20 results on '"Harlisya Harun"'
Search Results
2. Development of Hierarchical Analytical Scheduling (HAS) - A Conceptual Framework.
- Author
-
Harlisya Harun, M. F. Mohd Sharif, N. Mariun, Ungku Chulan, and Khamizon Khazani
- Published
- 2012
- Full Text
- View/download PDF
3. Improving the Evaluation Performance of Space-Time Trellis Code through Visualisation.
- Author
-
Harlisya Harun, Ungku Azmi Iskandar Ungku Chulan, and Khamizon Khazani
- Published
- 2011
- Full Text
- View/download PDF
4. The design of Viterbi decoder for low power consumption space time trellis code without adder architecture using RTL model
- Author
-
Noor Izzri Abdul Wahab, Harlisya Harun, Mohammad Yazdi Harmin, Mohd Azlan Abu, and Muhd Khairulzaman Abdul Kadir
- Subjects
Computer science ,Mechanical Engineering ,010102 general mathematics ,Space–time trellis code ,Geotechnical Engineering and Engineering Geology ,Viterbi algorithm ,01 natural sciences ,Soft-decision decoder ,symbols.namesake ,Viterbi decoder ,Mechanics of Materials ,Electronic engineering ,symbols ,Generator matrix ,0101 mathematics ,Electrical and Electronic Engineering ,Arithmetic ,Encoder ,Decoding methods ,Civil and Structural Engineering ,Phase-shift keying - Abstract
Purpose This paper aims to describe the real-time design and implementation of a Space Time Trellis Code decoder using Altera Complex Programmable Logic Devices (CPLD). Design/methodology/approach The code uses a generator matrix designed for four-state space time trellis code (STTC) that uses quadrature phase shift keying (QPSK) modulation scheme. The decoding process has been carried out using maximum likelihood sequences estimation through the Viterbi algorithm. Findings The results showed that the STTC decoder can successfully decipher the encoded symbols from the STTC encoder and can fully recover the original data. The data rate of the decoder is 50 Mbps. Originality/value It has been shown that 96 per cent improvement of the total logic elements in Max V CPLD is used compared to the previous literature review.
- Published
- 2016
5. Threshold-Based Bit Error Rate for Stopping Iterative Turbo Decoding in a Varying SNR Environment
- Author
-
Harlisya Harun, Makhfudzah Mokhtar, Wan Azizun Wan Adnan, Kaharudin Dimyati, and Roslina Mohamad
- Subjects
biology ,Computational complexity theory ,Computer science ,Turbo ,020208 electrical & electronic engineering ,020206 networking & telecommunications ,Data_CODINGANDINFORMATIONTHEORY ,02 engineering and technology ,Serial concatenated convolutional codes ,biology.organism_classification ,Eb/N0 ,Turbo equalizer ,0202 electrical engineering, electronic engineering, information engineering ,Electronic engineering ,Bit error rate ,Turbo code ,Electrical and Electronic Engineering ,Algorithm ,Decoding methods - Abstract
Online bit error rate (BER) estimation (OBE) has been used as a stopping iterative turbo decoding criterion. However, the stopping criteria only work at high signal-to-noise ratios (SNRs), and fail to have early termination at low SNRs, which contributes to an additional iteration number and an increase in computational complexity. The failure of the stopping criteria is caused by the unsuitable BER threshold, which is obtained by estimating the expected BER performance at high SNRs, and this threshold does not indicate the correct termination according to convergence and non-convergence outputs (CNCO). Hence, in this paper, the threshold computation based on the BER of CNCO is proposed for an OBE stopping criterion (OBEsc). From the results, OBEsc is capable of terminating early in a varying SNR environment. The optimum number of iterations achieved by the OBEsc allows huge savings in decoding iteration number and decreasing the delay of turbo iterative decoding.
- Published
- 2016
6. Pruning the algorithm complexity of the Add-Compare Select Unit (ACSU) for the Viterbi Decoder - A Review
- Author
-
Norfadilah Shamsudin, Zuhanis Mansor, Ruhaida Abdul Rashid, Sahzilawati Mohamed Nor, and Harlisya Harun
- Subjects
business.industry ,Computer science ,Data_CODINGANDINFORMATIONTHEORY ,Viterbi algorithm ,symbols.namesake ,Viterbi decoder ,symbols ,Wireless ,Pruning (decision trees) ,Forward error correction ,Error detection and correction ,business ,Algorithm ,Encoder ,Decoding methods ,Computer Science::Information Theory - Abstract
Viterbi decoders are widely employed along with convolutional encoders to provide an excellent error correction probability in digital wireless transmissions. To achieve lower error rates, the constraint length (k) of the encoder, has to be defined at higher values. However, higher (k) results in increasingly complex implementations of the Viterbi algorithm. As a result, the power consumption of the Viterbi decoder will increase exponentially which is to the disadvantage of many wireless communication devices. Many reduced complexity decoding techniques of the Viterbi decoder presented in the past focused on algorithm and architecture specific levels. Most of the studies concentrated on the Add-Compare Select Unit (ACSU) of the decoder, due to its nature of repetitive processing; also add to the complexity of the process. This paper presents a review of several variations of the Viterbi algorithm performed in the ACSU. The comparison was made among several different algorithms with the most optimal algorithm presented in this paper. A combined algorithm approach is proposed at the end of the review for future improvement.
- Published
- 2018
7. Power Consumption Optimization Technique in ACS for Space Time Trellis Code Viterbi Decoder
- Author
-
Harlisya Harun, Noor Izzri Abdul Wahab, Mohd Azlan Abu, and Mohammad Yazdi Harmin
- Subjects
Iterative Viterbi decoding ,business.industry ,Computer science ,Space–time trellis code ,Total system power ,Data_CODINGANDINFORMATIONTHEORY ,General Medicine ,Power optimization ,Computer Science::Hardware Architecture ,Soft-decision decoder ,Viterbi decoder ,Electronic engineering ,business ,Decoding methods ,Computer hardware ,Soft output Viterbi algorithm ,Computer Science::Information Theory - Abstract
To provide fast digital communications systems, energy efficient, high-performance, low power is critical for decoding mobile receiver device. This paper proposes a low power optimization techniques in the Add Compare Select (ACS) unit for Space Time trellis codes (STTC) Viterbi decoder. STTC Viterbi decoder is used as a reference case. This paper discusses about how to lower the power in the ACS architecture, to optimize the Viterbi decoder STTC in reducing the total power consumption. Based on the results of design and analysis, power consumption Viterbi decoder modeling, low power system for STTC Viterbi decoder is proposed. Design and optimization of ACS unit in STTC Viterbi decoding is done using Verilog HDL language. Power analysis tools in the software Altera Quartus 2 is used for the synthesis of total system power consumption. Optimization strategy showed an increase of 83% power reduction compared to previous studies.
- Published
- 2015
8. Enhancement of cross-entropy based stopping criteria via turning point indicator
- Author
-
Harlisya Harun and Roslina Mohamad
- Subjects
Computation ,Reliability (computer networking) ,020208 electrical & electronic engineering ,Frame (networking) ,020206 networking & telecommunications ,Data_CODINGANDINFORMATIONTHEORY ,02 engineering and technology ,Cross entropy ,Signal-to-noise ratio (imaging) ,0202 electrical engineering, electronic engineering, information engineering ,Electronic engineering ,Bit error rate ,Forward error correction ,Algorithm ,Decoding methods ,Computer Science::Information Theory ,Mathematics - Abstract
Cross-entropy (CE) based stopping criteria is known for its early termination capability of the turbo decoding iterations without scarifying its performance, in terms of bit error rate (BER). This advantage, however, performs well in high signal-to-noise ratio (SNR) albeit requiring infinite iterations at low SNR, thereby resulting in more unwanted computations and decoding delays. In this paper, a simple turning point (TP) measurement from the BER and average iteration number (AIN) graphs of the Genie stopping criterion has been introduced. The motivation is twofold; to terminate the iteration at the early stage in low SNR and maintaining the same level of performance of the existing method in high SNR, by using the proposed TP measurement. From the simulation results, the enhanced CE-based stopping criteria improve the early termination in low SNR and maintain the BER performance for various frame sizes compared to existing stopping criteria.
- Published
- 2017
9. Robustness analysis of the improved minimum descriptive length stopping criterion
- Author
-
Nuzli Mohamad Anas, Harlisya Harun, Roslina Mohamad, and Murizah Kassim
- Subjects
Computer science ,05 social sciences ,050801 communication & media studies ,020206 networking & telecommunications ,02 engineering and technology ,Code rate ,0508 media and communications ,Channel reliability ,Robustness (computer science) ,Statistics ,0202 electrical engineering, electronic engineering, information engineering ,Bit error rate ,Turbo decoding ,Algorithm ,Decoding methods ,Computer Science::Information Theory - Abstract
Most of the turbo decoding stopping criteria assume perfect channel reliability available at the receiver since they require a threshold based on signal-to-noise ratio (SNR) information. However, operational environments can vary according to frame sizes, code structures and channel reliability requirements, further aggravating the difficulty in threshold determination and convergence or non-convergence detection. This paper aims to provide a robustness analysis of improved minimum descriptive length (IMDL) stopping criterion peculiar to these factors: the frame sizes, the channel reliability requirements, the code structures, and the code rate. The robustness analysis is benchmarked to the well-known Genie stopping criterion based on the average iteration number (AIN) and bit error rate (BER) performance.
- Published
- 2016
10. Robust stopping criterion in signal‐to‐noise ratio uncertainty environment
- Author
-
Makhfudzah Mokhtar, Wan Azizun Wan Adnan, Kaharudin Dimyati, Roslina Mohamad, and Harlisya Harun
- Subjects
Astrophysics::High Energy Astrophysical Phenomena ,Data_CODINGANDINFORMATIONTHEORY ,Soft-decision decoder ,Signal-to-noise ratio (imaging) ,Convergence (routing) ,Computer Science::Networking and Internet Architecture ,Bit error rate ,Electronic engineering ,Turbo decoding ,Stopping rules ,Electrical and Electronic Engineering ,Algorithm ,Decoding methods ,Computer Science::Information Theory ,Degradation (telecommunications) ,Mathematics - Abstract
A robust stopping criterion (called online-BER [OB]) that can terminate iterative turbo decoding in a signal-to-noise ratio (SNR) uncertainty environment is proposed. OB is based on the online bit error rate (BER) estimation and the BER thresholds. Both values are used to detect convergence and non-convergence decoder output and also to halt iterative decoding in various SNRs. Unlike other well-known stopping criteria, OB does not depend on SNR information in its stopping rules and hence it is less complex. OB is also more robust than other stopping criteria in a SNR uncertainty environment while being capable of reducing the average iteration number and resulting in less degradation in BER performance.
- Published
- 2015
11. Optimal generator matrix G
- Author
-
Kaharudin Dimyati, Harlisya Harun, and Ungku Azmi Iskandar Ungku Chulan
- Subjects
ComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKS ,Aerospace Engineering ,Space–time trellis code ,Data_CODINGANDINFORMATIONTHEORY ,Topology ,Coding gain ,Transmit diversity ,Tree structure ,Transmission (telecommunications) ,Electronic engineering ,Generator matrix ,Pairwise error probability ,Multipath propagation ,Computer Science::Information Theory ,Mathematics - Abstract
Multiple antenna transmission methods are currently being developed all around the world for evolving 3G wireless standards. Space–Time Trellis Code (STTC) has been proven to use transmit diversity efficiently. It effectively exploits the effects of multipath fading to increase the information capacity of the multiple antenna transmission systems. STTC is a channel coding technique that maximises the ‘distance’ between different symbol matrices such that the probability of transmission errors are decreased when transmitting redundant symbol or in other words, to maximise the minimum determinant. Maximising the minimum determinant is equivalent to obtaining optimal generator matrix G. Instead of using state diagrams, optimal generator matrix G discussed in this paper is obtained using an improved algorithm which is based on Lisya tree structure. Optimal generator matrix G in this paper has a minimum determinant of 48 which is the highest coding gain obtained so far.
- Published
- 2013
12. Impact of increasing threshold level on higher bit rate in free space optical communications
- Author
-
Mohd Adzir Mahdi, Mohd Khazani Abdullah, Salasiah Hitam, Mohd Azrizal Fauzi, Harlisya Harun, and Aduwati Sali
- Subjects
Radio propagation ,Bit (horse) ,Haze ,Transmission (telecommunications) ,Computer science ,Noise reduction ,Electronic engineering ,Optical communication ,Intensity modulation ,Power (physics) - Abstract
The biggest challenge facing free space optical deployment is optical signal propagation in different atmospheric conditions such as fog, low clouds, rain, snow, dust, haze and various combinations of each. A transmission and detection technique for free space optical communications is proposed where it employs two beams, one modulated with the data, the other with the inverted version of the same data. A differential detection technique is used at the receiver whereby the inverted data are used as the reference to perform the decision making, as opposed to the fixed threshold used in the conventional technique. The probability of error under threshold instability in the proposed differential technique and Intensity Modulation/Direct Detection (IM/DD) technique are compared. A simulation under heavy rainfall condition of 8.33×10−4 cm/sec at bit rate of 155 Mbps to 10 Gbps with 0 dBm of launch power and 1.5 km distance shows that this technique achieves an improvement compared to the conventional IM/DD. The differential detection can support a higher bit rate (up to 9 Gbps) because of its noise reduction capabilities due to higher threshold level implemented in the receiver. The IM/DD in contrast, can support bit rates of only 2.5 Gbps. This analysis focuses on weather conditions in Malaysia.
- Published
- 2009
13. On the robustness of measurement of reliability stopping criterion in turbo iterative decoding
- Author
-
Harlisya Harun, Makhfudzah Mokhtar, Kaharudin Dimyati, Roslina Mohamad, and Wan Azizun Wan Adnan
- Subjects
biology ,Computer science ,Turbo ,Real-time computing ,Data_CODINGANDINFORMATIONTHEORY ,Hardware_PERFORMANCEANDRELIABILITY ,Code rate ,biology.organism_classification ,Frame size ,Channel reliability ,Robustness (computer science) ,Hardware_INTEGRATEDCIRCUITS ,Turbo code ,Bit error rate ,Algorithm ,Decoding methods - Abstract
Measurement of reliability (MOR) stopping criterion is able to terminate early in the low and high signal-to-noise ratio (SNR) while maintaining the bit error rate (BER) performance. However, the performance of MOR is only based on one code structure and hence, the robustness of MOR is still unknown in turbo iterative decoding. Thus, this paper will test the robustness of MOR based on the following parameters: frame size, code structure, channel reliability and code rate. Then, we analyse and compare the average iteration number (AIN) and the BER performance of MOR with the benchmark stopping criterion known as Genie to determine the robustness of MOR. From the analysis, MOR has a BER degradation for low code rate. MOR also fails to perform well if the corret channel reliability is not available at the receiver and this results a large degradation in BER performance. However, MOR has close performance to Genie in terms of BER for various frame sizes, code structures and high code rate with the assistance of correct channel reliability. MOR is also able to save AIN at low SNR as compared to Genie and this can reduce delay and complexity of turbo codes.
- Published
- 2015
14. Architecture for Combined Energy and Attitude Control System
- Author
-
Harlisya Harun, Mohd Nizam Filipski, Renuganth Varatharajoo, and Ibrahim Mustafa Mehedi
- Subjects
Engineering ,Multidisciplinary ,business.industry ,Control engineering ,Transfer function ,Energy storage ,Space exploration ,Flywheel ,Attitude control ,Test case ,Control theory ,Torque ,business ,Energy (signal processing) - Abstract
Combining the energy and attitude control system is a feasible technology for small satellites to improve the space missions. In this Combined Energy and Attitude Control System (CEACS) a double rotating flywheel is used to replace the conventional battery for energy storage as well as to control the attitude of an earth oriented satellite. Each flywheel is to be controlled in the torque mode. The energy and attitude inputs for the flywheels' control architecture are also in the torque mode. All related mathematical representation along with the relevant transfer functions and the required numerical calculation are developed. The goals are to analyze the attitude performance with respect to the ideal and non-ideal test cases for a chosen reference mission.
- Published
- 2005
15. Early stopping turbo iteration at low SNR for CE-based stopping criteria
- Author
-
Makhfudzah Mokhtar, Wan Azizun Wan Adnan, Kaharudin Dimyati, Roslina Mohamad, Nuzli Mohamad Anas, and Harlisya Harun
- Subjects
Early stopping ,biology ,Turbo ,Reliability (computer networking) ,Data_CODINGANDINFORMATIONTHEORY ,biology.organism_classification ,Reduction (complexity) ,Signal-to-noise ratio (imaging) ,Iterated function ,Electronic engineering ,Bit error rate ,Algorithm ,Decoding methods ,Computer Science::Information Theory ,Mathematics - Abstract
The Cross-Entropy-based (CE-based) stopping criteria advantage is able to terminate early in the high signal-to-noise ratio (SNR) while maintain the bit error rate (BER) performance. Unfortunately, the criteria fails to cope with low SNR region and make the decoder iterates until maximum or infinite iteration. This paper proposed an early termination technique at low SNR for the CE-based stopping criteria using the decoding threshold derived by the measurement of reliability (MOR) at low SNR. In the simulation results and analysis, we compare the average iteration number (AIN) and the bit-error rate (BER) performance between the proposed combination methods with the existing CE-based stopping criteria. From the results, the combination method capable to reduce the AIN at low SNR with minimum one AIN while maintaining the AIN at high SNR as the traditional method. This significant reduction could reduce delay and complexity of existing CE-based stopping criteria while maintaining the BER performances.
- Published
- 2014
16. Performance analysis of stopping turbo decoder iteration criteria
- Author
-
Harlisya Harun, Makhfudzah Mokhtar, Wan Azizun Wan Adnan, Kaharudin Dimyati, and Roslina Mohamad
- Subjects
Turbo equalizer ,Theoretical computer science ,Computer science ,BCJR algorithm ,Concatenated error correction code ,Turbo code ,List decoding ,Data_CODINGANDINFORMATIONTHEORY ,Sequential decoding ,Serial concatenated convolutional codes ,Algorithm ,Linear code ,Computer Science::Information Theory - Abstract
The invention of turbo codes has attracted many researchers to explore various fields regarding turbo codes since it provides better error rate performance compared to the existing codes. Good error rate performance gives a penalty to the complexity of the codes. It includes the complexity of decoding algorithm and iterative decoding. This paper reviews the history of turbo codes and its structures. This paper also discusses the turbo decoding stopping criteria algorithm and analyses the performance of fixed and cross-entropy (CE) based stopping criteria. From the results, both criteria fail to terminate early in low SNR. However, CE-based stopping criteria outperform the fixed stopping criterion at high SNR and able to save more iteration and delay. This leads to an energy saving preservation while maintaining the performance of turbo codes.
- Published
- 2014
17. Improving the evaluation of generator matrix G by initial upper bound estimation
- Author
-
U. A. N. U. Chulan, K. Khazani, U. A. I. U. Chulan, and Harlisya Harun
- Subjects
Mathematical optimization ,Approximation algorithm ,Algorithm design ,Generator matrix ,Upper and lower bounds ,Rayleigh fading ,Coding (social sciences) ,Mathematics - Abstract
Space-Time Trellis Code (STTC) can achieve both the diversity and coding gains. To maximize the advantages of STTC, two design criteria for slow Rayleigh fading channels will be used: i.e. the rank and determinant criteria. This paper focuses on the determinant criteria, which involves the evaluation of the generator matrix G. Evaluation is improved by pruning the search process earlier, which is made possible by estimating the initial upper bound prior to the search. In order to reduce the search complexity, the initial upper bound will be calculated at the minimal cycle. Comparatively, it can reduce the search space by 25.8%.
- Published
- 2013
18. Development of Hierarchical Analytical Scheduling (HAS) -- A Conceptual Framework
- Author
-
Khamizon Khazani, Ungku Azmi Iskandar Ungku Chulan, M.F. Mohd Sharif, Harlisya Harun, and N. Mariun
- Subjects
Hardware_MEMORYSTRUCTURES ,Computer architecture ,CPU cache ,Cache coloring ,Computer science ,Two-level scheduling ,Distributed computing ,Cache-only memory architecture ,Page cache ,Cache ,Cache pollution ,Cache algorithms - Abstract
With the rapid growth of multicore processors, memory optimisation is necessary in improving the usage of cache memories. This includes efforts in improving the performance of data fetching from memory. Current prefetching algorithms depend on usage, such that items, which are not frequently used, will be removed. This can cause potential delay when infrequent items are removed, even when they are needed by many other processors in the near future. To alleviate this limitation, the hierarchical analytical scheduling (HAS) model is proposed in this paper. The model works by determining the relative importance of data during cache replacement policy. HAS is based on the concept of hierarchical temporal memory (HTM), in which the scheduling is derived on the priority and similarity of data from the aspect of space and time. Implemented with OCTAVE, a simulation of the model is implemented to analyse its behavior in a hypothetical cache management scenario.
- Published
- 2012
19. Improving the Evaluation Performance of Space-Time Trellis Code through STTC Visualisation Tool
- Author
-
Harlisya Harun, Ungku Azmi Iskandar Ungku Chulan, and Khamizon Khazani
- Subjects
Evaluation strategy ,Theoretical computer science ,Computer science ,Heuristic ,business.industry ,Space–time trellis code ,Trellis (graph) ,Machine learning ,computer.software_genre ,Visualization ,Code (cryptography) ,Generator matrix ,Pruning (decision trees) ,Artificial intelligence ,business ,computer - Abstract
In this paper we present a new visualisation approach in the effort of improving the evaluation strategy of space-time trellis code (STTC) generator matrix G. To our knowledge, although visualisation is widely used to handle a variety of problems, it has never been employed specifically to solve complexity problems that are related to generator matrix G evaluation. Most approaches are either mathematically or algorithmically inclined. As such, they tend to offer a series of refinement that enhances the current available method, but do not provide fresh insight on the problem at hand. By comparing it with the enhancement strategy that was discovered via the normal approach (i.e., by analysing algorithm) it was discovered that visualisation had inspired an entirely different pruning technique that outperformed the common approach by 20%.
- Published
- 2013
20. Improving FEC performance for baseband signal processing using a lookup table
- Author
-
Harlisya Harun, K. Dimyati, and Roslina Mohamad
- Subjects
Channel code ,Theoretical computer science ,business.industry ,Computer science ,Data_CODINGANDINFORMATIONTHEORY ,Channel capacity ,Transmission (telecommunications) ,Viterbi decoder ,Convolutional code ,Baseband ,Forward error correction ,business ,Computer hardware ,Communication channel ,Data transmission - Abstract
Forward error correction (FEC) plays an increasingly important role in today's communication systems. It will improve the capacity of a channel by adding some carefully designed redundant information to the data being transmitted through the channel. The process of adding this redundant information to the data is known as channel coding. In most wireless communication systems, convolutional coding is the preferred method of channel coding to overcome transmission distortions. This paper discusses the implementation of convolutional coding and Viterbi decoding onto Texas Instruments (TI) TMS320C6711 DSP to improve the FEC performance in terms of its execution time and realtime capability
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.