scispace - formally typeset
Search or ask a question
Author

Shweta Kukade

Bio: Shweta Kukade is an academic researcher from College of Engineering, Pune. The author has contributed to research in topics: Telecommunications link & Throughput. The author has an hindex of 1, co-authored 4 publications receiving 4 citations.

Papers
More filters
Book ChapterDOI
01 Jan 2021
TL;DR: SC-FDMA link has been modeled and proposed to design this link using LabVIEW and the simulation results show that the PAPR and BER parameters of the proposed SC- FDMA link design are significantly lower than the OFDMA in downlink.
Abstract: Single-Carrier Frequency Division Multiple-Access (SC-FDMA) is used in uplink data transmission in Long-Term Evaluation (LTE) due to its low Peak-to-Average Power Ratio (PAPR) properties. In this paper, SC-FDMA link has been modeled and proposed to design this link using LabVIEW. The real-time data transmission through the SC-FDMA link is carried out using Software Defined Radio (SDR) testbed which is implemented using Universal Software Radio Peripheral (USRP) devices. The data transmission and reception are carried out through SC-FDMA link and the performance evaluation is estimated for PAPR and Bit Error Rate (BER). The simulation results show that the PAPR and BER parameters of the proposed SC-FDMA link design are significantly lower than the OFDMA in downlink. Moreover, the simulation of BER with different modulation schemes is carried out for different Signal to Noise Ratio (SNR) values. The analysis of PAPR impacts the OFDMA waveform and benefits on SC-FDMA transmission link in any channel condition.

4 citations

Journal ArticleDOI
TL;DR: In this article, an iterative heuristic optimal resource allocation (HORA) algorithm and a chunk based resource block allocation (CRBA) scheduling algorithm were proposed to determine resource block (RB) allocation among users to satisfy the quality of service requirement.
Abstract: Long Term Evolution—Advanced (LTE-A) is the most widely used and encouraging technology for 4G and 5G mobile networks. The LTE technology in wireless networks has achieved a significantly high throughput because it makes use of multiple access schemes. We propose an iterative heuristic optimal resource allocation (HORA) algorithm and a chunk based resource block allocation (CRBA) scheduling algorithm to determine resource block (RB) allocation among users to satisfy the quality of service requirement. A heuristic approach which is used in HORA offers a tradeoff between computational complexity and performance. It performs RB and power allocation separately to reduce computational complexity. In the CRBA algorithm sets of RBs are allocated to groups of users keeping power constant to all users. User selection is performed based on channel conditions to improve throughput. RB allocation is an additive method to maximize the data transmission rate and energy efficiency. The use of channel quality indicator feedback from the user equipment (UE) to eNodeB plays an important role in the selection of appropriate modulation and coding schemes and benefits the assigned chunk of RBs to users in the wideband channel-dependent selective frequency-time domain. Here, RB usage and quality-of-service (QoS) constraint are considered for the scheduling algorithm. The HORA algorithm assigns most RBs to users who have high-value signal to noise ratio and continues the RB allocation until it meets the QoS criteria of all users in consideration of the threshold value of the power budget. Problems that arise during continuous resource allocation to the scheduled user are considered as APX-hard and NP-hard problems. An RB and power allocation optimization problem is formulated for the maximum data rate in the cellular network. The simulation results show that the proposed approaches demonstrate considerable throughput improvement at the user end in a significant and robust condition.

2 citations

Proceedings ArticleDOI
16 Dec 2022
TL;DR: In this paper, a machine approach that makes use of Python's OpenCV library is used for face detection and recognition and it is designed to be s ufficiently accurate in comparison to other systems.
Abstract: The system's goal is to make the attendance marking process quick and easy because the teacher's mundane job in class is time consuming in monitoring the students while marking attendance and ensuring that no proxy attendance is marked.To solve the problem efficiently, the system employs a machine approach that makes use of Python's OpenCV library. The Haar cascade algorithm and the LBPH algorithm are used for face detection and recognition. The system is designed to be s ufficiently accurate in comparison to other systems.
Proceedings ArticleDOI
06 Jul 2021
TL;DR: In this article, a cross-tier interference is addressed in the two-tier 5G architecture, where a massive MIMO is used at the base stations (BSs) which served micro user equipment (MUEs).
Abstract: 3GPP standard is exploring the enhancement of 5G development. As a result, a more durable network must be created to enable huge access to networks and intelligent communication systems beyond 5G and 6G. In this paper, a cross-tier interference is addressed in the two-tier 5G architecture. A massive MIMO is used at the Base stations (BSs) which served micro user equipment (MUEs)., and small cells (SBSs) are equipped with lesser antennas are served to small cell user equipment (SUEs). A synchronized co-channel TDD is used to communicate over the available bandwidth. The advantage of the synchronized co-channel TDD mode is utilized to estimate the channel. A correlation matrix of a received signal is estimated from the uplink (UL) pilot signal. Hence we modified the correlation matrix and therefore even though the precise knowledge of the interfering channel is not available the channel can be estimated. A modified heuristic optimal resource allocation (HORA) algorithm is used for scheduling. The simulation demonstrates considerable throughput improvement at the user end in two-tier cellular architecture and robust conditions.
Posted ContentDOI
12 Apr 2021
TL;DR: The authors have requested that this preprint be removed from Research Square.
Abstract: The authors have requested that this preprint be removed from Research Square.

Cited by
More filters
Journal ArticleDOI
TL;DR: In this article, an iterative heuristic optimal resource allocation (HORA) algorithm and a chunk based resource block allocation (CRBA) scheduling algorithm were proposed to determine resource block (RB) allocation among users to satisfy the quality of service requirement.
Abstract: Long Term Evolution—Advanced (LTE-A) is the most widely used and encouraging technology for 4G and 5G mobile networks. The LTE technology in wireless networks has achieved a significantly high throughput because it makes use of multiple access schemes. We propose an iterative heuristic optimal resource allocation (HORA) algorithm and a chunk based resource block allocation (CRBA) scheduling algorithm to determine resource block (RB) allocation among users to satisfy the quality of service requirement. A heuristic approach which is used in HORA offers a tradeoff between computational complexity and performance. It performs RB and power allocation separately to reduce computational complexity. In the CRBA algorithm sets of RBs are allocated to groups of users keeping power constant to all users. User selection is performed based on channel conditions to improve throughput. RB allocation is an additive method to maximize the data transmission rate and energy efficiency. The use of channel quality indicator feedback from the user equipment (UE) to eNodeB plays an important role in the selection of appropriate modulation and coding schemes and benefits the assigned chunk of RBs to users in the wideband channel-dependent selective frequency-time domain. Here, RB usage and quality-of-service (QoS) constraint are considered for the scheduling algorithm. The HORA algorithm assigns most RBs to users who have high-value signal to noise ratio and continues the RB allocation until it meets the QoS criteria of all users in consideration of the threshold value of the power budget. Problems that arise during continuous resource allocation to the scheduled user are considered as APX-hard and NP-hard problems. An RB and power allocation optimization problem is formulated for the maximum data rate in the cellular network. The simulation results show that the proposed approaches demonstrate considerable throughput improvement at the user end in a significant and robust condition.

2 citations

Proceedings ArticleDOI
06 Jul 2021
TL;DR: In this article, a cross-tier interference is addressed in the two-tier 5G architecture, where a massive MIMO is used at the base stations (BSs) which served micro user equipment (MUEs).
Abstract: 3GPP standard is exploring the enhancement of 5G development. As a result, a more durable network must be created to enable huge access to networks and intelligent communication systems beyond 5G and 6G. In this paper, a cross-tier interference is addressed in the two-tier 5G architecture. A massive MIMO is used at the Base stations (BSs) which served micro user equipment (MUEs)., and small cells (SBSs) are equipped with lesser antennas are served to small cell user equipment (SUEs). A synchronized co-channel TDD is used to communicate over the available bandwidth. The advantage of the synchronized co-channel TDD mode is utilized to estimate the channel. A correlation matrix of a received signal is estimated from the uplink (UL) pilot signal. Hence we modified the correlation matrix and therefore even though the precise knowledge of the interfering channel is not available the channel can be estimated. A modified heuristic optimal resource allocation (HORA) algorithm is used for scheduling. The simulation demonstrates considerable throughput improvement at the user end in two-tier cellular architecture and robust conditions.
Posted ContentDOI
12 Apr 2021
TL;DR: The authors have requested that this preprint be removed from Research Square.
Abstract: The authors have requested that this preprint be removed from Research Square.
Book ChapterDOI
01 Jan 2023
TL;DR: In this paper , the authors proposed a cross-layer energy efficient radio resource management scheme for allocating the physical resources to UE and MTC devices in the LTE-A system.
Abstract: With the Internet of things and other network devices demanding faster and more reliable connectivity, combined with exponential data growth, LTE-Advanced system provides high data rate and low latency with increased mobility for multimedia applications and improved spectral efficiency. LTE-A also provides support to machine type communication (MTC) devices which describes the communication with machines without the engrossment of a humanoid. These MTC devices with the application of IoT provide small amount of sensing and monitoring data with low data rate requirement. In order to improve the performance of the LTE-A system, the radio resources for every user in the network should be efficiently managed by providing QoS requirements for every user. The radio resource management algorithm in LTE-A network provides cross-layer resource allocation between the user equipment (UE) and MTC devices. Since all the MTC devices are sensor nodes and battery powered equipment, they should consume very little amount of power. In this work, we propose a cross-layer energy efficient radio resource management scheme for allocating the physical resources to UE and MTC devices. Rate adaptive (RA) principle is utilized in this work to improve the energy efficiency and to upsurge the capacity of the channels. The performance of the system is evaluated by calculating the data rate of each user and allocating the resource to each user, packet loss ratio, fairness index, packet delay, peak signal-to-noise ratio, and time consumption. With the comparison of the existing algorithms, the simulated results obtained from the proposed algorithm guarantees QoS service to the user by consuming less energy for UE and MTC devices with increased fairness index and decreased packet loss.
Proceedings ArticleDOI
TL;DR: In this paper , the authors proposed a multiuser uplink scheduler based on recursive maximum expansion (RME) algorithm to assign resource blocks (RBs) to active users in an optimum manner to improve the system's spectral efficiency.
Abstract: 5G architecture allows the use of new radio alongside long‐term evolution‐A (LTE‐Advanced) and enables cellular mobile networks to handle extraordinarily high traffic volume data. The major challenge in current uplink data transmission systems is making resource blocks (RBs) available in a continuous form and assigning them to active users in an optimum manner to improve the system's spectral efficiency. The best N‐subset (BNS) minimization technique is suggested to identify the best of the best continuous chunks of RBs from the existing system bandwidth, followed by a modified recursive maximum expansion (RME) algorithm to assign RBs to users in the most optimal manner. The algorithm together is referred to as a multiuser uplink scheduler BNSRME algorithm. A constraint utility maximization problem is formulated to allocate RBs to UEs in the most optimized manner. Utility matrix is then converted into a weighted sum‐rate maximization problem allowing weights to be updated adaptively based on marginal utility values. In addition, a threshold limit is considered based on the signal‐to‐noise ratio in the BNSRME‐TH algorithm, which satisfies multiusers in terms of assigned resources and improves system performance, spectral efficiency, and throughput. This MU scheduling strategy is used in the 5G uplink non‐stand‐alone cellular network. The findings exhibit that the proposed multiuser algorithm increases system spectral efficiency by 31.65% in comparison with the existing opportunistic algorithms. It has been observed that the performance of the system is further increased by using the MU‐MIMO framework.