scispace - formally typeset
Search or ask a question
Author

Ali Lalbakhsh

Bio: Ali Lalbakhsh is an academic researcher from Macquarie University. The author has contributed to research in topics: Antenna (radio) & Directivity. The author has an hindex of 20, co-authored 82 publications receiving 1257 citations. Previous affiliations of Ali Lalbakhsh include Islamic Azad University & Commonwealth Scientific and Industrial Research Organisation.


Papers
More filters
Journal ArticleDOI
TL;DR: A response to combat the virus through Artificial Intelligence (AI) is rendered in which different aspects of information from a continuum of structured and unstructured data sources are put together to form the user-friendly platforms for physicians and researchers.
Abstract: COVID-19 outbreak has put the whole world in an unprecedented difficult situation bringing life around the world to a frightening halt and claiming thousands of lives. Due to COVID-19's spread in 212 countries and territories and increasing numbers of infected cases and death tolls mounting to 5,212,172 and 334,915 (as of May 22 2020), it remains a real threat to the public health system. This paper renders a response to combat the virus through Artificial Intelligence (AI). Some Deep Learning (DL) methods have been illustrated to reach this goal, including Generative Adversarial Networks (GANs), Extreme Learning Machine (ELM), and Long/Short Term Memory (LSTM). It delineates an integrated bioinformatics approach in which different aspects of information from a continuum of structured and unstructured data sources are put together to form the user-friendly platforms for physicians and researchers. The main advantage of these AI-based platforms is to accelerate the process of diagnosis and treatment of the COVID-19 disease. The most recent related publications and medical reports were investigated with the purpose of choosing inputs and targets of the network that could facilitate reaching a reliable Artificial Neural Network-based tool for challenges associated with COVID-19. Furthermore, there are some specific inputs for each platform, including various forms of the data, such as clinical data and medical imaging which can improve the performance of the introduced approaches toward the best responses in practical applications.

358 citations

Journal ArticleDOI
TL;DR: In this paper, an efficient particle swarm optimization (PSO) algorithm was developed to design a near-field time-delay equalizer metasurface (TDEM) for the purpose of improving directivity and radiation patterns of classical electromagnetic band-gap resonator antennas.
Abstract: This letter presents an efficient particle swarm optimization (PSO) algorithm developed to design a near-field time-delay equalizer metasurface (TDEM) for the purpose of improving directivity and radiation patterns of classical electromagnetic band-gap resonator antennas. Triple layers of conductive printed patterns in the metasurface were optimized by the PSO algorithm to systematically design the TDEM. Predicted and measured results show a significant improvement in antenna performance including 9.6 dB enhancement in antenna directivity, lower sidelobes, and higher gain. The measured directivity of the prototype is 21 dBi, and 3-dB bandwidth is 11.8%.

165 citations

Journal ArticleDOI
TL;DR: In this paper, the authors presented a design methodology for a compact low-cost partially reflecting surface (PRS) for a wideband high-gain resonant cavity antenna (RCA) which requires only a single commercial dielectric slab.
Abstract: This communication presents a design methodology for a compact low-cost partially reflecting surface (PRS) for a wideband high-gain resonant cavity antenna (RCA) which requires only a single commercial dielectric slab. The PRS has one nonuniform double-sided printed dielectric, which exhibits a negative transverse-reflection magnitude gradient and, at the same time, a progressive reflection phase gradient over frequency. In addition, a partially shielded cavity is proposed as a method to optimize the directivity bandwidth and the peak directivity of RCAs. A prototype of the PRS was fabricated and tested with a partially shielded cavity, showing good agreement between the predicted and measured results. The measured peak directivity of the antenna is 16.2 dBi at 11.4 GHz with a 3 dB bandwidth of 22%. The measured peak gain and 3 dB gain bandwidth are 15.75 dBi and 21.5%, respectively. The PRS has a radius of 29.25 mm ( $1.1\lambda _{0}$ ) with a thickness of 1.52 mm ( $0.12\lambda _{g}$ ), and the overall height of the antenna is $0.6\lambda _{0} $ , where $\lambda _{0}$ and $\lambda _{g}$ are the free-space and guided wavelengths at the center frequency of 11.4 GHz.

89 citations

Journal ArticleDOI
TL;DR: In this paper, a microstrip lowpass filter is proposed to achieve an ultra wide stopband with 12th harmonic suppression and extremely sharp skirt characteristics, and the operating mechanism of the filter is investigated based on proposed equivalent-circuit model, and an overall good agreement between measured and simulated results is observed.
Abstract: A novel microstrip lowpass filter is proposed to achieve an ultra wide stopband with 12th harmonic suppression and extremely sharp skirt characteristics. The transition band is from 1.26 to 1.37 GHz with -3 and -20 dB, respectively. The operating mechanism of the filter is investigated based on proposed equivalent-circuit model, and the role of each section in creating null points is theoretically discussed. An overall good agreement between measured and simulated results is observed.

84 citations

Journal ArticleDOI
Abstract: This paper presents an elegant yet straightforward design procedure for a compact rat-race coupler (RRC) with an extended harmonic suppression. The coupler’s conventional $\lambda $ /4 transmission lines (TLs) are replaced by a specialized TL that offers significant size reduction and harmonic elimination capabilities in the proposed approach. The design procedure is verified through the theoretical, circuit, and electromagnetic (EM) analyses, showing excellent agreement among different analyses and the measured results. The circuit and EM results show that the proposed TL replicates the same frequency behaviour of the conventional one at the design frequency of 1.8 GHz while enables harmonic suppression up to the $7^{\mathrm {th}}$ harmonic and a size reduction of 74%. According to the measured results, the RRC has a fractional bandwidth of 20%, with input insertion losses of around 0.2 dB and isolation level better than 35 dB. Furthermore, the total footprint of the proposed RRC is only 31.7 mm $\times15.9$ mm, corresponding to $0.28\,\,\lambda \times 0.14\,\,\lambda $ , where $\lambda $ is the guided wavelength at 1.8 GHz.

75 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this paper, a comprehensive survey of the most important aspects of DL and including those enhancements recently added to the field is provided, and the challenges and suggested solutions to help researchers understand the existing research gaps.
Abstract: In the last few years, the deep learning (DL) computing paradigm has been deemed the Gold Standard in the machine learning (ML) community. Moreover, it has gradually become the most widely used computational approach in the field of ML, thus achieving outstanding results on several complex cognitive tasks, matching or even beating those provided by human performance. One of the benefits of DL is the ability to learn massive amounts of data. The DL field has grown fast in the last few years and it has been extensively used to successfully address a wide range of traditional applications. More importantly, DL has outperformed well-known ML techniques in many domains, e.g., cybersecurity, natural language processing, bioinformatics, robotics and control, and medical information processing, among many others. Despite it has been contributed several works reviewing the State-of-the-Art on DL, all of them only tackled one aspect of the DL, which leads to an overall lack of knowledge about it. Therefore, in this contribution, we propose using a more holistic approach in order to provide a more suitable starting point from which to develop a full understanding of DL. Specifically, this review attempts to provide a more comprehensive survey of the most important aspects of DL and including those enhancements recently added to the field. In particular, this paper outlines the importance of DL, presents the types of DL techniques and networks. It then presents convolutional neural networks (CNNs) which the most utilized DL network type and describes the development of CNNs architectures together with their main features, e.g., starting with the AlexNet network and closing with the High-Resolution network (HR.Net). Finally, we further present the challenges and suggested solutions to help researchers understand the existing research gaps. It is followed by a list of the major DL applications. Computational tools including FPGA, GPU, and CPU are summarized along with a description of their influence on DL. The paper ends with the evolution matrix, benchmark datasets, and summary and conclusion.

1,084 citations

Journal ArticleDOI
22 Mar 2021
TL;DR: In this paper, the authors present a comprehensive view on these machine learning algorithms that can be applied to enhance the intelligence and the capabilities of an application and highlight the challenges and potential research directions based on their study.
Abstract: In the current age of the Fourth Industrial Revolution (4IR or Industry 4.0), the digital world has a wealth of data, such as Internet of Things (IoT) data, cybersecurity data, mobile data, business data, social media data, health data, etc. To intelligently analyze these data and develop the corresponding smart and automated applications, the knowledge of artificial intelligence (AI), particularly, machine learning (ML) is the key. Various types of machine learning algorithms such as supervised, unsupervised, semi-supervised, and reinforcement learning exist in the area. Besides, the deep learning, which is part of a broader family of machine learning methods, can intelligently analyze the data on a large scale. In this paper, we present a comprehensive view on these machine learning algorithms that can be applied to enhance the intelligence and the capabilities of an application. Thus, this study’s key contribution is explaining the principles of different machine learning techniques and their applicability in various real-world application domains, such as cybersecurity systems, smart cities, healthcare, e-commerce, agriculture, and many more. We also highlight the challenges and potential research directions based on our study. Overall, this paper aims to serve as a reference point for both academia and industry professionals as well as for decision-makers in various real-world situations and application areas, particularly from the technical point of view.

659 citations

01 Nov 1984
TL;DR: In this article, a substrate-superstrate printed antenna geometry which allows for large antenna gain is presented, asymptotic formulas for gain, beamwidth, and bandwidth are given, and the bandwidth limitation of the method is discussed.
Abstract: Resonance conditions for a substrate-superstrate printed antenna geometry which allow for large antenna gain are presented. Asymptotic formulas for gain, beamwidth, and bandwidth are given, and the bandwidth limitation of the method is discussed. The method is extended to produce narrow patterns about the horizon, and directive patterns at two different angles.

568 citations

Journal ArticleDOI
TL;DR: A response to combat the virus through Artificial Intelligence (AI) is rendered in which different aspects of information from a continuum of structured and unstructured data sources are put together to form the user-friendly platforms for physicians and researchers.
Abstract: COVID-19 outbreak has put the whole world in an unprecedented difficult situation bringing life around the world to a frightening halt and claiming thousands of lives. Due to COVID-19's spread in 212 countries and territories and increasing numbers of infected cases and death tolls mounting to 5,212,172 and 334,915 (as of May 22 2020), it remains a real threat to the public health system. This paper renders a response to combat the virus through Artificial Intelligence (AI). Some Deep Learning (DL) methods have been illustrated to reach this goal, including Generative Adversarial Networks (GANs), Extreme Learning Machine (ELM), and Long/Short Term Memory (LSTM). It delineates an integrated bioinformatics approach in which different aspects of information from a continuum of structured and unstructured data sources are put together to form the user-friendly platforms for physicians and researchers. The main advantage of these AI-based platforms is to accelerate the process of diagnosis and treatment of the COVID-19 disease. The most recent related publications and medical reports were investigated with the purpose of choosing inputs and targets of the network that could facilitate reaching a reliable Artificial Neural Network-based tool for challenges associated with COVID-19. Furthermore, there are some specific inputs for each platform, including various forms of the data, such as clinical data and medical imaging which can improve the performance of the introduced approaches toward the best responses in practical applications.

358 citations