scispace - formally typeset
Search or ask a question

Showing papers by "Parma Nand published in 2017"


Journal ArticleDOI
TL;DR: The paper explores NLG architectures and tasks classed under document planning, micro-planning and surface realization modules and identifies the gaps existing in the NLG research which require further work in order to make NLG a widely usable technology.
Abstract: Natural Language Generation (NLG) is defined as the systematic approach for producing human understandable natural language text based on non-textual data or from meaning representations. This is a significant area which empowers human-computer interaction. It has also given rise to a variety of theoretical as well as empirical approaches. This paper intends to provide a detailed overview and a classification of the state-of-the-art approaches in Natural Language Generation. The paper explores NLG architectures and tasks classed under document planning, micro-planning and surface realization modules. Additionally, this paper also identifies the gaps existing in the NLG research which require further work in order to make NLG a widely usable technology.

48 citations


Journal ArticleDOI
TL;DR: It is demonstrated that it is safe and feasible to use the 29mm SAPIEN 3 in patients with annular dimensions greater than those recommended, with minimal balloon overfilling.
Abstract: Background As the indications for transcatheter aortic valve implantation (TAVI) have expanded, so to have the demands on interventionists to allow as many patients to access this technology as possible. Methods We retrospectively reviewed our TAVI database for patients who had received a 29 mm SAPIEN 3 valve despite having an annular area greater than the manufacturer-recommended upper limit of 683 mm 2 , as determined by multi-detector computed tomography (MDCT). Procedural and inpatient outcome data were collected. Results The study population was 5 of 121 patients receiving a SAPIEN 3 valve since it became available in March 2015. Their annular area ranged from 691 to 800 mm 2 . Valve deployment was successful in all patients. The deployment balloon volume was nominal, except for an additional 1 ml in one patient. No patient had a new indication for permanent pacing, and no significant valvular or paravalvular regurgitation (PVR) was identified on post-procedure transthoracic echocardiography. All patients survived to hospital discharge. Conclusions In this select group of patients we have demonstrated that it is safe and feasible to use the 29 mm SAPIEN 3 in patients with annular dimensions greater than those recommended, with minimal balloon overfilling.

14 citations


Proceedings ArticleDOI
01 May 2017
TL;DR: A cryptographic technique (Combination of RSA and symmetric Key) and proposed algorithm are used by which a message can be broadcasted from one node to other nodes securely and efficiently.
Abstract: The Vehicular Ad-hoc Network (VANET) is a special type of MANET in which the nodes are high mobility vehicles. Due to the high mobility of nodes the topology changed very rapidly. Security is the main issue in the VANET due to change in topology. The network is accessible from everywhere in the VANET within the radio range and the malicious node can easily target the network. Wormhole attack is the dangerous attack which can breaks the security of the VANET. In wormhole attack a malicious node can captures messages at one location and tunnels to another malicious node, which replays them locally. In this paper a cryptographic technique (Combination of RSA and symmetric Key) and proposed algorithm are used by which a message (in the form of packets) can be broadcasted from one node to other nodes securely and efficiently. In this technique RSA is used to distribute the shared key and identifier (ID) of nodes, and further broadcasting of messages with ID of node is carried out by the shared key encryption.

12 citations


Proceedings ArticleDOI
05 May 2017
TL;DR: DSR routing protocol and AODV routing protocol is compared and simulation has been analyzed for the node and Packet Delivery Ratio, Throughputs and End To End Delay are calculated.
Abstract: Wireless sensor network is an application tools which play a important role in monitoring remote environment and tracking of target. It consisting Sensor node which are smallers cheaper and intelligent, which consume small amounts of battery or power. By using this data is transferred. There is lots of routing technique that provides significant benefits to wireless sensor networks concerning both reliability and performance. There are many routing technique which are design for data transmission in wireless sensor network but we are considering only some popular one that is AODV and DSR. In this paper we are comparing DSR routing protocol and AODV routing protocol and simulation has been analyzed for the node. Calculating Packet Delivery Ratio, Throughputs and End To End Delay. For this calculation we are using ns-2 tools for simulation.

10 citations


01 Jan 2017
TL;DR: A framework to generate patterns which can be used to lexicalize Linked Data is presented which achieved 70.36% accuracy and a Mean reciprocal Rank value of 0.72 for five DBpedia ontology classes generating 101 lexicalizations.
Abstract: The concept of Linked Data has attracted increased interest in recent times due to its free and open availability and the sheer of volume. We present a framework to generate patterns which can be used to lexicalize Linked Data. We use DBpedia as the Linked Data resource which is one of the most comprehensive and fastest growing Linked Data resource available for free. The framework incorporates a text preparation module which collects and prepares the text after which Open Information Extraction is employed to extract relations which are then aligned with triples to identify patterns. The framework also uses lexical semantic resources to mine patterns utilizing VerbNet and WordNet. The framework achieved 70.36% accuracy and a Mean reciprocal Rank value of 0.72 for five DBpedia ontology classes generating 101 lexicalizations.

7 citations


Proceedings ArticleDOI
01 May 2017
TL;DR: During this projected work, a Reliable Energy economical adaptation (REEA) algorithmic rule has been projected that considers hop count, link quality and energy consumption for reliable route discovery and decreased in REEA by victimization the thought of piggy back theme.
Abstract: Due to advancement within the telecommunication, wireless communication, wireless sensors, and different technologies a replacement field of analysis has been evolved and this space of analysis is termed as Wireless Body space Networks. In WBAN miniaturized device nodes are planted within the blood stream or on the flesh. These device nodes are battery dependent, once the battery dies, node additionally dies. That's why minimum energy consumption with reliable knowledge transmission is primary demand in WBAN. To realize minimum energy consumption there's a necessity of self-adaptive routing protocol. During this projected work, a Reliable Energy economical adaptation (REEA) algorithmic rule has been projected considers hop count, link quality and energy consumption for reliable route discovery Energy consumption is decreased in REEA by victimization the thought of piggy back theme within which residual energy of various nodes are sent to their neighbors on demand and ways having nodes with low residual energy are skipped. REEA algorithmic rule additionally additionally uses a self-adaptation approach for even distribution of load within the communication network with minimum variety of dead nodes and avoids energy-hole drawback. The projected approach has been applied on the real data set knowledge obtained from NICTA and simulated victimization the Castalia machine. Obtained results of the projected approach REEA are compared with AODV and LABILE routing algorithmic rule on the premise of varied metrics.

6 citations


Journal ArticleDOI
TL;DR: A new answer construction and presentation system, which utilizes the linguistic structure of the source question and the factoid answer to construct an answer sentence which closely emanates a human-generated answer.
Abstract: Question Answering over Linked Data (QALD) refer to the use of Linked Data by question answering systems, and in recent times this has become increasingly popular as it opens up a massive Linked Data cloud which is a rich source of encoded knowledge. However, a major shortfall of current QALD systems is that they focus on presenting a single fact or factoid answer which is derived using SPARQL (SPARQL Protocol and RDF Query Language) queries. There is now an increased interest in development of human-like systems which would be able to answer questions and even hold conversations by constructing sentences akin to humans. In this paper, we introduce a new answer construction and presentation system, which utilizes the linguistic structure of the source question and the factoid answer to construct an answer sentence which closely emanates a human-generated answer. We employ both semantic Web technology and the linguistic structure to construct the answer sentences. The core of the research resides on extracting dependency subtree patterns from the questions and utilizing them in conjunction with the factoid answer to generate the answer sentence with a natural feel akin to an answer from a human when asked the question. We evaluated the system for both linguistic accuracy and naturalness using human evaluation. These evaluation processes showed that the proposed approach is able to generate answer sentences which have linguistic accuracy and natural readability quotients of more than 70%. In addition, we also carried out a feasibility analysis on using automatic metrics for answer sentence evaluation. The results from this phase showed that the there is not a strong correlation between the results from automatic metric evaluation and the human ratings of the machine-generated answers.

5 citations


Journal ArticleDOI
TL;DR: The findings suggest that the MHT process is robust, consistent and appropriately allocating a limited treatment resource, and survival following TAVI and SAVR was superior to medical therapy and similar to the age-matched general population.
Abstract: Background Transcatheter aortic valve implantation (TAVI) is an alternative and effective contemporary intervention to surgical aortic valve replacement (SAVR) for patients with severe aortic valve disease at increased surgical risk. Guidelines recommend a multidisciplinary “Heart Team” (MHT) review of patients considered for a TAVI procedure, but this has been little studied. We reviewed the characteristics, treatments and outcomes of such patients reviewed by the MHT at our centre. Methods Data on consecutive patients with severe aortic valve stenosis discussed by the Auckland City Hospital MHT from June 2011 to August 2016 were obtained from clinical records. Patient characteristics, treatment and outcomes were analysed using standard statistical methods. Results Over the 5-year period 243 patients (mean age 80.2 ± 8.0 years, 60% male) were presented at the MHT meeting. TAVI was recommended for 200, SAVR for 26 and medical therapy for 17 patients, with no significant difference in mean age (80.2 ± 8.3, 80.4 ± 6.1, 80.4 ± 7.3 years, respectively) or EuroSCORE II (6.5 ± 4.7%, 5.3 ± 3.6%, 6.7 ± 4.3%, respectively). Over time, there was an increase in the number of patients discussed and treated, with no change in their mean age, but the mean EuroSCORE II significantly decreased (TAVI p = 0.026, SAVR p = 0.004). Survival after TAVI and SAVR was similar to that of the age-matched general population, but superior to medical therapy p = 0.002 (93% (n = 162), 84% (n = 21) and 73% (n = 18) at one year and 85% (n = 149), 84% (n = 21) and 54% (n = 13) at 2 years, respectively). Conclusions An increasing number of patients were discussed at the MHT meeting with the majority undergoing TAVI, with a similar age and EuroSCORE II to those allocated SAVR or medical therapy. Survival following TAVI and SAVR was superior to medical therapy and similar to the age-matched general population. These findings suggest that the MHT process is robust, consistent and appropriately allocating a limited treatment resource.

5 citations


Proceedings ArticleDOI
01 Jan 2017
TL;DR: The aim of the RealText project is to build a scalable framework to transform Linked Data into natural language by generating lexicalization patterns for triples that are accurate, readable and emanates qualities of a human produced language.
Abstract: The consumption of Linked Data has dramatically increased with the increasing momentum towards semantic web. Linked data is essentially a very simplistic format for representation of knowledge in that all the knowledge is represented as triples which can be linked using one or more components from the triple. To date, most of the efforts has been towards either creating linked data by mining the web or making it available for users as a source of knowledgebase for knowledge engineering applications. In recent times there has been a growing need for these applications to interact with users in a natural language which required the transformation of the linked data knowledge into a natural language. The aim of the RealText project described in this paper, is to build a scalable framework to transform Linked Data into natural language by generating lexicalization patterns for triples. A lexicalization pattern is a syntactical pattern that will transform a given triple into a syntactically correct natural language sentence. Using DBpedia as the Linked Data resource, we have generated 283 accurate lexicalization patterns for a sample set of 25 ontology classes. We performed human evaluation on a test sub-sample with an inter-rater agreement of 0.86 and 0.80 for readability and accuracy respectively. This results showed that the lexicalization patterns generated language that are accurate, readable and emanates qualities of a human produced language.

1 citations


Posted Content
TL;DR: This paper discusses eight different industries which currently reap the benefits ofSemantic Web and offers a future outlook into Semantic Web applications and discusses the areas in which SemanticWeb would play a key role in the future.
Abstract: The next leap on the internet has already started as Semantic Web. At its core, Semantic Web transforms the document oriented web to a data oriented web enriched with semantics embedded as metadata. This change in perspective towards the web offers numerous benefits for vast amount of data intensive industries that are bound to the web and its related applications. The industries are diverse as they range from Oil & Gas exploration to the investigative journalism, and everything in between. This paper discusses eight different industries which currently reap the benefits of Semantic Web. The paper also offers a future outlook into Semantic Web applications and discusses the areas in which Semantic Web would play a key role in the future.

1 citations


Book ChapterDOI
17 Apr 2017
TL;DR: This paper presents a framework that lexicalizes the Linked Data triples into natural language using an ensemble architecture comprised of four different pattern based modules which lexicalize triples by analysing the triple features.
Abstract: Linked Data has revamped the representation of knowledge by introducing the triple data structure which can encode knowledge with the associated semantics including the context by interlinking with external resources across documents. Although Linked Data is an attractive and effective mechanism to represent knowledge as created and consumed by humans in the form of a natural language, it still has a dimension of separation from natural language. Hence, in recent times, there has been an increase interest in transforming Linked Data into natural language in order to harness the benefits of Linked Data in applications interacting with natural language. This paper presents a framework that lexicalizes the Linked Data triples into natural language using an ensemble architecture. The proposed architecture is comprised of four different pattern based modules which lexicalize triples by analysing the triple features. The four pattern mining modules are based on occupational metonyms, Context Free Grammar (CFG), relation extraction using Open Information Extraction (OpenIE), and triple properties. The framework was evaluated using a two-fold evaluation process consisting of linguistic accuracy analysis and human evaluation for a test sample. The linguistic accuracy evaluation showed that the framework can produce 283 accurate lexicalization patterns for a set of 25 ontology classes resulting in a 70.75% accuracy, which is an approximately 91% increase compared to the existing state-of-the-art model.