scispace - formally typeset
Search or ask a question
Topic

Architecture

About: Architecture is a research topic. Over the lifetime, 25849 publications have been published within this topic receiving 225266 citations.


Papers
More filters
Journal ArticleDOI
Igal Charney1
01 Jun 2007-Area
TL;DR: Stressing the significance of high-quality design and iconic architecture helped to wear down deep-rooted antagonism and to channel the debate to improving the aesthetic qualities of London, a goal that enjoys wide consensus.
Abstract: After 2000 a handful of very tall buildings were approved in central London, a circumstance that challenged well-established planning practices in that part of the city. Their promotion by Ken Livingstone, the mayor, but opposition to them by conservation groups, seemed to signal a fierce campaign ahead; in fact, it was all over in an instant. This article examines how this debate was framed to dismiss the arguments and concerns of those who oppose tall buildings. To make tall buildings acceptable, London's mayor drew on the merits associated with iconic architecture and high-profile architects. Under Livingstone's incumbency tall buildings were affirmed by the expertise and clout of global architects who provided legitimacy for mayoral ambitions to reach for the sky. Stressing the significance of high-quality design and iconic architecture helped to wear down deep-rooted antagonism and to channel the debate to improving the aesthetic qualities of London, a goal that enjoys wide consensus.

57 citations

Journal ArticleDOI
TL;DR: Recently, a dizzying number of X-former models have been proposed as mentioned in this paper , which improve upon the original Transformer architecture, many of which make improvements around computational and memory efficiency.
Abstract: Transformer model architectures have garnered immense interest lately due to their effectiveness across a range of domains like language, vision, and reinforcement learning. In the field of natural language processing for example, Transformers have become an indispensable staple in the modern deep learning stack. Recently, a dizzying number of “X-former” models have been proposed—Reformer, Linformer, Performer, Longformer, to name a few—which improve upon the original Transformer architecture, many of which make improvements around computational and memory efficiency . With the aim of helping the avid researcher navigate this flurry, this article characterizes a large and thoughtful selection of recent efficiency-flavored “X-former” models, providing an organized and comprehensive overview of existing work and models across multiple domains.

57 citations

Journal ArticleDOI
TL;DR: The difference between the artistic and literary fields and universes such as architecture usually recognized as "art professions" but which enjoy a far lesser degree of autonomy than such fields seemingly constitutes an obstacle to the broader application of the notion of a "field of cultural production" sought by Bourdieu in his Rules of Art.
Abstract: The difference between ‘artistic and literary fields’ and universes such as architecture usually recognized as ‘art professions’ but which enjoy a far lesser ‘degree of autonomy’ than such fields seemingly constitutes an obstacle to the broader application of the notion of a ‘field of cultural production’ sought by Bourdieu in his Rules of Art. The author of this paper overcomes this obstacle by employing his notion of the ‘field effect’, with the architecture competition serving as the test case. Following Bourdieu, the author replaces the notion of profession with that of the field, for the former is a representation fostered by professional groups themselves. Architecture is a field, but, because architects require clients to construct and realize their works, one unlike the artistic and literary fields, which are markets of symbolic goods where ‘distinterest’ reigns and an autonomy unthinkable elsewhere is enjoyed. However, much like artists and unlike any other ‘professionals’, architects enter compe...

57 citations

Posted Content
18 Apr 2020
TL;DR: This work proposes a Federated NAS (FedNAS) algorithm to help scattered workers collaboratively searching for a better architecture with higher accuracy and shows that the architecture searched by FedNAS can outperform the manually predefined architecture.
Abstract: Federated Learning (FL) has been proved to be an effective learning framework when data cannot be centralized due to privacy, communication costs, and regulatory restrictions. When training deep learning models under an FL setting, people employ the predefined model architecture discovered in the centralized environment. However, this predefined architecture may not be the optimal choice because it may not fit data with non-identical and independent distribution (non-IID). Thus, we advocate automating federated learning (AutoFL) to improve model accuracy and reduce the manual design effort. We specifically study AutoFL via Neural Architecture Search (NAS), which can automate the design process. We propose a Federated NAS (FedNAS) algorithm to help scattered workers collaboratively searching for a better architecture with higher accuracy. We also build a system based on FedNAS. Our experiments on non-IID dataset show that the architecture searched by FedNAS can outperform the manually predefined architecture.

56 citations


Network Information
Related Topics (5)
The Internet
213.2K papers, 3.8M citations
81% related
Wireless sensor network
142K papers, 2.4M citations
81% related
Energy consumption
101.9K papers, 1.6M citations
80% related
Software
130.5K papers, 2M citations
80% related
Cloud computing
156.4K papers, 1.9M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20244
20235,088
202211,536
2021845
20201,174
20191,226