scispace - formally typeset
Search or ask a question
Author

Makoto Okazaki

Bio: Makoto Okazaki is an academic researcher from University of Tokyo. The author has contributed to research in topics: Absorption spectroscopy & Extended X-ray absorption fine structure. The author has an hindex of 14, co-authored 27 publications receiving 4914 citations.

Papers
More filters
Proceedings ArticleDOI
26 Apr 2010
TL;DR: This paper investigates the real-time interaction of events such as earthquakes in Twitter and proposes an algorithm to monitor tweets and to detect a target event and produces a probabilistic spatiotemporal model for the target event that can find the center and the trajectory of the event location.
Abstract: Twitter, a popular microblogging service, has received much attention recently. An important characteristic of Twitter is its real-time nature. For example, when an earthquake occurs, people make many Twitter posts (tweets) related to the earthquake, which enables detection of earthquake occurrence promptly, simply by observing the tweets. As described in this paper, we investigate the real-time interaction of events such as earthquakes in Twitter and propose an algorithm to monitor tweets and to detect a target event. To detect a target event, we devise a classifier of tweets based on features such as the keywords in a tweet, the number of words, and their context. Subsequently, we produce a probabilistic spatiotemporal model for the target event that can find the center and the trajectory of the event location. We consider each Twitter user as a sensor and apply Kalman filtering and particle filtering, which are widely used for location estimation in ubiquitous/pervasive computing. The particle filter works better than other comparable methods for estimating the centers of earthquakes and the trajectories of typhoons. As an application, we construct an earthquake reporting system in Japan. Because of the numerous earthquakes and the large number of Twitter users throughout the country, we can detect an earthquake with high probability (96% of earthquakes of Japan Meteorological Agency (JMA) seismic intensity scale 3 or more are detected) merely by monitoring tweets. Our system detects earthquakes promptly and sends e-mails to registered users. Notification is delivered much faster than the announcements that are broadcast by the JMA.

3,976 citations

Journal ArticleDOI
TL;DR: An earthquake reporting system for use in Japan is developed and an algorithm to monitor tweets and to detect a target event is proposed, which produces a probabilistic spatiotemporal model for the target event that can find the center of the event location.
Abstract: Twitter has received much attention recently. An important characteristic of Twitter is its real-time nature. We investigate the real-time interaction of events such as earthquakes in Twitter and propose an algorithm to monitor tweets and to detect a target event. To detect a target event, we devise a classifier of tweets based on features such as the keywords in a tweet, the number of words, and their context. Subsequently, we produce a probabilistic spatiotemporal model for the target event that can find the center of the event location. We regard each Twitter user as a sensor and apply particle filtering, which are widely used for location estimation. The particle filter works better than other comparable methods for estimating the locations of target events. As an application, we develop an earthquake reporting system for use in Japan. Because of the numerous earthquakes and the large number of Twitter users throughout the country, we can detect an earthquake with high probability (93 percent of earthquakes of Japan Meteorological Agency (JMA) seismic intensity scale 3 or more are detected) merely by monitoring tweets. Our system detects earthquakes promptly and notification is delivered much faster than JMA broadcast announcements.

483 citations

Journal ArticleDOI
TL;DR: In this article, the electron localization in the Anderson model of disordered two dimensional square lattices is investigated numerically in the case of large systems composed of 10 4 sites, where the Anderson transition at the band center and the localization near the mobility edge are recognized.
Abstract: The electron localization is investigated numerically in the Anderson model of disordered two dimensional square lattices. The spatial behaviors of wavefunctions are examined for large systems composed of 10 4 sites. Computations are carried out to see the Anderson transition at the band center and to see the localization near the mobility edge. Sharp transition at the band center is recognized and the electron localization is clearly visualized. The spatial decay rate of localized wavefunction is evaluated in the localized region and its dependence on energy or the degree of disorder is determined.

101 citations

Journal ArticleDOI
TL;DR: In this article, a relativistic generalization of the Green's function method for the energy-band calculation is presented, where the wave function within the atomic spheres is expanded in terms of four-component spherical waves.
Abstract: A relativistic generalization of the Green's function method for the energy-band calculation is presented. The wave function within the atomic spheres is expanded in terms of four-component spherical waves. The resulting expression which gives the relationship between E and k is very similar to the nonrelativistic one. Matrix elements between the spherical waves can be easily computed provided structure constants used in the nonrelativistic calculations are available.

100 citations

Journal ArticleDOI
TL;DR: In this article, the problem of coexistence of the local and band aspects in the fundamental absorption spectra (or the impurity-induced infrared absorption of lattice vibrations) is formulated with the use of the Green's function method.
Abstract: The problem of coexistence of the local and band aspects in the fundamental absorption spectra (or the impurity-induced infrared absorption of lattice vibrations) is formulated with the use of the Green's function method. By a suitable decomposition of the hamiltonian of the electron-hole relative motion (or the dynamical matrix for the lattice vibrations), one can derive a line shape expression in which coexist the both aspects, namely, the metastable excitons (or the quasi-local modes) on the one hand and the Van Hove singularities on the other hand. Their interference results in the antiresonance of the quasi-local states and the metamorphism of Van Hove singularities.

86 citations


Cited by
More filters
Proceedings ArticleDOI
28 Mar 2011
TL;DR: There are measurable differences in the way messages propagate, that can be used to classify them automatically as credible or not credible, with precision and recall in the range of 70% to 80%.
Abstract: We analyze the information credibility of news propagated through Twitter, a popular microblogging service. Previous research has shown that most of the messages posted on Twitter are truthful, but the service is also used to spread misinformation and false rumors, often unintentionally.On this paper we focus on automatic methods for assessing the credibility of a given set of tweets. Specifically, we analyze microblog postings related to "trending" topics, and classify them as credible or not credible, based on features extracted from them. We use features from users' posting and re-posting ("re-tweeting") behavior, from the text of the posts, and from citations to external sources.We evaluate our methods using a significant number of human assessments about the credibility of items on a recent sample of Twitter postings. Our results shows that there are measurable differences in the way messages propagate, that can be used to classify them automatically as credible or not credible, with precision and recall in the range of 70% to 80%.

2,123 citations

Proceedings ArticleDOI
26 Oct 2010
TL;DR: A probabilistic framework for estimating a Twitter user's city-level location based purely on the content of the user's tweets, which can overcome the sparsity of geo-enabled features in these services and enable new location-based personalized information services, the targeting of regional advertisements, and so on.
Abstract: We propose and evaluate a probabilistic framework for estimating a Twitter user's city-level location based purely on the content of the user's tweets, even in the absence of any other geospatial cues By augmenting the massive human-powered sensing capabilities of Twitter and related microblogging services with content-derived location information, this framework can overcome the sparsity of geo-enabled features in these services and enable new location-based personalized information services, the targeting of regional advertisements, and so on Three of the key features of the proposed approach are: (i) its reliance purely on tweet content, meaning no need for user IP information, private login information, or external knowledge bases; (ii) a classification component for automatically identifying words in tweets with a strong local geo-scope; and (iii) a lattice-based neighborhood smoothing model for refining a user's location estimate The system estimates k possible locations for each user in descending order of confidence On average we find that the location estimates converge quickly (needing just 100s of tweets), placing 51% of Twitter users within 100 miles of their actual location

1,213 citations

Journal ArticleDOI
04 May 2011-PLOS ONE
TL;DR: The use of information embedded in the Twitter stream is examined to (1) track rapidly-evolving public sentiment with respect to H1N1 or swine flu, and (2) track and measure actual disease activity.
Abstract: Twitter is a free social networking and micro-blogging service that enables its millions of users to send and read each other's “tweets,” or short, 140-character messages. The service has more than 190 million registered users and processes about 55 million tweets per day. Useful information about news and geopolitical events lies embedded in the Twitter stream, which embodies, in the aggregate, Twitter users' perspectives and reactions to current events. By virtue of sheer volume, content embedded in the Twitter stream may be useful for tracking or even forecasting behavior if it can be extracted in an efficient manner. In this study, we examine the use of information embedded in the Twitter stream to (1) track rapidly-evolving public sentiment with respect to H1N1 or swine flu, and (2) track and measure actual disease activity. We also show that Twitter can be used as a measure of public interest or concern about health-related events. Our results show that estimates of influenza-like illness derived from Twitter chatter accurately track reported disease levels.

1,195 citations

Book ChapterDOI
18 Apr 2011
TL;DR: This paper empirically compare the content of Twitter with a traditional news medium, New York Times, using unsupervised topic modeling, and finds interesting and useful findings for downstream IR or DM applications.
Abstract: Twitter as a new form of social media can potentially contain much useful information, but content analysis on Twitter has not been well studied. In particular, it is not clear whether as an information source Twitter can be simply regarded as a faster news feed that covers mostly the same information as traditional news media. In This paper we empirically compare the content of Twitter with a traditional news medium, New York Times, using unsupervised topic modeling. We use a Twitter-LDA model to discover topics from a representative sample of the entire Twitter. We then use text mining techniques to compare these Twitter topics with topics from New York Times, taking into consideration topic categories and types. We also study the relation between the proportions of opinionated tweets and retweets and topic categories and types. Our comparisons show interesting and useful findings for downstream IR or DM applications.

1,193 citations

Journal ArticleDOI
TL;DR: This article deconstructs the ideological grounds of datafication, a ideology rooted in problematic ontological and epistemological claims that shows characteristics of a widespread secular belief in the context of a larger social media logic.
Abstract: Metadata and data have become a regular currency for citizens to pay for their communication services and security—a trade-off that has nestled into the comfort zone of most people. This article deconstructs the ideological grounds of datafication. Datafication is rooted in problematic ontological and epistemological claims. As part of a larger social media logic, it shows characteristics of a widespread secular belief. Dataism, as this conviction is called, is so successful because masses of people — naively or unwittingly — trust their personal information to corporate platforms. The notion of trust becomes more problematic because people’s faith is extended to other public institutions (e.g. academic research and law enforcement) that handle their (meta)data. The interlocking of government, business, and academia in the adaptation of this ideology makes us want to look more critically at the entire ecosystem of connective media.

1,076 citations