scispace - formally typeset
Search or ask a question
Institution

AT&T Labs

Company
About: AT&T Labs is a based out in . It is known for research contribution in the topics: Network packet & The Internet. The organization has 1879 authors who have published 5595 publications receiving 483151 citations.


Papers
More filters
Journal ArticleDOI
Lorrie Faith Cranor1
01 Nov 2003
TL;DR: The World Wide Web Consortium's Platform for Privacy Preferences (P3P) lets Web sites convey their privacy policies in a computer-readable format and promises to make Web site privacy policies more accessible to users.
Abstract: The World Wide Web Consortium's Platform for Privacy Preferences (P3P) lets Web sites convey their privacy policies in a computer-readable format. Although not yet widely adopted, P3P promises to make Web site privacy policies more accessible to users.

205 citations

Book ChapterDOI
14 Jun 1999
TL;DR: This paper presents a framework for computing activity deadlines so that the overall process deadline is met and all external time constraints are satisfied.
Abstract: Time management is a critical component of workflow-based process management. Important aspects of time management include planning of workflow process execution in time, estimating workflow execution duration, avoiding deadline violations, and satisfying all external time constraints such as fixed-date constraints and upper and lower bounds for time intervals between activities. In this paper, we present a framework for computing activity deadlines so that the overall process deadline is met and all external time constraints are satisfied.

205 citations

Proceedings ArticleDOI
01 Aug 2000
TL;DR: " # %$ '&( ) *+ , ( $ . / 0*1 32 4 ,,5 6, 87!$ 93:;, 5 6! ) ! < = = = < > = 6, , 0 @?A B(C, / =D E 2 . F 2 $ 6G C ! EB 4 H B A 2 C JI0:J:JKL 7!$, M ' "?+ & E 9ANO6,
Abstract: " # %$ '&( ) *+ , ( $ . / 0*1 32 4 , ,5 6, 87!$ 93:; , 5 6! ) ! < = = < > = 6, , 0 @?A B(C, / =D E 2 . F 2 $ 6G C ! EB 4 H B A 2 C JI0:J:JKL 7!$, M ' "?+ & E 9ANO6, = 6, =/=.PQ $ E6 2, 6, &( = = 2R*S & =/= ,5> J $ . @P ,$ ( C% 6, P ) ?T&( ,53$, 2 >2 +6 H?O $, A A 6, A % E 2 / $, k&( L6, 5 6,=.Pr .s 2m&( $, k * 6 Q /s t * 6, k2 u vUV =<5 .5 &!P% n ( <2, P,`+ ,2# 6, %$, >&( + *b 5 G H $ n 4 / ! / UV6%$ G2 2 ) *+ = = / , `E9 lw 5 ) R ,$ > 5 G $ *S o .d, 2o 2, &, = xPk*1 3 ( *S 4 , 9 yA 7!$, ! =.P!C 6 P# "2 z $,=. n .*8PR G2R 4 % ;9 I) ! EB{ R L2 4 / -_ ( .dG Q=/ 5 $, 5 k 2m L | ,$ , = =.P3 z ! p 5 , $ + 5 p =/ ,=.P!9~}@ " 6, , ( CJ?A t2 &( Q 6, Q &, = i L ,$ ,5 5 G $ *8  4 4 ' G2u | ,=/ / L6 ?€I0 , ! Bu 2,2% 6 &,= 9 a, 3 | ( Pt ,$ ( C~?A ! "I0 ! EB4$, 5 > $ , , 5> | ,= <*8 € 6, F = '$, ,2 $, P‚%6 H?A C% 6 )=/ 5 $G 5 0 . =.*p J5 , =D G2i ,= 7%$, = =.P#?A = =; > 6, E )2 ' $ 9 1. INTRODUCTION ƒ , =,2, F „ b ) 7%$ , A *, 2 p 6G b= 5 ! , ~&( @?A " ! . / 9ba, ~ | =/ C n r *, ! B 4 H B < , 0 *c&,$ P,…H = =p 2 E )*1 G H / $ † G H ! J *byA ,$ <‡ / , CGyA = =~ˆ< , . @P!9 Appears in the ‰ X ^ \ h h Š e1Z%f [i^@‹ WSŒ h4,e Ž WSŒt Z WxhEXEZGY WVe^ ZGY ‘<’M^ ZG] ‹ hEX hEZ(\ hi^ Z “"Z(^ ”n‘/h Š f!h'•0e1[E\ ^ – hEXE—tY ZGŠ#•3Y W_Y#˜ke1Z e1Z%f

205 citations

Journal ArticleDOI
TL;DR: This paper primarily focuses on the unsupervised scenario where the labeled source domain training data is accompanied by unlabeled target domain test data, and presents a two-stage data-driven approach by generating intermediate data representations that could provide relevant information on the domain shift.
Abstract: With unconstrained data acquisition scenarios widely prevalent, the ability to handle changes in data distribution across training and testing data sets becomes important. One way to approach this problem is through domain adaptation, and in this paper we primarily focus on the unsupervised scenario where the labeled source domain training data is accompanied by unlabeled target domain test data. We present a two-stage data-driven approach by generating intermediate data representations that could provide relevant information on the domain shift. Starting with a linear representation of domains in the form of generative subspaces of same dimensions for the source and target domains, we first utilize the underlying geometry of the space of these subspaces, the Grassmann manifold, to obtain a `shortest' geodesic path between the two domains. We then sample points along the geodesic to obtain intermediate cross-domain data representations, using which a discriminative classifier is learnt to estimate the labels of the target data. We subsequently incorporate non-linear representation of domains by considering a Reproducing Kernel Hilbert Space representation, and a low-dimensional manifold representation using Laplacian Eigenmaps, and also examine other domain adaptation settings such as (i) semi-supervised adaptation where the target domain is partially labeled, and (ii) multi-domain adaptation where there could be more than one domain in source and/or target data sets. Finally, we supplement our adaptation technique with (i) fine-grained reference domains that are created by blending samples from source and target data sets to provide some evidence on the actual domain shift, and (ii) a multi-class boosting analysis to obtain robustness to the choice of algorithm parameters. We evaluate our approach for object recognition problems and report competitive results on two widely used Office and Bing adaptation data sets.

205 citations

Book ChapterDOI
01 Jan 2010
TL;DR: In this chapter, the basic components of GRASP are described and implementation strategies of memory-based intensification and post-optimization techniques using path-relinking are discussed.
Abstract: GRASP is a multi-start metaheuristic for combinatorial optimization problems, in which each iteration consists basically of two phases: construction and local search. The construction phase builds a feasible solution, whose neighborhood is investigated until a local minimum is found during the local search phase. The best overall solution is kept as the result. In this chapter, we first describe the basic components of GRASP. Successful implementation techniques are discussed and illustrated by numerical results obtained for different applications. Enhanced or alternative solution construction mechanisms and techniques to speed up the search are also described: alternative randomized greedy construction schemes, Reactive GRASP, cost perturbations, bias functions, memory and learning, local search on partially constructed solutions, hashing, and filtering. We also discuss implementation strategies of memory-based intensification and post-optimization techniques using path-relinking. Hybridizations with other metaheuristics, parallelization strategies, and applications are also reviewed.

204 citations


Authors

Showing all 1881 results

NameH-indexPapersCitations
Yoshua Bengio2021033420313
Scott Shenker150454118017
Paul Shala Henry13731835971
Peter Stone130122979713
Yann LeCun121369171211
Louis E. Brus11334763052
Jennifer Rexford10239445277
Andreas F. Molisch9677747530
Vern Paxson9326748382
Lorrie Faith Cranor9232628728
Ward Whitt8942429938
Lawrence R. Rabiner8837870445
Thomas E. Graedel8634827860
William W. Cohen8538431495
Michael K. Reiter8438030267
Network Information
Related Institutions (5)
Microsoft
86.9K papers, 4.1M citations

94% related

Google
39.8K papers, 2.1M citations

91% related

Hewlett-Packard
59.8K papers, 1.4M citations

89% related

Bell Labs
59.8K papers, 3.1M citations

88% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
20225
202133
202069
201971
2018100
201791