scispace - formally typeset
Search or ask a question

How to compare two dates in Robot Framework? 

Answers from top 10 papers

More filters
Papers (10)Insight
The minimum optical dates obtained are geomorphologically and stratigraphically consistent with each other, suggesting that the dates are reliable.
We prove that the optimal robot scheduling of two-cluster tools can be solved in polynomial time.
Our method enables us to compare SLAM approaches that use different estimation techniques or different sensor modalities since all computations are made based on the corrected trajectory of the robot.
Accordingly, agreement between dates from the two techniques (or lack thereof) provides a stringent test of their reliability.
In addition, our framework explicitly captures sensor specifications that depend on the environment with which the robot is interacting, which results in a novel paradigm for sensor-based temporal-logic-motion planning.
This article argues that International Relations scholars need to question the ways in which these orthodox dates serve as internal and external points of reference, think more critically about how benchmark dates are established, and generate a revised set of benchmark dates that better reflects macro-historical international dynamics.
Proceedings ArticleDOI
Gary Anderson, Gang Yang 
01 Oct 2007
19 Citations
These combined measures can be used as a yardstick by engineers in determining how to design a robot-environment system.
The framework presented in this paper enables the robot team to cooperatively fulfill tasks given as temporal logic specifications while explicitly considering uncertainty and incorporating observations during execution.
Open accessProceedings ArticleDOI
Sangseok You, Lionel P. Robert 
26 Feb 2018
85 Citations
Results showed that human–robot similarity promoted trust in a robot, which led to willingness to work with robots and ultimately willingness to work with a robot over a human co-worker.
Open accessProceedings ArticleDOI
10 Apr 2007
257 Citations
In addition, our framework explicitly captures sensor specifications that depend on the environment with which the robot is interacting, resulting in a novel paradigm for sensor-based temporal logic motion planning.