scispace - formally typeset
Open AccessProceedings ArticleDOI

Simulation of Tactile Sensing Arrays for Physical Interaction Tasks

Reads0
Chats0
TLDR
A framework for tactile servoing in the simulated world is presented that includes a general model of tactile sensing arrays that can simulate the behavior of a real tactile sensing array thanks to an empirical characterization procedure.
Abstract
Simulated worlds are important enablers and accelerators of new algorithms for autonomous robot applications. A framework for tactile servoing in the simulated world is presented. This framework includes a general model of tactile sensing arrays that can simulate the behavior of a real tactile sensing array thanks to an empirical characterization procedure. After obtaining the precise sensor model, different tactile servoing schemes can be implemented in the framework by controlling contact features, including points and lines extracted from the simulated contact images. Several experiments have been performed in order to guarantee the correspondence between the simulated results generated by the framework and the real ones executed with different sensors and robots.

read more

Content maybe subject to copyright    Report

HAL Id: hal-02925450
https://hal.uca.fr/hal-02925450
Submitted on 29 Aug 2020
HAL is a multi-disciplinary open access
archive for the deposit and dissemination of sci-
entic research documents, whether they are pub-
lished or not. The documents may come from
teaching and research institutions in France or
abroad, or from public or private research centers.
L’archive ouverte pluridisciplinaire HAL, est
destinée au dépôt et à la diusion de documents
scientiques de niveau recherche, publiés ou non,
émanant des établissements d’enseignement et de
recherche français ou étrangers, des laboratoires
publics ou privés.
Simulation of Tactile Sensing Arrays for Physical
Interaction Tasks
Zhanat Kappassov, Juan Antonio Corrales Ramon, Véronique Perdereau
To cite this version:
Zhanat Kappassov, Juan Antonio Corrales Ramon, Véronique Perdereau. Simulation of Tac-
tile Sensing Arrays for Physical Interaction Tasks. 2020 IEEE/ASME International Confer-
ence on Advanced Intelligent Mechatronics (AIM), Jul 2020, Boston, France. pp.196-201,
�10.1109/AIM43001.2020.9158822�. �hal-02925450�

Simulation of Tactile Sensing Arrays for Physical Interaction Tasks
Zhanat Kappassov
,
3
, Juan-Antonio Corrales-Ramon
2
, and V
´
eronique Perdereau
1
Abstract Simulated worlds are important enablers and ac-
celerators of new algorithms for autonomous robot applications.
A framework for tactile servoing in the simulated world is
presented. This framework includes a general model of tactile
sensing arrays that can simulate the behavior of a real tactile
sensing array thanks to an empirical characterization proce-
dure. After obtaining the precise sensor model, different tactile
servoing schemes can be implemented in the framework by
controlling contact features, including points and lines extracted
from the simulated contact images. Several experiments have
been performed in order to guarantee the correspondence
between the simulated results generated by the framework and
the real ones executed with different sensors and robots.
I. INTRODUCTION
There is at present a large imbalance between physical
world simulators’ reliabilities in applications that do not
require handling of a contact and those that involve careful
consideration of physical interactions. For example, whereas
the simulation environment is sufficient to gather the data
for solving the problem of learning adaptation behaviors to
faults without contacts [1], real experiments are needed to
collect and process raw signals for solving the problem of
safe physical interactions [2].
This imbalance is due to limitations of physical world
engines in the simulation of a tactile sensing arrays [3],
which provide robots with a contact force profiles between
their links and the environment. This profile is measured
thanks to multiple sensing elements organized in raw-column
wise structure within the sensor. With the emergence of the
demand for fine and dexterous manipulation, tactile sensing
arrays have gained attention not only in real applications
but also in simulations [4], since physical world simulators
accelerate design and implementation.
Simulation of these tactile sensing arrays involve the
problem of their surface deformations that occur due to the
applied contact forces. This problem can be addressed by
applying the elasticity theory [5], which are computationally
costly and inherently ill-posed since the authors use inverse
filtering techniques that may not have a unique solution.
This work was partially supported by the Institute of Smart Systems and
Artificial Intelligence (ISSAI), Kazakhstan, grant “Variable Stiffness Tactile
Sensor for Robot Manipulation and Object Exploration” 110119FD45119
and project CoMManDIA (SOE2/P1/F0638) cofinanced by Interreg Sudoe
Programme (European Regional Development Fund).
1
Sorbonne Universit
´
e, CNRS, Institut des Syst
`
emes Intelligents et de
Robotique, ISIR, F-75005 Paris, France vperdereau,kappassov
@isir.upmc.com
2
Universit
´
e Clermont Auvergne, CNRS, SIGMA Cler-
mont, Institute Pascal, F-63000 Clermont-Ferrand, France
juan.corrales@sigma-clermont.fr
3
Robotics Department, Nazarbayev University, Astana, 010000 Kaza-
khstan zhkappassov@nu.edu.kz
In order to overcome this limitation and simulate realistic
tactile sensors, previous approaches propose different solu-
tions (Table I). For example, in SkinSim [6] each sensing
element within a sensing array is modeled as an independent
rigid body with a spherical shape represented in Gazebo
simulation environment. Solid elements with a mass are
attached to the base of the sensor through a virtual spring.
Then the response of the sensor is given by the displacements
of the virtual springs.
In contrast to SkinSim, the simulation model of the
OpenGrasp skin [7] includes one rigid body only. The body
is represented by a triangularization of its surface. A force
response at each triangle is then used to form a sensor
response. Following the single body representation, in the
RobWorkSim simulator with Open Dynamics Engine (ODE),
Joergensen et al [8] empirically derive a polynomial function
to describe the deformations of an elastic surface as a
function of the distance from the point of contact towards
each of the sensing elements of a WeissRobotics variable
resistance sensor.
Another approaches use the data coming from range
sensors, e.g. Pezzementy et al. [4] construct the model
of a Pressure Profile Systems tactile sensor as an array
of proximity sensors without inferring the physics of the
material covering the sensing cells. Since a mechanically
compliant surface of a real sensor creates cross-talk in a
tactile image, the sensing array is characterized with a point
spread function (PSF).
TABLE I: Tactile sensing array simulation approaches.
Sensor
Model
Sim./Engine Friction Model Geometry
SkinSim
[6]
Gazebo
/ODE
Mass-spring-damping
model for each cell
Array of rigid
spheres
OpenGrasp
[7]
OpenRave
/ ODE
LuGre dynamic friction
model
Triangularized
meshes
Deformable
Skin [8]
RobWorkSim
/ODE
Coulomb model Deformable
meshes
PPS
model [4]
C++ /
OpenGL
Sensor proximity
sensor, no contact
forces
Cubes on a pla-
nar surface
Developed
model
Gazebo
/ODE
PSF, Coulomb friction
model
Triangularized
meshes, cubes
As the first contribution, we present a novel methodology
(Fig. 1) to construct a tactile sensing model that takes
the advantages of each approach: integration with Gazebo
simulation environment, consideration of the effect of contact
forces in accordance with a real sensor, and use of PSF
based characterization of a real sensor. Such characterization
simulates the effects of a soft layer over the sensor on the
resulting tactile image.

Using the proposed framework, simulation models were
built for the following sensing arrays (Fig. 2 a-c): resis-
tive WTS0614 Weiss Robotics, capacitive 6 × 4 DigiTacts
Pressure Profile Systems, and capacitive sensor developed
by the Laboratory for Integration of Systems and Technol-
ogy (LIST) of the French Alternative Energies and Atomic
Energy Commission (CEA). Accordingly, the efficacy of the
models was benchmarked in the following tactile servoing
applications: a robot end-effector position control in the 6
degrees-of-freedom (dof) Cartesian space, and tactile servo-
ing with a 2 dof pan-and-tilt platform. Our approach can
be integrated into autonomous exploration techniques with
multiple points of contact, e.g. [9], which would potentially
enhance autonomous robots.
The proposed tactile sensor model including the its cali-
bration and characterization is presented in the next section
(Section II). It followed by the implementation of a tactile
servoing controller in section III. The responses of the
simulated and real sensors are compared in section IV. The
last section concludes the work.
Fig. 1: Approach.
II. SIMULATED SENSOR
Figure 1 illustrates the proposed pipeline for constructing
a model of a sensor sensing array in the simulated world
(SW). The pipeline includes the characterization step. The
shape of the sensor in the SW is given by a triangular mesh
that can be integrated into Gazebo environment, which is
a rigid multibody kinematics simulator based on a physics
engine, for example ODE. The engine provides data about
collisions at the points of contact between two objects in
contact. Thus, we can construct the response of the sensor
in terms of a tactile image. It visualize the physical collisions
between the bodies. From tactile image, tactile features can
be retrieved using computer vision techniques [10]. Usually,
the sensing array is attached to a robot in the SE. The robot
is always defined by its kinematic structure along with joint
controllers. The Gazebo simulator is compatible with the
Robot Operating System (ROS) that handles the controllers.
A triangular mesh of a sensing array can be designed in
one of the following ways: it represents the whole sensor as
either a single body or multiple bodies organized in the same
way as sensing elements. Figure 2 d) shows the model of the
N × M WTS0614 sensor given by a single body with the
sensing elements, consisting of pairs of triangles, depicted
Fig. 2: Tactile sensing arrays: (a) WeissRobotics, (b) PressureProfileSystems,
(c) Tactile sensing array developed by Laboratory for Integration of Systems
and Technology of the French Alternative Energies and the Atomic Energy
Commission, (d) 3D mesh model of the 6×14 WeissRobotics tactile sensing
array.
with numbers. For this sensor: N = 6 and M = 14; the
physical coordinates of a cell are x = i · x and y = j · y,
where the size of each sensing element is x = y = 3.4
mm.
A contact force occurs between a sensing element and
the environment when they interact. In reality, this force is
spread over the neighboring elements due to the soft elastic
top layer of the sensor that tends to uniformly spread its
deformation to an input stimulus. Thus, every tactile sensing
array has specific spatial and force resolution characteristics.
In order to fit the spatial characteristics, we apply a PSF
model to capture effects of the deformation, as Pezzementi
et al [4] proved its efficacy.
A. Point Spread Function
PSF is the function that represents the spatial response of
an array to a point stimulus. We apply a PSF based on a two-
dimensional Gaussian function to the contact forces at each
sensing element with a convolution mask. The parameters
of this Gaussian function are estimated from the empirical
data (Sec III-A)
1
.
Therefore, when a force is applied to a sensing element,
the effects of the deformations lead to the following relation-
ship:
I
sim
(i, j) = F
sim
(i, j) g(a, b) (1)
where i and j are the coordinates of a tactile sensing cell (or
tactel) in an N × M sensing array, I
sim
(i, j) is the resulting
tactile image in the simulation; F
sim
(i, j) is the input force.
g(a, b) =
1
2πσ
2
e
1
2
a
2
+b
2
σ
2
is the 2D Gaussian PSF function
with σ the normal distribution and a × b the size of a kernel
with a = b; denotes the convolution in the spatial domain,
i.e. I(i, j) =
a=a
end
P
a=a
start
b=b
end
P
b=b
btart
F (i a, j b)g(a, b).
The result of the PSF model, applied to the contact force
profile depicted in Fig. 3 b, is illustrated in Fig. 3 c. Fig. 3 a
shows the simulation environment: a Shadow robotic hand
with a Weiss tactile sensing array over its palm (white) and
the object (orange) that is in contact with the sensor.
The contact force profile is obtained as described in the
following section.
1
In the following, the parameters of the Gaussian function are σ = 0.5
and the kernel size is 3 × 3 determined by comparing real and simulated
contact patterns

Fig. 3: (a) A cylindrical object in contact with the simulated sensor. (b)
The initial tactile image (stimulus). (c) The resulting tactile image of the
convolution of the stimulus with a PSF. Red and blue lines visualize contours
and principal components of contacts, respectively.
B. Contact force
The term F
sim
in eq. (1) is the force profile, which is built
from the forces at each tactel. When the sensor collides with
an object, ODE calculates the contact forces using a temporal
spring-damper system at the intersection of the surface trian-
gles. The simulator allows small penetrations in the colliding
objects, from which it estimates normal and tangential forces
based on the stiffness and friction coefficients of the bodies in
contact. We assume that the frictional coefficients are rather
low to allow sliding motions. Then, the contact force F
sim
at the (i
0
, j
0
)-th sensing element with coordinates x(i
0
) and
y(j
0
) is computed as follows:
F
sim
(i
0
, j
0
) = k
scale
N
x
,N
y
X
i,j
F
n
(x, y) : (|x x(i
0
)|
x) (|y y(j
0
)| y) (|z z(i
0
, j
0
)| h)
(2)
where h is the height of the sensor (in z direction), N
x
and N
y
are the number of tactels along x- and y-axes,
respectively, F
n
is a normal force provided by the Gazebo
simulator at the point of contact between an object and the
given element with the volumetric dimensions x, y, z.
The coordinates x, y, z are the coordinates of collisions
between the sensor and an object with respect to the sensing
frame. Thus, we sum the contact forces that appear on
the surface and within the dimensions of (i
0
, j
0
)-th tactel.
We take into account the normal forces only because the
tangent forces tend to zero as the surface friction coefficients
are assumed to be negligible. The resulting force is then
normalized by a scaling factor k
scale
to respect the maximum
of the real sensor output, and down-sampled in time to make
the simulated bandwidth equal to the bandwidth of a real
sensor.
Thus far, we derived a custom sensor model. The model
takes as the input the interaction forces (calculated by the
ODE physical engine) and considers their spreading due to
the compliance of the elastic surface of the corresponding
real sensor. Although the developed sensor model does not
apply the deformations of the surface of the sensor in the
simulation, it generates the tactile image coherent with the
empirical data (Sec. V-A).
C. Tactile image
The simulated sensor is added in the simulation by defin-
ing its attachment frame with respect to an end-effector of a
robot, e.g. the palm of a robot hand. Then the sensor provides
this end-effector with the tactile image. The procedure of
obtaining this image is summarized in Algorithm 1.
As indicated in the developed algorithm, the tactile im-
age is the result of the convolution of a Gaussian with
contact forces, which are clustered in row-column structure
and provided by a physics engine. The parameter σ of
the Gaussian and a stiffness parameter for deriving the
contact forces should follow the real parameters that are
obtained empirically by the sensor characterization procedure
described in the next section.
III. SENSOR CHARACTERIZATION
In the following, we show the steps of how any tactile
sensing array can be characterized using a 3 axes manual
manipulating platform. The steps include:
1) calibrating of every tactel of the sensing array with a
ground-truth force measuring device: First step is dedicated
to find the relation between the ground-truth forces and
responses of each sensing element in the array. The indenter
pressed once on different elements.
2) recording the penetration depths and verification of the
repeatability of sensor responses: second step is dedicated
to identify the mechanical compliance (stiffness) of the
sensing surface. The indenter pressed on a sensing element,
increasing the penetration depth slowly, and recording mean-
while the penetration depth and the force response. Also the
repeatability of a given sensing element is tested by pressing
several times on the same tactel with the indenter.
A case study is performed using the resistive WTS0614
WeissRobotics sensor. The output of the sensor is 12 bit raw
voltage level values.
A. Characterization setup
The following tools, illustrated in Fig. 4, can be used
to characterize the sensor’s response: a schematic drawing
of the sensing elements’ locations (Fig. 4 c and Fig. 5 a)
that help to determine the locations of the tactels needed for
Algorithm 1, the ground-truth ATi Nano17 force and torque
sensor (Fig. 4 b), a custom-made cylindrical indenter with
the diameter of 1 mm (Fig. 4 b), the Proxxon mf70 D-54518
milling device (Fig. 4 a) used as a three axes manipulator.
The sensor was fixed (with a thin double side scotch 3M) to
a flat surface within the three axes manipulator’s (Proxxon
Algorithm 1 Tactile image in Simulation
T actileArray triangularized meshes
Stiff ness Real stiffness coefficient (e.g. Fig. 6 b)
P enetrationDepth Thickness of the sensing surface
SensorLocation RobotDescription
Contact forces F CollisionsPhysicalWorldSimulator
N
x
× N
y
NumberOfTactels
for all F N
x
× N
y
do
F
sim
(i, j) equation (2), Fig. 5 // scaled force value
end for
for all F
sim
N
x
× N
y
do
I
sim
(i, j) equation (1) // contact image pixel
end for
return Simulated Tactile Contact Image I
sim

Fig. 4: Tools used for the characterization of the WTS0614 sensor. (a)
Manual 3-axis platform. (b) The ground-truth force and torque ATI Nano
17 sensor with the indenter printed using a 3-dimensional rapid prototyping
machine. (c) Schematic of the locations of the sensing elements.
mf70) workspace. The manipulator has the resolution of 0.2
mm in the horizontal directions and 0.05 mm in the vertical
direction (red arrows in the Fig. 4 a). ATi Nano17 force
sensor was attached to the machine’s head using a custom
made adapter piece. This force sensor with an attached
indenter was pushed against the elastic sensing surface over
the centers of the sensor’s sensing elements. We captured the
deformations in the normal direction of the sensing frame,
tactile sensor outputs, and the force sensor measurements
obtained by pressing the elastic surface incrementally up to
the saturation of the sensor output. The signals from the
tactile array and ground-truth force sensor were synchronized
using a data synchronization tool of ROS framework.
B. Step 1: tactile sensor output calibration
Fig. 5 b shows the sensor’s responses versus the ground-
truth force measurements. The linearized response is de-
picted with the black line. We can assume that interaction
forces are within the linear range (from 0.2 N to 1.8 N with a
ratio outputforce of k
outforce
= (1.80.2)/(38400) =
0.0004
N
out
) of the tactile sensor’s response
2
. The maximum
Fig. 5: Responses of the individual sensing elements (tactels) to the applied
forces on each element. The sensor outputs are the raw values of the
WeissRobotics WTS 0614 sensor that is based on the electro-conductive
rubber technology. (a) Locations of tactels and (b) the raw output values of
tactels versus ground-truth forces.
detectable pressure applied by the indenter with the diameter
2
though the response of an element of the sensor can be characterized
with a higher order polynomial (depicted with the red color)
Fig. 6: (a) Step responses of the same tactel from multiple trials (tactel 39
in this case study). (b) Maximum penetration depth and ground-truth force
measurements.
of 1 mm is given by the maximum normal force F
max
applied onto the sensor surface with the area of π · r
2
:
F
max
/(π · r
2
) = (2N/0.785m
2
) · 10
6
= 2.548 · 10
6
Pa, that
is 0.25N/mm
2
or 2.89 N per tactel. There is a significant
difference in the minimum trigger level (i.e. the minimum
detectable contact force) of the tactel 15, which is close to
the center, and the tactel 1, which is located at the border:
the tactel in the center is more sensitive than the tactel near
the border. This irregularity is due to the fact that the rubber
is attached to the base of the sensor from the sides, where the
rubber is less soft than in the center. We take into account
such irregularities by calibrating each tactel separately and
applying a scaling coefficient for the response each tactel.
C. Step two: stiffness and maximum penetration
The sensor responses have a creep behavior over a series
of deformations as shown in the Fig. 6 a. The response
of the same tactel changes from one trial to another: the
indenter pushed the sensing element number 39 several
times. The repeatability of the response of the tactel had
a variation of almost 1.5 times at the point around 1.5
N. This happens because of the hysteresis of the flexible
surface, temperature variations, and a creep behavior under
a constant force [11]. However, such limitations can be
mitigated in simulations without undesirable effects. In order
to estimate the compliance constant C
z
, the sensing surface
was deformed by the displacements of 0.2, 0.4, 0.6, 0.8, 1.0,
1.2, and 1.4 mm. Fig. 6 b shows the response of the ATI
Nano17 force sensor (not the raw data of the sensor) to the
indentations of the different depths and the linear fit (red
line). The maximum displacement and the stiffness of the
sensor k = 1/C
z
( in this case-study k ' 2 N/m) define the
Penetration depth and Stiffness parameters in Algorithm 1,
respectively.
IV. TACTILE SERVOING FOR EDGE ORIENTATION
CONTROL IN A SIMULATION
So far, we developed a method to construct a tactile
sensing array in a simulation. If there is a contact between
this array and an edge of an object, there are multiple tactels
in contact with it. Since the tactels are arranged in raw-
column wise, we can extract contact features by applying
computer vision techniques to the tactile images generated
with the simulated sensor. In most cases, there are two types
of contact that occur during physical interaction tasks: point-

Citations
More filters
Journal ArticleDOI

Zero-Shot Sim-to-Real Transfer of Tactile Control Policies for Aggressive Swing-Up Manipulation

TL;DR: In this paper, a robotic system is presented that is able to swing up poles of different masses, radii and lengths, to an angle of 180 $^{\circ }$, while relying solely on the feedback provided by the tactile sensor.
Posted Content

Zero-shot sim-to-real transfer of tactile control policies for aggressive swing-up manipulation

TL;DR: This letter aims to show that robots equipped with a vision-based tactile sensor can perform dynamic manipulation tasks without prior knowledge of all the physical attributes of the objects to be manipulated.
Proceedings ArticleDOI

Grasp Stability Prediction with Sim-to-Real Transfer from Tactile Sensing

TL;DR: This work integrates simulation of robot dynamics and vision-based tactile sensors by modeling the physics of contact and introduces a contact model that uses simulated contact forces at the robot’s end-effector to inform the generation of realistic tactile outputs.
Posted Content

Elastic Interaction of Particles for Robotic Tactile Simulation.

TL;DR: Elastic Interaction of Particles (EIP) is proposed, a novel framework for tactile emulation that models the tactile sensor as a group of coordinated particles, and the elastic theory is applied to regulate the deformation of particles during the contact process.
Journal ArticleDOI

PhotoElasticFinger: Robot Tactile Fingertip Based on Photoelastic Effect

TL;DR: In this article , a tactile fingertip design was proposed based on the photoelastic effect observed in silicone matter, where the light propagating within the silicone rubber is subjected to the angular phase shift, which is proportional to the increase in the image brightness in the camera frames.
References
More filters
Journal ArticleDOI

Robots that can adapt like animals

TL;DR: An intelligent trial-and-error algorithm is introduced that allows robots to adapt to damage in less than two minutes in large search spaces without requiring self-diagnosis or pre-specified contingency plans, and may shed light on the principles that animals use to adaptation to injury.
Proceedings ArticleDOI

Safety Evaluation of Physical Human-Robot Interaction via Crash-Testing

TL;DR: In this paper, the authors evaluated the potential injury risk emanating from the manipulator and evaluated several injury mechanisms and so-called Severity Indices with respect to their adaptability to physical human-robotic interaction.
Journal ArticleDOI

Control of contact via tactile sensing

TL;DR: The critical issue of how to model the state of contact in terms that are both sufficient for defining general contacts and conducive to bridging the gap between a robot task description and the information observable by a tactile sensor is addressed.
Proceedings Article

RobWorkSim - an Open Simulator for Sensor based Grasping

TL;DR: A new open-source grasp simulator RobWorkSim is presented, which provides support for full 6D rigid body dynamics as well as simulation of kinematics and includes simulation of a rich set of sensors including tactile arrays, cameras, range scanners and a range of tools useful for grasp analysis and generation.
Journal ArticleDOI

Multi-contact haptic exploration and grasping with tactile sensors

TL;DR: This work tackles simultaneous control of multiple contact points on several links with an efficient use of tactile sensors on the whole surface of robotic fingers, and enables the robot to perform a rapid exploration of complex, non convex shapes while maintaining low contact forces.
Frequently Asked Questions (13)
Q1. What have the authors contributed in "Simulation of tactile sensing arrays for physical interaction tasks" ?

A framework for tactile servoing in the simulated world is presented. This framework includes a general model of tactile sensing arrays that can simulate the behavior of a real tactile sensing array thanks to an empirical characterization procedure. 

Future work includes advancing the framework for using a simulated model with a physical engine that can computes surface deformations. 

In order to estimate the compliance constant Cz , the sensing surface was deformed by the displacements of 0.2, 0.4, 0.6, 0.8, 1.0, 1.2, and 1.4 mm. 

In order to validate the proposed framework for continuously changing contacts, the performance of an edge tracking tactile servoing controller serves as the qualitative metric. 

When the sensor collides with an object, ODE calculates the contact forces using a temporal spring-damper system at the intersection of the surface triangles. 

Since the tactels are arranged in rawcolumn wise, the authors can extract contact features by applying computer vision techniques to the tactile images generated with the simulated sensor. 

The maximumdetectable pressure applied by the indenter with the diameter2though the response of an element of the sensor can be characterized with a higher order polynomial (depicted with the red color)of 1 mm is given by the maximum normal force Fmax applied onto the sensor surface with the area of π · r2: Fmax/(π · r2) = (2N/0.785m2) · 106 = 2.548 · 106 Pa, that is 0.25N/mm2or 2.89 N per tactel. 

The maximum displacement and the stiffness of the sensor k = 1/Cz ( in this case-study k ' 2 N/m) define the Penetration depth and Stiffness parameters in Algorithm 1, respectively. 

The simulator allows small penetrations in the colliding objects, from which it estimates normal and tangential forces based on the stiffness and friction coefficients of the bodies in contact. 

In reality, this force is spread over the neighboring elements due to the soft elastic top layer of the sensor that tends to uniformly spread its deformation to an input stimulus. 

In their previous works, e.g. in-hand shape recognition [12], the authors faced with advantages of simulation environments and with limitations of them in regards to precise controlling of contact geometry, e.g. [10]. 

A triangular mesh of a sensing array can be designed in one of the following ways: it represents the whole sensor as either a single body or multiple bodies organized in the same way as sensing elements. 

the contact force Fsim at the (i0, j0)-th sensing element with coordinates x(i0) and y(j0) is computed as follows:Fsim(i0, j0) = kscale Nx,Ny∑ i,j Fn∃(x, y) : (|x− x(i0)|∈ ∆x) ∧ (|y − y(j0)| ∈ ∆y) ∧ (|z − z(i0, j0)| ∈ h)(2)where h is the height of the sensor (in z direction), Nx and Ny are the number of tactels along x- and y-axes, respectively, Fn is a normal force provided by the Gazebo simulator at the point of contact between an object and the given element with the volumetric dimensions ∆x,∆y,∆z.