scispace - formally typeset
A

Ahmad Ostovar

Researcher at Umeå University

Publications -  8
Citations -  65

Ahmad Ostovar is an academic researcher from Umeå University. The author has contributed to research in topics: Object detection & Object (computer science). The author has an hindex of 5, co-authored 8 publications receiving 51 citations.

Papers
More filters
Journal ArticleDOI

Adaptive Image Thresholding of Yellow Peppers for a Harvesting Robot

TL;DR: The presented work is part of the H2020 project SWEEPER with the overall goal to develop a sweet pepper harvesting robot for use in greenhouses, and visual servoing is used to train the robot.
Journal ArticleDOI

Detection and classification of Root and Butt-Rot (RBR) in Stumps of Norway Spruce Using RGB Images and Machine Learning

TL;DR: This study developed and evaluated an approach based on RGB images to automatically detect tree stumps and classify them as to the absence or presence of rot, and classified stumps into three classes of infestation; rot = 0%, 0% < rot < 50% and rot ≥ 50%.
Book ChapterDOI

Human Detection Based on Infrared Images in Forestry Environments

TL;DR: This paper introduces two human detection methods in forestry environments using a thermal camera; one shape-dependent and oneshape-independent approach, which reaches a precision rate of 80 % and recall of 76 % by using shape-independent features.
Proceedings ArticleDOI

A Direct Method for 3D Hand Pose Recovery

TL;DR: A novel approach for performing intuitive 3D gesture-based interaction using depth data acquired by Kinect using a new version of optical flow constraint equation, which can be utilized to directly estimate 3D hand motion without any need of imposing other constraints.
Proceedings Article

Integrating Kinect Depth Data with a Stochastic Object Classification Framework for Forestry Robots

TL;DR: It is shown here how kinect depth data with a stochastic object classification framework for forestry robots can be integrated with a conventional classification framework.