Advertisement

object detection using sensors

Started by October 10, 2009 05:53 PM
3 comments, last by Emergent 15 years, 1 month ago
A group I'm involved in are creating a 3d simulation software program. Where autonomous vehicles would be equipped with sensors to avoid objects, detect movement(if possible) I understand the issue is complex etc. but thats cool. I was linked here from the gdk forums for more direction on this. Right now I'm currently researching sensors such as LIDAR and LADAR, that would detect different objects etc. I've been told to research waypoints but how would i develop that into a usable program to implement. I'm new to dark gdk coding and a novice to C++ programming. so how would these vehicles detect objects using GDK coding? AI object detection and avoidance AI programs? [Edited by - cartman23 on October 10, 2009 6:34:39 PM]
so no one knows anything?
Advertisement
One thing that has me a little confused is to what extent your project is to do with providing a simulation environment for robots, and to what extent it's about "AI" (or control) algorithms from robotics.

If it's the former: There are a number of robotics simulators available, including Player/Stage/Gazebo (open source), Microsoft Robotics Studio, and others. Perhaps you should take a look at what these programs provide before writing your own?

If it's the latter: There are lots of problems described in the robotics literature, and you sort of need to pick one. One problem which you may find interesting is SLAM -- simultaneous localization and mapping (though this may well be significantly more difficult than you want to attempt; I don't know).

Most generally, it sounds like you need to define your problem better.
whats up merge. thanks for a response! My group is creating a 3d simulation software. where vehicles will be programmed to behave like autonomous vehicles(have you heard of DARPA grand Challenge?. I'm not leading this project I'm just assisting and i was given the task of researching such topics as posted. the leader wanted me to look up how sensors like LIDAR work and how I would simulate that system in the simulation. like obstacle avoidance/collision? pathfinding(A* i know of for this particular app). The simulation would behave like a real world environment. for instance if the autonomous vehicle sees another vehicle approaching, the autonomous vehicle must either slow down or stop to avid hitting it, once it does either commands, it'll decided if it will turn left,right or any other maneuver. This is basically the project.


any further advice?
Hmmm... well like I said, for simulation you might want to use one of the available simulators instead of rolling your own. I don't know a ton about LIDAR/IR/ultrasonic(sonar) sensors so I can't help you there, but looking both at (1) how other simulators deal with them, and (2) spec sheets for particular sensors might be helpful.

For obstacle avoidance there's a great deal in the robotics literature.

One common approach is the artificial potential method (of which there are many variations): Basically you pretend that obstacles have an associated repulsive force which becomes very large as you get close to the obstacle but which smoothly falls off away from it. To do this you define a potential function which has a "peak" around any obstacle; then the force is its negative gradient, and the control input you choose is the one which moves in the direction of the force as much as possible. In other words, you always try to move "downhill."

This is more-or-less a reactive method. In principle there's also nothing to stop you (besides computational complexity) from using so-called deliberative (i.e., planning) methods either: When you spot an obstacle, replan using e.g. D* on an appropriate graph.

This topic is closed to new replies.

Advertisement