Using and Coding an ultrasonic sensor

Has anyone ever tried to use an ultrasonic sensor on your FIRST robot? I am trying to get my team to use more sensors in our autonomous program; we have used dead reckoning almost every year and I think it is time for us to move to the next level of autonomous programming. I think that ultrasonic might be a good method of navigating around a FIRST playing field…what do you think? I was thinking of using a sensor such as this one . Can you recommend a better one? How do you write the code for a sensor like this?

Thank you so much for helping!!

Just to add to Jaine’s question, is there any way that these sensors may be programmed in such a manner that they will not react to other robots on the field but rather, just the playing field itself?

I’m all for any attempt by teams to move away from dead reckoning to something more sophisticated. I think autonomous is going to become a real drag if we don’t start heading that direction. With that said…

The sensor you listed here wouldn’t have been allowed this year (I don’t think any of the approved suppliers carry it). Hopefully the rules will be relaxed next year, but you never know. I don’t have experience with any of these sensors, however we did investigate the use of them for our robot this year. Depending on what the rules are next year with regards to infrared emitters on the robot, there’s some IR rangefinder devices that I believe are much cheaper than these ultrasonic ones.

Writing code for proximity sensors depends a lot on the sensor in question. Many of the IR prox sensors just output an analog voltage (hook it up to an analog input on the RC and just read the value! Pretty convenient!) The sensor you selected above appears to work a little differently and would be harder to work with. Basically you hook the output of that sensor up to a digital input on the RC, and after you tell it to measure distance, it will send a high signal to the digital input for a length of time that is proportional to the distance to the target. This means you need a good way to accurately measure time. So, you need to set up one of the timers on the processor to run at a known rate, then you probably want to hook the sensor up to an interrupt line and trigger the interrupt on both the rising edge and falling edge (I believe the RC supports this, but it’s been a while since I looked at it). Then you just start the timer when you get the rising edge interrupt and stop it when you get the falling edge (and don’t forget to account for rollover). Obviously it’d be a lot easier to try to find a sensor that just gives you an analog output :slight_smile:

Unfortunately no. All these sensors can tell is if there’s an object that reflects sound (or IR, for the infrared ones) within it’s path. The sensor has no knowledge of what type of object it is, which is why navigating with these would be very difficult, because you can’t really be sure if the reflection you’re getting is from the playfield wall or from another robot, or a ball, or some junk that got dropped on the field, or…

I think the best way to use a rangefinder would be in conjunction with other methods of navigation. For example, one could use wheel rotations or maybe an Inertial Nav system to guide the robot to a specific area and then make fine tuned movements based on ultra sonic input.

That’s what I’d do anyway…

We’re starting to use an ultrasonic sensor on one of our research robots. This sensor would have been illegal last year; however, with the digikey ultrasonic sensors being discontinued, there may be no options for this type of sensor unless the rules are changed. (HINT. HINT.)

The company which makes this sensor is senscomp (formerly polaroid) and can be found at www.senscomp.com. They have a variety of options. But, one option is to output an analog value proportional to the distance from your robot.

The range is 6" to 10’ (off the top of my head). The sensing angle is a 15 degree cone.

As for deciding whether an object is another robot or part of the playing field…

if it moves without any intervention from you, it’s probably a robot. If it stays put, it’s either playing field, dead robot, or movable playing field artifact.

I’m not sure how much use mapping would be or using a pre-stored map to help navigate. It’s too easy to get knocked off course, run into an impediment, experience sensor failures, set up in a non-aligned initial position, etc. Using odometry and GPS is probably a dead end if the task relies on complex interaction with the playing field.

We’re starting to use an ultrasonic sensor on one of our research robots. This sensor would have been illegal last year; however, with the digikey ultrasonic sensors being discontinued, there may be no options for this type of sensor unless the rules are changed. (HINT. HINT.)

Actually there are always options. I managed to find a way around the rules so we could use ultrasonic sensors. I also found an infared rangefinder but the website went down before I could find it.

The sensor mentioned here was used in the Frontiers Robotics summer program at WPI. We chose it mostly because of the cost and the chance to illustrate some advanced programming issues like interrupts and timers.

Another ultrasonic sensor is available from DigiKey and does provide an easier interface - it outputs an analog value proportional to the distance measured:

http://dkc3.digikey.com/PDF/T042/1310.pdf

the 387-1000-ND.

It is true that using ultrasonics on a robot is subject to “interference” from other robots and objects that get in the way. We used ultrasonics in the team 190 robot to autonomously detect the stairs, then the upper platform where we were assured that there would be no obstacles or other robots interfering.

As was mentioned, you need to integrate the information from other sensors to provide a better picture of what’s going on around the robot. We used wheel encoders for precise distance measurement and gyros for pitch and turn information in addition to the ultrasonic rangefinder.

Between all of these sensors we were able to successfully hang autonomously in most matches.

I encourage everyone to think about using the summer/fall to improve the sensing capabilities of their robots. Some of us have a vision of the future that includes full utilization of all three elements of the “sensing-mobility-manipulation” pyramid. Those who can see their way to learning about these technologies will probably find it a worthwhile pursuit.

When you have integrated a few sensors on your robot, you may want some exercises to better understand what you can do with them. I would like to suggest this reference as a starting place to look for a few ideas.

-dave

The sensor you listed here wouldn’t have been allowed this year (I don’t think any of the approved suppliers carry it). Hopefully the rules will be relaxed next year, but you never know. I don’t have experience with any of these sensors, however we did investigate the use of them for our robot this year. Depending on what the rules are next year with regards to infrared emitters on the robot, there’s some IR rangefinder devices that I believe are much cheaper than these ultrasonic ones.

Actually they would… You would just have to build them. Most of the parts are rather quite harmless.http://www.robot-electronics.co.uk/htm/srf04tech.htm
Also the infared emitters have a maximum distance of three feet. The ultrasonic emitter has a rating of 3 meters. Infared sensors don’t detect anything transparent to infared light. Ultrasonic sensors have a hard time detecting fuzzy things. Both sensors have a hard time detecting things at a steep angle due to the fact that the signal gets bounced in the wrong direction. You can ask me anything about sensors. Im pretty knowledgable about them.

Sharp infrared proxity sensor, part number GP2Y0A02YK, measures distances between 8" and 60". Its still half the range of the ultrasonic sensor but not terrible. The smaller one, part number GPD2D12 measures 3" to 12". They can be wired directly to a 3-pin header on the RC since they run on 5V and return an analog voltage proportional to the distance. They are really easy to use. The only source I’ve seen for the JST connectors is www.acroname.com. Digi-key sells the sensors as well.

Sharp infrared proxity sensor, part number GP2Y0A02YK, measures distances between 8" and 60". Its still half the range of the ultrasonic sensor but not terrible.

Hmmm… I never knew they were being sold at Digikey. Personally Im starting to have nightmares of robots with these sensors going crazy now.

We are currently trying to resolve a small problem with an alpha particle x-ray spectrometer on our robot that is being used as an in-situ sensor to determine the elemental composition of targeted compounds and other materials under investigation. Prior to full operational use, the spectrometer was stored in a cold soak, reduced pressure environment, with the devie powered on to monitor system health and emission source strength, for approximately seven months. For the past seven months, the sensor has been used on an intermittent basis, and returned to an environment similar to the long-term storage conditions for at least half of the operational period. Total device duty cycle is less than five percent. Data sets from this device are fused with data from a vertically-mounted mini thermal emissivity spectromter, to provide additional discrimination information for environmental characterization and identification of investigated materials.

During recent utilization of the spectrometer the device attempted to execute a number of standard operations, with anomalous results. When the sensor payload software attempted to turn on the payload FPGA, the FPGA failed to power on before the FPGA Power On Timer timed out. Subsequently, the payload software issued a MTES image command. The MTES image command failed due to an MTES interface timeout. Further MTES image commands failed similarly, and generated several more instances of the FPGA Power On Timer timeout EVR. A fault EVR was then generated due to the failure of the PMA azimuth actuator to move. However, since there is a hardware level handshake between MTES and PMA, this is likely to be a secondary failure due to the failure of the FPGA to power on during the original attempt. Initial analysis of the anomaly indicates it is likely due to an insufficient interval between the MTES_ABORT_IMAGE command from e3383 and the subsequent MTES_IMAGE command from p3576. If the timing is “just right”, the pyld software can be put into a state such that all subsequent attempts to power the FPGA will fail. Only a reboot clears the condition. We are currently attempting to recreate the anomaly on the spare backup robot to better understand the condition. In the interim, use of the APXS and other spectrometers, as well as the PMA, is precluded until the anomaly is resolved.

Since you have urged that we can “ask you anything about sensors,” we would welcome any advice you might offer…

-dave

(sorry, but I just couldn’t resist when you left yourself open with a line line that… :slight_smile: )

Comon Dave, clearly you ran your robot into a wall, stalled the motors, and burned a CIM… Anyone could tell you that…

BTW, is/was this a problem with one of the Mars rovers?

oh man that’s an easy one…i just…don’t wanna answer right now, yeah yeah that’s the ticket :stuck_out_tongue:

During recent utilization of the spectrometer the device attempted to execute a number of standard operations, with anomalous results. When the sensor payload software attempted to turn on the payload FPGA, the FPGA failed to power on before the FPGA Power On Timer timed out. Subsequently, the payload software issued a MTES image command. The MTES image command failed due to an MTES interface timeout. Further MTES image commands failed similarly, and generated several more instances of the FPGA Power On Timer timeout EVR. A fault EVR was then generated due to the failure of the PMA azimuth actuator to move. However, since there is a hardware level handshake between MTES and PMA, this is likely to be a secondary failure due to the failure of the FPGA to power on during the original attempt. Initial analysis of the anomaly indicates it is likely due to an insufficient interval between the MTES_ABORT_IMAGE command from e3383 and the subsequent MTES_IMAGE command from p3576. If the timing is “just right”, the pyld software can be put into a state such that all subsequent attempts to power the FPGA will fail. Only a reboot clears the condition. We are currently attempting to recreate the anomaly on the spare backup robot to better understand the condition. In the interim, use of the APXS and other spectrometers, as well as the PMA, is precluded until the anomaly is resolved.

Give me five years and I may be able to answer that question. All I know now is that you used a lot of acronyms. So many acronyms. Maybe you should whack it a few times with a hammer. That always seems to fix our robots. By the way I was refering to sensors that you would find on FIRST robots. Nothing you would find in outerspace.