SRX Magnetic Encoder and desktop simulation

Has anyone integrated SRX Magnetic encoders into the new simulation capability? The EncoderSim class uses a lot of hardware level APIs to manipulate the Encoder during simulation and the SRX encoder is pretty incompatible with all of that. I thought I’d see if anyone else has worked this problem before I dive into it.

SRX encoders and Talon controllers are pretty common but don’t appear to be well supported in simulation…

Simulation support is up to vendors to implement. CTRE only has basic support for simulation in terms of motor get/set. I don’t believe they support encoders yet. I’d recommend sending them an email or request to implement this feature.

We’ve actually been working on additional sim support for awhile this year.

I’m happy to report we’ve just released an update that adds alpha support for simulation for Talon SRX and Victor SPX.

Note this does require WPILib 2021 Beta 4 to run. There are limitations, but we’ll be improving and adding features up to and after kickoff.

This includes a new API collection for setting what would normally be “Physics” inputs (like sensors). This functions are inside a sub-class called ‘SimCollection’ which is used in the same way our traditional ‘SensorCollection’ is.

So if you want to set your quadrature input in simulation, you would call the following:
TalonSRX.getSimCollection().setQuadraturePosition(int pos).

Also note you’ll need to call our FeedEnable() routine to enable your simulated motor controllers.

We’ll be updating examples and documentation soon with more information.


How soon will we see some doc? Too many questions here, don’t really want to poke around in the dark with the pre-release…

GitHub examples

I believe our release notes already mention that examples/documentation is coming.
For now I recommend looking at our GitHub examples which incorporate simulation. They are on a branch for now.

Simple C++

Below is a simple C++ example on how Phoenix-simulated devices are robot-enabled by adding a SimulationProcess call. Note you have to call FeedEnable() to “enable” the motor controllers. This may change in the future. Other than that, there isn’t any work to talk our simulated devices. They also show up in Tuner if you enter “localhost” in for the Server Address.

Java + basic physics sim

More elaborate Java example with a basic Physics Simulation.


The new sim collection object allows you to set anything that is sim-able like voltage, velocity, etc.


Quick Video

Playing around with WPI sim-gui and Tuner. Note the Tuner “Simulated Device” under “CAN Devices”.


I’ve been doing some work getting WPILib code to run against simulations in Unity rather than the GUI. Any thoughts on whether something like this would fit? The Unity physics aren’t too bad and I can already get encoders working off wheel RPMs.

1 Like

For interfacing with external simulators like Unity, we recommend using the Hardware WebSockets API. Most things on the GUI are accessible via this path, including everything under the “Other Devices” window.


Thanks Peter. I had been looking for documentation on the new how to use the new websocket sim but hadn’t been able to find it. I’ve converted my Unity interface over to WebSockets and it seems to be working. I noticed that the WebSocket simulation doesn’t support PCM/Solenoids.Is that something that will be coming? It is supported in the GUI.

Oh, BTW - the documentation doesn’t call out analog outputs (AO) but it seems they are supported in the WebSocket interface.

Kind regards,


1 Like

Yes, we are working on adding solenoids. We tried to keep the spec fairly minimal because LabView is adding support for it as well, and we didn’t want to over specify. Analog Outputs are little used by teams.

1 Like

Can we get any insight into what is going on under the hood? From my experimentation, I’m guessing you have some sort of update thread so you can simulate your 10khz update rate for PID, and timeouts will occur if you don’t call the feed function.

If this is correct, this creates some determinism issues. Pausing/stepping the time from the HAL GUI, or debugging with breakpoints will cause timeouts and the reported values from the motor controllers stop functioning as expected. This also makes it hard/annoying to write unit tests, as sleeps have to be added in between your mocked robot loops.

I’ve also seen, even with the most rudimentary java unit test, that something inside of the library causes a crash. The unit test reports that it passed, but gradle (and the hal gui) have a non-zero exit code. I couldn’t reproduce it with C++, so I’m guessing something is going wrong with the JNI unloading.

Having said all that, those are relatively minor complaints and I don’t want you to get sidetracked fixing these low reward problems. Overall, this feature is sick, and I’m looking forward to using this over the pretty hacky stuff I’ve done before.

Will you be producing documentation that describes the WebSockets interface for Talon devices? I can see some of what is available by looking at the JSON output from the WebSocket server in the Robot code when I include a Talon SRX in my code (e.g. type=SimDevices, device=“Talon SRX[1]”, fields “<percentOutput”, “<>velocity”, etc.) but not quite sure how they fit into the correct behavior and would also need data expected from the other direction as well. Let me know if this can be extracted from the code instead of having to wait for documentation.

The fields on WebSockets are identical to those in the GUI “Other Devices” window, both in value and behavior. They come from the same place. The only difference is the “<”, “>”, “<>” prefixes are not shown in the GUI. Heads up that there may be some changes in this between beta 4 and the final kickoff release.

1 Like

Thanks Peter. Can you give many insight on names that will be used for type and device? Will the type include the PCM number? Or will that be part of the device name?

Will Solenoids be in the kick-off release?

Are you asking about Solenoids in particular? We’ve not settled on it yet, but I think we’ll end up with Solenoid as the device type with the ID containing both the PCM number and solenoid number. When we merge it we’ll update the API documentation too. Yes, I’d like to get it in the kickoff release.

Regarding other SimDevices, as described in the spec, vendors can set the device type by putting a “:” in the SimDevice name. The portion before the “:” is not shown in the GUI (I may add an option to do that for debugging purposes though).

1 Like

timeouts will occur if you don’t call the feed function.

The FeedEnable function is used for signaling Phoenix to allow enabled-operation. This was originally designed for non-FRC applications. So this might change in the future for sim builds. Also you can pass the time period to stay enabled. So if you pass 9999999ms, it should stay practically enabled for your entire session.

I’ve also seen, even with the most rudimentary java unit test, that something inside of the library causes a crash.

Please forward your crash case to

1 Like

Peter, Thanks for pushing the PCM/Solenoid sim out so quickly! We definitely need this for our bot! My initial results integrating with Websockets in Unity show it to be working fine.

On a slightly different topic, is there a chance there is still an off by one bug with buttons? I see you made a change and that looks good, but button one seems to trigger code written to handle button 2. For example, the following code fires when button 1 is pressed:
intake = new IntakeSubsystem() ;
intakeButton = new JoystickButton(controller, 2);
intakeButton.whenPressed(() ->
{ intakeOpen = !intakeOpen ;
if (intakeOpen) {
} else {
) ;

I’ve added some debug code to HALSimWSProviderJoystick::OnNetValueChanged to make see what is being received, and it shows button 1 is correctly being received . Here is the code with the debug print statements:
if ((it = json.find(">buttons")) != json.end()) {
HAL_JoystickButtons buttons{};
buttons.count =
std::min(it.value().size(), static_castwpi::json::size_type(32));
for (int i = 0; i < buttons.count; i++) {
if (it.value()[i]) {
buttons.buttons |= 1 << i;
wpi::outs() << “joy stick button " << i << " pressed \n”;

if ( buttons.buttons != 0 ) {
  wpi::outs() << "joy stick buttons " << buttons.buttons
              << "\n";
HALSIM_SetJoystickButtons(m_channel, &buttons);


The debug output looks like this:
joy stick button 1 pressed
joy stick buttons 2

Let me know if there is any other data I can get to help debug.

This is a little off topic of the thread, so let me know if you want me to start up a new one.

Happy New Year!

1 Like

Buttons are 1-indexed in user code. When we map that to an array in WebSockets, the WebSockets array is 0-indexed.

1 Like

Thanks. Sorry I didn’t see that.


I’ve been able to receive the values coming out of the WebSockets interface and can see those values in my code running in unity. When I try to push data in from Unity rather than have everything computed in PhysicsSim I encounter an interesting behavior where each frame, several zero values are output from the sim before my value appears.

From my code in Unity, I echo the JSON data received from the robot code and the data I send from Unity. As a simple test, I send a specific value from Unity for Supply current (Talon SRX[1], <>supplyCurrent) and look for it to be echoed by the Robot code. I see several frames of output from the robot code where supplyCurrent is 0.0, and then finally my set value, the again several frames of 0.0 followed by a single from of my value. I would expect that once I send in a value, it would continue to be echoed until a new value is provided. Is that not correct? I’ve also tried to send in /Encoder <>position and see the same behavior. Other information of interest: 1) My PhysicsSim does not have any code that sets the supply current; 2) the rate of JSON data transmitted from unity is less that the period of the robot code; 3) I only have one Talon SRX instantiated within my robot code.

Thoughts? Let me know if you have questions or would like other data.