Python 3 for the FIRST Robotics Competition (FRC)

pynetworktables NT3 rewrite now available for testing!

pynetworktables has been rewritten in the style of ntcore, and now fully supports all of the NT3 features that are available in ntcore. For the most part…. it should all work. There are a few breaking changes I can think of:

  • Connection listeners are different. Sorry.
  • The special array types are gone (yay) and so is the networktables2 package
  • It’s easier to make client connections (though the old way still works)
  • … and that’s about it

I haven’t had the opportunity to try this on a real robot yet, BUT the unit tests have 75% coverage and it works on my machine, so it’s probably good to go if you’re using this on a driver station or coprocessor. Try it out, let me know how it works!

Installation is super easy if you already have python and pip installed:

pip install --pre pynetworktables

Also, if you’re using pynetworktables2js, there’s an alpha release of that available too, which accommodates some of the NT3 changes. However, more work needs to be done to fully support all of the NT3 features in pynetworktables2js.

mjpg-streamer for RoboRIO 2016.3.0 released with OpenCV input plugin

I’m happy to announce the release of an OpenCV input plugin for mjpg-streamer, which allows you to write simple little filter plugins that can process the image from a webcam, and change what is streamed out via HTTP. You can install the mjpg-streamer-cv or mjpg-streamer-py packages using the instructions on our github repo. Here’s an example filter plugin:

import cv2
import numpy as np

class MyFilter:
    def process(self, img):
            :param img: A numpy array representing the input image
            :returns: A numpy array to send to the mjpg-streamer
                      output plugin
        # silly routine that overlays a really large crosshair over the image
        h = img.shape[0]
        w = img.shape[1]
        w2 = int(w/2)
        h2 = int(h/2)
        cv2.line(img, (int(w/4), h2), (int(3*(w/4)), h2), (0xff, 0, 0), thickness=3)
        cv2.line(img, (w2, int(h/4)), (w2, int(3*(h/4))), (0xff, 0, 0), thickness=3)
        return img
def init_filter():
        This function is called after the filter module is imported.
        It MUST return a callable object (such as a function or
        bound method). 
    f = MyFilter()
    return f.process

If you scp’ed this to the roborio, you could use the following command line to run it:

mjpg_streamer -i ' -r 320x240 --fps 15 --quality 30 --filter /usr/local/lib/mjpg-streamer/ --fargs /home/admin/'

Our team used the OpenCV plugin on our robot this weekend with a python script to do image processing and NetworkTables operations (Lifecam 3000, 320x240, 15fps, 30 quality), and it seemed to be about 20% CPU usage. Not too shabby. In theory, you could use this on a RPi or other platform too, as I’ve pushed the changes (plus some significant build system improvements) to mjpg-streamer upstream.

RobotPy 2016.2.0, PyFRC 2016.2.3 released

RobotPy WPILib 2016.2.0 now has full CANTalon support including enhanced sensor support in simulation, the new motion profiling stuff that was introduced for 2016, and a bunch of new setter functions and other random status things. The simulation hal_data structures have been updated as well, which may break your tests. However, the new API should be easier to use and more consistent.

Additionally, PyFRC 2016.2.3 has been released, with a useful new feature that allows you to select autonomous mode via NetworkTables if you’re using the AutonomousModeSelector object to select autonomous modes (used in the Magicbot framework too). Check out this screenshot:

PyFRC Screenshot

RobotPy releases can be downloaded from our github releases page, and pyfrc can be upgraded using Pip.

NavX device support in RobotPy 2016.1.1 released

This is a bugfix release of RobotPy, and all RobotPy users are recommended to upgrade, particularly owners of NavX devices or those who want to use the PIDController object.

If you want to see the NavX stuff in action, one of the NavX samples that I ported over shows a robot rotating to a specific angle based on a button press, and it works in simulation (not tested on a real robot). Very cool demo – you will need to make sure you have the latest version of pyfrc installed as well.

For NavX device python documentation, see our readthedocs site.

  • Fixed a crash in the PIDController object
  • New version of robotpy-wpilib-utilities 2016.2.0 includes support for the NavX MXP

RobotPy releases can be downloaded from our github releases page.

mjpg-streamer for RoboRIO available in opkg feed

If you’re just trying to view the images from a camera, this can be a great solution. The stream is embeddable in a webpage, so it integrates well with a pynetworktables2js dashboard. Includes an init script that automatically starts mjpg-streamer when the robot starts, and you can edit the settings pretty easily too.