View
15
Download
0
Category
Preview:
Citation preview
Chameleon Vision
Apr 28, 2020
Feature overview:
1 Advantages 31.1 Free . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.2 Flexible . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.3 Easy To Use . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.4 Efficient . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41.5 Fast . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2 Features 52.1 Filtering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.2 Offset Calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.3 Wide Range of Supported Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.4 Easy Customization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.5 Driver Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
3 Supported Hardware 73.1 Coprocessors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73.2 Cameras . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
4 Coprocessor Setup 94.1 Raspberry Pi . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
5 PC Testing 115.1 JDK Installation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115.2 Installing Chameleon Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
6 Auto Startup And Upgrade 136.1 Auto Startup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136.2 Version Upgrade . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
7 User Interface 157.1 Opening the UI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157.2 Getting Familiar with the UI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157.3 Saving changes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
8 Command-Line Arguments 23
9 Building a 2D Pipeline 259.1 Step 0: Settings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
i
9.2 Step 1: Input . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 259.3 Step 2: Threshold . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 259.4 Step 3: Contours . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 269.5 Step 4: Output . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
10 Robot Code Example 2710.1 NetworkTables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2710.2 Vision Alignment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
11 Command Line Information 3111.1 Webserver Port . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3111.2 Platform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3111.3 Camera Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3111.4 Network Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3211.5 NetworkTable State . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
12 NetworkTables 33
13 Timestamp 35
14 SolvePNP 37
15 Contact Us 4315.1 Bug reports / feature requests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4315.2 Other information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4315.3 Source Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
16 Authors 45
17 Acknowledgments 47
18 License 49
ii
Chameleon Vision
Chameleon Vision is free open-source software for FRC teams to use for vision proccesing on their robots.
Feature overview: 1
Chameleon Vision
2 Feature overview:
CHAPTER 1
Advantages
Why should you choose Chameleon Vision?
1.1 Free
Chameleon Vision costs nothing to use, saving teams hundreds of dollars when compared to existing solutions.
1.2 Flexible
Chameleon Vision can be run on any Java-compatible device, including:
• Raspberry Pi 3 and 4
• Most Linux and Windows Desktops
• Jetson Nano and TX2
• and more
This flexibility allows to be set up in under an hour for just USD$50 using a Raspberry Pi and USB Camera. Comparedto existing solutions which can cost upwards of USD$300 or take weeks to setup, Chameleon Vision is revolutionisingthe FRC Vision Processing field. To learn more about setting up compatible devices, see Supported Hardware.
1.3 Easy To Use
Chameleon Vision contains a modern web interface, which allows any team, no matter the experience level, to tunetheir vision processing. All the available option can be configured using easy to use sliders and buttons.
3
Chameleon Vision
1.4 Efficient
Boasting lower latency and higher frames per second than similar solutions, Chameleon Vision can improve fieldperformance, make life easier, and keep you one step ahead of the competition.
1.5 Fast
Chameleon Vision is able to reach over 200 fps, outperforming other solutions.
4 Chapter 1. Advantages
CHAPTER 2
Features
2.1 Filtering
Filter out your target from the background with HSV (Hue, Saturation, Value) and then fine-tune using area, width,height, and ratio settings. Using advanced multi-target intersection filtering, you can filter even more.
2.2 Offset Calibration
Oftentimes, it is difficult to place the camera in the direct center of the robot. Chameleon Vision contains special toolsto fix the incorrect positioning.
2.3 Wide Range of Supported Hardware
Chameleon Vision supports a wide range of both coprocessors and cameras. This allows for the maximum amount offlexibility in both performance and cost.
For optimal performance, the Chameleon Vision Team recommends the Raspberry Pi 4 and the PS3Eye (available viaonline retailers).
Warning: The Pi Camera Module and all Network Cameras are NOT yet supported.
2.4 Easy Customization
Each camera connected to Chameleon Vision can have individual resolution, FPS, brightness, and exposure settings.The camera views are visible from the Driver Station and can be sent at lower resolutions to conserve bandwidth. Thiscustomization allows you to ensure that Chameleon Vision works how you want it to.
5
Chameleon Vision
2.5 Driver Mode
Cameras can also be configured to be passed to the Driver Station using the “Driver Mode” setting. This allows thedriver to take advantage of Chameleon Vision’s speed in game.
6 Chapter 2. Features
CHAPTER 3
Supported Hardware
Hardware listed on this page has been tested by Chameleon Vision developers and is more likely to work out of thebox.
3.1 Coprocessors
These are all fully compatible with Chameleon Vision:
• Raspberry Pi 4 (Recommended)
• Raspberry Pi 3 and 3+
• Jetson Nano and TX2
Any Java-compatible 64-bit Windows or Linux should be able to run it natively.
3.2 Cameras
These cameras are 100% compatible with Chameleon Vision. Using one of these cameras is highly recommended forthe best experience.
• PS3 Eyecam (Linux only)
• Microsoft LifeCam HD-3000
• Pi Camera Module (all versions)
These cameras are mostly compatible, but come with no guarantees as to full compatibility.
• Microsoft LifeCam Cinema
• Microsoft LifeCam VX-5500
• Logitech C310
7
Chameleon Vision
Warning: These cameras are NOT yet compatible with Chameleon Vision. Please contact us if you wish to useone of these cameras.
• Network Cameras (such as the Axis Camera)
8 Chapter 3. Supported Hardware
CHAPTER 4
Coprocessor Setup
Note: If testing on a separate device, please follow the instructions in PC Testing.
4.1 Raspberry Pi
Chameleon Vision can run on most operating systems available for the Raspberry Pi. However, it is recommended thatyou install Rasbian Buster Lite, available here. Follow the instructions to install Raspbian onto an SD card.
Ensure that the Raspberry Pi is connected via Ethernet to the Internet. Log in to the Raspberry Pi (username pi andpassword raspberry) and run the following commands in the terminal:
$ wget https://git.io/JeDUk -O install.sh$ chmod +x install.sh$ sudo ./install.sh$ sudo reboot now
Congratulations! Your Raspberry Pi is now set up to run Chameleon Vision!
Once the Raspberry Pi has rebooted, Chameleon Vision can be started with the following command:
$ sudo java -jar chameleon-vision.jar
When a new version of Chameleon Vision is released, update it by running the following commands:
$ wget https://git.io/JeDUL -O update.sh$ chmod +x update.sh$ sudo ./update.sh
Note: Instructions on starting Chameleon Vision automatically on boot are coming soon.
9
Chameleon Vision
10 Chapter 4. Coprocessor Setup
CHAPTER 5
PC Testing
5.1 JDK Installation
Chameleon Vision requires JDK Version 12 or newer. To check whether you are running a compatible version, type $java --version. You should see a result similar to the one below:
$ java --versionjava version "12.0.2" 2019-07-07Java(TM) SE Runtime Environment (build 12.0.2+10)Java HotSpot(TM) 64-bit Server VM (build 12.0.2+10, mixed mode, sharing)
If your version is older than JDK 12, follow the instructions below.
5.1.1 Installing JDK 12
Follow the relevant instructions for your platform here.
5.2 Installing Chameleon Vision
Chameleon Vision is available for download from the GitHub repository.. The file should be namedchameleon-vision-X.X-RELEASE.jar, varying with the current version number.
Then, open a terminal in the folder with the downloaded JAR file and run the following command, replacing X.X withthe latest version number.
$ java -jar chameleon-vision-X.X-RELEASE.jar --unmanage-network
Warning: Take care to use the --unmanage-network argument, as without it it will change your networksetup.
11
Chameleon Vision
12 Chapter 5. PC Testing
CHAPTER 6
Auto Startup And Upgrade
6.1 Auto Startup
Using Chameleon Vision UI you can run your software at startup with just one click. Under General Settings you’ll seean option called Install or Update clicking the Install current version button will make chameleonvision run at startup.
13
Chameleon Vision
6.2 Version Upgrade
With a new jar file downloaded to you local machine click on the file input for ‘Choose a newer version’ and selectthe newer file. once selected the button will change to Install And Update once clicked it will transfer the newversion into the remote machine and make the new version run at startup after that all that is left is to restart and yourdone.
14 Chapter 6. Auto Startup And Upgrade
CHAPTER 7
User Interface
7.1 Opening the UI
To connect to the Chameleon Vision UI you will need to enter in the web address. The web address is the<coprocessor-IP>:<port>. The port defaults to 5800 and the coprocessor’s IP address will be in the form10.TE.AM.XX. If you are testing on a desktop computer you can use localhost in place of the IP address. Forexample, an IP address of 10.15.77.13 and a port of 5800, the web address will be ‘‘10.15.77.13:5800‘.
7.2 Getting Familiar with the UI
The user interface has two main tabs, you can switch between them in the top-right corner:
• Vision: Allows teams to configure pipeline settings with a live preview.
• Settings: For configuring networking and camera settings.
15
Chameleon Vision
7.2.1 Vision Tab
The Vision tab displays the processed image in real-time to help teams tune and adjust the pipeline setup.
There are five steps to configuring a vision pipeline:
1. Camera and Pipleine Selection
2. Input
3. Threshold
16 Chapter 7. User Interface
Chameleon Vision
4. Conoturs
5. Output
Camera and Pipeline Selection
On the left side there is a dropdown labelled “Camera”. This list contains all detected cameras. The pen icon can beused to rename the camera.
On the right side there is a dropdown labelled “Pipeline”. These pipelines contain different settings and they can beeasily switched between. Any edited configurations will apply only to the selected pipeline.
Input
The input tab allows the adjustment of camera exposure, brightness, and orientation:
Threshold
The thresholding tab allows teams to adjust the Hue/Saturation/Value (HSV) thresholds. This allows only parts of theimages within the thresholds to be detected, such as when configured for vision tape. Teams can also enable erode anddilate for eliminating speckles.
For a more in depth explanation of erode and dilate visit opencv’s page
7.2. Getting Familiar with the UI 17
Chameleon Vision
Contours
The contours tab has sliders which constrain the contours which can be considered for sorting. Teams can adjust theminimum or maximum area, aspect ratio (the ratio of width to height of bounding rectangle of the object) or extent (theratio of contour area to bounding rectangle area). This tab also allows teams to select only one target or to group twotogether. Another filtering option is Speckle rejection, it ignores small contours “speckles” compared to the largestcontour seen.
18 Chapter 7. User Interface
Chameleon Vision
Output
The output tab controls how the contours which make it through thesholding and filtering are sent as the target. Teamscan sort contours by leftmost/rightmost/topmost/bottommost, largest, smallest, or closest to the crosshair (centermost).
7.2. Getting Familiar with the UI 19
Chameleon Vision
This tab also allows teams to perform crosshair calibration. Instead of offsetting values in code, teams can line up theirrobot perfectly by hand, click “calibrate A” and “calibrate B”, and the crosshair will be set to the current position. Ifthe robot needs to shoot gamepieces into a goal from different distances, teams can calibrate A at their closest scoringposition and B at their furthest scoring location, and the crosshair will linearly interpolate between the two offsetsbased on distance (area) from the target.
3D
The 3D tab is used for SolvePNP. This is an advanced feature which is not needed for 2D pipelines.
20 Chapter 7. User Interface
Chameleon Vision
Image / Binary Image
On the right in the vision tab you will see the camera’s image, this is the image published. You can also chooseThreshold to see a binary image of the threshold filtering (HSV erode % dilate). A white represents a pixel thatpassed the threshold filtering and a black one is a pixel that didn’t pass the filtering. You can also see the FPS, pitchand yaw of the target
7.2.2 Settings tab
In the settings tab you change can settings in couple of categories
General
Network settings and team number
Cameras
Resolution and fps for each detected cameras
Camera Adjustments
This tab contains the Driver Mode and 3D settings for each camera. Driver Mode disables the overlays on the streamedcamera output, making it easier for the driver to see. In this tab you can set the brightness and exposure for eachdetected cameras.
Note: It might take a couple of seconds for the camera to switch its exposure settings so switching driver mode on oroff can cause a problem with the vision processing/ the driver’s view for a few seconds
You can also calibrate your 3D model as explained here.
7.3 Saving changes
After configuring and tuning your pipeline settings the changes will be saved automatically, alternatively it can besaved by pressing the Save button.
Note: On version 1.1.4 or older, the changes are NOT saved automatically at all. They are only saved when the clientcloses its session (close the browser tab or refresh the page).
7.3. Saving changes 21
Chameleon Vision
22 Chapter 7. User Interface
CHAPTER 8
Command-Line Arguments
Chameleon Vision supports a few command-line arguments to enable or disable certain features, or to change operationparameters.
Table 1: Arguments TableArgument Parameters Description--nt-servermodeNone Enables the internal NetworkTables Server.--nt-client-serverIP address (i.e.
10.8.32.2)Sets a manual IP address for the NT Client to connect to.
--unmanage-networkNone Disables the automatic network settings manager.
To use this commands add the argument after the run command, for example
java -jar chameleon-vision.jar --nt-client-server 10.15.77.2
23
Chameleon Vision
24 Chapter 8. Command-Line Arguments
CHAPTER 9
Building a 2D Pipeline
After getting familiar with the UI you can create your pipeline. We will go over all (except 3D) the vision tabs one byone
9.1 Step 0: Settings
Before we start, make sure you have configured the settings correctly. Ensure that your:
• Team number is set correctly in Chameleon Vision UI
• Coprocessor has a static IP address (such as 10.TE.AM.11)
• Static IP is set within Chameleon Vision settings
It is also best to have the maximum FPS. This can generally be achieved by decreasing the resolution of the camera.
9.2 Step 1: Input
Adding a green LED ring around the camera ensures that retro-reflector targets are bright and saturated, which makesthem easier to see. Lowering the camera’s exposure and brightness will darken the rest of the field and make it mucheasier for the bright vision targets to be filtered.
Note: Not all USB Cameras support exposure settings.
9.3 Step 2: Threshold
In this step you will adjust the HSV threshold values.
25
Chameleon Vision
Hue-Saturation-Value (HSV) is a way of digitally represented colours, similar to Red-Green-Blue (RGB). However,HSV encodes the colour in terms of its hue, making it easier to filter out a certain colour.
We want to create a filter to select only pixels that are “greenish”.
1. Select the threshold image to see the filter output. This makes it easier to see what we’re doing.
2. Adjust the slider until just the tape is visible. If the rest of the vision target is too bright, try decreasing thecamera exposure.
3. Widen the colour filters to give leeway as the lighting between fields differs.
9.4 Step 3: Contours
Once the background has be filtered out we can treat the remaining white pixels as “contours”. Each contour has:
• area
• ratio between width and height
• height
• width
It is possible to filter out contours which do not meet the specifying criteria of the above properties. In 2016 the visiontargets were a U shape, which had a different W/H ratio compared to the 2017 targets, which were rectangular.
Sometimes the vision targets can be pairs, such as in 2019, where both targets were on an angle. Chameleon Visionallows two contours to be merged into one contour.
The bounding rectangle of the contour will be drawn on the output.
9.5 Step 4: Output
Depending on your camera’s placement you might want the final contour/contours to be the topmost / buttommost /leftmost / rightmost / centermost
Now that we have built the pipeline, we want to receive the results in the RoboRIO. This information is sent viaNetworkTables. See the example robot code for an example of how to use it.
26 Chapter 9. Building a 2D Pipeline
CHAPTER 10
Robot Code Example
This example code demonstrates the basics of NetworkTables and vision alignment.
10.1 NetworkTables
This code reads the yaw value to the target from NetworkTables and prints it. It also puts the camera into Drive Modewhen the A button is pressed on the Joystick.
import edu.wpi.first.wpilij.TimedRobot;import edu.wpi.first.networktables.NetworkTable;import edu.wpi.first.networktables.NetworkTableEntry;import edu.wpi.first.networktables.NetworkTableInstance;
public class NetworkTableExampleRobot extends TimedRobot {public Joystick joystick;public NetworkTableEntry yaw;public NetworkTableEntry isDriverMode;
public void robotInit() {// Gets the joystick connected to port 1joystick = new Joystick(1);
// Gets the default instance of NetworkTablesNetworkTableInstance table = NetworkTableInstance.getDefault();
// Gets the MyCamName table under the chamelon-vision table// MyCamName will vary depending on the name of your cameraNetworkTable cameraTable = table.getTable("chameleon-vision").getSubTable(
→˓"MyCamName");
// Gets the yaw to the target from the cameraTableyaw = cameraTable.getEntry("yaw");
(continues on next page)
27
Chameleon Vision
(continued from previous page)
// Gets the driveMode boolean from the cameraTableisDriverMode = cameraTable.getEntry("driver_mode");
}
public void teleopPeriodic() {// Prints the yaw to the targetSystem.out.println(yaw.getDouble(0.0));
// Sets driver mode to true if the A button is pressedisDriverMode.setBoolean(joystick.GetRawButton(0));
}}
10.2 Vision Alignment
This is an example written in Java. This is an imaginary robot with a drivetrain consisting of four Spark motors and aJoystick in port 1.
public class Robot extends TimedRobot {//Variables
//Robot vars (not directly related to the vision system)
public Joystick joystick;
//This example uses 4 spark motors controllers for the drivetrain with 2 motors→˓on each side
Spark m_frontLeft;Spark m_rearLeft;SpeedControllerGroup m_left;
Spark m_frontRight;Spark m_rearRight;SpeedControllerGroup m_Right;
DifferentialDrive m_drive;
/*NetworkTable is a collection of databases stored in the RoboRIOThe co processor can change data inside itThis database can be viewed in a program named OutlineViewer"table" represents a single database called "MyCamName" under "Chameleon Vision`"
*/NetworkTable table;
/*targetX represents the horizontal angletargetY represents the vertical angle
*/NetworkTableEntry targetX;NetworkTableEntry targetY;
(continues on next page)
28 Chapter 10. Robot Code Example
Chameleon Vision
(continued from previous page)
//Error values for the control loopdouble rotationError;double distanceError;
//Control loop constants/*
This example uses proportional control loop with constant forceAfter you master proportional control use might want to try PID control loop
*/double KpRot=-0.1;double KpDist=-0.1;
//Deadzone is necessary because the robot can only get so accurate and cannot be→˓pefectly head on the target
double angleTolerance=5;//Deadzone for the angle control loopdouble distanceTolerance=5;//Deadzone for the distance control loop
/*There is a minimum power that you need to give to the drivetrain in order to
→˓overcome frictionIt helps the robot move and rotate at low speeds
*/double constantForce=0.05;
/*rotationAjust is rotational signal for the drivetraindistanceAjust is forward signal for the drivetrain
*/double rotationAjust;double distanceAjust;
//Initilazition functionpublic void robotInit(){
//Initilazition of robot drivetrain and joystickjoystick = new Joystick(1);m_frontLeft = new Spark(1);m_rearLeft = new Spark(2);m_left = new SpeedControllerGroup(m_frontLeft, m_rearLeft);m_frontRight = new Spark(3);m_rearRight = new Spark(4);m_Right = new SpeedControllerGroup(m_frontRight, m_rearRight);
m_drive = new DifferentialDrive(m_left, m_right);
//Points "table" to the NetworkTable database called "chameleon-vision"table=NetworkTableInstance.getDefault().getTable("chameleon-vision").
→˓getSubTable("MyCamName");
//Points to the database value named "yaw" and "pitch"targetX=table.getEntry("yaw");targetY=table.getEntry("pitch");
}
//Periodic function(continues on next page)
10.2. Vision Alignment 29
Chameleon Vision
(continued from previous page)
public void teleopPeriodic(){
rotationAjust=0;distanceAjust=0;if (joystick.GetRawButton(0))//the "A" button{
/*Fetches the rotation and distance values from the vision co processorsets the value to 0.0 if the value doesnt exist in the database
*/rotationError=targetX.getDouble(0.0);distanceError=targetY.getDouble(0.0);
/*Proportional (to targetX) control loop for rotationDeadzone of angleToleranceConstant power is added to the direction the control loop wants to
→˓turn (to overcome friction)
*/if(rotationError>angleTolerance)
rotationAjust=KpRot*rotationError+constantForce;else
if(rotationError<angleTolerance)rotationAjust=KpRot*rotationError-constantForce;
/*Proportional (to targetY) control loop for distanceDeadzone of distanceToleranceConstant power is added to the direction the control loop wants to
→˓turn (to overcome friction)
*/if(distanceError>distanceTolerance)
distanceAjust=KpDist*distanceError+constantForce;else
if(distanceError<distanceTolerance)distanceAjust=KpDist*distanceError-constantForce;
//Output the power signals to a arcade drivetrainm_drive.arcadeDrive(distanceAjust,rotationAjust);
}}
}
30 Chapter 10. Robot Code Example
CHAPTER 11
Command Line Information
When running Chameleon Vision a lot of information is displayed in the output of the terminal. Most of this informa-tion is for debugging and is not needed.
Note: Some of the output is not generated by Chameleon Vision but by external libraries. Please ignore these linesas well.
11.1 Webserver Port
When the program initializes it will print a line similar to this:
Starting Webserver at port 5800
Most of the time it will be port 5800 but it can vary. This is what is used to connect to the UI.
11.2 Platform
It will also print a line similar to this:
Starting Chameleon Vision on platform Windows x64
If you receive a message such as Unknown Platform. OS: ... or an incorrect platform, please contact us.
11.3 Camera Initialization
For each camera detected by Chameleon Vision the text below with be printed, followed by the initialization time.
31
Chameleon Vision
CS: <camera-name>: Connecting to USB camera on <camera-path>
If you camera doesn’t appear and hasn’t been detected by Chameleon Vision please contact us.
11.4 Network Management
On Linux devices Chameleon Vision attempts to change the network adapter settings to enable the coprocessor towork on the robot. On Windows this is not supported so there will be a warning message.
11.5 NetworkTable State
The program will print the below message if it was unable to connect the a NetworkTable server instance. The serverIP address can be configured as specified in Command Line Arguments.
NT Connection has failed!
32 Chapter 11. Command Line Information
CHAPTER 12
NetworkTables
NetworkTables is a communication protocol used between the roboRIO and other devices on the network. For moreinformation, see the frc-docs article.
NetworkTables acts like shared folder which is accessable by all connected devices. Every folder can have sub-folderswhich can also have files. These are called “tables”, “subtables”, and “entries”. Each entry has both a name, type, anda value.
A program called OutlineViewer can be used to view the NetworkTable entries and is included with your WPILibinstall.
Chameleon Vision stores its entries in a table called chameleon-vision. Each camera will be in a subtable with it’sname, say, TurretCam. The TurretCam table will contain the pipeline information, detailed below.
• targetPitch (double) The vertical angle to the target in degrees.
33
Chameleon Vision
• targetYaw (double) The horizontal angle to the target in degrees.
• targetArea (double) The precentage of the target contour from the entire image
• targetPose (double array) The 3D target position as an array values are ordered by [x (meters),y (meters),angle(degrees)]
• targetFittedWidth (double) & targetFittedHeight (double) The width and height of the blue rectangle inpixels
• targetBoundingWidth (double) & targetBoundingHeight (double) The width and height of the red rectan-gle in pixels
• targetRotation (double) The angle of the blue rectangle in degrees
• pipeline (editable double) The index of the current pipeline Chameleon Vision should run. Change this toswitch between pipelines.
• latency (double) The time delta that it took for the vision process to compute the image
• driverMode (editable boolean) Setting this to true enables the Driver Mode.
• isValid (boolean) Whether or not a a target was found.
• auxTargets (array) A json array of the 5 best targets each of the targets has its own array. The array valuesare ordered as [pitch, yaw, area, bounding width, bounding height, fitted width, fitted height, target angle,target pose] and are the same type like the selected target
34 Chapter 12. NetworkTables
CHAPTER 13
Timestamp
35
Chameleon Vision
36 Chapter 13. Timestamp
CHAPTER 14
SolvePNP
Chameleon Vision includes everything you need to get started with 3D pose reconstruction with the 2020 InfiniteRecharge upper vision targets.
Before the 3D features are enabled, the camera must be calibrated. Do this from the Cameras tab, as illustrated in thisinformational video, using the provided chessboard image printed using your printer’s “fit to page” mode and a pair ofcalipers or a ruler. This must be done per resolution that solvePNP is to be used with.
The reason we need to first callibrate the camera is because cameras distorts images to a certain degree. The types ofdistortion are:
• Radical distortion
37
Chameleon Vision
– Radical distortion results in straight lines to appear bent or curved and is the most common type.
• Tangential distortion
– Tangential distortion results in objects appearing farther away or closer than they are in real life. This isdue to the lens of the camera not being aligned with the plane of the image.
By finding how much radical or tangential distortion there is, we can then undistort any image from the camera. Inorder to measure the amount of distortion, we can take pictures of an image with a known shape and see how muchthe camera distorts that image. We can then use that information to correct the distortion errors.
For example, below you can see a chessboard that is undistorted. To the right, you can see an example of exaggeratedradical and tangential distortions of the same chessboard taken from a camera.
Before callibrating:
• Make sure the chessboard is 8 x 8 with the length and width of EACH sqaure is the same and correctly inputtedinto Chameleon
• Collect images from different areas, tilts and distances, tilts of +/- 45 degrees is recommended, however, rotatingtoo much may result in inaccurate data
• Lighting: make sure no shadows are being cast onto the chessboard, and the chessboard is recieving even lighting
• Make sure you see rainbow stripes from the corners of the chessboard before you take a snapshot.
38 Chapter 14. SolvePNP
Chameleon Vision
After you complete the callibration process, on the command line, you’ll see:
• Camera Matrix
– The camera matrix is a 3 by 3 matrix as show below
–
– fx and fy : These are the focal lengths of the camera expressed in pixels
– cx and cy : These are the principal points and refer to the image center
– The camera matrix will be affected by any change in resolution and hence, you will need to recalibrate.
• Distortion Coefficents
– distortion coefficents are vectors of up to 8 parameters, formated like : k1, k2, p1, p2 [, k3 [, k4, k5, k6]]
– k1, k2.. are radical distortion coefficents
– p1, p2. . . are tangential distortion coefficents
Now that we have the camera matrix and distortion coefficents, the final parameter we need is the real world coor-dinates. These coordinates are derived by using the corners of your target in 3D space, for example, following theformat:
• -targetwidth/2, targetheight/2, 0
39
Chameleon Vision
• -targetwidth/2, -targetheight/2, 0
• targetwidth/2, -targetheight/2, 0
• targetwidth/2, targetheight/2, 0
These world coordinates for the 2020 season are premade for the Loading Bay and Upper port in the dropdown menuor can be uploaded as a .csv file.
Now that we have callibrated the camera, and found the 3D world coordinates, we can now use this information to:
• Undistort an image received from the camera
• Using the 3D world coordinates and 2D target coordinates from the image, the SolvePNP() function returns thecamera pose using the camera matrix and distortion coefficents
• Returns a re-projection .. error
OpenCV calculates reprojection error by projecting three-dimensional of chessboard points into the image using thefinal set of calibration parameters and comparing the position of the corners. An RMS error of 300 means that, onaverage, each of these projected points is 300 px away from its actual position. The closer the error is to 0, the betterthe callibration.
Before enabling 3D tracking, calculate the pitch between the horizontal axis and the optical axis of your camera andenter it into the Camera Pitch field in the chameleon UI.
In the NT table, after enabling 3D tracking, you will see a targetPose double array. These are:
• pose[0]: X distance, which is the distance between the target and the camera in the direction of the optical axis(meters)
• pose[1]: Y distance, which is the distance from the target, perpendicular to the optical axis (meters)
• pose[2]: The angle between the target plane and the optical axis (degrees)
40 Chapter 14. SolvePNP
Chameleon Vision
Now point your camera at a 2020 FRC vision target, enabled 3D from the 3D tab, and get started tracking! It really isJust That Easy.
41
Chameleon Vision
42 Chapter 14. SolvePNP
CHAPTER 15
Contact Us
15.1 Bug reports / feature requests
To report a bug or submit a feature request in Chameleon Vision, you can either:- Submit an issue on the Chameleon Vision GitHubOr- Contact the developers on Discord
If you found an problem in this documentation, you can also submit an issue on the Chameleon VisionDocumentation GitHub
15.2 Other information
For getting other information about Chameleon Vision that may not have been covered in this documentation, youcan reach out to us on Discord.
15.3 Source Code
You can find the source code for all Chameleon Vision projects on our GitHub page, or individually as listed below.- Chameleon Vision- Chameleon Vision Documentation- Chameleon Vision Raspberry Pi Image
43
Chameleon Vision
44 Chapter 15. Contact Us
CHAPTER 16
Authors
• Sagi Frimer - initial work - websocket, settings manager, UI
• Ori Agranat - main coder - vision loop, UI, websocket, networktables
• Omer Zipory - developer - vision loop, websocket, networking
• Banks Troutman - developer - vision loop, websocket, networking
• Matt Morley - developer - documentation
45
Chameleon Vision
46 Chapter 16. Authors
CHAPTER 17
Acknowledgments
• WPILib - Specifically cscore, CameraServer, NTCore, and OpenCV.
• Apache Commons - Specifically Commons Math, and Commons Lang
• Javalin
• Spring Framework
• JSON
• Google - Specifically Gson
47
Chameleon Vision
48 Chapter 17. Acknowledgments
CHAPTER 18
License
Usage of Chameleon Vision must fall under all terms of Creative Commons Attribution-NonCommercial-ShareAlike4.0 International
49
Recommended