Skip to content Skip to footer

Meet QCar, A New Platform for Self-Driving Car Research Applications

Before we ever see a fully autonomous transportation infrastructure in society, self-driving cars will need to interact with unpredictable human drivers. That’s why Quanser recently launched the QCar, the feature vehicle of a new Quanser platform for self-driving car research applications.

If you’re not familiar, Quanser specializes in creating custom or augmenting existing research labs for modern engineering applications. The company envisions research test scenarios in which fully autonomous QCars interact with one or more human-driven QCars, ultimately bringing self-driving cars closer to mainstream reality.

What we love about the QCar is that it’s a powerful and flexible platform that has everything you need to test your ideas. It includes serious GPU power for real-time image processing, vision and navigation, audio features, and different lighting conditions.

GPU Power for Real-time Image Processing 

At the core of the QCar, NVidia Jetson TX2 plugs into a custom PCB. For a small, mobile platform, this is some serious GPU power which supports real-time image processing and AI functions. It includes a USB 3.0 hub so that researchers can add a variety of high speed devices and an Intel RealSense D435 depth camera that will make use of the USB 3.

Vision and Navigation 

The QCar includes 4 onboard, wide angle, CSI color cameras to give you 360 degrees of vision. These cameras can stream their data directly to the onboard GPU so you can get almost 4K resolution at 10 bits per pixel per camera — or you can switch to lower resolutions and get 120fps. With the direct interface into the GPU, you can leverage the power of the CUDA cores with minimal latency.

There is also a 2D LIDAR on top for 360 degree ranging. This can either enhance your visual processing with the CSI cameras, or you can navigate solely with LIDAR alone.

Audio Features 

Stereo microphones adds another dimension to sensor fusion. For example, could this be used to determine the surface material your car is currently traveling on and change the control response as a result? Could a response to audio cues of an emergency vehicle be formulated before it is visible? Could honking horns be spatially located to identify a future threat?

Signals and Lights 

The QCar features brake lights, turn signals, reverse indicators and headlights. That means you can test how your image processing routines work under different lighting conditions to test the robustness of your algorithms, or try dynamically switching your processing models.

You can learn more about the QCar on Quanser’s blog here.

Contact us for detail specs and pricing information.

author avatar
Christine Archer
Stay in the loop!

Sign up for our monthly newsletter with the the latest in maker education, workforce development and skills based training, engineering education, and more!