NodeGoose | A Networked nodeJS Driver & GCODE Interpreter for the Dobot SCARA Robotic Arm



Robotic control systems, motion planners, and applications typically require a significant level of specialized technical knowledge. Since robots are usually costly investments, ventures into applications caries significant risk should integration not function to expectation or complete on time. Despite their fantastic utility when fully operational, robots still aren't something you can simply pick up, put on your desk, and have performing sophisticated instruction sets without error yet.

I wanted to see just how difficult it would be to develop applications for a multi-purpose robot that I could afford to take the risk on, so I backed this project on Kickstarter. Dobot is a company that has developed an inexpensive SCARA Robot available for under $500, and they are progressively open-sourcing hardware and software components as they become refined. The Dobot developer network has included controls that afford linear and angular motion, as well as independent laser, pump, and gripper adjustments in a small desktop package. While the Dobot Robotic Arm is not designed for industrial loads, it does provide means for reliable mechanical simulation for what industrial robots are capable of at a safe price.


This technical article will touch on the following topics as they relate to the development of simple and intuitive software than anyone could build using their computer and open source software resources:

Robotics [n]: "the design, construction, operation, and application of artificial agents"
Controls [n]: "open or closed-loop director of system or device response to stimulus"
GCODE [n]: "the most widely used numerical control (NC) programming language"
Network [n]: "communication between systems using a shared protocol"
Computer Vision [n]: "artificial visual to physical perception for computer systems"
API [n]: "describing a software component in terms of its operations, inputs, and outputs"

The Idea

Since the Dobot team made their software open-source, I was able to take a look at their communication protocol and use it to make a more adaptable control software. While the firmware and communication protocol are still in beta and had several bugs, it was stable enough for me to successfully implement a low-cost laser-cutter/engraver at 10% the price of the going rate for even the least expensive desktop laser-cutters like the famous Glowforge. I wanted to add a little bit of extra convenience to my software, so I included an untethered network experience along with the drivers and G-code interpretation service.

The 3D printing industry was able to quickly expand because the DIY, maker, and open-source communities aligned their common goals with a collaborative development processes. Any single organization could not have completed this alone due to the breadth of knowledge that needed to come together into a concise and well-documented application. Once the downstream APIs (G-code) and drivers (//Marlin, Repetier, etc.) were refined and robust electromechanical integrations of 3D printers (//RepRaps, Makerbots, etc.) were published, built, and tested, things started moving really quickly in the direction of mass consumer adoption. Developers like foosel even made a wireless print server called OctoPrint to break the tether required between the printer and the host computer. This allowed for users to control their 3D printers untethered, and even remotely queue, kick off, and monitor their prints.

NodeGoose is like OctoPrint in that it enables a top-level untethered network experience for controlling a robotic arm, but it also implements the drivers necessary to interpret and translate the GCODE communication protocol sent from the client into the machine formats needed to control the robotic arm and its components. It is a full-stack implementation using nodeJS as the platform, angularJS as the framework, and JavaScript as the programming language. What this enables is an open collaboration platform for hardware and software innovators to add to, remix, and reinvent components and applications for affordable DIY robot control.

The Implementation

NodeGoose uses a nodeJS server running Express Middleware to produce a network interface, JavaScript to enable the driver and G-code interpretation services, and angularJS as the framework for implementing the UI/UX elements like file-upload, jog-moves, connect/disconnect, etc. The server is run on a host that is connected to a local network, and also physically connected to the robotic arm via Serial over USB emulation. The server waits and listens for requests, and when it receives a HTTP GET request it returns an HTML page with further instructions for the client's browser. The HTML page instructs the client browser to fetch and load all supplementary scripts for the angularJS framework and socket io communication for back and forth transit of commands and data between the browser and the server, thus enabling a UI/UX interface through which the client can send instructions from their browser window to the robotic arm via the intermediary server.


I named this open-source software project NodeGoose because by convention open-source software projects need to have a fun name. I made the full-stack UI/UX and control system using nodeJS as the platform, angularJS as the framework, and JavaScript as the programming language.

There are several use cases implemented:

  1. Client controls robot step by step using jog mode buttons and visual feedback.
  2. Client uploads a full GCODE program with instructions for the robot to run through
  3. Client develops a fully autonomous computer vision controlled system for dynamic robot response

Below is an illustration of the system architecture and high-level control loop:


These are some code snippets covering the technical details for how steps are enabled in the driver and gcode interpreter service. I use the first use-case (jog mode buttons) as the most simple way to illustrate the core concepts behind how the UI/UX framework running in the client's browser enables control of the robot via physically connected server:

1. Communication from Server to Robot via USB/Serial using NPM Serialport Module:

var SerialPort = require('serialport');     // requires LIBUSB available to OS

var Dobot = function(COM, BAUD) {
    var that = this;

	this._PORT = new SerialPort.SerialPort(COM, port_params, false); { ...

	this._PORT.on('data', function (data) {

		data = new Buffer(data);
		//console.log("buffer rx length: " + data.length);

		if(data.length == 42) {
			that._STATE = "WAITING";

2. Parsing and interpreting GCODE commands using Regular Expressions:

if(g_command === '1') {

	//extract x
	var x_coordinate	= command.match(/X([+-]?[\d]+[\.]?[\d]+]?)/i);
		if (x_coordinate) { x_coordinate = parseFloat(x_coordinate[1]); }


//create an object with the selected dobot command parameters
var selected_state 	= {
	write 			: write_mode,
	x_pos 			: x_coordinate,
	y_pos 			: y_coordinate,
	z_pos 			: z_coordinate,
	head_rot 		: rh_angle,
	is_grab 		: grp_angle,
	laser_pwr 	: lsr_power,
	feed_rate 	: feed_rate,
	settings 		: false

//call function to create command buffer
command_buffer = this.generateCommandBuffer(selected_state);


3. Generating and Sending Command Buffers to Dobot:

Dobot.prototype.generateCommandBuffer = function(data) {
	var command_buffer = new Buffer(42);					//create 42 byte buffer
	command_buffer[0] = 0xA5;											//write the header
	command_buffer.writeFloatLE(data.x_pos, 9);		//write the x (absolute value)
	command_buffer.writeFloatLE(data.y_pos, 13);	//write the y (absolute value)
	command_buffer.writeFloatLE(data.z_pos, 17);	//write the z (absolute value)
	command_buffer[41] = 0x5A;									//write the tail in all cases

	return command_buffer;

this._PORT.write(buffer);			//send over that buffer on next() heartbeat!

Whew, okay now that we have mechanisms for interpreting commands, generating buffers that Dobot can understand, and a means to send messages to Dobot over the serialport, we need to come up with a reasonable way to trigger these driver functions. Here are some code snippets covering the technical details for how steps are enabled for the UI/UX elements using the angularJS framework. I'll use a simple example where a {Jog Button} can be triggered in the browser via a button that causes the Dobot to move in the selected jog axis and direction

4. First thing we do when the html page is received by the client is load the AngularJS and dependencies, as well as the Angular Controller (detailed in step 5). I've included the ng-controller binding for the mousedown and mouseup events as an example for the jog move functionality:

    <script src= "./js/angular.min.js"></script>
    <script src= "./js/control.min.js"></script>
    <script src= "./js/socket.min.js"></script>
<body ng-app="myApp" ng-controller="userCtrl"> <td ng-mousedown="jogXpos()" ng-mouseup="jogStop()"><strong>X+</strong></td> <td><strong></strong></td> </body>

5. We specify the details of the Angular Controller (control.js) to instruct how the AngularJS Framework scripts loaded should interpret interaction with Angular elements embedded in the Index page displayed in the client's browser window. The module is injected into the Angular Controller and it is used to emit socket messages when the user of the browser page clicks and unclicks the Jog Button:

var myApp = angular.module('myApp', [(inject angular modules here)]);


myApp.controller('userCtrl', ['$scope', '$http', '$window', 'mySocket'
	function($scope,$http,$window, mySocket) {

		$scope.connectDobot = function(){
			mySocket.emit('dobot client', { connect: true });
			$window.updateStateResponseInterval = setInterval(function(){
		  		mySocket.emit('dobot client', {getState: true});


		$scope.jogXpos = function(){
    		mySocket.emit('dobot client', {jog: true, axis: "X", direction: 1});

		$scope.jogStop = function(){
    		mySocket.emit('dobot client', {jog: true, axis: "STOP", direction: null});


6. Meanwhile, the server listens and routes HTTP and socket traffic using Express and This is then routed to specified methods in the dobot drivers. This is the server receiving a jog message via socket io:

socket.on('dobot client', function (data) {
	else if(data.jog === true) {
		dobotInstance.jogMoveCartesian( {axis: data.axis, direction: data.direction} );
		socket.emit('server response', { message: 'Jog Command Sent' });

7. When the server receives a 'dobot client' socket message containing data.jog, the dobotInstance.jogMoveCartesian( query ) method is triggered which parses the query received at the route and writes the resulting buffer to the JOG priority command into this._COMMAND_JOG variable:

Dobot.prototype.jogMoveCartesian = function (args) {

	switch(selection) {
		case "STOP":    //when button click is ended
			var jog_command = this.generateCommandBuffer(
					{"jog": true, "axis": 0, "speed": 40}

			this._COMMAND_JOG = jog_command;

		case "X":

			if(direction>0) {   //positive direction
				var jog_command = this.generateCommandBuffer(
					{"jog": true, "axis": 1, "speed": 40}
			else {
				var jog_command = this.generateCommandBuffer(
					{"jog": true, "axis": 2, "speed": 40}

			this._COMMAND_JOG = jog_command; 	//loaded to be sent when next() is called

8. The method is triggered on a regular heartbeat interval, and when the dobot responds that it is ready for a new command the driver pushes this._COMMAND_JOG down the pipeline before any other commands: = function () {

	if ( this._COMMAND_JOG ) {
		this._COMMAND_JOG = null;


Dobot.prototype.sendBuffer = function (buffer) {

	try {
		//console.log("sending: " + buffer.toString('hex'));
	catch (error) {
		console.log("error sending buffer: " + error);



You can now connect the a small computer (e.g. Raspberry Pi) to the robotic arm and start up the server on that computer on boot up. You then connect that small computer your local area network via wireless or ethernet, and when you access that computer's IP or hostname via another locally connected computer (your laptop, tablet, or cell phone) you are returned the interface to control the robotic arm. An extra layer of security SHOULD be implemented on top of this to require authentication and prevent unauthorized use of your robotic arm, but that adds a layer of work I haven't implemented yet. This could be done by establishing a unique user token after a user provides correct username and password upon contacting the robot server. This token is then used in subsequent communications over a secure HTTPS protocol. All communication without the correct token will be ignored.


Using this control system I have also implemented the capability to upload GCODE files containing sequential commands for processing by the robot. The details of that implementation will not be covered in detail, but it basically operates by sequentially stepping through and sending each command in the file (newline delimited) as the robot responds that it is ready for to receive the next command. This is done using a command queue that is incremented as the machine becomes ready to receive the next command. The test application shown below demonstrates running a GCODE file to produce a desired output from the robot. The last implementation using computer vision will be covered in a future blog entry.

NodeGoose is a very early implementation for browser-based robot control similar to OctoPrint for 3D Printers. OctoPrint provides a utility written in Python using Flask and Tornado Frameworks that allows fully remote control of 3D printers over a local or remote wireles. NodeGoose is not as feature complete as OctoPrint yet, but I plan to keep adding features as I think of new fun applications for the hardware!

Test Application

I chose to use the laser feature as my initial demonstration for a useful application of the Dobot and the NodeGoose software platform to control it. Laser-cutters are expensive, and a highly sought after prototyping tool. Just look at the tremendous crowd-funding success of the Glowforge! One advantage I have here is price: I made it happen for about $500 right now, whereas the Glowforge is $5,000 and still only on pre-order. The only disadvantage I have with the Dobot currently is that its 12 Watt laser cuts more slowly than the typical 70+ Watt used in commercial laser cutters.

To send encoded instructions to the Dobot all we need to do is use an emprovised version of the GCODE protocol with commands that trigger JUMP and WRITE configurations to turn the connected laser on and off and adjust motion parameters. F controls the feed rate:

C9 JUMP														//switch to move mode and set speeds
G1 F2.00 X220.59 Y-53.01 Z-52.10 	//move to {x,y,z} start position
C9 WRITE													//switch on laser and set to write mode speeds
G1 F2.00 X220.74 Y-51.89 Z-52.10	//move to {x,y,z} at rate while laser cutting
G1 F2.00 X220.91 Y-50.98 Z-52.10	//move to {x,y,z} at rate while laser cutting
G1 F2.00 X221.13 Y-50.15 Z-52.10	//move to {x,y,z} at rate while laser cutting


Other GCODE options to be implemented and tested with computer vision:

RH{value}   - rotation of head
GRP{value}  - gripper angle (open/close)
LSR{value}  - laser power
PUMP{value} - control the vacuum pump

By using existing 3D printing software, I was able to transform an SVG file into GCODE, and then do minimal post-processing to configure it for use with the Dobot GCODE protocol/interpreter portion of the Dobot Driver. Here is the resulting video covering the motion of the Dobot as it processed the commands and laser-cut its own logo:


I believe we are just at the beginning of the affordable and accessible robotics platforms. There are much more complex and powerful libraries and development environments for industrial applications like ROS, but these are overkill for the uses a typical garage maker would have for an inexpensive small consumer robot. Once we make the powerful features of advanced motion planners, computer-vision, and robot control available in a language more makers can understand and implement, I believe we will transform the future through massive development of highly versatile and seemingly intelligent robot counterparts. Say hello to Baxter and Sawyer from Rethink Robotics!

A versatile robotic platform allows for multiple applications whereas task-specific robots can only complete one, constantly running the risk of becoming unsupported, obsolete, or overpriced. When you pair that hardware versatility with easy-to-use software, the sky is the limit.

When someone asks if multi-purpose robots and open-source software will transform the future, the answer is "yes, absolutely!" You can read more about the details of the software implementation and follow updates at GitHub.

Tags: Dobot, Robotics, Lasers, Education, OctoPrint, STEM