CameraServer::GetInstance() segmentation fault

I’m trying to send camera data to the monitor with CameraServer. I followed the instructions at, but upon running the code it crashes and says “No robot code”. I ran a debugger and it said that there was a segmentation fault when running CameraServer::GetInstance(). I have tried separating the pieces of the code and putting everything in a try-catch statement, but nothing works. Here is my code:

static void CameraServerLoop() {
	try {
		CameraServer * serv = CameraServer::GetInstance(); // crashes here
		cs::CvSource outputStreamStd = serv->PutVideo("Vision Camera", 640, 480);
		int sockfd, newsockfd, clilen;
		char buffer[256];
		struct sockaddr_in serv_addr, cli_addr;
		int n;
		std::stringstream ss;
		sockfd = socket(AF_INET, SOCK_STREAM, 0);
		if (sockfd < 0) 
			return CameraError("Could not open socket");
		std::cout << "Created socket";
		memset((void*)&serv_addr, sizeof(serv_addr), 0);
		//bzero((char *) &serv_addr, sizeof(serv_addr));
		int tru = 1;
		if (setsockopt(sockfd, SOL_SOCKET, SO_REUSEADDR, &tru, sizeof(int)) < 0) 
			return CameraError("Could not set socket options");
		std::cout << "Set socket options";
		serv_addr.sin_family = AF_INET;
		serv_addr.sin_addr.s_addr = INADDR_ANY;
		serv_addr.sin_port = htons(3805);
		if (bind(sockfd, (struct sockaddr *) &serv_addr,
				sizeof(serv_addr)) < 0) 
				return CameraError("Could not bind socket to port");
		std::cout << "Bound socket";
		std::cout << "Done listening";
		clilen = sizeof(cli_addr);
		newsockfd = accept(sockfd, (struct sockaddr *) &cli_addr, (unsigned int*)&clilen);
		if (newsockfd < 0) 
			return CameraError("Could not accept connection");
		std::cout << "Accepted connection";
		cameraLoopStatus = true;
		std::cout << "Connected to raspi.\n";
		long long count = 0;
		while (true) {
			n = read(newsockfd,buffer,256);
			if (n < 0) return CameraError("Could not read camera data!");
			ss.write(buffer, n);
			count += n;
			if (count >= 640*480*3) {
				std::cout << "Got frame from pi\n";
				char imbuffer[640*480*3];, 640*480*3);
				cv::Mat image(cv::Size(640, 480), CV_8UC3, imbuffer, cv::Mat::AUTO_STEP);
				count -= 640*480*3;
	} catch (std::exception e) {
		std::cout << "Got exception: " << e.what() << "\n";
		cameraLoopStatus = false;

Is there any way to fix this?

When are you starting the camera server loop? Its possible (but unlikely) for that to happen if you start up CameraServer before the robot initializes. If not, in the debugger can you see where inside GetInstance() is crashing?

Also WPILib 2019.4.1, right? The kickoff release had some CS bugs that were fixed in later releases.

It’s not crashing in GetInstance(), it’s crashing trying to allocate the necessary stack space. You have a char[640*480*3] declared as a local variable. That’s WAY bigger than the stack size allocated for each thread. You need to allocate buffers of that size from the heap with malloc() instead.

Ah. That’ll do it.

To kind of extend on what you’re doing, are you using the FRC Pi Image? If so, it has the facilities to send the image directly to the Pi to the Dashboard. Attempting to do your code has will cause a large amount of CPU usage on the RIO in order to handle the read and to recompress the image back from raw to mjpg. At that image size, this can take up to 50% of the rio CPU to do properly. Using the FRC Pi Image will make things much easier, and not cause any CPU usage on your rio. In addition, at 30FPS that would be about 240Mbps for just camera images, which is about 2.4 times the data speed of the 10/100 Ethernet port on the rio. So you would be flooding your network, causing lots of lost frames in addition to potentially disturbing controls of the robot.

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.