PPro and camera sensor


leon912

Recommended Posts

Hi,

 

I'm trying to implement a communication between the PPro and the camera sensor OV7670 and using the VGA cable to print the output of the camera on a monitor. I have been able to print a simple image on the screen creating a VHDL component that simulates the output of the camera but when it comes to communicate with the camera itself, it becomes a complex issue.

 

By default, the OV7670 camera gives the output as YCbCr422 coding, which means that 1 pixel is "described" by two bytes. For my purposes, since I need only to print on the monitor the intensity of the light that the camera sees, I need to take only one of the two bytes for each pixel (the Y component, not the Cb or Cr component). (check here 

 

http://embeddedprogrammer.blogspot.co.uk/2012/07/hacking-ov7670-camera-module-sccb-cheat.html

 

The problem is that, in order to use the VGA cable, I should provide the VGA cable with 1 pixel each clock cycle but I actually sample 1 pixel every 2 clock cycles from the camera.

 

1 possible idea can be to store a full image inside the fpga and then send it to the monitor but the fpga is not able to store 640x480(x8bit) data.

 

I read that in this forum another person has tried to do what I'm doing but no details were given.

 

Thanks for the help,

 

cheers

Link to comment
Share on other sites

Hello Leon,

 

I've actually been putting a lot of time into working with the OV7670 camera as well. I've actually been working with getting the OV7670 to work with the Xilinx ZYNQ board and Xilinx's image processing IP so its a bit different, but I'd love to compare notes, share what I've learned, and ask for help from the community.

 

I spent a lot of time trying to get a real time image processing pipeline setup, that does not require any memory framebuffer, but have never had success in getting it to actually work...

 

One very interesting thing that I learned about the OV7670 is that you can capture in raw bayer RGB mode and in that mode you only need to capture one byte per pixel, instead of two. But then you need to de-bayer the output to get the full RGB data for each pixel. Xilinx has some IP to do so here, but it is not free unfortunately. It also requires that you convert the OV7670 input to AXI-Stream format. 

 

I'm just about out of time to write this message today, but I can write more tomorrow. 

 

I think there are two approach's we can take for the Papilio boards. If using the DUO then we can use the SRAM for a framebuffer. Or, given time, Alvie and I have talked about making a wishbone module for the OV7670 that would use a DMA channel to use the SDRAM on the Papilio Pro or SRAM on the DUO as a framebuffer.

 

Jack.

Link to comment
Share on other sites

Hi Jack,

 

thanks for your answer!

 

Since i'm using a PPro I can't store data. Since, as I said, the OV7670 gives 1 pixel in 2 bytes using a YCbCr422 coding, for my purposes I'll sample only the even bytes which store the component Y (luminescence). Then, these bytes will be connected to 1 of RGB cables of the VGA obtaining on the screen an image in a scale of red, green or blue color according to the chosen wire.

 

To implement everything in VHDL, I think to use an FSM communicating with the camera that samples 1 bytes out of 2(so the total throughput is halved). Then there is another FSM that handles the communication with the VGA cable that works with half of the frequency of the camera interface fsm: to do so and to have synchronized clocks, I will use a T flip-flop that halves the incoming frequency. In this way, the vga FSM will be able to see 1 valid byte every clock cycle and it should, hopefully, work.

 

I'll post my results if succesfull.

 

Leon

Link to comment
Share on other sites

Hi Jack,

 

thanks for your answer!

 

Since i'm using a PPro I can't store data. Since, as I said, the OV7670 gives 1 pixel in 2 bytes using a YCbCr422 coding, for my purposes I'll sample only the even bytes which store the component Y (luminescence). Then, these bytes will be connected to 1 of RGB cables of the VGA obtaining on the screen an image in a scale of red, green or blue color according to the chosen wire.

 

To implement everything in VHDL, I think to use an FSM communicating with the camera that samples 1 bytes out of 2(so the total throughput is halved). Then there is another FSM that handles the communication with the VGA cable that works with half of the frequency of the camera interface fsm: to do so and to have synchronized clocks, I will use a T flip-flop that halves the incoming frequency. In this way, the vga FSM will be able to see 1 valid byte every clock cycle and it should, hopefully, work.

 

I'll post my results if succesfull.

 

Leon

Hi Leon,

 

The only "VGA"-sh thing about the camera is the pixel resolution (640x480).  It sends data at 30 fps and there is no VGA timing mode that match that.  See http://martin.hinner.info/vga/timing.html

I think you have to buffer the image somehow if you want to display it on a VGA monitor (like what hamster did).  Or stream it out via serial port and display it on the PC monitor (like what voelker did).

 

Cheers,

Magnus

Link to comment
Share on other sites

Magnus,

 

I wish I would have talked to you earlier. :) I wasted over a week trying to get the OV7670 to feed into a FIFO and then into the AXI-Stream Video out component. It would never lock and I finally realized it was because the OV7670 only outputs at 30fps and the 640x480@60hz needs 60fps. Oh well, its a learning process.

Link to comment
Share on other sites

Hi all,

 

the problem of capturing and storing the pixels is the maximum amount of memory that the spartan6 supports which is not enough. The idea of sending info to the pc and print them on the monitor seems the most valuable solution. However, how can I send data from the PPro to the computer?

Link to comment
Share on other sites

BTW, there is a nice VHDL camera library here:

https://github.com/jpiat/hard-cv

 

There is a RGB565 interface for the OV7670 here:

https://github.com/jpiat/hard-cv/blob/master/hw/rtl/interface/rgb565_camera_interface.vhd

 

And YUV here:

https://github.com/jpiat/hard-cv/blob/master/hw/rtl/interface/yuv_camera_interface.vhd

 

There is a Xilinx example project of using them here:

https://github.com/fpga-logi/logi-projects/tree/master/logi-camera-demo/hw/logipi/ise

 

These were designed for the LogiPi but should be easy to port to the Papilio.

 

Jack.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.