voelker

Papilio + ov7670 camera = spartcam !

Recommended Posts

I forgot S3E only has MULT18. But it's similar.

You did not actually posted the bit depth :P

Convolution should be easy to do with fixed point. I also have a FP library (in the forge) that can be used with ZPUino.

Alvie

Share this post


Link to post
Share on other sites

bit depth is 24bits/pixel when using YUV but i'am only using Y component right now (8 bits). I'am already doing convolution of 3x3 matrix of signed integer values for the sobel filter and i have no plan to move to FP (canny edge detector could require FP).

I have now added three new elements : erode, dilate, threshold. I can now clearly isolate a black line on a white background. I now need to perform a blob detection to track the line.

For line detetection i'am using most of the BRAM (just 5 left, 3 if i perform a sobel) and i'am now thinking of lowering the frame resolution to QVGA (320x240) to save on memory. I also found the ov7725 sensor that can  run at 120hz in QVGA ! When i get everything working with the ov7670 i'll consider changing the sensor to improve the framerate.

I also tested my viewer application under linux and it appears that rxtx is much more reliable on a Linux. With windows i sometime get pixel dropped, but with linux the image is steady. I believe that using a C application instead of java might help.

Share this post


Link to post
Share on other sites

Hi,

just a quick update on my progress. I got running a blob detection algorithm and i now can detect blobs in a binarized picture and track their center. I 'm just one step from line tracking. I wanted to do color blob tracking but i'am struggling with the configuration of the sensor to work in rgb565. It seem that i can switch to this mode but while R component seems right, G and B components are really noisy. My problem might come from my i2c core, or my configuration but i have failed to correct it so far. I'd be glad to get information from people who already worked with this module and got the rgb565 output working.

Share this post


Link to post
Share on other sites
Guest mun

Do you think you could do a video walkthrough of your work? I think that'd be really helpful.

Share this post


Link to post
Share on other sites

Hey Voelker,

I'm doing a school project on motion tracking with FPGA. I was wondering what resources did you use to learn about image processing and how did you get started using VHDL for image processing? (I would say I'm a beginner after seeing your code)

I saw your previous post to the VHDL sources so I'll be studying more of that and hopefully I will also begin to understand your code as well. Any help or tips will be greatly appreciated. Also, how long have you been working on all of this or working with FPGA's?

-Zephyr

Share this post


Link to post
Share on other sites

Hi,

i've been studying basic image processing and VHDL at school (Engineering school). I think that OpenCV documentation and examples are a good starting point to study image processing pipelines, but you can also find a lot of good resources on wikipedia. I did one school project with FPGA back in 2007 but i really started working with FPGA (both at work and at home) in last september. I started this project in january, as a case study for another of my projects that is SystemC to VHDL.

My VHDL coding is not really clean (translated from systemc), but i prefer to start by writing behavioral VHDL (that can synthesize) and then  optimize components for synthesis when i get things to work.

Don't hesitate to PM me if you have trouble to understand my code.

I did a little progress lately. I' did an homemade enclosure for my project, and added leds to light up the scene for the camera. My blob tracking algorithm was improved to track more blobs more reliably. I 'am now stuck, as i would need to grab full resolution color frames to debug things. I plan to use the ft2232H to get the required bandwith but it will require some work. I also built a robotic platform to be used for line tracking with my camera.

Share this post


Link to post
Share on other sites

Just a quick update. I got the blob tracking to work !

http://youtu.be/8XFV81n-Oyo

the link shows a desktop capture of the viewer software. I'am now using to serial channels : one for video feed, and one for data feed. The data feed sends the blob values (x,y, width, height) and the viewer apps draws blobs in the image. Values on the right are the value used to binarize the image, and only Y channel is used.

Sometime it seems that blob are not detected but it is just because the box drawing is not done properly.

Some blobs are also contained in others but these are identified as a unique blob in memory, i just need to add some data to indicate the blob were merged.

The video start with the tracking of the lens cap on a white background, then the magnetic stripe of a card is tracked.

I hope to perform line tracking on a robot very soon.

Share this post


Link to post
Share on other sites

just some pictures of my setup :

IMG_2613.JPG

IMG_2614.JPG

i "sandwiched" the papilio board between two stripboard using standoff (i only had 50mm standoff ...). The top board has four led powered from 5v. The leds works nice but i will had some diffuser to have an homogeneous lightning of the scene. I made a wing with stripboard to connect the camera and a reset button, but since then i have trouble to get the i2c communication working reliabily. The connector on the left wing connector is a ttl serial port connected to a usb to serial (cp2102) module for data communication.

I will design a kind of megawing pcb to connect the camera, a reset button, give connection to a fifo port (for the ft2232h), a serial port, and a lcd port (i have a 2.4" lcd).

Share this post


Link to post
Share on other sites

I have designed a first version of my camera board. This board features a camera connector that can accommodate three kind of camera board (ov7725, ov7670-3.3v, ov7670-2.8v). The board also features a 8 bit port with control signal to interface with fifos or mcu, and a LCD port to connect 16 bit lcd. The two port can also be combined to give access to a 16-bit data 12-bit address memory port.

I have attached a picture of the design, i would appreciate your feedbacks.

post-8428-13431627535868.png

Share this post


Link to post
Share on other sites

i modified my pcb a bit ... This pcb is meant to be placed above the papilio thus the cutout on the bottom right to let the power socket go through. I also added a mounting hole for the camera.

post-8428-13431627536225.png

Share this post


Link to post
Share on other sites

i finally received my PCB from seeed studio, ordered 10 but got 12 ! Whil waiting for the PCB i identified one schematic error (a decoupling capacitor on the regulator wired to the adj pin instead of gnd ...), and some design mistakes. The most important design mistake is that the port i wired for LCD control provides 3.3v supply while most lcd will take 5v. Its not a big mistake but i will include a jumper in furture to select between 5v and 3.3v. Another flaw is that when using the ov7670 the sensor is not centered on the board. This is not a big problem since i plan to use the OV7725 but i will move the connector location.

I soldered one board and modified the UCF file and everything worked on first try. I still need to test communication with the I2C eeprom, fifo port and LCD port. On the HDL side, i have done some refactoring to get less warning on synthesis. There is still a lot of work to do (implement LCD interface, and FT2232h fifio interface) but at least i have a clean PCB to work with.

As i won't be using all the 12 PCB i can same some to people interested to help with that project.

post-8428-13431627540463.jpg

post-8428-13431627545891.jpg

Share this post


Link to post
Share on other sites

Hi,

i got a 2.4" LCD running with my custom board to display image from the camera and have a live full-resolution view of the processed image. The image is updated in LCD memory at 30hz but it seems that the refresh rate of the LCD is slightly lower resulting in some artifact for fast moving objects. I use the YUV output of the camera that i then convert to RGB using some custom messy HDL code.

video below

This post has been promoted to an article

Share this post


Link to post
Share on other sites

Hello,

This is cool project and you have made great progress. I am working on a vision module as well but using STM32F4 microcontroller.

http://www.negtronics.com/vision-module

I am having difficulty in setting up the OV7670 registers. I tried a register setting for QQVGA that I found from some appnote but I am getting 4 pixles less per line. Did you have similar issues?

What image format/mode is working for you? Is ist QVGA YUV or QVGA RGB?

Also in YUV mode what is the data format? I mean which bits correspond to Y, which bits to U etc. Did you find any document that explains the data format?

Thanks

Share this post


Link to post
Share on other sites

Hello,

I tried the register setup you have for YUV QVGA image format. After setting up the module, I saw the frame rate to be 7.5Hz. I feeding 24MHz clock to the camera module. What clock frequency did you use?

Thanks

Share this post


Link to post
Share on other sites

Hi,

sorry for the delay. I'am feeding the cmaera module with 24Mhz and the frame rate is 30fps with my settings (checked with logic analyzer). How did you check the framerate ?

The YUV format outputs as YUYV each component being coded on 8 bits. QVGA and VGA works for me.

You may try to have a look at your I2C communication, i found the OV7670 to act weird sometime with the i2c communication.

After a period of inactivity i have started on working on the interface between the papilio and the FXLP USB2.0 component to be able to display the images a full-framerate full-resolution on a PC (ft2232 of the papilio only allowed 80*60 @ 30fps). I succeeded and i now can watch my camera output in 320*240 @ 30fps with a custom application using libusb. The next step is to transmit the VGA frame at 30fps (4 times the bandwidth). The FX2LP is a really nice and cheap component (dev board @ 10€ on ebay) that should allow to get 40MB/s with a custom driver (libusb executes in user-space).

I'll post pictures ASAP.

Share this post


Link to post
Share on other sites

Hi,

i managed to get a harris coner detector to run on the papilio using the same setup (ov7725 + papilio + fx2 usb fifo).

In this video the camera films a chessboard and output the positives harris values (score based on corner detection).

The design takes 8 BRAMs (1 for sobel, 6 for harris, 1 holds camera configuration) , 5 multipliers and 37% of slices. It runs on the papilio 250k.

Share this post


Link to post
Share on other sites

Hey Voelker!

 

This is amazing work you've been doing!

My group is doing a blob detection project using an FPGA and the OV7670 camera just like you, but we have been attempting it on an Altera DE0. This has led to a lot of headaches and some dead-ends and now that I have seen your results I am thinking of switching over to the Papilio platform!

 

Thank you for opening your code to the public, I'm sure many people will find it helpful. I feel bad to ask any more of you, but is there any way you could release a step-by-step video tutorial on exactly how to take your code and produce the results you have shown in this forum? It would be a life saver for my groups project.

 

A little more info about my group project:

We are making a diffused surface illumination (DSI) multi-touch table. The table has an acrylic sheet as the top which is surrounded by IR light strips. The OV7670 camera will sit on the floor of the table looking up. When users touch the top of the table, IR light will beam down at the OV7670 which it will see as blobs (IR filter will be removed). These blobs are going to be tracked and the center points of each blob will be send via serial to a Raspberri Pi running Linux to be used as touch points.

 

Thanks again!

Share this post


Link to post
Share on other sites

new comer to this group.  can you use chipscope or something to show how the camera drives href and vsync after you configure it for vga rgb565?  If not, I'll show you my waveform from chipscope when i get a chance.  Then perhaps you can tell me if the darn thing is even configured correctly.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now