FuseSOC for managing Papilio projects and libraries.


Recommended Posts

In another discussion about getting back to basics FuseSOC was mentioned as a possible tool to manage HDL libraries. I've been looking over the documentation (which is extremely sparse) and think it might be a good fit.

I'm attempting to first setup a simple project to autobuild with it. I added a FuseSOC .core file to this existing project:

https://github.com/GadgetFactory/VHDL_Example_Code/tree/master/WebPack_QuickStart

Then on a linux machine on AWS I did the following commands:

sudo apt-get install python2 python-pip

https://github.com/GadgetFactory/fusesoc.git
cd fusesoc/
sudo pip install -e .

cd ..
git clone https://github.com/GadgetFactory/VHDL_Example_Code.git
cd VHDL_Example_Code/WebPack_QuickStart/
fusesoc --cores-root=. build quickstart

The result is a fully automated build process that creates a WebPack_QuickStart.bit file in the build/quickstart_0/bld-ise/ directory.

I was unable to figure out how to make changes to the xise project file for things that we use with the Papilio so I had to fork the project and hard code the necessary changes for now. This is the commit I made:

https://github.com/GadgetFactory/fusesoc/commit/5c69ec3a57f23e84d014686d96ea68c3225d1fab

I added the following two lines to the tcl script that creates the project:

project set "Allow unmatched LOC Constraints" true
project set "Create Binary Configuration File" true

The first one allows us to use a ucf file that has all the papilio pins defined without throwing up errors. The second line creates a bin file that can be loaded using zpuinoprogrammer.

This is a good first step to test out automated builds, up next is using the library management features and then getting it under CI.

Jack.

Link to comment
Share on other sites

Hi Jack,

Great :)

I will check this out. I did not take a deeper look yet into this, because CI is personally not a high prio for me. I learned about FuseSoc because I was at a conference from the "Fossi Foundation"

https://fossi-foundation.org/

They use FuseSOC as a base to LibreCores CI https://www.librecores.org/static/librecores-ci

Maybe there are some opportunities to share. 

But at least your example motivates me to try out FuseSOC on the Papilio version of Bonfire. I'm already in the process of publishing the projects in an easy to reproduce form. My Github Repos currentlly only contain the RTL and not any toolchain specific files (e.g. ISE projects) which makes reproducing them hard.  

An obstacle for me is that I personally don't like python, which is really a problem nowadays given the popularity of this language. Maybe I need to overcome my aversion ;)

 

Thomas

Link to comment
Share on other sites

I got a step further with FuseSOC on gitlab:

https://gitlab.com/Papilio-FPGA/papilio-quickstart-vhdl

I broke out the ucf file to a dependency and then set everything up to be built automatically in the cloud. It's pretty cool that there are only three files in the repository and everything is automatically pulled in, the xise file is autogenerated, and a bit file is synthesized. :)

The documentation doesn't answer enough questions to get you going... I had to constantly dig into the source code to figure out how things work. 

The next step is to create a new repository for the dependency core files and then figure out how to build for all papilio boards. 

That would be really great if you could get the Papilio version of Bonfire running with FuseSOC, then it can be one of the first projects available with this new way of doing things.

Jack.

Link to comment
Share on other sites

Hi Jack,

I succeed to simulate the Bonfire CPU with FuseSOC. The example is limited to simulation because it is my CPU core test bench. Doing the same with the full Bonfire SoC will not be much more effort, I decided to make it in two steps, because my understanding of FuseSOC is that a project is composed of cores, so I want to try this approach and hope that the SoC can reuse the CPU as IP core. 

Nevertheless, it is easy to try it out:

git clone git@github.com:bonfireprocessor/bonfire-cpu.git
cd bonfire-cpu
git checkout riscv_fusesoc
fusesoc --cores-root=. sim --sim=isim bonfire_cpu

Is should compile the core in isim and run the simulation (of course ISE must be installed and in the path ). The result should look like this:

at 395 ns(1): Note: Monitor: value 0x0000000B written to address 0x0000001 (/tb_cpu_core/Inst_monitor/).
at 655 ns(1): Note: Monitor: value 0x0000000B written to address 0x0000001 (/tb_cpu_core/Inst_monitor/).
at 875 ns(1): Note: Monitor: value 0x80000007 written to address 0x0000001 (/tb_cpu_core/Inst_monitor/).
at 1235 ns(1): Note: Monitor: value 0x0000000B written to address 0x0000001 (/tb_cpu_core/Inst_monitor/).
at 1495 ns(1): Note: Monitor: value 0x0000000B written to address 0x0000001 (/tb_cpu_core/Inst_monitor/).
at 1715 ns(1): Note: Monitor: value 0x80000007 written to address 0x0000001 (/tb_cpu_core/Inst_monitor/).
at 2075 ns(1): Note: Monitor: value 0x0000000B written to address 0x0000001 (/tb_cpu_core/Inst_monitor/).
at 2335 ns(1): Note: Monitor: value 0x0000000B written to address 0x0000001 (/tb_cpu_core/Inst_monitor/).
at 2555 ns(1): Note: Monitor: value 0x80000007 written to address 0x0000001 (/tb_cpu_core/Inst_monitor/).
at 2915 ns(1): Note: Monitor: value 0x0000000B written to address 0x0000001 (/tb_cpu_core/Inst_monitor/).
at 3175 ns(1): Note: Monitor: value 0x0000000B written to address 0x0000001 (/tb_cpu_core/Inst_monitor/).
at 3395 ns(1): Note: Monitor: value 0x80000007 written to address 0x0000001 (/tb_cpu_core/Inst_monitor/).
at 3755 ns(1): Note: Monitor: value 0x0000000B written to address 0x0000001 (/tb_cpu_core/Inst_monitor/).
at 4015 ns(1): Note: Monitor: value 0x0000000B written to address 0x0000001 (/tb_cpu_core/Inst_monitor/).
at 4235 ns(1): Note: Monitor: value 0x80000007 written to address 0x0000001 (/tb_cpu_core/Inst_monitor/).
at 4495 ns(1): Note: Monitor: value 0x00000001 written to address 0x0000000 (/tb_cpu_core/Inst_monitor/).
at 4495 ns(3): Note: Test finished with result 00000001 (/tb_cpu_core/).

Unfortunately I have to find out how to complete the simulation. Currently you must hit Ctrl-C to stop it. 

But I'm really impressed how well it works.

BTW: The RISC-V Test program the simulation executes is in https://github.com/bonfireprocessor/bonfire-cpu/blob/riscv/riscv_test/timer_irq.S

Thomas
 

 

Edited by Thomas Hornschuh
Added link to test program
Link to comment
Share on other sites

Maybe the problem is the path in the toplevel (constant TestFile :-...). isim is famous for segfaulting with errors of this type.

 

      dbus_cyc_o : OUT  std_logic;
         dbus_stb_o : OUT  std_logic;
         dbus_we_o : OUT  std_logic;
         dbus_sel_o : OUT  std_logic_vector(3 downto 0);
         dbus_ack_i : IN  std_logic;
         dbus_adr_o : OUT  std_logic_vector(31 downto 2);
         dbus_dat_o : OUT  std_logic_vector(31 downto 0);
         dbus_dat_i : IN  std_logic_vector(31 downto 0);
         irq_i : IN  std_logic_vector(7 downto 0)
        );
    END COMPONENT;
    
    constant TestFile : string :=  "../src/bonfire_cpu_0/ise/tb_bonfire_cpu/compiled_tests/timer_irq.hex";
    

   --Inputs
   signal clk_i : std_logic := '0';
   signal rst_i : std_logic := '0';

I run all my FPGA design stuff in different Linux VM, not using Windows anymore. Most command line things like git, etc. work better with Linux, and until recent there was also no RISC-V gcc toolchain with Windows. 

Link to comment
Share on other sites

Hmmm, I just tested it under ubuntu and it is giving me an error:

 

WARNING:HDLCompiler:746 - "../src/bonfire_cpu_0/rtl/riscv_local_memmap.vhd" Line 106: Range is empty (null range)
Compiling architecture rtl of entity counter_64Bit [counter_64bit_default]
Compiling architecture rtl of entity riscv_interrupts [riscv_interrupts_default]
Compiling architecture behavioral of entity riscv_control_unit [\riscv_control_unit(true,"sparta...]
Compiling architecture rtl of entity lxp32_execute [\lxp32_execute(false,true,"spart...]
Compiling architecture rtl of entity riscv_regfile [\riscv_regfile("block")(1,5)\]
Compiling architecture rtl of entity lxp32_cpu [\lxp32_cpu(false,true,"spartands...]
Compiling architecture rtl of entity lxp32u_top [\lxp32u_top(false,true,"spartand...]
Compiling architecture rtl of entity sim_bus [sim_bus_default]
Compiling package std_logic_textio
Compiling architecture behavioral of entity sim_MainMemory [\sim_MainMemory("../src/bonfire_...]
Compiling architecture behavioral of entity sim_memory_interface [\sim_memory_interface(12,4096,27...]
Compiling architecture sim of entity monitor [\monitor(true)\]
Compiling architecture behavior of entity tb_cpu_core
Time Resolution for simulation is 1ps.
Waiting for 30 sub-compilation(s) to finish...
Compiled 55 VHDL Units
Built simulation executable fusesoc.elf
Fuse Memory Usage: 433292 KB
Fuse CPU Usage: 1620 ms
GCC CPU Usage: 7880 ms
ERROR: Failed to run ::bonfire_cpu:0 : Failed to run Isim simulation

 

Link to comment
Share on other sites

Edit, ignore the striketrough text...

Strange.  I suspect that there is some small bug somewhere in my HDL code which makes isim crash, I had such encounters in the past. Unfortunately I also had Isim crashes also on perfectly legal but possibly "unusal" VHDL code. As 

As a first try I will check the compiler warning about the empty Range.

Can you attach the full log of the run ?

What happens when you go to the build/bonfire_cpu_0/sim-isim directory and start  ./fusesoc.elf manually? It should start with an ISim>  prompt. If this works, what happens when you enter "run all" at the prompt?

If your time budget allows you, it would also be nice when you try to run simulation in interactive mode in ISE, with opening the ise/tb_bonfire_cpu ISE project ?  Most likely you need to adapt the path to the timer_irq.hex file. 

 

Thomas
 

Edited by Thomas Hornschuh
New ideas....
Link to comment
Share on other sites

I have not found any reason, why it is not working for you.

On the positive side, I successfully get the simulation working with ghdl. This option gives a makes fully open source test/verification possible.

The main obstacle was that ghdl requires the files sorted by dependencies because it builds the work library incrementally. There are other differences and glitches, but my latest commit works with isim and ghdl.

Thomas

Link to comment
Share on other sites

On 9/27/2017 at 5:50 AM, Thomas Hornschuh said:

Edit, ignore the striketrough text...

Strange.  I suspect that there is some small bug somewhere in my HDL code which makes isim crash, I had such encounters in the past. Unfortunately I also had Isim crashes also on perfectly legal but possibly "unusal" VHDL code. As 

As a first try I will check the compiler warning about the empty Range.

Can you attach the full log of the run ?

Here is the full log of the run...

ubuntu@ip-10-136-174-62:~/bonfire-cpu$ fusesoc --verbose --cores-root=. sim --sim=isim bonfire_cpu
DEBUG: Setup logging at level 10.
DEBUG: Command line arguments: ['/usr/local/bin/fusesoc', '--verbose', '--cores-root=.', 'sim', '--sim=isim', 'bonfire_cpu']
DEBUG: Verbose output
DEBUG: Colorful output
DEBUG: Looking for config files from /etc/fusesoc/fusesoc.conf:/home/ubuntu/.config/fusesoc/fusesoc.conf:fusesoc.conf
DEBUG: Found config files in
DEBUG: build_root=/home/ubuntu/bonfire-cpu/build
DEBUG: cache_root=/home/ubuntu/.cache/fusesoc
DEBUG: cores_root=
DEBUG: Not defined
DEBUG: Checking for cores in .
DEBUG: Adding core ::bonfire_cpu:0
DEBUG: Autodetected 64-bit mode
DEBUG: ::bonfire_cpu:0 : Getting tool for flags {'tool': 'isim', 'flow': 'sim', 'target': 'sim', 'testbench': None}
DEBUG: ::bonfire_cpu:0 :  Matched tool isim
DEBUG: Building EDA API
DEBUG: ::bonfire_cpu:0 : Getting dependencies for flags {'tool': 'isim', 'is_toplevel': True, 'flow': 'sim', 'target': 'sim', 'testbench': None}
DEBUG: Collecting EDA API parameters from ::bonfire_cpu:0
DEBUG: ::bonfire_cpu:0 : Getting parameters for flags '{'tool': 'isim', 'is_toplevel': True, 'flow': 'sim', 'target': 'sim', 'testbench': None}'
DEBUG: ::bonfire_cpu:0 : Found parameters []
DEBUG: ::bonfire_cpu:0 : Getting tool options for flags {'tool': 'isim', 'is_toplevel': True, 'flow': 'sim', 'target': 'sim', 'testbench': None}
DEBUG: ::bonfire_cpu:0 : Found tool options {}
DEBUG: ::bonfire_cpu:0 : Getting VPI libraries for flags {'tool': 'isim', 'is_toplevel': True, 'flow': 'sim', 'target': 'sim', 'testbench': None}
DEBUG: ::bonfire_cpu:0 :  Matched VPI libraries []
DEBUG: ::bonfire_cpu:0 : Getting toplevel for flags {'tool': 'isim', 'flow': 'sim', 'target': 'sim', 'testbench': None}
DEBUG: ::bonfire_cpu:0 : Matched toplevel tb_cpu_core
DEBUG: ::bonfire_cpu:0 : Getting dependencies for flags {'tool': 'isim', 'is_toplevel': True, 'flow': 'sim', 'target': 'sim', 'testbench': None}
INFO: Preparing ::bonfire_cpu:0
DEBUG: ::bonfire_cpu:0 : Exporting ['rtl/riscv_counter_64Bit.vhd', 'util/log2.vhd', 'rtl/csr_def.vhd', 'rtl/riscv_decodeutil.vhd', 'rtl/lxp32_ram256x32.vhd', 'rtl/lxp32_mul16x16.vhd', 'rtl/lxp32_ubuf.vhd', 'rtl/lxp32_compl.vhd', 'rtl/riscv_interrupts.vhd', 'rtl/lxp32_scratchpad.vhd', 'rtl/riscv_regfile.vhd', 'rtl/lxp32_interrupt_mux.vhd', 'rtl/lxp32_shifter.vhd', 'rtl/lxp32_mul_dsp.vhd', 'rtl/lxp32_mul_opt.vhd', 'rtl/lxp32_mul_seq.vhd', 'rtl/lxp32_divider.vhd', 'rtl/riscv_mulsp6.vhd', 'rtl/lxp32_alu.vhd', 'rtl/lxp32_dbus.vhd', 'rtl/riscv_local_memmap.vhd', 'rtl/riscv_csr_unit.vhd', 'rtl/lxp32_fetch.vhd', 'rtl/riscv_decode.vhd', 'rtl/lxp32_decode.vhd', 'rtl/lxp32_execute.vhd', 'rtl/lxp32_cpu.vhd', 'rtl/lxp32u_top.vhd', 'verify/std_logic_textio/std_logic_textio.vhd', 'verify/common_pkg/common_pkg.vhd', 'verify/common_pkg/common_pkg_body.vhd', 'verify/bonfire/sim_bus.vhd', 'verify/bonfire/sim_MainMemory.vhd', 'verify/bonfire/sim_memory_interface.vhd', 'verify/lxp32/src/tb/monitor.vhd', 'verify/bonfire/tb_cpu_core.vhd', 'ise/tb_bonfire_cpu/compiled_tests/timer_irq.hex']
WARNING: ../src/bonfire_cpu_0/ise/tb_bonfire_cpu/compiled_tests/timer_irq.hex has unknown file type ''

DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-isim
DEBUG:     fuse tb_cpu_core -prj isim.prj -o fusesoc.elf
Running: /opt/Xilinx/14.7/ISE_DS/ISE/bin/lin64/unwrapped/fuse tb_cpu_core -prj isim.prj -o fusesoc.elf
ISim P.20131013 (signature 0xfbc00daa)
Number of CPUs detected in this system: 2
Turning on mult-threading, number of parallel sub-compilation jobs: 4
Determining compilation order of HDL files
Parsing VHDL file "../src/bonfire_cpu_0/rtl/riscv_counter_64Bit.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/util/log2.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/csr_def.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/riscv_decodeutil.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_ram256x32.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_mul16x16.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_ubuf.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_compl.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/riscv_interrupts.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_scratchpad.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/riscv_regfile.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_interrupt_mux.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_shifter.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_mul_dsp.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_mul_opt.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_mul_seq.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_divider.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/riscv_mulsp6.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_alu.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_dbus.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/riscv_local_memmap.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/riscv_csr_unit.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_fetch.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/riscv_decode.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_decode.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_execute.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32_cpu.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/rtl/lxp32u_top.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/verify/std_logic_textio/std_logic_textio.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/verify/common_pkg/common_pkg.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/verify/common_pkg/common_pkg_body.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/verify/bonfire/sim_bus.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/verify/bonfire/sim_MainMemory.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/verify/bonfire/sim_memory_interface.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd" into library work
Parsing VHDL file "../src/bonfire_cpu_0/verify/bonfire/tb_cpu_core.vhd" into library work
Starting static elaboration
Completed static elaboration
Fuse Memory Usage: 106364 KB
Fuse CPU Usage: 1170 ms
Compiling package standard
Compiling package std_logic_1164
Compiling package numeric_std
Compiling package math_real
Compiling package common_pkg
Compiling package log2
Compiling package riscv_decodeutil
Compiling architecture rtl of entity lxp32_ubuf [\lxp32_ubuf(62)\]
Compiling architecture rtl of entity lxp32_fetch [\lxp32_fetch(('0','0','0','0','0...]
Compiling architecture rtl of entity riscv_decode [riscv_decode_default]
Compiling package csr_def
Compiling architecture rtl of entity riscv_mulsp6 [riscv_mulsp6_default]
Compiling architecture rtl of entity lxp32_compl [lxp32_compl_default]
Compiling architecture rtl of entity lxp32_divider [lxp32_divider_default]
Compiling architecture rtl of entity lxp32_shifter [lxp32_shifter_default]
Compiling architecture rtl of entity lxp32_alu [\lxp32_alu(true,"spartandsp")(1,...]
Compiling architecture rtl of entity lxp32_dbus [\lxp32_dbus(false,true,('1','1',...]
Compiling architecture rtl of entity riscv_local_memmap [riscv_local_memmap_default]
WARNING:HDLCompiler:746 - "../src/bonfire_cpu_0/rtl/riscv_local_memmap.vhd" Line 106: Range is empty (null range)
Compiling architecture rtl of entity counter_64Bit [counter_64bit_default]
Compiling architecture rtl of entity riscv_interrupts [riscv_interrupts_default]
Compiling architecture behavioral of entity riscv_control_unit [\riscv_control_unit(true,"sparta...]
Compiling architecture rtl of entity lxp32_execute [\lxp32_execute(false,true,"spart...]
Compiling architecture rtl of entity riscv_regfile [\riscv_regfile("block")(1,5)\]
Compiling architecture rtl of entity lxp32_cpu [\lxp32_cpu(false,true,"spartands...]
Compiling architecture rtl of entity lxp32u_top [\lxp32u_top(false,true,"spartand...]
Compiling architecture rtl of entity sim_bus [sim_bus_default]
Compiling package textio
Compiling package std_logic_textio
Compiling architecture behavioral of entity sim_MainMemory [\sim_MainMemory("../src/bonfire_...]
Compiling architecture behavioral of entity sim_memory_interface [\sim_memory_interface(12,4096,27...]
Compiling architecture sim of entity monitor [\monitor(true)\]
Compiling architecture behavior of entity tb_cpu_core
Time Resolution for simulation is 1ps.
Waiting for 28 sub-compilation(s) to finish...
Compiled 53 VHDL Units
Built simulation executable fusesoc.elf
Fuse Memory Usage: 429152 KB
Fuse CPU Usage: 1480 ms
GCC CPU Usage: 8680 ms
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-isim
DEBUG:     ./fusesoc.elf -tclbatch isim.tcl -log isim.log -wdb isim.wdb
ERROR: Failed to run ::bonfire_cpu:0 : Failed to run Isim simulation
ubuntu@ip-10-136-174-62:~/bonfire-cpu$ cd /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-isim
ubuntu@ip-10-136-174-62:~/bonfire-cpu/build/bonfire_cpu_0/sim-isim$ ./fusesoc.elf -tclbatch isim.tcl -log isim.log -wdb isim.wdb
Segmentation fault
ubuntu@ip-10-136-174-62:~/bonfire-cpu/build/bonfire_cpu_0/sim-isim$ ll
total 60
drwxrwxr-x 3 ubuntu ubuntu  4096 Oct  1 18:07 ./
drwxrwxr-x 4 ubuntu ubuntu  4096 Oct  1 18:06 ../
-rw-rw-r-- 1 ubuntu ubuntu  5808 Oct  1 18:07 fuse.log
-rw-rw-r-- 1 ubuntu ubuntu    48 Oct  1 18:06 fuseRelaunch.cmd
-rwxr-xr-x 1 ubuntu ubuntu 21792 Oct  1 18:07 fusesoc.elf*
-rw-rw-r-- 1 ubuntu ubuntu   522 Oct  1 18:07 fuse.xmsgs
drwxrwxr-x 5 ubuntu ubuntu  4096 Oct  1 18:07 isim/
-rw-rw-r-- 1 ubuntu ubuntu  2040 Oct  1 18:06 isim.prj
-rw-rw-r-- 1 ubuntu ubuntu    22 Oct  1 18:06 isim.tcl
ubuntu@ip-10-136-174-62:~/bonfire-cpu/build/bonfire_cpu_0/sim-isim$ ./fusesoc.elf
Segmentation fault

 

On 9/27/2017 at 5:50 AM, Thomas Hornschuh said:

What happens when you go to the build/bonfire_cpu_0/sim-isim directory and start  ./fusesoc.elf manually? It should start with an ISim>  prompt. If this works, what happens when you enter "run all" at the prompt?

Just does a seg fault...

On 9/27/2017 at 5:50 AM, Thomas Hornschuh said:

If your time budget allows you, it would also be nice when you try to run simulation in interactive mode in ISE, with opening the ise/tb_bonfire_cpu ISE project ?  Most likely you need to adapt the path to the timer_irq.hex file. 

 

Thomas
 

When I try to run it from ISIM, over X11 forwarding which is super slow... I get the following:

Started : "Simulate Behavioral Model".

Determining files marked for global include in the design...
Running fuse...
Command Line: fuse -intstyle ise -incremental -lib secureip -o D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/tb_cpu_core_isim_beh.exe -prj D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/tb_cpu_core_beh.prj work.tb_cpu_core {}
Running: C:\Xilinx\14.7\ISE_DS\ISE\bin\nt64\unwrapped\fuse.exe -intstyle ise -incremental -lib secureip -o D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/tb_cpu_core_isim_beh.exe -prj D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/tb_cpu_core_beh.prj work.tb_cpu_core 
ISim P.20131013 (signature 0x7708f090)
Number of CPUs detected in this system: 4
Turning on mult-threading, number of parallel sub-compilation jobs: 8 
Determining compilation order of HDL files
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/lxp32_mul16x16.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/lxp32_compl.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/csr_def.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/riscv_mulsp6.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/riscv_interrupts.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/riscv_counter_64Bit.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/lxp32_shifter.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/lxp32_mul_seq.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/lxp32_mul_opt.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/lxp32_mul_dsp.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/lxp32_divider.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/riscv_local_memmap.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/riscv_decodeutil.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/riscv_csr_unit.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/lxp32_ubuf.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/lxp32_ram256x32.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/lxp32_dbus.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/lxp32_alu.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/common_pkg/common_pkg.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/riscv_regfile.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/riscv_decode.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/lxp32_scratchpad.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/lxp32_interrupt_mux.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/lxp32_fetch.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/lxp32_execute.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../rtl/lxp32_decode.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/common_pkg/common_pkg_body.vhd" into library work
Parsing VHDL file "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" into library work
ERROR:HDLCompiler:104 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 27: Cannot find <std_logic_textio> in library <work>. Please ensure that the library was compiled, and that a library and a use clause are present in the VHDL file.
ERROR:HDLCompiler:854 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 30: Unit <sim_mainmemory> ignored due to previous errors.
ERROR:HDLCompiler:374 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 53: Entity <sim_mainmemory> is not yet compiled.
ERROR:HDLCompiler:69 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 55: <string> is not declared.
ERROR:HDLCompiler:631 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 56: Near string "TRUE" ; 0 visible types match here
ERROR:HDLCompiler:69 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 59: <size> is not declared.
ERROR:HDLCompiler:24 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 59: "**" expects 2 arguments
ERROR:HDLCompiler:69 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 60: <std_logic_vector> is not declared.
ERROR:HDLCompiler:69 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 63: <tword> is not declared.
ERROR:HDLCompiler:69 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 64: <std_logic_vector> is not declared.
ERROR:HDLCompiler:69 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 72: <ramfilename> is not declared.
ERROR:HDLCompiler:69 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 73: <line> is not declared.
ERROR:HDLCompiler:69 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 74: <tword> is not declared.
ERROR:HDLCompiler:904 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 78: Near range ; prefix should denote an array type
ERROR:HDLCompiler:617 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 78: Near range ; prefix should denote a scalar or array type
ERROR:HDLCompiler:69 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 80: <readline> is not declared.
ERROR:HDLCompiler:69 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 82: <hread> is not declared.
ERROR:HDLCompiler:69 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 84: <read> is not declared.
ERROR:HDLCompiler:69 - "D:/Dropbox/GadgetFactory_Engineering/bonfire-cpu/ise/tb_bonfire_cpu/../../verify/bonfire/sim_MainMemory.vhd" Line 81: <mode> is not declared.Sorry, too many errors..


Process "Simulate Behavioral Model" failed

 

Link to comment
Share on other sites

On 9/27/2017 at 11:27 AM, Thomas Hornschuh said:

I have not found any reason, why it is not working for you.

On the positive side, I successfully get the simulation working with ghdl. This option gives a makes fully open source test/verification possible.

The main obstacle was that ghdl requires the files sorted by dependencies because it builds the work library incrementally. There are other differences and glitches, but my latest commit works with isim and ghdl.

Thomas

I would definitely like to try the ghdl simulation version, I could just view that with gtkwave right? Seems like an easier solution for the direction of a cloud computing environment. I just tried to run the simulation with ghdl too and failed with this error:

ubuntu@ip-10-237-62-72:~/bonfire-cpu$ fusesoc --verbose --cores-root=. sim --sim=ghdl bonfire_cpu
DEBUG: Setup logging at level 10.
DEBUG: Command line arguments: ['/usr/local/bin/fusesoc', '--verbose', '--cores-root=.', 'sim', '--sim=ghdl', 'bonfire_cpu']
DEBUG: Verbose output
DEBUG: Colorful output
DEBUG: Looking for config files from /etc/fusesoc/fusesoc.conf:/home/ubuntu/.config/fusesoc/fusesoc.conf:fusesoc.conf
DEBUG: Found config files in
DEBUG: build_root=/home/ubuntu/bonfire-cpu/build
DEBUG: cache_root=/home/ubuntu/.cache/fusesoc
DEBUG: cores_root=
DEBUG: Not defined
DEBUG: Checking for cores in .
DEBUG: Adding core ::bonfire_cpu:0
DEBUG: Autodetected 64-bit mode
DEBUG: ::bonfire_cpu:0 : Getting tool for flags {'tool': 'ghdl', 'flow': 'sim', 'target': 'sim', 'testbench': None}
DEBUG: ::bonfire_cpu:0 :  Matched tool ghdl
DEBUG: Building EDA API
DEBUG: ::bonfire_cpu:0 : Getting dependencies for flags {'tool': 'ghdl', 'is_toplevel': True, 'flow': 'sim', 'target': 'sim', 'testbench': None}
DEBUG: Collecting EDA API parameters from ::bonfire_cpu:0
DEBUG: ::bonfire_cpu:0 : Getting parameters for flags '{'tool': 'ghdl', 'is_toplevel': True, 'flow': 'sim', 'target': 'sim', 'testbench': None}'
DEBUG: ::bonfire_cpu:0 : Found parameters []
DEBUG: ::bonfire_cpu:0 : Getting tool options for flags {'tool': 'ghdl', 'is_toplevel': True, 'flow': 'sim', 'target': 'sim', 'testbench': None}
DEBUG: ::bonfire_cpu:0 : Found tool options {}
DEBUG: ::bonfire_cpu:0 : Getting VPI libraries for flags {'tool': 'ghdl', 'is_toplevel': True, 'flow': 'sim', 'target': 'sim', 'testbench': None}
DEBUG: ::bonfire_cpu:0 :  Matched VPI libraries []
DEBUG: ::bonfire_cpu:0 : Getting toplevel for flags {'tool': 'ghdl', 'flow': 'sim', 'target': 'sim', 'testbench': None}
DEBUG: ::bonfire_cpu:0 : Matched toplevel tb_cpu_core
DEBUG: ::bonfire_cpu:0 : Getting dependencies for flags {'tool': 'ghdl', 'is_toplevel': True, 'flow': 'sim', 'target': 'sim', 'testbench': None}
INFO: Preparing ::bonfire_cpu:0
DEBUG: ::bonfire_cpu:0 : Exporting ['rtl/riscv_counter_64Bit.vhd', 'util/log2.vhd', 'rtl/csr_def.vhd', 'rtl/riscv_decodeutil.vhd', 'rtl/lxp32_ram256x32.vhd', 'rtl/lxp32_mul16x16.vhd', 'rtl/lxp32_ubuf.vhd', 'rtl/lxp32_compl.vhd', 'rtl/riscv_interrupts.vhd', 'rtl/lxp32_scratchpad.vhd', 'rtl/riscv_regfile.vhd', 'rtl/lxp32_interrupt_mux.vhd', 'rtl/lxp32_shifter.vhd', 'rtl/lxp32_mul_dsp.vhd', 'rtl/lxp32_mul_opt.vhd', 'rtl/lxp32_mul_seq.vhd', 'rtl/lxp32_divider.vhd', 'rtl/riscv_mulsp6.vhd', 'rtl/lxp32_alu.vhd', 'rtl/lxp32_dbus.vhd', 'rtl/riscv_local_memmap.vhd', 'rtl/riscv_csr_unit.vhd', 'rtl/lxp32_fetch.vhd', 'rtl/riscv_decode.vhd', 'rtl/lxp32_decode.vhd', 'rtl/lxp32_execute.vhd', 'rtl/lxp32_cpu.vhd', 'rtl/lxp32u_top.vhd', 'verify/std_logic_textio/std_logic_textio.vhd', 'verify/common_pkg/common_pkg.vhd', 'verify/common_pkg/common_pkg_body.vhd', 'verify/bonfire/sim_bus.vhd', 'verify/bonfire/sim_MainMemory.vhd', 'verify/bonfire/sim_memory_interface.vhd', 'verify/lxp32/src/tb/monitor.vhd', 'verify/bonfire/tb_cpu_core.vhd', 'ise/tb_bonfire_cpu/compiled_tests/timer_irq.hex']

DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/riscv_counter_64Bit.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/util/log2.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/csr_def.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/riscv_decodeutil.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_ram256x32.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_mul16x16.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_ubuf.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_compl.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/riscv_interrupts.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_scratchpad.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/riscv_regfile.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_interrupt_mux.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_shifter.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_mul_dsp.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_mul_opt.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_mul_seq.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_divider.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/riscv_mulsp6.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_alu.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_dbus.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/riscv_local_memmap.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/riscv_csr_unit.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_fetch.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/riscv_decode.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_decode.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_execute.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32_cpu.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/rtl/lxp32u_top.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/verify/std_logic_textio/std_logic_textio.vhd
../src/bonfire_cpu_0/verify/std_logic_textio/std_logic_textio.vhd:130:27:warning: universal integer bound must be numeric literal or attribute
../src/bonfire_cpu_0/verify/std_logic_textio/std_logic_textio.vhd:139:27:warning: universal integer bound must be numeric literal or attribute
../src/bonfire_cpu_0/verify/std_logic_textio/std_logic_textio.vhd:185:27:warning: universal integer bound must be numeric literal or attribute
../src/bonfire_cpu_0/verify/std_logic_textio/std_logic_textio.vhd:196:27:warning: universal integer bound must be numeric literal or attribute
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/verify/common_pkg/common_pkg.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/verify/common_pkg/common_pkg_body.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/verify/bonfire/sim_bus.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/verify/bonfire/sim_MainMemory.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/verify/bonfire/sim_memory_interface.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -a ../src/bonfire_cpu_0/verify/bonfire/tb_cpu_core.vhd
WARNING: ../src/bonfire_cpu_0/ise/tb_bonfire_cpu/compiled_tests/timer_irq.hex has unknown file type ''
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -e tb_cpu_core
DEBUG: /home/ubuntu/bonfire-cpu/build/bonfire_cpu_0/sim-ghdl
DEBUG:     ghdl -r tb_cpu_core
../src/bonfire_cpu_0/rtl/riscv_regfile.vhd:106:16:@5ns:(report warning): Metavalue in raddr1_i
../src/bonfire_cpu_0/rtl/riscv_regfile.vhd:111:16:@5ns:(report warning): Metavalue in raddr2_i
../src/bonfire_cpu_0/rtl/riscv_regfile.vhd:114:14:@5ns:(assertion error): Metavalue for we_i
../../src/ieee/numeric_std-body.v93:2098:7:@5ns:(assertion warning): NUMERIC_STD.TO_INTEGER: metavalue detected, returning 0
../../src/ieee/numeric_std-body.v93:2098:7:@5ns:(assertion warning): NUMERIC_STD.TO_INTEGER: metavalue detected, returning 0
../src/bonfire_cpu_0/rtl/riscv_regfile.vhd:106:16:@15ns:(report warning): Metavalue in raddr1_i
../src/bonfire_cpu_0/rtl/riscv_regfile.vhd:111:16:@15ns:(report warning): Metavalue in raddr2_i
../../src/ieee/numeric_std-body.v93:2098:7:@15ns:(assertion warning): NUMERIC_STD.TO_INTEGER: metavalue detected, returning 0
../../src/ieee/numeric_std-body.v93:2098:7:@15ns:(assertion warning): NUMERIC_STD.TO_INTEGER: metavalue detected, returning 0
../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd:66:32:@395ns:(report note): Monitor: value 0x0000000B written to address 0x0000001
../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd:66:32:@655ns:(report note): Monitor: value 0x0000000B written to address 0x0000001
../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd:66:32:@875ns:(report note): Monitor: value 0x80000007 written to address 0x0000001
../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd:66:32:@1235ns:(report note): Monitor: value 0x0000000B written to address 0x0000001
../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd:66:32:@1495ns:(report note): Monitor: value 0x0000000B written to address 0x0000001
../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd:66:32:@1715ns:(report note): Monitor: value 0x80000007 written to address 0x0000001
../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd:66:32:@2075ns:(report note): Monitor: value 0x0000000B written to address 0x0000001
../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd:66:32:@2335ns:(report note): Monitor: value 0x0000000B written to address 0x0000001
../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd:66:32:@2555ns:(report note): Monitor: value 0x80000007 written to address 0x0000001
../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd:66:32:@2915ns:(report note): Monitor: value 0x0000000B written to address 0x0000001
../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd:66:32:@3175ns:(report note): Monitor: value 0x0000000B written to address 0x0000001
../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd:66:32:@3395ns:(report note): Monitor: value 0x80000007 written to address 0x0000001
../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd:66:32:@3755ns:(report note): Monitor: value 0x0000000B written to address 0x0000001
../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd:66:32:@4015ns:(report note): Monitor: value 0x0000000B written to address 0x0000001
../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd:66:32:@4235ns:(report note): Monitor: value 0x80000007 written to address 0x0000001
../src/bonfire_cpu_0/verify/lxp32/src/tb/monitor.vhd:66:32:@4495ns:(report note): Monitor: value 0x00000001 written to address 0x0000000
../src/bonfire_cpu_0/verify/bonfire/tb_cpu_core.vhd:248:7:@4495ns:(report failure): Test finished with result 00000001
./tb_cpu_core:error: report failed
./tb_cpu_core:error: simulation failed
ERROR: Failed to run ::bonfire_cpu:0 : Simulation failed

 

Link to comment
Share on other sites

Hi Jack,

let us start with the good news: Your  ghdl simulation run was in fact successful and as expected. The "failed" message is caused by my quick and dirty attempt to stop the simulator. 

Test finished with result 00000001

says that it was successful. The test bench contains a special "monitor port". Writing to address 0 of this port by convention marks the simulation end. Writing to other addresses are just debug values. I have now changed the testbench to stop the clock on simulation end, which also terminates simulation because there are no stimulus anymore. Just pull the latest commit. 

It also shows that ghdl is more reliable (or at least reproducible) than isim ;)

Your interactive isim run encountered the same problem as my first try with ghdl: It didnt find the std_logic_textio package. I learned in the mean time that this library is not really part of the ieee standard libraries, it is some extension from synopsis. I have still to find out, why it sometimes is found and sometimes not. ISE 14.7 definitely  has it in its standard installation.  For ghdl I added a local copy (check the bonfire-cpu/verify/std_logic_textio directory). The version of sim_MainMemory in the riscv_fusesoc branch is already adapted to load the local version. Maybe a quick work around, I have to dig deeper into it.

Why isim segfaults for you is still strange. My first suspect was the small size of the fuessoc.elf in your listing. But my has exactly the same size. Nevertheless I think it is only some form of stub. If you dig deeper into the "isim" subdir you will find the following 

thomas@thomas-lubuntu1604:~/fusesoc_proj/bonfire-cpu/build/bonfire_cpu_0/sim-isim/isim/fusesoc.elf.sim$ ls -lh
total 580K
-rw-rw-r-- 1 thomas thomas 167K Oct  1 23:57 ISimEngine-DesignHierarchy.dbg
-rwxrwxr-x 1 thomas thomas 385K Oct  1 23:57 fusesoc.elf
-rw-rw-r-- 1 thomas thomas    0 Oct  1 23:57 isimcrash.log
-rw-rw-r-- 1 thomas thomas  579 Oct  1 23:57 isimkernel.log
-rw-rw-r-- 1 thomas thomas  11K Oct  1 23:57 netId.dat
drwxrwxr-x 2 thomas thomas 4.0K Oct  1 23:57 tmp_save
drwxrwxr-x 2 thomas thomas 4.0K Oct  1 23:57 work

Can you check if isimcrash.log contains some content on your computer?

In general sometimes it is hard to get things running....
My initial plan for this weekend was to make the whole bonfire for Papiplio working with fusesoc. But I started with the idea to back port the data cache which I developed for the Arty version to the Papilio. In simulation this was easy, but for some reason on the real thing it does not work. Synthesis is correct, I spend hours to analyze the post-synthesis netlist (I feel still exhausted from this exercise) but did not find any obvious problem, it looks ok. I wrote a lot of test programs. Now I'm at least at the point that I have narrowed down the problem, but still no idea why this happens. 

 

Thomas

 

 

 

 

 

Link to comment
Share on other sites

2 hours ago, Thomas Hornschuh said:

let us start with the good news: Your  ghdl simulation run was in fact successful and as expected.

Sweet, I ran this over ssh on a linux box and it didn't bring up any GUI on my xserver. But I exported to a ncd file and opened on my windows box with gtkwave and it looks great. :)

Jack

Link to comment
Share on other sites

  • 2 weeks later...

Ok, I made more progress today and have got ZPUino fusesoc cores designed and a vanilla version of ZPUino synthesizing for the Papilio Pro. These are the steps to recreate what I've done:

git clone https://github.com/GadgetFactory/fusesoc.git
cd fusesoc
pip2 install -e .
fusesoc init

cd ~
git clone https://gitlab.com/Papilio-FPGA/Papilio-ZPUino-SOC.git
cd Papilio-ZPUino-SOC
fusesoc --cores-root=. build papilio-pro-zpuinoSOC

The bin an bit files can then be found under:

build/papilio-pro-zpuinoSOC_0/bld-ise/

It's also possible to make the ZPUino SOC the old fashioned way by doing:

git clone https://gitlab.com/Papilio-FPGA/Papilio-ZPUino-SOC.git
git submodule update --init --recursive
make

Next steps are to make a core to generate the bootloader.vhd file and to get this working in an automated fashion in Docker.

Link to comment
Share on other sites

Hi Jack,

sounds great. I was in the meantime also able to build the bonfire project with FuseSoC. I will update the respective thread with instructions. I had some contact with Olof Kindgren, the author of FuseSoC. He fixed a bug I reported quite fast https://github.com/olofk/fusesoc/issues/175.

My understanding is, that you have a own fork of FuseSoC which, among other things, also adds the "papilio base library". I think there must be a better way than forking the original sources. FuseSoC is in its infancy, and forking will either create much burden on reintegrating upstream progress or will cut off your version. 

I'm also not sure how to deal with this library stuff. I try to use FuseSoC not only as build tool, I try to integrate into my development process, e.g. for maintaining a common base for Vivado and ISE. When using the -no-export and some other options of FuseSoC it works quite well. The [provider] option in the core seems to interfere with this kind of workflow. because it has precedence over local source files when it is present - which means that FuseSoC clones the source into its cache and uses the cached versions to build. When looking at the fusesoc-cores lib it seems that they have a local core file in each repo (without [provider] section) and add a copy of the core file (with [provider] added) to their library repo. Not really an elegant way of doing things, a clear violation of the "don't repeat yourself" principle. 

 

To avoid this problems for the moment I have added a .core file to all the different part of bonfire, added them as git sub-modules to top-level project (which itself is almost empty).  FuseSoC searches for core files in all subdirs of a cores-root,so running git clone, git submodule update and fusesoc build will do almost the same as using the [provider] section.

 

Nevertheless, I think it could be an idea to decompose zpuino further into cores, e.g. for UART, SPI, GPIO. Do you (or Alvie?) have plans in this direction? I'm currently thinking of replacing the Bonfire SoC I/O modules with the ZPUino ones, providing them as cores would help (of course I can also help with this if you like the idea). 

Thomas

 

 

 

Link to comment
Share on other sites

1 hour ago, Thomas Hornschuh said:

Hi Jack,

sounds great. I was in the meantime also able to build the bonfire project with FuseSoC. I will update the respective thread with instructions. I had some contact with Olof Kindgren, the author of FuseSoC. He fixed a bug I reported quite fast https://github.com/olofk/fusesoc/issues/175.

My understanding is, that you have a own fork of FuseSoC which, among other things, also adds the "papilio base library". I think there must be a better way than forking the original sources. FuseSoC is in its infancy, and forking will either create much burden on reintegrating upstream progress or will cut off your version. 

Yes, I hear you, forking is not the long term solution. I forked the project because I needed to add special parameters to the tcl script that creates the .xise project file and I didn't see any support to do so in the code. I also wasn't sure if FuseoSOC was going to be suitable for what I wanted to accomplish so the quickest and easiest thing to do was just hard code what I needed into the part that generates the tcl script. The plan is to circle back around and either code the functionality I need and push it back to the main project or create an issue and request the functionality. Either way, the fork should just be a temporary thing while I'm figuring out the workflows that I need.

The papilio base library can be added to a config file from what I've read, so the fork isn't needed for that.

1 hour ago, Thomas Hornschuh said:

I'm also not sure how to deal with this library stuff. I try to use FuseSoC not only as build tool, I try to integrate into my development process, e.g. for maintaining a common base for Vivado and ISE. When using the -no-export and some other options of FuseSoC it works quite well. The [provider] option in the core seems to interfere with this kind of workflow. because it has precedence over local source files when it is present - which means that FuseSoC clones the source into its cache and uses the cached versions to build. When looking at the fusesoc-cores lib it seems that they have a local core file in each repo (without [provider] section) and add a copy of the core file (with [provider] added) to their library repo. Not really an elegant way of doing things, a clear violation of the "don't repeat yourself" principle. 

I'm not sure I've gotten as far as you have to really see this problem yet. But one thing I was uncertain about was how to update the cache? When I make a change to my base repository I have to wipe out the local cache and do a fuesesoc init again. There must be an easier way that I'm not seeing in the quick help...

1 hour ago, Thomas Hornschuh said:

 

To avoid this problems for the moment I have added a .core file to all the different part of bonfire, added them as git sub-modules to top-level project (which itself is almost empty).  FuseSoC searches for core files in all subdirs of a cores-root,so running git clone, git submodule update and fusesoc build will do almost the same as using the [provider] section.

One of the problems I ran into is that I want to have my ucf files in the papilio-boards.core file and just add it as a dependency. I have to hard code a path that would be changing when the module changes version numbers which is not a great solution. The only thing I can think of so far is to add the ucf files as git submodules... So maybe a hybrid solution of git submodules and fusesoc is the best solution...

1 hour ago, Thomas Hornschuh said:

 

Nevertheless, I think it could be an idea to decompose zpuino further into cores, e.g. for UART, SPI, GPIO. Do you (or Alvie?) have plans in this direction? I'm currently thinking of replacing the Bonfire SoC I/O modules with the ZPUino ones, providing them as cores would help (of course I can also help with this if you like the idea). 

Thomas

Yes, absolutely, that is what I was thinking. Part of the problem that people have with using zpuino is that the board files are buried way down in the directory structure. I would like to get all of the files that you need to modify to customize a zpuino instance checked into a top level git project and then use fusesoc to pull in all the supporting files that people don't need to worry about. Would love some help along these lines and also integration with other projects.

Also, I would like to branch out to Altera and ICE Papilio boards. I think fusesoc will help with creating and managing a common code base for non xilinx Papilio boards.

Link to comment
Share on other sites

13 hours ago, Jack Gassett said:

The plan is to circle back around and either code the functionality I need and push it back to the main project or create an issue and request the functionality.

I have seen Olof is very helpful and responsive. He also accepted a small pull request from me. I have the feeling that ISE is not so much on their focus, neither VHDL. Most people involved in FuseSoC have their roots in the OpenCores/OpenRISC movement and belong to a network of european open source hardware advocats, which at the end aim for open source ASICs. So they are quite open to suggestions on the ISE//VHDL flows. So e.g. supporting additional options for the tool section ([ise] in our case) should not present a big problem. 

 

14 hours ago, Jack Gassett said:

I'm not sure I've gotten as far as you have to really see this problem yet.

Maybe at the we moment just aim for slightly different targets. You focus on a CI flow, I'm more focussing on the problem of managing my development work which currently targets two fpga boards (Papilio Pro and Arty) which use different tool chains. So I'm not "packaging" a finished project into FuseSoC, I want to use FuseSoC as a kind of "automake". It actually works very well for this purpose, especially with the --no-export option I can load the generated ISE project into ISE, analyze e.g. synthesis results and edit the source files at their original location.  When I use the [provider] option, it works only with the committed/pushed version of the project. 

 

13 hours ago, Jack Gassett said:

So maybe a hybrid solution of git submodules and fusesoc is the best solution...

Yes. this is exactly what I'm doing currently, checkout https://github.com/bonfireprocessor/bonfire

 

13 hours ago, Jack Gassett said:

Yes, absolutely, that is what I was thinking. Part of the problem that people have with using zpuino is that the board files are buried way down in the directory structure

This was exactly what stopped me form using things like the uart from zpuino. The other topic for me was that zpuino use Wishbone pipeline mode a lot, which is incompatible with non-pipeline mode (bonfire uses wishbone burst cycles instead). But especially for uart, gpio and spi it should not be to hard to remove the dependencies to the zpuino packages and replace the required parameters with generics. Maybe I will give the uart a try as a first attempt.

 

14 hours ago, Jack Gassett said:

I would like to branch out to Altera and ICE Papilio boards

Do you have specific plans in this direction ?

Link to comment
Share on other sites

  • 2 months later...
On 10/16/2017 at 4:07 PM, Thomas Hornschuh said:

This was exactly what stopped me form using things like the uart from zpuino. The other topic for me was that zpuino use Wishbone pipeline mode a lot, which is incompatible with non-pipeline mode (bonfire uses wishbone burst cycles instead). But especially for uart, gpio and spi it should not be to hard to remove the dependencies to the zpuino packages and replace the required parameters with generics. Maybe I will give the uart a try as a first attempt.

 

Do you have specific plans in this direction ?

 

Hi, I stumbled  up on this (ratter long but interesting video on "icesoc" ) I might give us some ideas

Link to comment
Share on other sites

  • 1 month later...

Hi Jack and Thomas,

 

I just stumbled across this discussion in my semi-regular googling to see if FuseSoC has picked up any new users. Exciting read. I really like the Papilio ecosystem and have been planning on buying some boards. Unfortunately, I have realized that it's lack of time rather than lack of HW that prevents me from playing more with FPGAs outside of work. Anyway, I'm happy to help out and answer questions. I know that you have both contributed patches to FuseSoC already. Also, let me take the opportunity to once again apologize for the embarrassing lack of documentation. I can however answer one of your earliest questions right away.

If you create a file in a fileset with the tclSource file type, it will be picked up by ISE and sourced in the main TCL script that creates the project file. As other backends also support sourcing TCL files, you should add a usage = ise to make sure it's not picked up by other tools.

Regarding the inelegance of having .core files with or without provider sections, depending on if their local in the source tree or external... I'm aware of that. I have some ideas for how to make that slightly nicer, but it's not prioritized right now. I will get back once I have more to say about that

 

On 10/16/2017 at 1:54 AM, Jack Gassett said:

One of the problems I ran into is that I want to have my ucf files in the papilio-boards.core file and just add it as a dependency. I have to hard code a path that would be changing when the module changes version numbers which is not a great solution

I was wondering a bit about this one. Couldn't really figure out what you mean. Could you elaborate a bit and perhaps I can see if this is something that is already handled or that I could add handling for.

 

Anyway, hope to find time to try out the stuff you have packaged. Already built and tested bonfire-soc, but not the zpuino stuff. Also, make sure to register your projects at librecores.org. We are building an index there of interesting open source silicon projects. The registration process is very streamlined and should only take a moment. Right now it's mostly an index, but we plan to expand this to automatically handle CI and show relations between cores (especially for FuseSoC-compatible cores) in the future.

Consider this also an invitation to our yearly conference, ORConf. We would love to see some presence from the Papilio corner and I believe many in our audience would like to hear a presentation of your work

 

Cheers,

Olof

Link to comment
Share on other sites

Hi Olof,

thanks for passing by. First I must say, that I'm only a happy GadgetFactory customer and Papilio Pro User. The big advantage of the Papilio Pro is that its SDRAM can be accessed with an open source soft core without having to rely on proprietary IP of the FPGA vendor. FuseSoC is a in addition a genius idea to solve the problem of having a tool vendor independent build description. 

I heard about LibreCores, FuseSoC and FOSSI at the FOSSIC RISC-V event last year in Munich. The funny thing is, that I didn't gave FuseSoC much attention until Jack started to play around with it after my hint. After his initial proof of concept I realized how easy to use FuseSoC is. 

In the meantime I updated Bonfire a lot, separated it into several FuseSoC cores, and also created a few Dockerfiles which allow to create a complete Build environment for Bonfire, including the RISC-V toolchain, ghdl for simulation and the tools to build eLua for Bonfire.

Of course I have the same problem as you, lacking time, especially for Documentation. 

I have also a version of Bonfire / eLua running on a Digilent ARTY board including TCP/IP networking. Because its uses a Xilinx IP Integrator block Design (for accessing SDRAM and Ethenernet PHY with Xilinx IP) it cannot be build with FuseSoC yet. Maybe the tcl Script option you mentioned above will help to solve this.

 

On 14.2.2018 at 10:17 PM, Olof Kindgren said:

Also, make sure to register your projects at librecores.org.

I already considered it. Also here the limiting part is that the cores are not fulfilling my own standards regarding documentation yet...

 

On 14.2.2018 at 10:17 PM, Olof Kindgren said:

Consider this also an invitation to our yearly conference, ORConf. We would love to see some presence from the Papilio corner and I believe many in our audience would like to hear a presentation of your work

Many thanks, I'm thinking about it. But as stated above I'm just a Papilio Customer ;) 

 

Regards

Thomas

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.