Altera Intel impacts


nilrods

Recommended Posts

As I am sure some of you have heard Altera has finally agreed to be bought out by Intel.

 

With all the drama that was going on earlier in the year between the two I have to admit I am surprised it went through.

 

What are the thoughts on how this will impact the FPGA landscape?

 

I would think since probably 90+% of pc run Intel processors if they start coupling FPGA's with the processors for specific application acceleration in commodity processors that could really change things. I know there was talk of a Xeon FPGA chip a couple years ago but I never heard it was actually for sale. But if it makes it to the average user or even mid-range processors that could really be big. Think of a Zedboard on steroids...

 

Will this purchase help Altera to become a larger player in the market than Xilinx? Or will Intel owning them just cause Altera issues? Or is it just a matter of time before another chip maker buys Xilinx?

 

Anyway just thought I would see what others thought the impact of this could be.

 

Thanks,
Chris

Link to comment
Share on other sites

We had a short discussion on this topic at element14 a few weeks ago: http://www.element14.com/community/thread/43401/l/intel-buys-altera-for-54-cash-is-this-a-good-thing

 

I'll repeat my main comment:

 

I think it's great news... for Xilinx.  Intel has a way of buying companies or technologies and letting them wither on the vine due to lack of resources.  I've often observed the "Business Area One" or "Cash Cow" phenomenon at large companies.  Basically, whichever part of the company is bringing in the most money gets to call the shots, and any upstart alternate technologies within the company get crushed or at least held back by Business Area One who sees an internal threat.  Examples include IBM 370 mainframes versus PowerPC, Microsoft Windows versus internal tablet OS work, and Intel x86 versus all of its RISC processors (Strong ARM, DEC Alpha, i432, i860, i960).

 

The key to FPGA success is software, not silicon.  Intel is a great company when it comes to high-performance silicon.  OTOH, when I think of Intel software I think of PL/M.  The smartest thing Intel could do with Altera is to open up the bitstream format and let the open source world take over FPGA software.  Open documentation is IMO one of the most important reasons Intel processors have been so successful.  They could revolutionize FPGA software by doing the same with Altera FPGAs.  However, I really doubt they'll do it.

 

JMO/YMMV

Link to comment
Share on other sites

Wow, that would be a dream come true if Intel opened up the bitstream format. I would move the Papilio boards over to Altera in a heartbeat!

 

I'm very curious to see how this will shake out for the FPGA community, I hope it will be good. 

 

I also would love to hear opinions on how it could shake out.

 

Jack.

Link to comment
Share on other sites

John,

You bring up excellent points. I am not as familiar with past Intel acquisitions.

 

I do know exactly what your saying about the "Business Area One" and all though.  Like when Compaq bought DEC.

 

The only thing that makes me think they might do something with Altera is that now days Intel really doesn't have much growth. Consumers are buying pc much less often, rather buying mobile type devices. Yes the server market is decent right now. I also see no MAJOR technological advances in CPU design on the near horizon so their growth will be flat.

 

I see something like the Altera being bought as means to have growth. But like you said it all comes down to how/if the use it.

 

I would 100% agree with you on the open source of bitstream. Does anyone know how much money Altera/Xilinx makes off the software vs the hardware? Would it tear into their bottom line much? Not sure how much Intel is open sourcing these days. I know they didn't use to open source much but maybe that is changing.

 

The other thing that might lead to it not being a boom for Xilinx is if this causes someone else to buy Xilinx. Then they could be in the same boat or worse. I think with the recent consolidations in the semiconductor industry that is a real possibility and maybe even somewhat soon.

 

It is nice to see some other viewpoints on it.

 

Just my thoughts

Link to comment
Share on other sites

I suspect Intel sees an increase in the use of FPGA's in the embedded space, and they are looking for a way to compete with the ARM + FPGA devices that are around.

The other possibility is the recent attention FPGA's are getting in the High Performance Computing space.

I agree the biggest winner here is probably Xilinx, even if Intel is committed to investing in Altera, it will minimally derail their roadmap.

Link to comment
Share on other sites

I hope Intel doesn't ruin Altera, I like having multiple choices to work with. It makes me wonder what they'll do with the software. Quartus II is quite good, in many ways nicer to work with than ISE. The free version feels more crippled though, for example it doesn't utilize more than one processor core which is ridiculous in an era where even the cheapest consumer laptops have multiple core CPUs.

 

As far as traditional desktop/laptop PCs, Intel is a victim of their own success. Computing power has continued to increase by leaps and bounds year after year and within the last decade or so it has outstripped the needs of most users. Not much more than a year ago I was using a nearly 10 year old PC and for the typical consumer stuff of email, web surfing, youtube videos and playing music it was still perfectly fine. Not only that but I was using it for PCB CAD and microcontroller and FPGA coding and software defined radio, the latter two are pretty heavy duty stuff. The only reason I finally upgraded was that I wanted to be able to transcode video efficiently and also WinXP was getting really old. Death of the traditional PC has been greatly exaggerated, there are over 1.2 Billion Windows PCs in current use and most of those are not going away. What has changed is the constant need to upgrade and 2 year old PCs being hopelessly obsolete. Mobile devices are relatively new and are still evolving quickly. They are where the PC was in the 90s or so but I think even they are beginning to plateau. Give it a few years and mobile sales will level off as they too become cheap, mature commodities. The only thing that will stop a PC-like plunge is the fact that we carry mobile devices around everywhere and they get dropped, beat up and broken.

 

As far as Intel, I agree with what was said earlier, they make generally great hardware, but I can't think of any Intel software that has impressed me. An FPGA with lousy development software is useless, and even if the bitstream format becomes open, I think it would take decades for an open source product to mature to the point where it could challenge the vendor specific commercial stuff. The process of turning HDL into a bitstream that will result in an efficient implementation in a particular FPGA is extremely complex. It absolutely amazes me that it works at all.

Link to comment
Share on other sites

As far as Intel, I agree with what was said earlier, they make generally great hardware, but I can't think of any Intel software that has impressed me.  An FPGA with lousy development software is useless, and even if the bitstream format becomes open, I think it would take decades for an open source product to mature to the point where it could challenge the vendor specific commercial stuff. The process of turning HDL into a bitstream that will result in an efficient implementation in a particular FPGA is extremely complex. It absolutely amazes me that it works at all.

 

Based on my research over decades, my opinion is that FPGA implementation software isn't that complex. You just need to go through the steps one by one.  It has about the same level of complexity as a good optimizing compiler.  Like an optimizing compiler, the actual complexity depends on the source language and the target architecture.  Like CPU architectures, some FPGAs are a lot more regular and it's easier to write tools for them.

 

A major difference is that you can buy good books on writing (or "crafting") compilers, whereas writing FPGA tools requires you to find lore from all over the place -- often hiding in expensive journals and conference proceedings.

 

Now that bitstreams are finally getting published -- with Project IceStorm leading the way -- we should see rapid progress.  IceStorm targets the Lattice iCE40, which is a simple architecture so an excellent first step.  While there are decades of work to do, we can use useful results from compiler writing and other areas that have occurred in these last decades, so it's not like we have to start from zero.  Plus, computers are a lot more capable (and cheaper!) than they were decades ago, so lack of computer performance isn't holding us back.  There's now the Internet, so people can find resources that would be difficult or impossible to find decades ago.  And there are twice as many people on the planet, so more people to do the work :-)

 

JMO/YMMV

Link to comment
Share on other sites

  • 4 weeks later...

Project ice storm is complete nonesence, basically because no working engineer is in the slightest bit interested in spending months delving into code, trying to work out how it does what it does, and then adding some mystical new feature which is going to open up hitherto unexplored vistas of wonderousness, doing all this without breaking the code, it ain't gonna happen when every major vendor supplies working synthesis tools which do what they say on the tin and took many hundreds of man years to write

Link to comment
Share on other sites

The open source fantasy that people are interested in modifying tools when they have got work to do is nothing more than Stallman-driven drivel.

 

True, most tool users aren't going to modify their tools.  However, others do improve open-source software and all users benefit from the results.  The reason GNU/Linux and GCC have such high quality is that since the source code is available bugs can be found and fixed, and people who redistribute the improved software are required to make the modified source code available so that improvements can be incorporated back into the mainstream code.

 

IMO the fact that we can get cheap, high-performance microprocessors and microcontrollers is largely because chip manufacturers can adopt GNU/Linux and GCC and only modify drivers, greatly reducing software development and licensing issues so they can concentrate on the silicon.  FPGA manufacturers could have taken advantage of this same opportunity if they had opened their bitstreams and IMO this could have made FPGAs a mainstream technology instead of something expensive and esoteric.  Fortunately, it's not too late for them to come around.

 

The reality is that most FPGA vendor software has a steep learning curve and since there's no competing software there's no incentive for vendors to improve things.  I've found the learning curve gets steeper with each new FPGA family, and recently spent hours debugging a problem because the Xilinx ISE error message steered me in the wrong direction.  Can I improve the error message?  Nope, because it's not open source.

 

Regarding RMS, I think his MacArthur Fellowship and his long list of honorary doctorates and professorships speak for themselves.

 

Regarding IceStorm, I've been playing with IceStorm tools and arachne-pnr over the last month and I'm very impressed with their reliability and speed.  I'm planning to incorporate them into my XXICC software.

Link to comment
Share on other sites

Without getting too far down the open source rabbit hole. The license matters as do the interested parties.

 

True, most tool users aren't going to modify their tools.  However, others do improve open-source software and all users benefit from the results.  The reason GNU/Linux and GCC have such high quality is that since the source code is available bugs can be found and fixed, and people who redistribute the improved software are required to make the modified source code available so that improvements can be incorporated back into the mainstream code.

 

 

GCC has seen a dramatic drop off in contributors since they swapped licenses to GPL V3, no vendor dares use the V3 versions in a commercial product. Which is why you see vendors stuck on earlier GCC releases and the move to LLVM and Clang.

Most of the successful large scale "Open Source" projects are so because they are largely written by the commercial entities consuming them. It also tends to be their biggest weakness, all the major contributors have agendas, and if your use of the tool isn;t aligned with their vision for it, best of luck actually getting a change for your use case accepted.

 

When I first used the Xilinx tools I was stunned that they could get away with shipping ISE, it's usability is horrid, last night I downloaded the Lattice Semi tools and discovered they were even worse. I'm sure the underlying code generation portions are quite good, but there is just no thought put into how people interact with them. Any real competition in the space seems like it would be a win.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.