Archive for the 'EDA' Category

Are you capturing enough value on your innovation?

Monday, November 3rd, 2014

Read an interesting article, “Capture more value” by Stefan Michel in the latest HBR issue (October 2014). The article talks about how companies while putting in efforts in value creation often lose focus on value capture, thus leaving money on the table. It describes 5 different innovation categories in value capture – Changing the price setting mechanism, changing the payer, changing the price carrier, and lastly changing the segment.

Value Capture

Most of us in the semiconductor industry would confess to be “techno snobbish “. Working at leading edge technologies and in the scaling race, we often miss out the salient point – customers want technology benefits and not technology per se. While we are steadily moving towards treating the hardware chips and systems as modes for creating value and not just solely numbers in nanometres, we have a long way to go in exploring various possible and often innovative ways on how we can optimally leverage the value we create. Value capture is sadly under prioritized.

Let me take a few examples on some varying stages of value capture in this industry…..

Capturing value by changing the price setting mechanism:
The idea is to set the price according to the product’s value or worth to the customer. Memory (DRAM and NAND Flash) pricing is a good example here.

Changing the price carrier:
Price carrier is what the seller is hanging the price tag on. Semiconductor IPs is a good example here. IP vendors have evolved their IP strategies from treating IPs as fillers for differentiating hardware sockets in a system towards ensuring that the IP works not just as an isolated unit but also in the complete system. It subsequently evolved towards IP plus services and then is moving towards offering the customer a complete IP (hardware, services, software) platform solution. The price tag has moved from the die space to differentiating value provided to the customer and then the complete valued package. Bundling and unbundling of various EDA licenses is another example where the EDA company is changing the price carrier.

Changing the payer:
It may not always be the case that it is only the consumer of the product/offering who pays for the value he receives. Like in some media where content is offered free to the public, the costs are shared by the advertisers. An example I see here is that of Qualcomm’s push to the China fabless design houses to design using its cores. Am not sure of the veracity of this but understand that in China, Qualcomm recovers its IP royalty not from the design houses there but from the system houses which use the design solution (with the Qualcomm core) from these local design houses.

One of the potential big biz for us is in the Internet of Things….. and it is especially in this space that the semiconductor industry needs to think hard and differently from its traditional value innovation and capture strategy - if it doesn’t want to be left out with just a fraction of the pie. I will do a separate post on this shortly.

Do you see other examples of value capture in your industry? What is your opinion on this? Would be keen to hear your ideas and perspective on this.

Could TSMC be your next chip design cloud owner?

Tuesday, October 25th, 2011

In a 2009 technical report, ”The Clouds: A Berkeley view of cloud computing”, the authors cite “Cloud Computing is likely to have the same impact on software that foundries have had on the hardware industry”. The underlying logic being the high cost of semiconductor fabrication line leading to the rise of semiconductor fabs and these in turn “enabling” fabless semiconductor design companies whose value is in innovative chip design.

A week back, I moderated a panel discussion on cloud computing in the IC design world, especially on accelerating time to market. While issues on security – a challenge which is stacked right on top of the “barriers to entry”, were defended and discussed quite animatedly, Surprisingly, it was the “cloud ownership” aspect which evoked only some tepid responses.

Now from where I stand, I see cloud ownership as a vital component of chip design security in the cloud. After all, if I were to place my company’s most precious assets –i.e. my chip design database – on a cloud, I will definitely like to know as to who owns the cloud. And this is on top of my regular apprehensions about my data security, back up and related aspects.

Let me clarify - I am not talking here about the infra structure provider e.g. Amazon and the likes. Rather it is the cloud framework/database owner. The framework here includes components of the existing physical eco-system integrated together – design database, EDA tools, user interface etc. – without which cloud computing will just service individual IC design tasks i.e. storage and processing power requirements; something which on its own is not exactly fully leveraging this powerful biz paradigm shift aka cloud computing.

So again – who will own the chip design cloud? Will it be the foundries (also cited as “natural design aggregators”), the EDA vendors, the fabless design companies or yet another entity? The reply gleaned from most of the stormy discussions elsewhere in the nimbus zone gravitates towards foundry.

Which brings me back to the where I started this post – Riding on cloud computing, foundries may turn out to influence the hardware industry in more ways than one. And who else is better equipped to lead the way here than TSMC??

Logical adjacency or hedging bets?? - Synopsys buys software firm Optical Research

Wednesday, October 13th, 2010

Synopsys has acquired Optical Research Associates, a privately held premier provider of optical design software andl engineering services. The acquisition represents Synopsys’ first move into markets associated with displays and solid state lighting using light emitting diodes. The company said the acquisition will also allow it to expand into markets such as semiconductor lithography equipment and cameras.
Over the last few years, Synopsys has been venturing into various fields - apart from its traditional role of supplying EDA tools for the semiconductor market. Earlier it was tools for the PV systems and now this foray into markets associated with displays and solid state lighting using LEDs
 

With mixed signals on the semiconductor outlook coupled with evolving role of EDA vendors, the line between “hedging bets” and moving into “logical adjacent” areas is quite fine…..
 

Evolving face of EDA

Friday, December 11th, 2009

There has been a lot of talk recently over the blogosphere (& elsewhere!) about the changing face of EDA, EDA is doomed, EDA needs to change its biz model in order to survive etc.
What needs to be kept in perspective is the evolving value-proposition & risk sharing perceived by the chip designing company from an EDA vendor’s offering; especially in these times of the escalating costs (and diminishing success rates) of  chip designs in the higher technos.
I see the following path for the EDA vendors

  • EDA vendors’ offering to be more service oriented and I do not mean tool based trouble shooting here.  

  • Point tools across EDA vendors are losing their differentiation but are essential for basic design flow. I do not see chip vendors getting into developing these.  

  • EDA vendors will partner for point tools. These tools will evolve as per market requirements and be available on a pay-per-use model and bolstered by cloud computing.  

  • As a service, EDA vendors will closely work with the chip developers to see the project from specs to manufacturing. This is the service part of the tool+ service offering of the EDA vendors. The major chunk of the EDA vendor’s revenue will come from this service part and this will be the deciding criteria for the EDA vendors’ ranking in the chip designing world.  

Recommend a couple of interesting articles

What EDA needs to do to start growing again and Cadence goes two-dimensional

 

KLA-Tencor invests in EDA firm

Monday, February 18th, 2008

KT Venture Group LLC, the investment arm of KLA-Tencor Corp has invested in the latest round of funding of Pyxis Technology, an EDA company selling IC routing software and service solutions. The new funds will be used to accelerate deployment of the Pyxis NexusRoute yield-aware, auto-router, which was announced last September.

This is yet another example of the trend where different players are “crossing over” or getting the various adjacencies into the fold, in their pursuit of a better and integrated solution for the market; and thus increasing their share of the pie.

We saw this with Cadence acquiring Clear Shape Technologies and also with Verigy’s acquisition of Inovys.

EDA & Foundry

Wednesday, June 13th, 2007

Ron Wilson, executive editor of EDN makes an interesting take on the low power SoC trend. He writes about the change in significance of low power design. Pre 90nm, low-power design was something you did in response to a specific application requirement. Post 90nm, according to tool vendors at least, low-power design is something you do so that the chip can work at all. This suggests that tools for invasive low-power design will be a gating factor in the industry’s migration to 65 nm and certainly beyond. And if there’s one thing that increases the–shall we say—intimacy of the relationship between the foundries and the EDA industry, it’s an obstacle to wafer shipments. 

I refer to this as yet another example of the expanding role & growing prominence of foundries. To fill their billion dollar fabs, they have to catalyse solutions for issues which may deter new design starts. So, if low power tools is a gating factor, they will “collaborate” with the EDA vendors. As I noted in an earlier post, Virtual vs. Vertical, it is the same for DFM; here too foundries started working together with the EDA vendors with information & data that was once under wraps. 

As they say, it is the economics!    

 

Low power IC design kit enables representative design

Friday, May 18th, 2007

Cadence is slated to release its Low Power Methodology Kit in late June. The highlight of the kit is a wireless “representative design” implemented using multi-supply voltage and power shutoff methods. It comes with all the necessary command scripts and technology files to complete the design. The design has sample IP including a processor and bus fabric from ARM, Wi-Fi from Wipro, USB 2.0 from ChipIdea, 65nm low-power memories from Virage Logic and 65nm technology libraries from TSMC.

While till date, EDA vendors have been mostly dishing out different point tools to address the industry’s power concerns, a big challenge is to help designers utilize the appropriate low power techniques and tools effectively and seamlessly within their flow on a real design – and in a timely manner. They need to be aware of the trade-offs required and some balancing tips to make the exercise productive.

A representative design is a step forward in this direction. The objective may be to regain the lead in the format war, but if it helps the end user, it definitely signals well!

ESL tool targets algorithm for FPGA, ASIC devices

Wednesday, April 25th, 2007

Synplicity rolled out its Synplify DSP ASIC Edition software at the Design Automation and Test Europe conference in France. Their earlier ESL synthesis tool was aimed at FPGA designs. With this new offering, they are targeting customers who use FPGA prototyping for their DSP based ASICs.

Another recent news has been that TSMC is broadening its IP portfolio giving worries to IP providers and speculation in the industry whether TSMC is moving towards ASIC like biz model.

Gives a new meaning to the phrase “ASIC demise”………

UMC joins CPF standard alliance

Monday, April 23rd, 2007

UMC is the latest one to join the Cadence camp. Earlier this month, Cadence and TSMC had announced the availability of 65nm libraries from TSMC supporting CPF (Common Power Format).

The market forces will decide who the winner is; but the poor user has to cope with this standards battle in the interim.

The dilemma of two languages in low power design

Saturday, March 31st, 2007

So, hopes of a single power format seem remote and it is increasingly likely that the industry will need to support both standards i.e. CPF as well as UPF. Well, now the market forces will decide the winner……

Integrated DFM solutions still lacking

Tuesday, March 6th, 2007

Walter Ng, senior director of platform alliances at Chartered Semiconductor Manufacturing noted in his presentation at the SPIE Advanced Lithography Conference last week that while there are some good point tools for DFM, integrated DFM solutions are still lacking.

As I noted in an earlier post, Why can’t we do it in EDA?, it is a huge task for a single vendor to handle even most of the important sources of variations through a single integrated flow. Integration of point tools requires standardization as well as agreement over interfaces and formats.

IBM’s Leon Stok had identified 4 eras in the EDA industry. For the 4th era i.e. design implementation platforms, he mentioned that we would need to define standards as APIs in order to allow tools to talk to each other.

The trend is moving more and more towards a hybrid approach

Blaze DFM merges with Aprio

Friday, February 23rd, 2007

So, the DFM consolidation has begun…..

While acquisitions of DFM companies by EDA vendors was already there, this is the first merger between 2 DFM companies. Blaze’s parametric DFM expertise complements Aprio’s lithography analysis skill-set. Together they can synergize on DFM analysis as well as optimization and address both manufacturing as well as the designer sets.

I had pointed out in an earlier post “Who will be left standing in DFM”, that in the consolidation phase, it will be a selected few who stand a chance to survive. Together, Blaze DFM-Aprio do come under this category.

Will this serve as the catalyst for further such mergers and pave way for “pure play DFM vendors” as opposed to EDA vendors selling DFM as a part of their “complete design flow portfolio” ??

Statistical tool avoids overdesign with excessive margins

Wednesday, February 14th, 2007

A new tool in the DFM arena –

Solido Design Automation has announced a tool for transistor level statistical design & verification. Unlike most of the DFM touted tools in the market, this one is to be used by designers prior to layout.

It promises 5 basic capabilities –

  1. Statistical sampling that describes how processes can vary so that circuit simulators can estimate possible outputs.
  2. Tradeoff analysis that lets users adjust specifications to impact yield
  3. Statistical characterization, that shows the user how to improve the design to make it more robust to process variations.
  4. Statistical circuit enhancement that automatically optimizes designs by sizing transistors.
  5. Statistical visualization lets users explore and view the data.

Looks like a comprehensive set….

Post-silicon debugging worth a second look

Monday, February 12th, 2007

With verification consuming up to 70% of design effort and debug up to 50% of that time (this means up to 35% of overall design time is spent understanding how a design works or why it doesn’t!) - I share Richard Goering’s musing that it’s a wonder that EDA vendors have paid little attention to post-silicon debug.

Post-silicon validation being a confrontational sale may be a significant hurdle for selling wares in this space. However another not so insignificant fact is that post-silicon validation is done in 2-3 phases – first on standalone chip, then on the system and then the field trials. The latter two being heavily dependant on applications and the working environment pose too many variables in the debug process and it is not an easy task to implement all these in a tool. Nevertheless it is a fact that good solutions in this space will be a boon to the chip designer (not to say the S&M guys who keep their fingers crossed while awaiting reports of field tests and subsequently news on the 1st order…)

Who will be left standing in DFM?

Sunday, February 11th, 2007

An interesting exchange of ideas reported in Electronics News recently.

DFM is the bridge between design and manufacturing. Most of the tools in the DFM arena are now towards the manufacturing side i.e. improving OPC while a selected few are focusing on the design part of the bridge. I agree with the viewpoint that in the consolidation phase, it will be these selected few who stand a chance to survive. Moving from binary rules based info to distribution information coming from manufacturing is not a smooth transition…..

Who will be left standing in DFM?

Sunday, February 11th, 2007

An interesting exchange of ideas reported in Electronics News recently.

DFM is the bridge between design and manufacturing. Most of the tools in the DFM arena are now towards the manufacturing side i.e. improving OPC while a selected few are focusing on the design part of the bridge. I agree with the viewpoint that in the consolidation phase, it will be these selected few who stand a chance to survive. Moving from binary rules based info to distribution information coming from manufacturing is not a smooth transition…..

Cadence deploys CPF

Thursday, February 1st, 2007

Cadence has deployed CPF (Common Power Format) into its existing tools. Rather than making it available as a special feature in tools that would have to be paid for separately, Cadence has made most of its existing tools CPF compliant.

While this makes it more convenient for the user on one hand, he expresses the design power intent/requirements just once and then the system/design flow takes care of the rest, it provides a potential blocking factor for user should the industry embrace an alternate power format (UPF or a third one).

However, Cadence has said, “Wherever the industry takes CPF and UPF, if the users want it, we’ll do it. If you’re a Cadence customer, as of now, the power standards thing is over. Go make chips. Whether it’s CPF or UPF or some common thing in the future doesn’t matter any more. We’ve got the software system that will build the chips, and we’ll follow wherever the standard goes.”  

Let’s see what follows from the rival potential standard’s camp……

Low Power Specification Format War

Friday, January 19th, 2007

Cadence’s primary EDA rivals felt that Power Forward Initiative introduced by Cadence in May ’06 wasn’t open and inclusive and joined another coalition – Accellera UPF effort in Sep. Si2’s Low Power Committee (LPC) was set up in Oct as an attempt to bridge the gap and address users’ requirement of having a single low power specification format.

Si2 first approved CPF 1.0 saying that its approval of CPF 1.0 does not constitute taking sides and that they have declared it as a “specification” and not a “standard”. This may be a conciliatory offer to Accellera which said that they are actively working with Si2 to converge UPF and CPF into a single standard. Then Si2 issued a RFT to complement the CPF and Cadence in its response has now provided them the source code of its CPF 1.0 parser; in the process opening the door to tool implementation that supports CPF ……. and hence giving another push to boost their format

Characterization tool for SSTA

Wednesday, January 17th, 2007

A boost to SSTA…..Altos has introduced Variety, a SSTA library characterization tool. While there do exist similar tools in the market, Altos’ niche factor is that it supports multiple formats (unlike Cadence, Synopsys, IBM, Magma etc. which support only their proprietary formats). This is definitely an advantage as it gives flexibility to the user to switch across various flows/vendors.

Characterization speed and accuracy, the two most important aspects in library characterization, are something which Altos promises through this tool.

Synthesis tool meets complex design rules

Thursday, December 14th, 2006

Yet another DFM tool in the market.

DFM Blaze announced a DFM tool, Blaze IF, to address topology variations caused due to CMP. It intelligently inserts dummy metal fill patterns into a design layout taking into account not only the design requirements like power and timing needs, but also the electrical issues like signal integrity and IR drop – which traditional approaches to metal fill do not accomplish.

This is in addition to the announcement of Blaze MO, announced earlier this year which provides guidance to OPC tool used in manufacturing through an annotation layer in the GDSII database. The tool optimizes the design by small tweaking of the gate lengths (within the process limits) for reducing leakage power and improving timings and provides this guidance to the OPC tool.

All these aim to bridge the gap between design and manufacturing. Instead of a blanket set of rules for the complete design, design specific and design objectives’ relevant optimizations are carried forward to the manufacturing. Making it an integral part of the flow before handoff to manufacturing is a step closer to address issues arising out of the open loop caused by changes made to the design after handoff – oblivious to the design issues which may be impacted.