Marko Marin: Signal Integrity – Design of Predictable Interconnects

Marko Marin: Signal Integrity – Design of Predictable Interconnects post thumbnail image

Marko Marin, Technical Account Manager at Ansys, stopped by our booth at DesignCon to answer a few questions about signal integrity.

Marko Marin – The Interview

Here are the topics covered in the interview:

  • 0:08 Tell us about your paper.
  • 2:53 If we had a machine that measures the surface roughness and we’d give you the data, would it be helpful?
  • 4:51 Is resistivity a big deal?
  • 5:50 What do you use as simulation software?
  • 7:30 In that ecosystem, it’s not just the PCB. It’s the chip, the packaging…
  • 7:56 We have a new 1-mil trace and space laser technology called Catlam. What do you think about lasering and not etching traces?
  • 8:52 How did you solve that problem of having multiple reflections?
  • 9:00 Tell us about your mission at Ansys.
  • 9:24 How do you deal with signal integrity and power integrity at Ansys?
  • 11:11 Do you have any advice on addressing signal integrity and power integrity?
  • 11:40 How do engineers get the data for simulation?
  • 12:32 Can you explain that phenomenon?
  • 12:53 What do customers request the most?
  • 14:12 What are the industries that need more accuracy?
  • 15:01 How would you compare Ansys software to the other ones on the market?
  • 15:35 How do you make sure you get the same consistent product from vendors?
  • 16:06 How is your experience at DesignCon?

Tell us about your paper last year and what was the key point of the paper.

The title was “40 Gigahertz PCB Interconnect Validation: Expectation vs. Reality.” So, basically what we wanted to do is, often when you design boards, you design the board, you do the simulations, you manufacture a board, you measure it, and then your simulations don’t mesh with your measurements. And then you start asking, say, “Why doesn’t it correlate?”

So, in this paper, we looked in a systematic way, how to make this expectations match the reality. And I used the process devised by Dr. Yuriy Shlepnev that’s called “Sink or Swim” approach. It’s basically, systematic way that generates, in the end of this process, you get analysis to measurement correlation. And it’s done in the minimal number of steps. But there are basically three steps that needs to be followed.

First, you need to identify the geometry adjustments, because circuit boards are not built as they are designed. There was one of the things we found. There’s a lack of material models, especially for digital signals. You are looking at the frequency content from DC up to 40, 50 gigahertz. So you need to have a broad band, frequency-dependent material models.

And lastly, you need to have EDA software that is validated to work with PCB and packaging applications. So these three elements are what is needed to close the loop and close this gap between expectation versus reality. It’s really difficult to predict how the board’s going look after you manufacture it. For instance, the traces we put on the outer layers, they were way off. They were not what PCB manufacturer told us they are going to be. Trace shape is also, we lack information. What is the etch angle, and so on.

Solar mask information, think of some solar mask properties, solar mask very difficult to get. And lack of surface roughness models. That’s the biggest thing that we show, that if you do analysis based upon pre-manufacturing assumptions, well… Let’s, for example, at 10 gigahertz, there was 30% mismatch. Between simulation and measurements. Because of lack of surface roughness models.

What if you had a machine that measures the surface roughness? You’d just take that laser that does the measurement and you’d draw that across the trace. And we’d give you that data. Would that be helpful?

Yes and no. Unfortunately, because … That is probably done at the copper-clad laminate vendors. Those who make laminates. But the thing is, once you want to do your multilayer PCB board, the PCB manufacturer applies the treatment on the side of the copper trace that faces the prepreg. They put oxidation treatment, micro-etching, and so on, to have better adhesion to decrease the risk of delamination. And that impacts the roughness. So the roughness you get prior to manufacturing is not the actual roughness that is on the manufactured board.

So you need to build validation boards or test boards to help you extract the material properties. And the simplest method that I know of is generalized model S-parameters. That is the method that we have been using. You basically only need two PCB transmission line segments of different lengths. They can be single-ended or differential. Doesn’t matter. And you assume that the line structures are pretty much identical. And based upon this, you can do the material extraction. There’s a simple way to do it.

DOWNLOAD OUR PCB TRANSMISSION LINES eBOOK:

PCB Transmission Lines eBook

And with this, you can identify copper resistivity, because the resistivity of copper on PCB is not the same as you get from physics handbooks. We found that, in our case, resistivity was 30% more than of the copper. Because we have different grain and there are also pollutant metals inside it. There’s more information about that in Paul Huray’s book The Foundations of Signal Integrity.

Is resistivity a big deal?

Well, we found that, especially if you run transient simulations, you need to extrapolate your measurement down to DC. So you need to predict the proper DC point for your transient simulations. So yes, for signal integrity, both the low and high frequencies matter.

Other findings we found in our paper was also that the VNAs we were using gave us wrong results at really low frequencies. Because their internals wasn’t behaving as expected. We expected to go asymptotically down with frequency. It went down, but after 50 gigahertz for some measurements, it went up. And it turned out that that’s a defect of calibration. These electronic calibration units that are supplied with the network analyzers.

It just shows that it’s really difficult to make accurate measurements from really low frequency up to really high frequency.

What did you guys use for simulation software?

For simulation software, we did this project using Simbeor. But again, this approach of building validation board and benchmarking your simulation software, also you can do this to pre-qualify your PCB manufacturer, this can be applied independently of which laminate used or which software used. This is a process. It’s a methodology. How to design predictable interconnects. Because in the end, what we want, we want to predict the behavior of the structures that we put on actual boards.

And in this process, we discovered that localization was one very important thing. The structures you design should be such structures that you can accurately simulate them. For instance, we had the via transition, where in one case, we had nearby return areas. In other cases, we had the return area far away. It turns out that you cannot predict the behavior of that via transition. If you don’t localize it.

So when you design your PCB, you have to think about that the structures you put on the board, use a lot of section vias. They say, “Well, it takes real estate.” But if you want to be able to predict the behavior of your via transition independently where you placed on the board, if it’s on the edge, on a center, or so. You need to localize the structure. That was one of the big things that was an eye-opener for me.

So in that ecosystem, it’s not just the PCB, it’s the chip, packaging …

Actually, here we kept it simple. As a hardware designer, you really don’t have control of the chips. You buy from a vendor. Not either for the package. So basically, we just focused on the PCB. Bare PCB. Nothing else. And just that is challenging enough. That’s a really ambitious goal, to get correlation up to 40 gigahertz. And even beyond.

We have a new technology. We laser the trenches where the trace should go, and then we plate inside.

So there’s no etching. There is still an issue of the surface roughness of the copper because it’s all electroless deposited copper. But we use laser to shape the traces. Do you think that would be beneficial or not beneficial, or doesn’t matter? What do you think?

I think what we should be concerned with is not to have impedance control traces or loss control. It’s better to have predictable. Because if you can predict it, then you can account for it. So I mean, when I worked as a hardware designer, in some cases, we had trouble with the really short links. People tend like, “Well, that the long traces we have much loss are troublesome.” On the contrary, too little loss can give you a headache. Actually more headache than if there are more loss. Or a medium amount of loss. That has to do it because then you get the multiple reflections in the channel, and those are really hard to equalize.

So, as long as we can predict it, that’s the key. Then you can account for it.

How’d you solve that problem with having multiple reflections?

Either you need to make the traces longer, or in this case, if you cannot do that, maybe switch to a cheaper material. One that has more losses. So it depends. It depends. It depends on the application.

Tell us a little bit about your transition from your previous life to your new life. Tell us just a little bit about what’s different? What do you like better? How are you coping with the change?

With the previous company I was with, I was working as a signal integrity lead. So there, I was focused on solving problems. And now, my transition to Ansys has been really interesting because Ansys is really big on multiphysics. And the more you think about it, the more advanced designs become. There is more need for multiphysics. Because you cannot any more separate these physics. You need to take in to account all physics at once. For example, if you have a trace routed nearby, that’s going to affect the performance of the loss characteristics. So you need to take account this whole ecosystem.

Alright, for a hardware engineer trying to attack the issue of signal integrity and power integrity, why would you recommend your paper?

Let’s begin with signal integrity because signal integrity in nature is, we can say, one dimensional. Power integrity is two dimensional. But it has a different set of challenges. So for signal integrity, you have to know about the measurements. You have to know about the design, you have to know about the manufacturing. It depends on up to which frequency you intend to work with. But, I think that our paper is a good starting point. Actually, I think more companies should build this foundation boards, they’re very simple to build. But you can gain so much information from them. And they can be used for multiple purposes.

Benchmarking the software, they can be used to qualify the PCB manufacturer. Let’s say, if you have one design that you do with vendor A and you transfer to vendor B, how do you know that those results, that those bolts will be identical, or have a comparable behavior? Well, you don’t send your production boards, then it’s going to be really difficult. You can send these test boards, let the manufacturer boards measure it.

And see if there are any changes. Also you need to cross section the boards to get the geometry adjustments and so on. So you need to really have full control, I mean. A tide corporation with a PCB manufacturer is a must. There cannot be any more. This is my part then I put to the next person. No, you have to work in an eco-system. Everything matters. Especially if you are shooting for 40 or 50 gigahertz. Everything matters.

You mentioned a few things, like the right data that you need to run your simulations? You mentioned the surface roughness and the material properties. How would an engineer get that information?

IPC has 20 different methods for obtaining Dk and Df value. And depending on which method you use, you get different results. Plus, many of these methods are discreet. You get behavior at, say, one gigahertz 510. But you need a continuous frequency dependent model. So no, you cannot get this information from the vendors.

We found that Panasonic laminate was pretty much close to what we found. It was pretty much close to the data sheet values. Not exactly the same, but rather close. But for the surface roughness, there is a complete lack. There is no information. You just get Ra and Rz value. And those are more related to mechanical properties of a laminate, rather than the electrical properties. And also we have found that the impedance once you TDR the traces goes up a bit, compared to HVLP (Hyper Very Low Profile) copper. So these smoother coppers seems to give a higher impedance. And this can be explained if you study this phenomenon using truly electronic field solder.

Do you understand the phenomenon? Why it’s there?

It has to do with the circulation of current. You get eddy currents. And this increases the inductance of the trace. And that goes up a bit.

You’re seeing a lot of customers. I’m assuming they are sophisticated customers with sophisticated problems. Is there any correlation between one customer to the next and do you see any commonality in terms of the problems they’re trying to solve?

Actually, one of the common things is people are becoming, now, more aware that there is a lack of material models.

Because even if you have a golden standard field solder. Garbage in, garbage out. You have to feed the correct data to get the correct results. And there is no standardized way of getting this data. There are techniques like generalized S-parameters that you can use, a very robust method. But, no. And people are starting to realize this, that if you want to have analysis to measurement correlation you need to have proper data models. You need to have proper information of the trace shape, solder mask, and so on.

So much more cooperation is needed between the PCB manufacturers, designers, and even semiconductor vendors I would say. It’s a complete eco-system. Because the margins have decreased. Previously when you were running at the lower speeds there were plenty of margins. Then it doesn’t matter. Maybe you had good enough accuracy. But now, if you want to have really top notch accuracy you need to have insight in all of these domains.

Can you talk a little about the industries that need more accuracy? Can you talk a little bit about the industries where you’re seeing that?

We are looking for lower power consumptions so, for example, on a board you don’t put retimer, it might consume more power, it takes more real estate. So if you want to, for example, especially in the industries where you deal with the chip to module, then you really need to fulfill the spec. And how far you can go, well you need to be able to predict the behavior of your channel. You cannot just say, “I want impedance control traces and no loss.” And as I said, especially if you have a short channel, well having too little loss can be problematic. Then you really need to optimize the transmissions in these continuities.

Sorry to bounce around but how would you compare Ansys software to ADS or HyperLynx?

I would say Ansys measures geometry as they are, and they use least amount of assumptions, at least from the tools that I work with. Least amount of assumptions and they solve it using full wave 3D methods, finite element methods. So that’s a strength. It’s a very powerful tool. It has been used for many years for antenna design, for microwave design, and especially if you want to go high-end frequency for signal integrity, it’s suitable.

What about SerDes?

It depends. Basically the only thing that you can control as a designer is the channel.

And luckily many of SerDes today are auto adaptive. You need to take care of the channel, make channel as clean as possible, and then hopefully if the algorithms are designed properly, it will auto tune.

The big thing in PCB manufacturing is just consistency. For a PCB manufacturer to get consistency that means all the supporting steps to maintain the tanks, to maintain the processes, to maintain equipment, all those things have to be extremely consistent. Even from lot to lot, a PCB manufacturer, even if they do their best, will still have inconsistencies.

How do you make sure you’re getting consistent products from the same vendor? Just over and over?

You need to have a good dialogue with your PCB vendor. And be honest. What are the limitations? Maybe as a designer, we are expecting too much from a PCB manufacturer. Maybe one solution would be, let’s say instead of using really narrow traces that are really difficult to control, maybe use wider traces and so on. But again, honest dialogue is necessary. Between all the parties. And I think, again, as long as we can predict it, we can account for it. It’s the unpredictable, the random, that’s a problem.

Tags:

2 thoughts on “Marko Marin: Signal Integrity – Design of Predictable Interconnects”

  1. I’m excited to uncover this page. I need to to thank you for ones time for this particularly fantastic read !! I definitely really liked every part of it and i also have you saved to fav to look at new information in your site.

  2. JoinMoney says:

    I genuinely enjoy examining on this site, it has got excellent posts. Beware lest in your anxiety to avoid war you obtain a master.Thanks!

Leave a Reply

Your email address will not be published.