Bill Hargin – PCB materials and their signal integrity properties

Bill Hargin – PCB materials and their signal integrity properties post thumbnail image

Bill Hargin and Don DeGroot are “working on an every-man’s VNA” for measuring the properties of PCB materials with the help of Sierra. Learn more in Bill’s interview!

0:10 Sierra has given you some PCB materials to test their signal integrity properties. Can you elaborate?

Well, this is a discussion that goes pretty far back for me. I’ve always been interested in a more accurate representation. I put something in a book a few years ago. There’s a book called The Printed Circuit Handbook, edited by Clyde Coombs. It’s a good book. I wrote the chapter on signal integrity. What I’ve always been interested in is anything that can be done, in kind of a mass-market sense, to improve signal quality and our understanding of the bad things that the mechanical world, your world, as part of a fabricator, the bad things that the manufacturing world, the physical world, does, to electrical behavior of signals.

As part of that, it seemed to me that there’s a bit of ambiguity that could be controlled and resolved in the laminate space. The raw materials for the dielectrics and the copper that make up the printed circuit board. So, myself and the founder of CCN (Don DeGroot), a measurement lab, got together, and we’ve been working on an every-man’s VNA for measuring material properties. As part of that, we’ve been working with some test samples from various laminate vendors, and doing our own characterization, as a function of frequency of things like dielectric constant and loss tangent.

That’s what we’ve been working on. It’s not on the market yet, but our intent is to give OEMs and fabricators a tool that’s plug and play, PC compatible and you can take raw PCB materials, laminates, and you can do your own in-house characterization of those PCB materials from multiple vendors. Any vendor.

2:37 Is this some kind of an instrument?

Yeah. It’s a box with a measurement piece as a cavity resonator and it plugs into the USB port of your PC and reads the data. You control it from your PC.

2:55 Then you analyze the data through the application?

Right. I would say analyze and capture the data, with the application, which is my Z Planner software. The Z Planner software would then allow you to take your measurement data in your own library, that you built, where you would use the same methodology for testing every laminate that you use, and you would read it right into your own library that you own, and you would design boards with that.

One of the questions I was also asked is whether I think that this data, our data, will be different than what the laminate vendors provide. I try not to pre-judge data so that you don’t prejudice the results, but we believe that our results are going to be different, and the reason we believe that, there’s a couple of reasons.

There are about 11 different IPC methods for characterizing laminate PCB materials. There are not about; there are 11. It’s this standard, that standard and the other one. They’re all different and they’re all good in their own right, but there are 11 different standards. You can’t have 11 different standards. To be a standard, there needs to be one to be effective and useful.

A smarter tool from an engineering standpoint

Everybody, I’m speaking loosely, everybody uses the standard to characterize their laminate material that favors their material. The marketing guys advocate for that. So, if one measures a Df of .005, and another one measures a Df of .009, which one do you choose? The one that said .005. Well, in our thinking, what if there was just one thing you used? It happens to be our tool, but it sort of smarter from an engineering standpoint.

What if I took all the laminates that I use, or that I’m considering, I use one of those 11 methodologies that IPC says are okay, and I characterize all my laminates that way and design my PCBs with them. Seems like a smarter idea, because a lot of what we try to do is make trade-offs between material A, B and C. So, we account of all of that. Resin content, thickness, frequency, all the things people care about, and you control it at your own test bench.

So, to the degree that you can get the samples, you can slap them into the measurement fixture, and off you go. It’s really easy.

6:02 What is the method that you have chosen out of those 11?

It’s a good question. It’s a cavity resonator method. We are still experimenting with a few different methodology standards. I would say that by the time DesignCon rolls around here, I’ll be able to give you a more firm answer on where our final destination is.

Does it cover a wide frequency range?

As high as you want to go.

6:31 Could you give a brief idea as to what is this test? How does the cavity resonator function?

I don’t know if I’d be qualified to go into the physics of it, but basically, I’ll talk to you about some of the trade-offs we made as a comparison.

So, a lot of people realize that the closest thing you could come to an actual PCB would include copper. In other words, the stripline based methodologies – SPP is one, S3 is one, Bereskin stripline; there are various stripline techniques. In our case, we’re considering those, but we’re gravitating towards strict dielectric based methodologies.

This is a real-time discussion internally. I would even welcome feedback from you or anybody seeing the video. We’re trying to come up with something that we think would be a generic, widely used approach, that’s still accurate. I will admit that when you add copper, you’re adding a degree of loss that’s hard to anticipate without copper, wherever that may come from.

Textbook Vs real life

I taught a class yesterday at PCB West on material selection and stack-up design. A lot of people don’t realize that when you buy copper as part of a PCB, it doesn’t have the same conductivity or bulk resistivity that copper would have in a physics textbook. It’s about 20% lower. Speaking from memory, these numbers could be wrong, I think a textbook may say like 4.8 Siemens per meter for the conductivity of copper.

In a PCB, it’s about 3.9. Why is that? Why is it lower?

That’s loss. That is signal loss. That’s one source of loss. There’s copper roughness, there are uneven surfaces, which tie to roughness. All these come into play. So, we’re, myself and Don as part of CCN, are experimenting with a bunch of different methodologies, and we’re trying to come up with the lowest common denominator approach that could be used by everyone, anyone and everyone, to capture their own laminate data.

At this time, we are leaning towards a cavity resonator approach, which is based on clamping onto the dielectric. We may end up doing a stripline version of it as well, we haven’t decided. Part of what we’re focused on is price. VNAs, as you probably know, is part of Sierra, can be pretty expensive, and a lot of people don’t have them. So, we’re trying to get our price down to a price point where probably everyone could afford it, even down to small companies.

10:02 So you are trying to eliminate the energy caused by the copper surfaces, etc. in the measurement of the dielectric properties of the PCB materials?

We’re trying to isolate the impact of copper, because I think that’s a whole other discussion. What part does copper contribute? Because you can measure – and people do – Dk and Df without considering copper, or you can measure it with copper. But, when you get on an actual PCB, the loss and the Dk are usually higher than what people expect them to be.

I don’t know if that’s your experience at Sierra, but most people are of the view, and I’m of the view that, usually, what you experience is higher loss and higher Dk than what you thought it was going to be.

And the reasons for that are still under discussion. Copper’s part of it, copper conductivity, like I said, copper roughness. How is the transmission line based methodology different from a straight dielectric approach? Both of them will give you numbers, how different are they?

Looking for the Holy Grail

So, Don and I are discussing and measuring all of that. Fortunately, my advantage in partnering with a guy like Don is that he’s independent. He doesn’t work for a laminate vendor, he doesn’t work for an OEM, he doesn’t work for a copper manufacturer. He has tons of independent test data; historic, independent test data. So, I get the benefit of all of that experience.

We’re kind of looking for the Holy Grail here, and we’re defining the Holy Grail as something that everyone can use. We don’t want some tool that costs 50,000 dollars, that you need a PhD to run. We want something that anybody could run, and we’re aiming for 20K range for the price.

12:10 When do you think this instrument is likely to come out in the market?

We’re targeting having it out by DesignCon. We’re actually giving a paper at DesignCon that addresses the methodology. It’ll use some of the samples that we received from Sierra, in fact, the Isola I-speed samples. We’re also trying to get samples from some of the Asian manufacturers, possibly others. We want kind of a cross-section. We hope to be able to ship by early next year, which is 2019. We haven’t even named it yet. Do you want to know what our code name is?

This could be the real name, but I’m wide open to suggestions. Get ready to chuckle. I named the tool in a meeting we had The Magical Mystery Tool. It doesn’t have a name. We’re also considering The Commando 450, which was the name of the shower head in Seinfeld that Kramer used.

But, MMT, you could also say that that means microwaved PCB materials tester. We might call it the MMT. We might name it Allegro. I don’t know. We’re thinking about it.

We’re considering Pantium, Itanium, Kleenex. Chevy, Oldsmobile. We’re considering various brand names. Nike. I don’t know.

13:46 Do you have any suggestions for designers who are concerned by this type of design and PCB materials?

My advice is to get a law degree and go into IP law. Become an IP attorney, they make more money.

Part of the reason that I formed my company – this is on a very serious note – it kind of amazes me that OEM engineering teams will spend 60 hours laying out a board. They’ll spend 40 hours doing signal integrity simulations and iterations. You add that up and it’s about 100 hours. And that’s not even the schematic. That’s just the physical layout and the signal integrity simulation of a board and troubleshooting.

If you told me there was a planet where people would spend 100 hours on that and they’d only spend two hours on the stack-up working with the fabricator, I’m sure, but two hours on the stack-up for something that takes 100 hours. And the stack-up is the foundation of the board. It touches every signal, and yet engineers and CAD designers only spend two hours on it? What’s that all about?

Spend time on your stack-up design

I wouldn’t even believe you if you told me that, but that’s the world we’re living in. I’ve felt, for a long time, not just recently, but for the last 20 years, that engineers and CAD designers should spend more time on the stack-up than we do. I’ve been sort of a fan and disciple of Lee Ritchey’s books for a long time. Of all the high-speed authors out there, and there’s a lot of them, and I have all of their books, I buy all of their books, Lee is the only one who spent a significant amount of time on stack-up design.

If you’re looking for that last 10-20% of signal quality improvement, as frequencies continue to escalate, I think you really need to look at how you’re modeling the stack-up, because the stack-up itself touches everything. How you model the physics of the copper relative to the dielectrics is critical. That’s what my company focuses on, that’s what my software focuses on, and that’s my passion.

Tags: ,

Leave a Reply

Your email address will not be published.