Yearly Archives: 2013

How to get your board to work before even powering it

As a very lucky embedded systems engineer I get to design and build many many PCBs. And when I say many, I mean it. Oftentimes, I can have 2-3 different designs going at the same time. Some designs are simple: a microcontroller with a few components. Others are complex boards with impedance matching, length matching, and BGAs worth upwards of $5k. However, no matter how much each board is worth, the second a new design comes from assembly (If I don’t do it myself) it gets a special treatment. It feels a bit like a police officer interrogating a subject – I’m suspicious of the whole board and until it gives me the truth (and runs as expected) I assume it will lie to me and cause all kind of issues.

explosion

Most of the designs work wonderfully. When you take the time, dive into the details to ensure the board will be good on arrival, you’re stacking up the odds in your favor. The board is working before it even leaves for manufacturing (as Sun Tzu said, a war is won before a single battle begins). But, in engineering, like in many other fields, things happen. That power supply specification that you though was right isn’t anymore because the customer asked for changes and there was little time to investigate the ramifications. A manufacturer failed to give you the right spec. The assembly house placed a diode with the reverse orientation. All of these things happen.  At best, a few component changes
resolve the issues and everyone is happy. Bluewires on a board are not desirable, but you live with them. The worst, however, is when issues damage the board. You’re left to figure out what went wrong and when you finally do, there’s nothing to be done but change components or build a whole new board (the former brings a whole host of issues). This should never happen for a simple reason – it is mostly an avoidable occurrence.

The most common issue that can destroy a board is power supply problems. The simple reason is that power supply typically produce large currents, large currents create heat, and heat destroys components and PCB (the PCB traces can act as wonderful fuses). Clamping diodes in ICs can handle a signal that is over-voltage (withing a reasonable range), but a power supply will keep trying to pump more and more current until the desired voltage is reached (not likely when a short). This has to go somewhere and will get dissipated by components that are not designed for it. Given this, there is a simple regimen that every board I get goes through:

 

  1. All component are visually checked to esnure there’s nothing strange going on. This means checking polarized components for the right orientation, ICs with pin 1 correctly aligned. You’ll be suprised the stuff that Automated Optical Inspection (AOI) doesn’t catch. Checks for any obvious shorts, missing components, etc.
  2. You can’t power the device. Note that the term short is very subjective. Each multimeter has a different limit for what it considers short and may still emit a beep. Anything below 1 ohm is typically a hard short (metal to metal). Never underestimate this check. This is probably the most overlooked but useful test. If all is OK your board might not work, but it will likely not blow up.
  3. If possible by design, all on-board power supplies are disconnected from the circuit they power. While disconnected from the load, they’re powered isolated and checked to ensure they’re voltage is correct. Imagine if a mistake is made on the feedback network of a regulator, and 5V is supplied to a 3.3V part? Its happened, but checking the voltage without it being applied to the actual circuit can catch this.
  4. For powering the on-board power supplies, use a current limited power supply with the limit set to a reasonable figure. You might hit the constant current limit on it, but this will result in a low voltage which is unlikedmm

These simple checks ahead of time have saved me a lot of headaches. Once the board is blown there’s typically little time to build new boards, test them, etc. I hope they’ll help you in your next project as well. When you’re confident the basic design is working, you can lighten up on the checks (assuming you trust assembly).

Outlook 2013 is eating your bandwith for breakfast, lunch and dinner

I recently had to switch hosting to a new service provider. The transition was quite smooth and I was up and running in about an hour. E-mail server came up nicely but I noticed something interesting; Outlook 2013 took quite a while to Send/Receive messages. I was used to pretty quick Send/Receive given small < 100kB e-mails.

Since I was quite happy with everything else from the new hosting company, I was inclined to leave the matter alone. It was a bit annoying and I could notice that e-mails would take a while to arrive compared to my phone. I broke down and contacted the web hosting company, but they could find nothing wrong.

Then a few days ago I checked my bandwidth (Can’t hurt to make sure you know how things are going) and I was shocked to see over 9GB of IMAP transfer. Considering the largest mailbox is about 500MB, this would mean a complete download of the account 18 times. In one day over 1.2GB were transferred. This couldn’t be happening. IMAP is configured to download only new messages, and most of these are a few MB at most.

Outlook 2013 loves eating IMAP bandwidth
Outlook 2013 loves eating IMAP bandwidth

A quick check online revealed that Outlook 2013 (and perhaps other versions) has issues handling IMAP. Seriously? One of the most common internet protocols and Outlook has problems supporting it? I already had enough putting up with the new Outlook 2013 interface. To be honest I don’t like it at all. Large icons make reading e-mails more difficult.

The options aren’t good. I tried to change the e-mail server from dovecot to another, no luck. No updates from Microsoft seem to fix the issue. Should I just take the fall and keep the bandwidth saturating? Absolutely not! Every minute Outlook is open is a ticking time bomb waiting to go off. Using a computer with a more bandwidth limited connection would prove disastrous.

So what should you do? I went back to Mozilla Thunderbird. Easy interface, good searching, plenty of space for messages.

Masters of Doom and Destiny: Lessons from Making Games

I’ve been reading “Masters of Doom: How Two Guys Created an Empire and Transformed Pop Culture” by David Kushner after a recommendation from Jeff Atwood.
This is the first e-book I read on a tablet (iPad to be specific) and I have to say
I’m hooked, both for the experience of reading on the tablet, and for the book itself.

As an engineer, it was hard for me to accept using a tablet. All my tools naturally run on a laptop or desktop. Schematic design, layout, compilers, etc. After all these years I broke down, got a tablet and now I see what all the fuss was about. Of course I had worked in some form or fashion in the same group that worked on parts of the original Amazon Kindle tablet and a few others tablets. I knew what Steve Jobs was talking about but I didn’t realize it would great it can be to hold a thin device that you can use to read any book. I was worried about the battery life. After designing a few devices with LCDs while keeping low power I was a bit paranoid, but the iPad stayed true and kept going for hours.

Many hours were spent helping Keen save the galaxy

Enough of praising the tablet. I said I was hooked on the book and honestly I think any entrepreneur should read it. But it doesn’t have anything to do with me you say? Nonsense. The lessons of the book carry over any enterprise.First lets start by stating these are the guys the worked on so many of the most popular games in the 90’s.
I fondly remember logging into the BBS and getting Commander Keen, playing it for hours. Its amazing now, after all these years to read about the guys behind it. As a kid, you always imagine that these games were built by a mega corporation with all kind of stuff. I remember the Apogee logo. I remember it represented fun, and the guys working on the games had lot of fun working on them.

John Carmack and John Romero are, as the book describes, masters. But they’re not just masters of doom. They’re masters of their craft and business. Why did they succeed where others failed? I chalk it up to several factors which are key
in almost any business.

The first is their ability to create games that pushed the envelope of the technology at the time. They were to games what Apple is to tablets. John Carmack was a master of coding and graphics. He was able to push the limits of every system he worked on,
developing techniques that were non-existant. These techniques allows the two Johns to provide realism and better playing that players were carving. There are plenty of companies that improve something.However, the technical improvements channeled directly to results that gamers could see and feel, and this makes all the difference. Better performance and better graphics made gamers feel as part of the game. So, technical improvements channeled directly into differentiating the product.

Both Johns had an obsession with playing creating games, and had done so for very long before starting id Software. This obsession enabled them to understand their target audience. They were gamers and they made the games they wanted to play. They were also programmers and understood what it would take to build them. I don’t think they ever sat down to do market research. They were the market, and When you know the market, you know what customers want. If you deliver it better than anyone, success is likely to follow.

One very important detail in the book is John Carmack’s refusal to patent his technological advances to improve performance and graphics. Sounds crazy right?
The techniques he developed were very valuable to him and would be to others. But, his hacker ethic didn’t allow him to claim as his the work inherited from others.
I find this is a good slap in the face to patent trolls everywhere and those who keep patenting rediculous things. Why? Quite simply because despite not patenting anything, the company succeeded tremedously due to their continuous innovation. They didn’t need the patents. They were obviously first to market with the techniques, but they kept innovating every game. That was the key. Plenty of companies are granted patents for trivial (and very obvious) things, just trying to keep a fake legal edge on competitors and keeping the real innovation at bay.
They are missing the bigger picture. Innovate or die (as told in the book Ninja Innovation by Gary Shapiro) is really the only way to do it. But its hard and requires continuous improvement. A patent is no panacea for a company sitting on its behind while someone out there is finding a better way to do things.
A patent isn’t a magical charm. I am not against patents, but recent media stories show just how crazy the system has become.

On the other hand, John Romero was multifaceted, doing graphics and the level editors and other tools used to create the game. The ability of the two to naturally divide the tasks allowed each of them could excel at their own tasks and collaborate when necessary. They agreed on the larger picture and left the details to each other.

As Jeff Atwood mentions, the computer today is so powerful and the resources of information so plentiful, that it isn’t necessary to be like John Carmack and invent everything. But, you still have to innovate, someway, somehow.

 

Consulting and Latest News

Its been a while since I’ve posted. Unfortunately, my schedule has become increasingly busy and leaves little time to manage the site.
The MSP430 tutorial continues being one of the most popular on the web. I am aware that it needs more information and editing. As with everything, there is so much to do. I hope it will continue providing the basics for your MSP430 needs.

In the last year I started my own consulting company, Argenox Technologies. I definitively couldn’t have done it with others with assistance from others around me and its quite a learning experience. More than anything I’ve seen the pervasive nature of embedded systems and the growing need for expertise in using them. I dislike shameless plugs so I will be straightforward. Many will take the tutorial and build from that. That’s exactly what I always wanted. However, others especially in commercial companies will need someone with the experience to help them build a product and know that it will work. If that is the case, please contact us at www.argenox.com. We’ll be happy to discuss how we can help you with your embedded systems. We work on a variety of systems, not just MSP430, and also wireless for low power transceivers, Wi-Fi, Bluetooth and GPS.

The Argenox Technologies website will host the latest MSP430 which I am working on updating with much needed information.

Best Wishes,
Gustavo