Category Archives: Uncategorized

The Internet of Things (IoT) everyone talks about, and no one knows what it is

IoT. Companies are designing it. Companies acquire other companies making it, and it is going to be a market of Billions of dollars. But no company has it. What is it? This thing called the Internet of Things.

For a couple of years we’ve been hearing about how the Internet of Things (IoT) or Internet of Everything (IoE) is coming and that is going to revolutionize everything. Funny thing, I worked on it long before it was even called IoT, and I can tell you much hasn’t changed. These are grand ideas that require a fundamental revolution in our technology but being built on the back of basically the same technology we’ve been using for 10+ years.

As the precursor to the IoT, the Department of Defense funded initiatives to develop smart sensor networks that would cost nothing and monitor everything. Applications were clearly military in nature to monitor soldiers and the battlefield. The National Science Foundation also provided funding for some of the more civilian applications such as mine monitoring, structural health monitoring, medical monitoring, etc. These Wireless Sensor Networks were developed by many universities worldwide. A few startups even started commercializing some platforms and either died or evolved into developing IP based products. I myself developed a few Wireless Sensor nodes. These “Motes” as they came to be called had a few issues:

1) Power Consumption – A useful platform is one that is operating and sending data, but battery technology has progressed very little compared to semiconductors. This forces these platforms have to sleep for long periods of time in order to last long enough. Wireless transmissions are very expensive in terms of power and require complex algorithms.

2) Cost – The idea was to deploy millions of low cost sensors that wouldn’t need to be retrieved. Needless to say, using Commercial components made reasonable systems rather expensive. A few custom ICs made that integrated complete WSN functionality were interesting, but never really made it out there.

3) Limited Functionality – The limitation in size and power resulted in limited applications. Video and audio require significant processing that drains batteries easily.

The biggest question back then, and now, is what to monitor. If sensor nodes were free and last forever, then we would deploy them everywhere. But they are now and so we must strategically  fit them where they would provide the most value. Today you hear the internet of everything. Do I really need to monitor my chair? Does it really provide me with any value?

The answer to this question is basically that WSNs and IoT will move at the pace where they would provide value and can be “absorbed” by customers. As an example, WSN technology made it to some data-centers where they could save money by monitoring power consumption and improving power efficiency. Clearly a good value proposition where you could show quick savings. Today you see Internet enabled outlets, Smartphone enabled  key locks, etc.

Another issue back then was the diverging technologies used for wireless connectivity. 2.4GHz, 900MHz, Wi-Fi, Sonar, and many other low level layers were used, making compatibility impossible. Some of this hasn’t changed.

In many ways, IoT is very much the Gold rushes of old, with the usual results:
1) Most will try and fail to make a dent in the market, either to be acquired or simply disappear
2) Some will figure out good and sensible applications and solutions that people actually want.
3) Everyone wants a piece of the action and is supposedly moving into position to extract all they can, but few actually understand what it will be.

#2 is critical. While talking with people developing IoT solutions I often ask what are their customers going to monitor. “Everything” is one of the most common answers I get.
Don’t develop a generic system that does everything. Start attacking problems that IoT technologies can solve. People will pay for this value.

One company that seems to be attacking the issue the right way is Nest Labs, the developer of the Nest Thermostat and Smoke Alarm. Tony Faddell, who worked at Apple designing the iPod and left to start Nest hates the IoT monkier. He’s right. He also says that companies should focus on solving actual problems. The Nest Thermostat and Smoke alarm aren’t billed as IoT devices, but just devices that help improve household items.

 

The Innovation Bandwagon

Everywhere I go “innovation” seems to be all present and omniscient. Visit Linkedin for any amount of time and you will see most every company touts how innovative it is in its business sector. I’ve started to get the feeling that “innovation” is the new bandwagon, that if you don’t call yourself innovative then no one should look at you. Bullshit.

First let’s get an idea of what is the official definition of innovation as given by the Merriam Webster dictionary:

in·no·va·tion

noun \ˌi-nə-ˈvā-shən\

: a new idea, device, or method
: the act or process of introducing new ideas, devices, or methods

The strict definition doesn’t give us the complete feeling. My experience tells me that innovation is introducing new ideas, devices or methods that are different and create value. Usually we call innovation things that make our lives better and improve old processes. In this sense, Apple could be one of the most easily recognizable companies introducing innovation. The iPhone revolutionized phones primarily due to ease of use, tight integration, responsiveness, and a few other factors. Many of the technologies already existed, but they made it better. This in turn created value.

But, for a company to be innovative, it must constantly introduce new forms of doing things. Innovation is in a sense a spectrum, and many abuse it by calling what they’re doing innovation.

We can’t all be innovative. The reality is that most companies fall on a bell curve or a similar distribution. Some companies fall behind and stick to their old methods for decades, rarely improving or innovating. Most companies introduce some innovation,but very few introduce constant innovation and change the rules. My feeling is that it comes to getting too comfortable with what you’re doing to break everything again. After all, most people were probably very happy with their horse and buggy until ford showed up. A car was innovation, small incremental improvements are minimally innovative.

Innovation feels like it is the new bandwagon, years after the Six Sigma craze swept companies (which is the antithesis of innovation, basically perfecting doing the same thing the same way again and again), it looks like companies are getting on the Innovation bandwagon to try and look good.

Innovation doesn’t come from simply saying “we’re innovative”.  You can readily tell when something is innovative and when something isn’t. So please, use this term with care, and practice it rather than . As always, innovation for the sake of innovation is pointless. Simply changing things and introducing new devices without any value doesn’t count as innovation.

The death of the PC – What’s Next?

In I’m the PC: Reports of my death have been greatly exaggerated I talked about the industry calling the PC dead and what I considered to be the reality of the state of the PC industry. Fast forawrd to 2014 and it seems that the trend I discussed is accelerating. The PC isn’t dead, but the growth is. A few days ago, Sony practically exited the PC market. Even though I have never owned a Sony PC or laptop, I felt sad realizing that things will never go back. The industry has changed and its affecting vendors and OEMs.

Intel some time ago realized that the industry had changed too quickly and stopped a fab that was under construction, deciding on retrofitting existing fabs instead, but just as I predicted, the demand for workstations for professionals seemed to help the situation.

In large part this is unavoidable. The PC industry was running for 20+ years on the same path. At some point, the gig is up, and the iPad was the beginning of the end . I realize this means layoffs, and that is tough. We can only hope that the market can absorb those who have been laid off and use their skills in the next phase.

Many people and companies are now jumping on the Internet of Things and Wearable devices wagon. Almost every major technology company is now working on a wearable device, a watch, or a fitness device. I’m not convinced. I do believe that wearable technologies will be important, but they’re not yet selling in any compelling way. For something like that to sell well, a customer has to see the value in it immediately. The customer has to basically say “If I don’t buy that I’m a sucker, because it solves a ton of problems, its cool and pretty affordable.” If the customer takes too long to decide, you know that the product has issues. I’ve yet to see a device get to this level. The problem is, some devices excel in one area, but have a small ecosystem. If Apple taught us anything is that you want to have a large eco-system that enables a device to have extended value across a variety of platforms.  Google has realized it too and its Google services carry across multiple devices.

Finally, I want to discuss Google’s acquisition of Nest, clearly a foray into the Internet of Things. Nest has a recognizable name, a good brand that people consider high up, in no small part due to Tony Fadell. I’ve worked close to the Nest and can tell you its a beautiful device with a lot of thought and effort to make it great. It’s no wonder Google wanted to get its foot in the door this way. But, as news of Google acquiring Nest surfaced, many became concerned about their privacy. A thermostat capable of telling when occupants are present can provide an awful lot of information that can be useful in advertising, Google’s main revenue source. I’ve heard that some companies have canceled partnerships because of these concerns. Whether their concerns are justified is hard to tell. It would really depend on what Google would want to do with Nest. At the moment, it remains independent, but Google purchased it help its revenue and at some point will use for this purpose.

 

Why doing IoT right is so difficult

If you’ve been following CES 2014 to any degree, you’ve already heard that the Internet of Things (IoT) and wearable devices are the next big thing. Experts cite that by a certain year it will be worth so many billions. I love IoT, I believe in it and I work on it, but I can tell you that the IoT revolution needs a lot more work. For years people have been working on IoT using Bluetooth, Wi-Fi, Zigbee, 6LowPan and anything else you can imagine. One of the considerations of IPv6 was because of the sheer number of IP addresses that would be needed, and that was a long time ago. The IoT revolution has been next year’s for many many years.

The reason I say this is that IoT assumes that smart devices make our life easier, when in fact at the moment they’re adding more complexity. Removing complexity is, ironically, very hard. IoT devices are technologically rich and by extension very complex. This complexity increases failures, and failures are frustrating to users.

Looking into why IoT is tough to do I see at least four factors – Installation Cost, Use Complexity, Increased maintenance as well as failure cost. To see how these factors play , consider the light switch and bulb in your home and the equivalent IoT device where you can control a lightbulb from your phone.

When you move into the house, the installation “cost” of having the light bulb and switch has been taken by a builder who (usually) knows what he’s doing.  You never have to play
around with the wires to get your bulb to work. On the other hand, since IoT devices are so new they require the consumer to install it. This can involve multiple devices like the gateway that allows your home network to communicate with the smart device. The sheer details and unknowns can cause even the most technology literate consumer to stay away. Check this light, check that connection, provide network name and password (oops, I lost it and now I have to spend a few hours). This device is marketed as making our lives easier, yet we have to pay a pretty steep installation “cost”. Clearly we’re now off to a bad start.

As for failure cost, have you ever had a light switch fail?  In all my years I’ve never heard of this. A light bulb may fail every few years  but the switch itself is 99.999% reliable or more. Replacing the lightbulb is a no brainer and doesn’t require any external information. When an IoT device fails, its not always obvious. If it does fail, the replacement cost can be almost as bad or even worse as the installation cost I discussed above. You might have to reprogram the new device and sometimes this is worse than a new installation because some parameters need to be checked. Perhaps you named your old part something and need to figure out how to reuse that. This is made even worse when manufacturers fail to consider this and don’t provide instructions. A standard lightbulb may be a few dollars, but a smart one will cost much more. As for all the silicon based devices, your Bluetooth might only be tested for use at 25% duty cycle for 5-7 years. Many other devices are not designed to operate for extended periods of time, and are much more susceptible to heat, moisture and the elements.

Perhaps the most stark issue is the use complexity. Lets say you managed to install it, or it was installed for you. Using these devices  might not always be as easy as they’re made to be. In order to turn on or off the light I would have to get my phone, which might be inconveniently placed in my pocked. I then have to open the phone with the PIN, find the App from among dozens others, open it, and then control the device. The amount spent doing this is not trivial. Never mind that any number of issues can pop up: No network service or the device is too far for the Bluetooth to connect, interference, low battery on our phone, etc.

The security aspect is something that is hard to get right. The more technologically complex the device is, the hardware it is to protect it. New features can easily introduce backdoors, allowing a hacker to easily open or close your doors. The low power aspect of many of the IoT devices is directly opposed to security and encryption. Encryption requires significant resources, but battery powered devices cannot always afford to perform encryption and other security schemes that can protect the device. What’s more, Denial of Service (DoS) attacks against these devices can be very effective since you’ll kill the battery and the monitoring capabilities are limited.

My discussion shows we still have a lot of work before IoT gets to where it should be. For people to actually use IoT enabled devices, its value must be very high. You have to be able to provide something that the customer currently can’t do and would really make a difference in their lives. Turning off lights at home might be important, but as humans we tend to ignore and forget what’s not in front of us or isn’t critically important. We would like to think that we’re good enough to turn off the lights we left at home, but the cost of electricity is not high enough to motivate everyone. I feel this scenario might be better left to automation. Second, you have to have solved all the issues that the user must have. If Real Estate depends on location, location, location, then IoT depends on Testing Testing and Testing. Until IoT is reliable enough, normal approaches (light switch) can’t be discarded.

Consumer Electronics Show – The Business for Electronics

CES Logo

The Consumer Electronics show is a trade show held every year in Las Vegas by
the Consumer Electronics Association. At its core its the place where many of the
top brands come to show their latest gizmos. They all try to one up the competition.
The show itself is closed to the general public, so only those in fields related to
consumer electronics typically attend. Still, the press is always there to show the latest
and greatest, so you won’t miss what’s going on even if you wanted.

I had the pleasure of going to CES as part of the industry while working at Texas Instruments. Although the media makes the event look like awe and wonder as new
products are announced, behind the scenes for those who are participating the days are long filled with trying to do as much business as possible. The top brands continually vie to take the top spot. This is a recipe for outrageous events. When I went in 2011, one of the hard drive manufacturers used a shotgun to fire at a hard drive. They also applied
high voltage, water and all kind of things to prove it was the toughest hard drive in the world. It looked impressive, but it was a a marketing gimmick none the less.
The high price tag for a hard drive capable of such things makes this product applicable to a very small market.

During my time at CES, demos had to be setup and demonstrated to customers. One of the things you quickly notice at CES is how challenging the environment is for anything wireless. To say that CES is one of the most extreme environments for wireless technologies is an understatement. Imagine thousands of people in the 3.2 million sq ft Las Vegas Convention Center that can host a maximum of about 200,000 people. The convention center has WiFi hotspots covering most of the floor. Add any WiFi and other wireless that exhibitors might show and you get massive congestion. I can’t tell you how difficult it was at times to get a connection between two devices, and get a certain throughput proving your device was better than the competition.

The pressure of showing technology (which tends to fail at the most inopportune moments) can cause many companies to overplay their hand. At CES 2012 Intel was caught showing a pre-rendered video to what was supposed to be a demonstration of DirectX on the Ivy Bridge CPU’s GPU. I won’t defend the practice of using videos, but the reality of the sheer number of variables going on and the desire to show the best makes this a very tempting solution. When your speaker is busy talking to a large audience, would he really remember to press this or that button at the exact time? Of course not. They have dozens of things going on. The pressure to do it right will cause errors, and the demo will suffer.
A video has a lot less variables, so less is likely to be wrong. In the case of Intel, the demo was apparently added late and might have been difficult to do live on such short notice. Good demos need to be extremely polished, and this is a significant time investment. No time when everyone at CES is trying to come out first out of the gate.

The key of getting demos right is first to plan. Understand your customers, your strengths and what you need to do to sell your product and differentiate yourself. You don’t always have to go far. Once you’re there, get there early, test, test and test in the actual location. Get everything running as soon as you can. You must have a few backups and plenty of any supplies your demos need. We typically prepared one demo with two backups just to make sure everything can run smoothly. We had batteries out the wazoo just to make sure we didn’t run out. You have to be ready to improvise and be flexible. Even if you came in early, another company using Wi-Fi, for example, might cause issues you didn’t have before.
That demo that looked cool back at the office suddenly doesn’t look as good on the floor. Worse yet, a competitor is showing something similar and you have to show something that will attract the crowds.

During my time at CES and having spoken to various companies I realized the main reason why 3000+ companies show up at one place and invest significant time and money to show their presence. For the business development and sales people, it’s actually a no brainer. With one trip, in one location you have thousands of potential customers. Quick meetings can be arranged to discuss business in short order. A customer might not even know you and suddenly realize your solutions are exactly what they need.
The demos are all ready to be shown to the customers and you can impress them all at once. Never underestimate how important this is. Many business opportunities fall through when they’re delayed. At an expo like CES there’s little excuse, and meeting people face to face is quite different from getting and e-mail from a stranger. In one day you may have meetings with 5 to 10 potential customers, and even if your success rate is low, the numbers are in your favor.

I don’t think there is another place you get see new technologies, products and applications that you didn’t imagine, done by companies you might not have known, as well as getting to see what your competitors are doing (albeit you should already know if you are in the business).

How to get your board to work before even powering it

As a very lucky embedded systems engineer I get to design and build many many PCBs. And when I say many, I mean it. Oftentimes, I can have 2-3 different designs going at the same time. Some designs are simple: a microcontroller with a few components. Others are complex boards with impedance matching, length matching, and BGAs worth upwards of $5k. However, no matter how much each board is worth, the second a new design comes from assembly (If I don’t do it myself) it gets a special treatment. It feels a bit like a police officer interrogating a subject – I’m suspicious of the whole board and until it gives me the truth (and runs as expected) I assume it will lie to me and cause all kind of issues.

explosion

Most of the designs work wonderfully. When you take the time, dive into the details to ensure the board will be good on arrival, you’re stacking up the odds in your favor. The board is working before it even leaves for manufacturing (as Sun Tzu said, a war is won before a single battle begins). But, in engineering, like in many other fields, things happen. That power supply specification that you though was right isn’t anymore because the customer asked for changes and there was little time to investigate the ramifications. A manufacturer failed to give you the right spec. The assembly house placed a diode with the reverse orientation. All of these things happen.  At best, a few component changes
resolve the issues and everyone is happy. Bluewires on a board are not desirable, but you live with them. The worst, however, is when issues damage the board. You’re left to figure out what went wrong and when you finally do, there’s nothing to be done but change components or build a whole new board (the former brings a whole host of issues). This should never happen for a simple reason – it is mostly an avoidable occurrence.

The most common issue that can destroy a board is power supply problems. The simple reason is that power supply typically produce large currents, large currents create heat, and heat destroys components and PCB (the PCB traces can act as wonderful fuses). Clamping diodes in ICs can handle a signal that is over-voltage (withing a reasonable range), but a power supply will keep trying to pump more and more current until the desired voltage is reached (not likely when a short). This has to go somewhere and will get dissipated by components that are not designed for it. Given this, there is a simple regimen that every board I get goes through:

 

  1. All component are visually checked to esnure there’s nothing strange going on. This means checking polarized components for the right orientation, ICs with pin 1 correctly aligned. You’ll be suprised the stuff that Automated Optical Inspection (AOI) doesn’t catch. Checks for any obvious shorts, missing components, etc.
  2. You can’t power the device. Note that the term short is very subjective. Each multimeter has a different limit for what it considers short and may still emit a beep. Anything below 1 ohm is typically a hard short (metal to metal). Never underestimate this check. This is probably the most overlooked but useful test. If all is OK your board might not work, but it will likely not blow up.
  3. If possible by design, all on-board power supplies are disconnected from the circuit they power. While disconnected from the load, they’re powered isolated and checked to ensure they’re voltage is correct. Imagine if a mistake is made on the feedback network of a regulator, and 5V is supplied to a 3.3V part? Its happened, but checking the voltage without it being applied to the actual circuit can catch this.
  4. For powering the on-board power supplies, use a current limited power supply with the limit set to a reasonable figure. You might hit the constant current limit on it, but this will result in a low voltage which is unlikedmm

These simple checks ahead of time have saved me a lot of headaches. Once the board is blown there’s typically little time to build new boards, test them, etc. I hope they’ll help you in your next project as well. When you’re confident the basic design is working, you can lighten up on the checks (assuming you trust assembly).

Outlook 2013 is eating your bandwith for breakfast, lunch and dinner

I recently had to switch hosting to a new service provider. The transition was quite smooth and I was up and running in about an hour. E-mail server came up nicely but I noticed something interesting; Outlook 2013 took quite a while to Send/Receive messages. I was used to pretty quick Send/Receive given small < 100kB e-mails.

Since I was quite happy with everything else from the new hosting company, I was inclined to leave the matter alone. It was a bit annoying and I could notice that e-mails would take a while to arrive compared to my phone. I broke down and contacted the web hosting company, but they could find nothing wrong.

Then a few days ago I checked my bandwidth (Can’t hurt to make sure you know how things are going) and I was shocked to see over 9GB of IMAP transfer. Considering the largest mailbox is about 500MB, this would mean a complete download of the account 18 times. In one day over 1.2GB were transferred. This couldn’t be happening. IMAP is configured to download only new messages, and most of these are a few MB at most.

Outlook 2013 loves eating IMAP bandwidth
Outlook 2013 loves eating IMAP bandwidth

A quick check online revealed that Outlook 2013 (and perhaps other versions) has issues handling IMAP. Seriously? One of the most common internet protocols and Outlook has problems supporting it? I already had enough putting up with the new Outlook 2013 interface. To be honest I don’t like it at all. Large icons make reading e-mails more difficult.

The options aren’t good. I tried to change the e-mail server from dovecot to another, no luck. No updates from Microsoft seem to fix the issue. Should I just take the fall and keep the bandwidth saturating? Absolutely not! Every minute Outlook is open is a ticking time bomb waiting to go off. Using a computer with a more bandwidth limited connection would prove disastrous.

So what should you do? I went back to Mozilla Thunderbird. Easy interface, good searching, plenty of space for messages.

Masters of Doom and Destiny: Lessons from Making Games

I’ve been reading “Masters of Doom: How Two Guys Created an Empire and Transformed Pop Culture” by David Kushner after a recommendation from Jeff Atwood.
This is the first e-book I read on a tablet (iPad to be specific) and I have to say
I’m hooked, both for the experience of reading on the tablet, and for the book itself.

As an engineer, it was hard for me to accept using a tablet. All my tools naturally run on a laptop or desktop. Schematic design, layout, compilers, etc. After all these years I broke down, got a tablet and now I see what all the fuss was about. Of course I had worked in some form or fashion in the same group that worked on parts of the original Amazon Kindle tablet and a few others tablets. I knew what Steve Jobs was talking about but I didn’t realize it would great it can be to hold a thin device that you can use to read any book. I was worried about the battery life. After designing a few devices with LCDs while keeping low power I was a bit paranoid, but the iPad stayed true and kept going for hours.

Many hours were spent helping Keen save the galaxy

Enough of praising the tablet. I said I was hooked on the book and honestly I think any entrepreneur should read it. But it doesn’t have anything to do with me you say? Nonsense. The lessons of the book carry over any enterprise.First lets start by stating these are the guys the worked on so many of the most popular games in the 90’s.
I fondly remember logging into the BBS and getting Commander Keen, playing it for hours. Its amazing now, after all these years to read about the guys behind it. As a kid, you always imagine that these games were built by a mega corporation with all kind of stuff. I remember the Apogee logo. I remember it represented fun, and the guys working on the games had lot of fun working on them.

John Carmack and John Romero are, as the book describes, masters. But they’re not just masters of doom. They’re masters of their craft and business. Why did they succeed where others failed? I chalk it up to several factors which are key
in almost any business.

The first is their ability to create games that pushed the envelope of the technology at the time. They were to games what Apple is to tablets. John Carmack was a master of coding and graphics. He was able to push the limits of every system he worked on,
developing techniques that were non-existant. These techniques allows the two Johns to provide realism and better playing that players were carving. There are plenty of companies that improve something.However, the technical improvements channeled directly to results that gamers could see and feel, and this makes all the difference. Better performance and better graphics made gamers feel as part of the game. So, technical improvements channeled directly into differentiating the product.

Both Johns had an obsession with playing creating games, and had done so for very long before starting id Software. This obsession enabled them to understand their target audience. They were gamers and they made the games they wanted to play. They were also programmers and understood what it would take to build them. I don’t think they ever sat down to do market research. They were the market, and When you know the market, you know what customers want. If you deliver it better than anyone, success is likely to follow.

One very important detail in the book is John Carmack’s refusal to patent his technological advances to improve performance and graphics. Sounds crazy right?
The techniques he developed were very valuable to him and would be to others. But, his hacker ethic didn’t allow him to claim as his the work inherited from others.
I find this is a good slap in the face to patent trolls everywhere and those who keep patenting rediculous things. Why? Quite simply because despite not patenting anything, the company succeeded tremedously due to their continuous innovation. They didn’t need the patents. They were obviously first to market with the techniques, but they kept innovating every game. That was the key. Plenty of companies are granted patents for trivial (and very obvious) things, just trying to keep a fake legal edge on competitors and keeping the real innovation at bay.
They are missing the bigger picture. Innovate or die (as told in the book Ninja Innovation by Gary Shapiro) is really the only way to do it. But its hard and requires continuous improvement. A patent is no panacea for a company sitting on its behind while someone out there is finding a better way to do things.
A patent isn’t a magical charm. I am not against patents, but recent media stories show just how crazy the system has become.

On the other hand, John Romero was multifaceted, doing graphics and the level editors and other tools used to create the game. The ability of the two to naturally divide the tasks allowed each of them could excel at their own tasks and collaborate when necessary. They agreed on the larger picture and left the details to each other.

As Jeff Atwood mentions, the computer today is so powerful and the resources of information so plentiful, that it isn’t necessary to be like John Carmack and invent everything. But, you still have to innovate, someway, somehow.

 

Consulting and Latest News

Its been a while since I’ve posted. Unfortunately, my schedule has become increasingly busy and leaves little time to manage the site.
The MSP430 tutorial continues being one of the most popular on the web. I am aware that it needs more information and editing. As with everything, there is so much to do. I hope it will continue providing the basics for your MSP430 needs.

In the last year I started my own consulting company, Argenox Technologies. I definitively couldn’t have done it with others with assistance from others around me and its quite a learning experience. More than anything I’ve seen the pervasive nature of embedded systems and the growing need for expertise in using them. I dislike shameless plugs so I will be straightforward. Many will take the tutorial and build from that. That’s exactly what I always wanted. However, others especially in commercial companies will need someone with the experience to help them build a product and know that it will work. If that is the case, please contact us at www.argenox.com. We’ll be happy to discuss how we can help you with your embedded systems. We work on a variety of systems, not just MSP430, and also wireless for low power transceivers, Wi-Fi, Bluetooth and GPS.

The Argenox Technologies website will host the latest MSP430 which I am working on updating with much needed information.

Best Wishes,
Gustavo

I’m the PC: Reports of my death have been greatly exaggeraged

If you’ve been following up on the fight between PCs and Tablets, you’ve read many articles claiming that the PC is dead, that everyone  is using tablets and that you should forget about developing for the PC. I have only one word to say to this: Nonsense.
Let me explain why I think the industry is full of it.

First, remember that investors usually look for companies with potential for growth. Go look at any company and whether it makes 1 million or 1 billion doesn’t matter as much as  whether revenues are growing and what is their potential for growth. After all, investors want to end up with more money than they put in, and when a company’s revenues grow significantly quarter after quarter, it’s worth more (and investor’s share is therefore worth more). More importantly, it draws more investors that invest more money and allows earlier investors to cash out. Like any company, there comes a point where the market has been saturated and although you can still make money, the growth just isn’t there.  One example that comes to mind is Coca Cola. This is a very profitable company, but lets face it, anyone who wants a coke can already get it (unless you look at smaller growing markets). This isn’t the kind of thing hungry investors are looking at (no pun intended). Warren Buffet himself had difficulty knowing when to sell coke stock after it stopped rising, demonstrating that companies can be good investments and then
they’ll end up a thing of the past.

When looking at US Census information and other sources, its clear that most households that want a computer have one in the US. One source I found said that in the U.S, 68%
of households have a PC. Year after year, a new Intel processor comes out that’s faster and better; A new version of Windows/Office, etc comes along. Some people upgrade, more people buy. However, here comes the tablet and creates the same innovation paradigm that has happened countless times before: it enables people who previously didn’t have access to a computer to use a tablet to do many of the same things. At the same time, it cannibalizes PCs when people who aren’t power users just want
to surf the web and find a tablet more convenient.
This doesn’t signify the death of the PC. It signifies the death of the PC and its software as a hot investment area (which as I keep saying is all the industry is looking for and all most
want to report about).

Looking at many new software companies, making apps for iPhone/iPad/Android is the cool thing to do. This has an impact investment in development of products targeted for the PC. But the PC is still the workhorse for most, especially for professionals. In many cases, the tablet is inconvenient to work with. As an engineer, none of my engineering software runs on tablets. CAD design and similar work necessitates a mouse. Graphics Designers, game players, Video Editors, are just a few categories that might find tablets interesting for a few applications but could not afford to switch completely. So, unless someone realizes how to get a tablet to do all that nicely, switching just isn’t happening.

The tablet apps industry reminds me of the late 1990’s and early 2000’s, when everyone was trying to figure out how to make money off of anything internet related. Many companies spent large amounts of money acquiring customers with free products and services,hoping to convert them into paying customers. Some succeeded, most failed. A similar thing is happening with apps for tablets. Many apps are free or have a
paid version, with some apps copying others. Actually getting people to pay for apps is hard, especially when someone comes by and copies your app because you failed to find a sustainable competitive advantage. Unless you’re Rovio with Angry birds or you’re in the top apps list, you’re probably not rolling in cash.

With Microsoft now entering with its Surface tablet, I am betting some apple developers will switch to developing on Windows tablets as if it were the 1849 gold rush simply because it will seem like greener pastures and because early on it’s easier to differentiate yourself when the number of apps is relatively small. Microsoft is also apparently offering incentives which is what it did in the early windows days). Of course, this actually would required that the Surface tablet gains a measurable foothold with customers.

It’s easy to get caught in the hype. Looking around, tablets and ARM is everywhere, and everyone who can is building either apps processors or software apps. I believe in a few years it will mature and shed itself of the useless and hyped aspects of the industry (anyone remembers the I’m rich app for the iPhone?). Many companies developing Apps processors will be gone simply because they jumped on the same ship
as everyone else, only to realize margins are razor thin, R&D is very expensive and differentiation is difficult; aside from the fact that the major player Apple makes its own. If you’ll look around, PCs will still be there and software will still be developed for it. It will be done by fewer people, but it will also be more specialized and focused on core areas such as Math, Engineering, etc. All this will be painful, but it will set the stage for the next ideas to come around and attract the attention of investors.