Groundbreaking Battery Announcement? 400Wh/kg

I woke up this morning to read of the Envia energy company talking about how they had breached the 400Wh/kg battery barrier. While the average person has no idea what it means, lets just say thats 3-4x better than batteries available today in early 2012. These batteries wont be available until 2014 with smaller improvements between now and then (200-250 Wh/kg batteries available in late 2012-2013).

Extraordinary claims require extraordinary proof. 

Luckily, Envia seems to understand this. They’ve had the cells tested at private labs, and as well by the US Naval Surface Warfare Center. From the report’s conclusion.

The test results from the prototype cells tested at [NSWC] Crane were in line with the results obtained from the manufacturer. The claims of 400 Wh/Kg were substantiated through the cycling tests performed at Crane. This is a 160% energy density increase over the industry standard indicated in paragraph 5.1 [Panasonic’s 2011 year 18650-cell battery, 3.6Ah 245Wh/kg]

The cells are built around a cathode licensed from Arggone National Labs. The cathode was first licensed in 2007. The anode is developed using the new silicon nano-technologies used in other batteries (the new late 2012 Panasonic and Sony cells are supposed to use Si anodes and a significant boost to capacity).

The second big announcement is price – they tout a $180/kWh price tag for these cells. That is approximately 1/4 the price of batteries in 2010, and 1/2 the price of batteries today, and roughly the price expected for batteries in 2015 (around when the cells are set to hit the market). The recent stories about Tesla replacement batteries costing $40,000 for 53kWh (full pack price), this would be only cost about $10,000 (batteries only). If you look at the Tesla Model S incremental battery prices ($10,000 for 20kWh more), this dramatically undercuts those prices by roughly half – instead of $10,000 more for 20kWh, it could eventually be $5,000 more.

Battery Math

These batteries are very well suited for full EVs. A 300-mile pack for a Nissan Leaf style vehicle would be the same size and weight as the current pack and cost about the same as the initial cost of the pack in 2011. The five characteristics – capacity, power, weight, volume and cycle life – are sufficient for EVs — 85kWh, 250kW, 210Kg, similar to the current pack size, and 275,000 miles on the pack to 80% original capacity. This battery would yield about 275-300 miles per charge. A cut down pack that offers 150 miles would offer 150,000 miles, along with half the weight and power (but still enough for an EV).

For plug-ins, these batteries would probably need to be in up-sized packs based on power needs (the pack needs to be able to produce enough power to push the vehicle up steep mountain grades and pass on the highway). So a Volt pack might go from 16 to 20kWh. The bouns would be more extended range (from 35 miles to 40-45), and longer life on the battery, the battery would still be smaller, lighter and cheaper (welcome back 5th seat!) at 1/2 the size, 50kg, and only around $3,500 instead of $8,000 or so.

It isn’t really suitable for hybrids or small plug-ins (Prius Plug-in), not without an adjustment to the manufacturing to change the battery characteristics from high density to high power (you’d want to sacrifice energy storage for power/kg).

HARP 2.0 – Part 1: Getting your ducks in a row

I’m in a severely underwater mortgage (175LTV), and when Obama announced HARP 2.0 in late 2011, it was a way to shave some money of my mortgage payment (and the administration hopes I put that money back into the economy in the form of consumer spending). This and future blog posts under the HARP title will chronicle my progress of going through the process and hopefully help others who want to do the same thing. Its a departure from the tech stuff I normally talk about, but there is a dearth of information out there about going through HARP 2.0.

The HARP program is long and complex, and every bank and loan servicer I’ve talked to (Wells, Chase, Quicken Loans) seems to follow their own set of rules even though there are the rules the government sets out (the only thing they all seem to agree on is anything limited by the Fannie Mae/Freddie Mac software they must use to underwrite the loan).

So the first step is to find out if your home loan is underwritten by Fannie Mae or Freddie Mac. Each one has tools on their website (FannieFreddie) to help you figure this out. If your loan is backed by either one and you haven’t missed any payments in the last 12 months, then you are eligible for HARP 2.0 (with a few other conditions). If its not backed by either one, then you might be eligible for HARP 3.0 or the upcoming settlement with the banks.

From this point, you can start the refinance process.

Your Steps

Start with your current mortgage servicer. Contact them about the HARP program, and go from there. Most institutions aren’t accepting HARP-based refinances from other banks as of the date this was posted. In my case, my mortgage servicer is not a lending institution but they referred me to a lending institution that picked me up and started the refi process straight away. If your loan is less than 125LTV you should be able to proceed now, if its at or above that amount (like mine), then you might be able to start getting your ducks in a row now, but nothing can officially start until March 15.

After March 15, it will be possible to “shop around” for HARP refinances, but it is still at the discretion of the lending institution as to whether they will accept loans serviced by other companies (HARP is itself voluntary, but refinancing to a lower monthly payment increases the likelihood people will stay in their home, so its in the bank’s interest to do it).

March 15, 2012

This date pops up a lot in the HARP program. Its the day Fannie and Freddie roll out the new version of their Desktop Underwriter (DU) software. This software update will allow institutions to evaluate loans from other institutions, and also for homes with greater than 125LTV to refinance. This is the day that refi applications like mine can be officially put into the system and the process started, and interest rates locked.

HARP 2.0 Flexibility

HARP 2.0 allows for a bit more flexibility than HARP 1.0 did. The two big features are the expansion of LTV ratios (to 300%) and the ability to shorten the term of the loan (you can go from 30 years to 20 or 15 years). An additional feature is not needing an appraisal of the house, they’re going to use a software algorithm to determine your home value based on comparable sales and foreclosures.

Paperwork

Things I needed to start the refi process (this is not a comprehensive list of what you might need, especially if you’re self employed, it may also vary from lender to lender).

  • W2/1099 forms
  • Recent bank statements (within the last 3 months)
  • Recent mortgage statement (within the last 3 months)
  • Pay stub (current)
  • Home insurance statement (current)

At this point I have all my paperwork ready except for the pay stub (it has to be the most recent one, so I wait). If I get it ready now, the loan will close quicker.

Crazy Idea: Apple should buy Clearwire, build iDevice LTE network

On a recent episode of Critical Path, it is noted that the fastest growing slice of the earnings pie for carriers around the world is data. Voice and SMS revenues are slumping, as users are turning to data networks for more and more of their communication. Phone apps like Apple’s iMessage and RIM’s BBM move text message traffic off SMS and to data networks. Phone calls will soon be replaced with Facetime calls when cellular networks are up to the task of carrying video traffic, with the exception of calling while driving.

If we look at Apple’s iPhone (and most cellular phones in general), the most disappointing facet of the device is often the carrier, specifically data traffic; followed closely by battery life (that’s another article entirely). So what is it that Apple can do to drive additional revenue as well as provide it a leg up on the competitions devices – tablets and phones, plus anything else they may think of in the future? It would need an end-run around the current cellular carriers. And this means owning and operating a cellular network.

This is initially difficult to do on a worldwide scale because of licensing issues. Each country has their own spectrum authority (FCC here in the USA), and the same slice of spectrum can be allocated for different uses around the world with the main exception of unlicensed ISM bands (2.4Ghz and 5Ghz for WiFi). Steve Jobsreportedly wanted to build their own network using these unlicensed ISM bands, but it was easy to see that it wouldn’t be technically possible.

Clear?

In the United States the obvious choice would be for them to acquire Clearwire’s spectrum and assets. Its market cap is incredibly low (less than $2B) and it doesn’t need a whole lot of cash to fix up ($900M in the next few yearsto build and operate a new LTE network), and is in desperate need of cash to pay its debt obligations, even choosing to skip a debt payment recently. Cheap considering how much spectrum they’re holding on to in major cities across the USA – 192MHz in many cities, 125MHz in NYC and as low as 75MHz in Detroit. The difficulty is that its majority owned by Sprint, however Sprint is in need of cash too and I expect it will have to be acquired by Verizon within the next five years if they don’t get their act together. Sprint seems less interested in Clearwire lately, especially since they announced they’re going on their own with LTE (using their own spectrum and Lightsquared spectrum instead of Clearwire spectrum). The downside to using Clearwire’s spectrum is that it is in the 2490-2690MHz band, which doesn’t have the best propagation characteristics (e.g. going through walls, into basements, etc). Apple would need to use their extensive antenna engineering knowledge to build a device that will still get fantastic reception even with poor signal strength.

The phone will still need (and should use) the voice networks from the old carriers. There is no need to build up that infrastructure again. Apple would roll out the TDD-LTE-Advanced (rel. 10) network on Clearwire’s 2.5-2.6Ghz spectrum in 2013 and provide tremendous speeds to end users – better than any of the current network carriers could offer. While LTE offers 10Mb/s down, the enormous spectrum holdings of Clearwire would allow speeds up to 50Mb/s on a regular basis, and peak speeds well above that. Putting their spectrum to use in a 50MHz TD-LTE-Adv configuration provides for over 250MB/s raw throughput (downlink, 2×2 MIMO) with user speeds around 20-50Mb/s and upload speeds around 10-15Mb/s.

How would the carriers react? A mixed bag – they’ve invested money in building up a network to handle tons of data, and while they might welcome Apple taking a load off their network (their CapEx would slow down dramatically, for a few quarters at least after rollout), they aren’t going to be happy with Apple taking revenue away – presumably because everyone could switch to no data plan or a minimum data plan for roaming outside of Apple’s initially incomplete network. But Apple recently just took a bite out of their revenue pie by introducing iMessage, reducing carrier revenue from text messages, though that is an order of magnitude smaller than the equivalent data revenues.

Killing Cable?

It also offers a hand in creating their own mini-cable system. With an abundance of spectrum, a separate 20MHz channel could be used just for broadcasting their own live TV on multicast – a 20Mhz channel (2×2 MIMO) with a 87:10 down/up ratio would have 120Mbs down, enough for 10 12Mbps 1080p feeds, the 8Mbps upstream channel would just be for device authentication and updates only. In true Apple/Pixar fashion, they’d only be showing a few choice channels with high quality content. During the low traffic periods of the day (would Apple sell informercials? I don’t think so…) they could turn off a few channels and stream prime content to the devices to be “unlocked” as prime-time TV shows. If they needed to increase throughput, they’d move to 4×4 MIMO and change the ratio to 90:7 for 255Mb/s down (21 channels 1080p channels) and a small control channel up.

Technical Difficulties

Apple would need to build dual-SIM devices, it would need a carrier SIM for voice and SMS, but an Apple SIM for data. However, Apple was rumored to be building a SIM replacement. This would allow for still one SIM card and Apple’s SIM would be based in software.

Building a network is no easy task, and considering that Clearwire is moving to a co-located configuration with Sprint (the same tower would have Sprint’s and Clearwire’s transmitters), any buy out might negate that cost-sharing benefit.

But overcoming one of the last poor aspects of the smartphone experience would be a huge deal, and give Apple a leg up on both other cellphone vendors and their carrier partners, at least here in the US.

Battery Magnitude

Recently, there was news on a new formulation of lithium-ion battery using a Si-Graphene anode that would provide for 10 times the charge in 10 times the power capacity that current lithium-ion batteries in the same size (and I’m assuming the same weight). Now the standard three to five years disclaimer applies – in that it has to be brought out of a lab, commercialized and they have to figure out how to mass-produce them and not lose any of their stand-out characteristics.

But it didn’t immediately seem apparent to most observers that it would be useful because of rapid capacity fade – the battery would only last 150 cycles before it had only 50% of it’s original capacity, although that is still five times current battery capacity. Traditional lithium-ion batteries last 300 cycles to 80%, lithium-polymer (the battery in your iPhone and Mac laptop) last about 1000 cycles to 80%, and lithium-titanate batteries can last 5000+ cycles to 80%.

The difficulty with the Si-Graphene battery is managing the user experience. If a user were to go through their entire battery in a day, every day, in 5 months they’ll only have half the capacity. So the device developers have to oversize the batteries but artificially clamp the energy storage to keep heavy users from destroying the battery in a short time frame.

Consumer Electronics? Sure…

Putting this battery into a smartphone to replace the current lithium-polymer battery would let average users go 10 days between charges. However every 250 days (25-30 charges) the user would notice they’ve lost a day of use before it was time to recharge, so from 10 days to 9 days. Will users be upset that they lost a day? Will they even notice? Or will they beat down the door of the store where they bought it demanding an exchange on a perfectly good product? How can we avoid this? By artificially limiting battery capacity.

If we were to limit the battery capacity artificially to a value that the 80th percentile user will have after 2 years of usage, we can save the trouble of users noticing their battery doesn’t last as long as it once did. I’ll assume this number is 75% of capacity, that is the 80th percentile user will go through enough battery capacity in two years that will cause a 30% capacity fade (this also factors in an increase in usage due to the bigger battery). So the phone will be setup so the average user can go 7 days before hitting the 25% warning and then recharging. Using 4Wh/day the user will go through 36.5 full cycles per year, which represents a 12% capacity fade per year. After two years, the capacity fade will be 24%. So after two and a half years, the average user will start to experience the battery holding less energy, and probably notice it around the end of year three – about which time the user will need to buy a new phone anyways as this one will be long out of warranty. An 80th percentile use will probably start to experience capacity fade earlier, around 18-24 months. A 95th percentile user is likely to do crazy stuff with their phone like stream audio all day and go through one cycle per day, and run into capacity fade in 6 months. This last case could actually be accounted for in software – if the phone notices its being used heavily it could ask the user to plug it in while engaging in the heavy activity, or just nerf capacity in software in the name of getting out of the warranty period without having to replace the battery.

Below is a chart of a traditional Li-Polymer batter (5.3) and a new Si-Graphene battery (53). You can see that the new battery has much larger capacity, it also fades much quicker. If you were to limit the Si-Graphene battery at 40Wh (40) capacity, the battery would get to two and a half years of average use before the user experienced any capacity fade.

The downside to this approach over traditional batteries is that users might increase their phone use and suck down more battery power per day knowing they have a lot more energy available to them, which is all the more reason to artificially limit capacity in the name of having the battery last long enough to have a useful device for 2-3 years.

The same results from the smartphone situation above would also apply to tablets. Laptop computers would probably see more agressive artificial capacity restrictions, as users usually run out of battery on the laptop before they are done with whatever they were doing (like doing internet research and writing a blog post about batteries ಠ_ಠ), so the issue of using more energy per day and higher annual cycle counts would apply.

Electric Vehicles? Not so fast…

If the approach of limiting battery capacity in the name of extending its life sounds familiar, it should, as it is how the battery in the Chevy Volt is managed. So what would happen if you applied this to the battery in the Volt? Not much difference, and probably an increase in cost.

If you recharge the Chevy Volt once a day, 365 days per year, it is equivalent to 237 full battery cycles per year (10.4kWh used for 35 miles, 16kWh capacity), and the battery type they use is expected to have a life of 1500 cycles (without any depth of discharge reduction bonuses). But if you were to drop-in a replacement battery with this new technology (assuming same size, weight, etc), you’d have a 160kWh battery. Now that doesn’t mean you drive 10 times as far, rather you just use an increasingly small portion of the battery, specifically an initial depth of discharge of 6.5%, and a rate of 25 full cycles per year. By the end of year 10, or 250 cycles, the battery would have degraded enough where it will start to run into problems storing and producing enough energy (assuming they can last that long from a calendar standpoint). This doesn’t appear to be a significant change from the current battery regimen, where the battery is warrantied for 8 years or 100,000 miles (12,500 miles per year). The only benefit to using the Si-Graphene batteries would be the increased power output – a Volt’s 9-second 0-60mph times could improve dramatically, along with faster recharging times.

What would help

The problem with this is that batteries are predominately priced in $/kWh, which would make the above scenarios prohibitive. The fundamental question is would it be more appropriate to charge by mass and volume? Does it cost 10 times the amount of making traditional batteries to these batteries? I don’t think it will. They might be able to charge more than the highest end batteries, but the $/kWh would need to be discounted compared to other types of batteries that have higher cycle lives. The best figure to use when it comes to battery prices is $/lifetime kWh, or the amount of energy a given battery will output until its no longer usable for the specific application (e.g. smartphone, EV, etc). A battery might cost $700/1500 kWh lifetime, and it might not matter that its 1kWh of storage for 1500 cycles or 10kWh for 150 cycles for certain applications – assuming other factors are held constant (volume, weight, safety). In fact, the latter configuration helps in applications where power demand is high (e.g. a car).

So the most basic thing to help these batteries would be an increase in cycle life. Even a relatively small increase in cycle life would dramatically impact the usefulness and increase the impact these batteries can have.

If I had $500 to spend on content per year…

I thought about this the other day, about how most websites have garbage ads on the sides, most of which are complete fucking scams. One of the websites I read I subscribe for $50/yr. So if I had 10 websites, $500 a year, to spend on content subscriptions where would I spend it?

  1. Ars Technica (which I already subscribe to for $50/yr)
  2. Las Vegas Sun
  3. Green Car Congress
  4. AnandTech
  5. The Verge
  6. BoingBoing
  7. Five Thirty Eight (a blog on the NY Times website, I’d subscribe to the entire site just for this and the Paul Krugman columns)
  8. Fareed Zakaria GPS (a blog on CNN’s website)
  9. Fierce Wireless/Fierce Broadband Wireless
  10. Reddit (not a news site, but still I’d pay)

Movie Studios’ new digital movie experience leaves something to be desired

Ultraviolet is the movie industry’s answer to fight piracy and illegal copying of movies. All the studios except for Disney have combined to allow the movie buying public to digitally access the movies they buy physically in stores. The goal is to end the need for ripping DVDs and Blu-ray discs, and to not have to require users to buy two copies of the movie (one physical disc and one digital copy via iTunes) if they want to legally watch the movie on both their TV and their iOS devices.

The process starts with buying a supported DVD or Blu-ray disc from a retailer or online. I purchased Horrible Bosses for $22.99 at Best Buy for my test. Inside the case is a 12-digit code to redeem the copy of the movie digitally. Because Ultraviolet’s streaming partner (I’ll get to this later) is responsible for your code, you need to sign up with them in addition to signing up with Ultraviolet, if you don’t already have an account. In my case, the parter is Flixster, though its probably up to each studio to manage their own partners.

Once you’ve signed up and entered the redemption code, you can watch the movie digitally without having to rip or pirate a copy of it.

The most convenient and flexible way to watch your movie is to stream it from a phone or tablet device. Flixster is managing the streaming for Warner Bros. Studios for Ultraviolet. This means you’ll need to use your Flixster account created when you redeemed your movie and then your Ultraviolet login, from there you can stream the movie. Flixster offers apps on iOS and Android platforms to watch your movie streamed from their servers.

As you might expect with the movie industry there are some significant strings attached.

Streaming and downloading is only included for one year, after that it’ll cost you. The Ultraviolet’s FAQ page indicates that for streaming and downloading…

UltraViolet rights include streaming from the selling UltraViolet retailer, at no extra charge above the original purchase price, for at least one year after purchase…

You get at least one year of streaming and downloading for free. After that who knows! The plus side (for Flixster in this case) is that streaming and downloading are cheap – you can store data on Amazon and stream it down for 10c/GB. If you watch a movie (about 2GB) six times in a year, thats only about $1.20 in bandwidth you’re costing Flixster. Presumably this amount is priced into the cost of the disc when you bought it.

You cant download a copy that will sync to the iPad/iPhone. While you can stream over WiFi (no 3G streaming), you cant download or cache the movie on your device for watching while disconnected from the internet (e.g. on an airline flight; no, in flight internet isn’t fast enough, and they will throttle you). You cannot download the movie unencrypted, and you cant download it in an fair-play (Apple’s copy protection system) encrypted container, though this is Apple’s doing because they don’t license fair-play out to third parties (probably due to worries it might be cracked like DVD CSS was when XingMPEG’s player was cracked back in the late 90s.

Update November 2011: You can download the movie from within the Flixster app to the device to watch on-the-go. The movie Horrible Bosses was about 600MB (SD version, not HD or even 540p). There is a huge usability issue however – you have to wait inside the Flixster app while the movie is downloading. You cant download it in the background, you cant sleep your iPad (the download will stop after 10 minutes of when you quit or sleep the iPad).

Downloading a copy to your computer requires Adobe Air and a special application to watch it. You can download the movie to your desktop or laptop computer to watch it later, but this will require an additional piece of software. For Horrible Bosses at least, the download is only SD quality, even if you buy the Blu-ray disc. And as of the writing of this review, the download itself wont even be available until December 20. The downloaded versions are copy protected and wont work iOS portable devices.

Other than Android and iOS, no other connected devices like TVs are currently supported. While it is early in Ultraviolet’s launch, and their website does seem to promise other connected devices to be added soon, you cant hook up your TV or other connected set top box to Ultraviolet to watch the movies you have stored.

While the goal is worthwhile and laudable, the inability to store a movie on a device is problematic – for many people, the only time they watch movies on an iPhone or iPad is on a plane flight or long drive (neither of which are suitable for streaming). But the ability to watch the movie over and over could be great for kids – who will probably have a new favorite movie in 3 months anyways. Finally, calling it a “rights locker” is a pretty wonky term. What was wrong with “Movie Vault”?

Pros:

  • Digital streaming of the movies to iOS and Android devices
  • Ability to have up to six people on one “Household” account
  • Parental controls to make sure kids don’t see R-rated movies
  • Download copies to laptops and desktops
Cons:
  • No way to cache movies on iOS devices when not connected to WiFi (plane flight, drive, etc)
  • No other devices supported yet
Things I don’t know yet that I would like to know:
  • What the experience will be like when the movie’s downloading and streaming rights expire in a year? Do I get a reminder to download the movie one last time?
  • If the Flixster iOS app will support AppleTV – can I Airplay or Airplay mirror the movie on my iPad to the TV. If so, how does it look when blown up to 50+ inches?

Prius Plug-In price announced at $32,000 before rebate

Toyota announced today that the Prius Plug-in, set to debut in March 2012, will be priced at $32,000 before an approximate $2,500 tax rebate. An upmarket version is available $39,500.

The total difference in price between the baseline Prius Plug-in and a Chevy Volt is about $2,100 after the full tax credits…

Chevy Volt 2012 – MSRP: $39,145 – 7500 = $31,645
Toyota Prius Plug-in 2012 – MSRP: $32,000 – 2500 = $29,500
Difference:  $2,145 (does not include any destination charges or other dealer add-on fees)

After realizing this, my initial position of leaning towards the Prius is starting to change back to the Volt because there is only a $2,100 difference in price. The difference goes up to $3,100 if you count the difference in cost between the $1000 base 240V 3.3kW charger for the Prius and the $2000 240V 3.3kW Volt charger. This choice is determined by several factors.

  1. Prius top electric speed The Prius plug-in has a top full electric speed of 62MPH, while the Volt is always electric when there is capacity in the battery. For me, this means that my 70MPH commute on the highway to and from work and to and from most of my friends’ houses means I’m using gas in the Prius because of my top speed. In the Volt’s case I’ll always be on electricity if I have the charge? Because of my large amount of highway driving the Volt holds some advantage here.
  2. Electric range is 15 miles With the Prius, the electric range is 15 miles and probably lower in cold and hot weather (e.g. most of Vegas weather). My commute is 15-16 miles, most of which is on the highway. So I’m inclined to think I’d have to charge at work in addition to home, so I’d have to work with the building manager to get a charging station.
  3. Weird charging port location This isn’t a deal breaker, but it does get under my skin a little – everyone else is putting their charging port up front near the drivers door.  The Prius puts its in the back right. So I’d have to walk around to the back of my car to plug it in twice a day. Plus in commercial shopping centers the charging stations are usually at the front of the car. Do cars always get backed into parking spaces in Japan?
When it comes to purchase or lease, again I’m leaning towards lease, however it really depends on the details of the lease. Some individuals are reporting lease rates of $500 or more, which isn’t such a great deal, while GM’s stated lease rate is $399/mo for 12,000 mi/yr.

Building an all-in-one home server?

I’ve been thinking about building an all in one server lately. The goal would be to consolidate the two servers I currently have (Win2k3 outside server, Windows Home Server inside) into one physical box using virtualization.

The difficulty is trying to make up for Windows Home Server 2011′s missing features (that were in the original WHS). So I’ve been reading and have been thinking about building a computer built on ESXi 4.1 with OpenIndiana and a SATA controller card through VT-d hosting up to 8 HDDs which would be the host storage for the WHS 2011 drive. The HDDs would be setup using ZFS RAID Z-2 (2 parity drives), at with 3TB per drive, just 4 drives would yield 6TB usable storage. It would be exposed to WHS 2011 through iSCSI in ESXi.

From there the WHS 2011 VM would use that storage to store all my media for sharing. Because WHS 2011 doesn’t have any sort of data duplication (Drive Extender was removed in 2011), just putting the data on 1 HDD wont protect it from failure. And WHS 2011 doesn’t support drives larger than 3TB (the VHD file format used to backup one drive to another for redundancy doesn’t support images larger than 2TB).

Finally, I’d also have a VM of Windows 2k3 server as the outside server, hooked to a VT-d NIC (either the 2nd one on the motherboard or a separate add-in card compatible with VT-d). This would be the web server, server I remote to from work to browse the internet, etc.

The difficulty is that all this requires a fair amount of computer hardware. Just about $2,000 for a nice configuration (from the ground up, no recycled hardware). This includes case, PSU, motherboard, CPU, RAM, SAS/SATA controller card, HDDs, and a 160GB SSD for all the operating systems (maybe that I could replace with a cheapo 500GB drive, or two in RAID-1). Considering I just upgraded my desktop this year, I’m not in a hurry to run out and spend more money on computer parts.

Initial Thoughts on the HP Touchpad

got a great deal on the new HP Touchpad, and I had wanted to get it just to get it and try it out. I think that Palm’s (now HP) WebOS operating system was probably the best phone/tablet operating system other than Apple’s iOS.

And after an hour there are already some things I like better about WebOS than iOS, particularly how multitasking works. And there are a few things I definitely don’t like – media sync and the HP Play Beta software for the Mac.

UI

The user interface is good, especially when it comes to multitasking. The WebOS card system is the best way to handle multitasking on a mobile device. Period. Even after a few hours I like it way better than iOS’s implementation. Swiping up from the screen to kill apps (or app tabs/fragments), swiping up from the bottom to bring up the task switcher, they just seem more natural than the double-tap of the home button on iOS devices. And the card interface is (sadly) the smoothest experience of the entire OS.

But even after the update design to improve performance, the touch response is still lacking. I don’t know how Apple does it, but everything is always more responsive on an iOS vs Android and HP phones and tablets. All the Android phones I’ve ever played with and the TouchPad have that same, slightly lagged response. If iOS never existed we might never notice, we might not have this issue. But coming from iOS devices I certainly can tell, and sometimes I’ll get ahead of the screen, resulting in unwanted taps. If they cant fix it (because of the way the device or OS is designed) then we’ll just have to live with it, and thats disappointing.

Apps

The dearth of Touchpad-native apps is one of the bigger issues, mostly due to it being a relatively new platform. The nice part is that since I’ve got great web development skills, I should be able to make my own applications fairly easily (unlike the 10 times I tried to make an iOS app and gave up because I couldn’t wrap my brain around Objective-C — too much C# and web dev have rotted my core I guess).

That said, even some of the apps the Touchpad has look incredibly awful. The USA Today app (also available on the iPad, and an app that I use somewhat frequently) has this weird side-scrolling news article list for each “section” of the newspaper, and then an article pop-over on the right side. I’m really not sure what design decisions were made, but the app you get on the iPad is infinitely more readable and usable. Similarly with the Facebook app – you just wonder what the designers were thinking. There are at least two or three better layouts they could have used.

There is a free Angry Birds HD app. So, there is that.

Media

For me, the media experience didn’t go so well for me. I have a Mac and currently manage everything through iTunes. My goal was to load about 10GB of music from my library and some movies (540p H.264 medium profile aka iPad 2 compatible). Loading music is awkward because you have to put the device in USB disk mode, which keeps you from doing anything else while the transfer is going. The HP Play Beta software did a half decent job of importing my iTunes library, but it didn’t do very well with Smart Playlists or the Folder/Playlist hierarchy I have. That, and I tried to add “Recently Added” songs to the Touchpad, and it tried to import my entire music library (rather than songs I had recently added to the iTunes library it imported from). Video didn’t turn out so well either, I copied Iron Man 2 to the device, but wasn’t able to play it. Googling for help didn’t work since all the links end up being for pay-for software to convert it down for the device, not much about specifications or what Handbrake settings I can use. As someone who has a large and carefully manicured digital movie and TV show library, its something of a letdown to not have it work right out of the box.

Edit: I was able to get some movies from my media library working, but many didn’t work. Considering I used the same settings on all the rips, I really don’t know what the problem is.

Follow Up – August 20

HP announced they’re discontinuing the TouchPad. Which is incredibly heartbreaking, knowing they could have done something really good with it. I took mine back and got my $400 back – luckily it was the last day of the 14 day return period. Then, as the fire sales happened around the internet this weekend, I spent $99 on a new 16GB TouchPad. So I’ll still end up with one, but for a lot less.

How the iCloud could be huge

One of the key things that I think a lot of us techies are overlooking is that we’re used to syncing music and movies to our iPhones and iPads. Its second nature for us to pick playlists, artists, etc to copy them over. A few weeks later we want to change it up, so we fiddle with options and sync again.

But what if we didn’t have to put up with that garbage anymore? What if we just added new music to our iTunes collection – either through buying it at the Apple store or adding in iTunes – and it showed up on all of our devices? And our iDevices were smart enough to know what music we like, what music we listen to, and just use the local storage as a cache. It seems to have the following benefits…

1. Increase usability of iTunes store – purchases are sideloaded into your cloud storage, and then pushed down to your devices automatically if you’re on wifi (and manually if you’re on 3G). Amazon lacks the hardware device to make this work, and Google’s music store doesn’t really have any traction.
2. Increase usability of iTunes app – now you don’t have to manage your music syncing preferences, it just goes and does its thing. As long as Apple’s caching algorithm is smart enough, it’ll be fine.
3. This can also extend to apps & app data, podcasts, etc. Everything except for movies, which are too large to sync over Wifi/3G (though they could be re-encoded to lower bitrates and streamed to devices like the AppleTV and iPad over fast home broadband connections and WiFi).

Doing a mental “full stop” on the current way iTunes works an rethinking how to architect it with the iTunes Store and iCloud at the center seems to make it really compelling for all of the non-experts who buy Apple products because they’re easy. Put another way, I can teach my parents how to do this – buy songs from Apple music store, wait 30 seconds, music shows up on iPhone, hit play, listen.

The big question isn’t whether or not the concept of the cloud will work, but rather whether people are that in to music to do this (especially with a monthly fee), and can Apple pull it off without any glitches (like MobileMe had)? We’ll see in the next few weeks.

Remember, Apple’s ultimate goal isn’t necessarily to sell you another service to add to their revenue, its to make the iPhone more compelling than any Android, RIM or Windows phone. To get users to say, wow, that is really amazing, I need to get an iPhone because it fits me and my lifestyle.