So, Federal Senior Execs To Get Nods For Customer Service

December 21, 2014 Leave a comment

When President Obama deigned to meet with members of the Senior Executive Service and other managers earlier this month, some three thousand of them showed up. Not a lot of new ideas rolled out, but the old “customer service” idea resurfaced. Specifically, the still-to-be-developed idea of giving some sort of award or bonus to SESers whose agencies deliver excellent customer service.

This notion goes back a while. In the George H.W. Bush administration, they called it service-to-the-citizen. That was before online services came along, but the idea of government services equal to what people get from the private sector took hold. Notwithstanding that customer service in many areas of the private sector stinks, the idea has endured. Nowadays, the comparison mostly refers to online, and to some degree, telephone service. My comment on that is, the newest form of customer service, online chat, is something the government ought to explore more. I’ve resolved many a technical issue with neither e-mail (rarely any good) or telephone (it depends) by using chat.

I remember a one-star Army general who retired and went to work for a large software company, working in its federal division. At an editorial retreat I held for one of my magazine staffs, he was a guest speaker. He got a lot of laughs when he said, “I was trained to break up things and kill people! Now I’ve got to learn to delight the customer!”

I’m not sure what “delighting the customer” might mean for services from the government, but the latest online trend seems to be a hybrid of online transactions executed flawlessly, together with what they used to call high touch, individual-to-individual followup. It’s actually not that new. Six or seven years ago I ordered occasional computer parts from CDW, and the e-mail receipt always had the signature of a real person with a direct phone number.

Let me tell you about a really delightful commercial experience I had this month. Having been a four-eyes since the age of six, I’ve bought many a pair of glasses. I did contacts for 30 years, but gave up on them because of the discomfort and the tiresome routine. A couple of years ago I bought three pairs of glasses from a storefront shop — two contrasting styles for daily wear and retro-looking sports glasses for running (think Kareem Abdul Jabbar). The three pairs cost me more than $2,000. The store provided fine services, if you back out the schlep of driving there twice and parking, and waiting in the store for help. Recently lost of of the pairs, Ray-Ban frames, and I realized I couldn’t read a computer screen with any of them. It was affecting my broadcast delivery. I could see closeup and far away, but not that magic 18-24 or so inches.

Having read about the online glasses phenomenon, I decided to risk it. Long story short — the outfit sends customers five pairs to try on. I e-mailed them a picture of my fancy-shop prescription together with a selfie with a credit card held under my nose, pressed against my upper lip. This wan’t the payment system, it was how the retailer could figure out the distance between my pupils, since the credit card is a universal, fixed distance. Get it? The prescription I had was for so-called progressive lenses, bifocals with out the little line in the middle. The optician there extrapolated the single focus prescription.

I felt I was taking a risk, but at only $99 for frames and coated lenses, I felt if was a tolerable risk if the glasses turned out junky. I’d only be out a c-note. The glasses arrived a few days later in the mail. This after a couple of clarifying e-mail exchanges from an actual person at the retailer.

Amazing! They are perfect. All frames, including the fancy designer names, are made in China. These were imitations, privately branded, and as nice as anything in the storefronts. The glasses arrived inside a soft drawstring bag, inside a hard clamshell case. Nothing cheap about them at all, but a third the price I’d have paid at the shop connected to my opthalmologist. Most important, I can see a computer screen finally. I ordered a second pair — this time with only a couple of mouse clicks since their system remembered me. Not only that, the same person with whom I’d corresponded sent me a chocolate bar in the mail — with a handwritten note! of thanks!

For me, that’s the new “delightful” bar for great customer service. The federal government isn’t going to send chocolates. But it can, with the right resources and focus, reach the kind of service that makes people say, “Gosh, that worked out pretty good!”

AMD Betting On Cloud-Mobility Convergence

March 7, 2014 Leave a comment

Cloud computing and mobile computing are somehow mutually re-inforcing trends. But how exactly? One can exist without the other.

I heard a really good answer from Rory Read, the chief executive officer of AMD. Advanced Micro Devices, the perennial No. 2 to Intel in the PC processor market, is in the midst of a bet-the-company strategy change. The PC in the traditional form of a box or a notebook, while not disappearing, has lessened in importance as a platform for innovation. So much energy is going into mobile devices — smart phones, phablets, tablets, and “ultra” this, that and the other with touch screens and solid state drives.

These hardware form factors have been accompanied by a change in the way applications are programmed and architected. Apps exist independently of the data stores they draw from. That means they can synthesize output from lots of data sources from the rows and columns in relational databases to the geographic data that describes the whole world. Increasingly, Read and others believe, that data will be stored in public clouds. Much of it already is when you consider how many organizations offer up their information to be available to apps.

Including, of course, federal agencies. At Federal News Radio we’ve reported for year about the galaxy of policies and initiatives the Obama administration has launched around open data, open government and data as a valuable commodity. Lord knows few organizations churn out as much data as the U.S. federal government.

Read said, “The cloud changes everything. It dis-intermediates the device from the data.” The mobile device also re-integrates unlike data or data from multiple sources via the apps and displays it on a high-resolution screen. So you’ve potentially got a high degree of both numerical and graphics processing required at once — rows and columns type computing and visually intensive. How that happens inside a mobile devices is largely a function of the microprocessor architecture. It’s fundamentally different from the x86 architecture found in most PCs and servers for the past umpty ump years. Read freely acknowledges that Intel missed the mobility movement with its chips. And because AMD was so focused on going toe-to-toe with Intel, it too missed mobility.

An irony is that the ARM chip architecture so widely used on the hundreds of millions of mobile devices sold each year actually predates that of the first Intel 8086 of the original 1982 IBM PC.

For some time, analysts have been predicting that the mobile ARM architecture is headed upstream to the cloud data center itself. Read said AMD is actively working to make that happen. He reiterated AMD’s “APU”, or accelerated processing unit, approach in which a compute-optimized and graphics-optimized process is fabricated on a single chip incorporating up to 32 cores. It’s coupled with a set of application programming interfaces called HSA, or heterogeneous system architecture. The HSA lets programmers create applications that use either or both processing styles for today’s apps that call on disparate data types. AMD executes this now in the x86 architecture; it’s the chip, Read pointed out, in the latest X-Box One and PS-4 game stations. Gaming, he says, very much represents where computing is headed — online, a mixture of locally cached and cloud data and lots of computing and graphics.

Read said AMD is sampling to OEMs an ARM architecture version of the APU-HSA combo. He said he’s confident it will help enable a new generation of cloud computing servers that reduce the physical foot print of a unit of cloud power by two-thirds, and power consumption by three-fourths. The HP Moonshot server exemplifies this coming approach, Read said. The cloud of the future will handle larger workloads and a higher density of virtual machines per server, yet with less space and power consumption.

“The huge data center firms, the Rackspaces, Amazons, Googles, they’re all reaching big space, power and performance problems,” Read said. The new style of cloud-tuned servers “link low power server chips together to transform the data center.” Moreover, Read said, the integration of numbers and graphics processing, low power, dis-intermediation between application and data sources, and data streaming from the cloud will become the norm in not just in gaming but also in military and civilian systems.

Time to fix the federal spending house of (reporting) horrors

February 18, 2014 Leave a comment

Since childhood I’ve loved horror movies. In “The Haunting of Hill House” one of the characters comments on the fact that the house has no perfect right angles. She posits that if you take all of the slightly off corners and not-quite-straight lines, “the house adds up to one big distortion as a whole!” Or something like that. How I remember that line from a 1963 movie I have no idea, but it reminds me of federal finance. And the fact that despite the billions in sunk investments in systems, financial processes are such that when you add up all the layers, it takes something akin to archaeology for a citizen to unearth a specific fact about where and how money was spent.

The government has made a lot of progress  cleaning up its books. At the recent Association of Government Accountants training conference, I caught up with Comptroller General Gene Dodaro.  He acknowledged that all the agencies operating under the Chief Financial Officers Act except for Defense have achieved unqualified audits. In DOD, the Marine Corps is the first of the armed services to get there, but the greater Navy, Army and Air Force still have a couple of years to go.

Where things get out of kilter is in the broken relationships among core financial systems and the various other systems that support procurement, disbursements, and programs. This may be why, as one Pentagon manager points out, program people often don’t really know the cost bases of what they do.  Getting auditability and traceability of your cash accounting doesn’t necessarily equate to sound management.

It came to mind late last year when Reuters published a report on vast and tangled financial reconciliation distortions that occur monthly between the Navy, the Defense Finance and Accounting Service and the Treasury. So little of it matches that DFAS employees make up numbers to force everything to square up.

Recently on the Federal Drive, the Government Accountability Office’s Stan Czerwinski, the director strategic issues, noted how well accounted for the money spent on the 2009 stimulus was. He contrasted it with the relative opaqueness of USAspending.gov. It’s easy to get lost trying to figure out what’s going on. How ironic is that? Even with all those unqualified audit opinions, it’s still nearly impossible to find specific information about, say, how many dollars were spent with a particular contractor, or in which zip codes money went.

haunting4

When nothing is standard, it’s easy to get lost

Like the fabled Hill House, federal spending has so many formats, data elements and definitions, that when you add it all up it’s almost impossible to decipher. Federal financial systems have their own internal logic and produce assurance that convinces auditors that everything is square. But they have no real connection to USAspending.gov.

Czerwinski noted that strong central control over the site and support from top leadership helped recovery.gov work. That is, a strong governance structure is required before agencies can decide on formats and data elements. USAspending.gov lacks that clear governance and authority structure.  Federal financial people don’t live and die by what’s in it, so they don’t pay that much attention to it.

That’s why the Digital Accountability and Transparency Act is so promising. It’s been knocking around for a year. The House passed a version. But now the White House has  sent the Senate a marked version that waters down the requirement that the financial systems be established as the source of spending information. The markup, as reported by Federal News Radio’s Jason Miller, puts the Treasury and the White House into the mix of authorities the act would establish to set standards and define data elements. It potentially spoils the idea of uniformity across government.

Equally promising are new rules for grant spending, which outweighs contract spending. The new approach sweeps away several old OMB circulars and forces agencies to unify how they account for and report grant spending. It comes from the same White House that is trying to alter the DATA Act.

The concept is simple. Builders and suppliers mean the same universal thing when they say #2 x 4″, “T-square” or “5D nail”. In the domain of federal expenditures, it’s time to make data uniform across the whole enterprise.

What Should Be Ahead For Federal IT In 2014 After Hurricane ACA

January 6, 2014 Leave a comment

The big difference between the failure of healthcare.gov and all other federal IT development failures is this: Establishment of the site was inseparable from the law underlying it. There is no manual healthcare plan exchange. By contrast, failure to automate VA disability case processing or Office of Personnel Management retirement annuity calculations didn’t stop those activities the automation was supposed to support. They existed long before attempts to automate them. The fallback therefore consisted of using the existing process, maybe adding people in a surge to clear backlogs. That option doesn’t exist for healthcare.gov. But, as we’ve seen with the almost inane, and probably illegal, on-the-fly rewrites of the ACA’s requirements and deadlines, the online exchanges have no real fallback possibility.

In all other respects, healthcare.gov was like other failed systems, the result of boiling a toxic stew of poorly defined requirements, the wrong contractor, insufficient oversight and unclear lines of authority. Moreover, healthcare.gov was what used to be called, disparagingly, a “grand design.” The antitheses of agile, spiral development, healthcare.gov is the result of planners trying to spawn a nearly impossibly complicated system all at once. That approach has never worked and it never will.

There’s nothing in the Federal Acquisition Regulation that caused this system failure. That is, it wasn’t a procurement failure, as I’ve written before. And there’s nothing in, say, the proposed Federal Information Technology Acquisition Reform Act that would necessarily prevent it. Thinking, sound management, following the rules already there — that is how these things can be kept from happening.

In short, a big and depressing disappointment, all the more so because of the presence of so many celebrated techies in the administration that seem to have been oblivious to what was going on at the Centers for Medicare and Medicaid Services. Now comes word that the administration wants to fast-track the hiring of more technical talent into government ranks. That may sound good but it won’t solve the problems exhibited by healthcare.gov — lack of project management skill, requirements control, and clear lines of authority. When those things are in place it doesn’t matter whether the coding talent is in-house or contracted.

I say all of this as prelude to what I hope will  come ahead in 2014 for federal IT. And hoping the lessons learned will be the correct ones. Because there’s no doubt that healthcare.gov was the biggest federal IT story of 2013. It is the Obama administration’s Hurricane Katrina.

Lots of published lists of technology predictions have already hit. Many analysts think Google Glass will be a big deal. I have a personal vow never to have a conversation or any other engagement with anyone wearing them.  Anyway, I think they’ll end up being the Nehru jacket of technology — a few nerds will sport them for a while. Others are hoping for flexible smart phones. Forgiving him the Google Glass reference, I think Bob Gourley’s tech product trends as published in Federal Times  is otherwise a pretty good list.

Here’s what I’m hoping to see more of in the government IT market in the coming year. I warn you, no glamor or drama.

  • Rational cybersecurity. If healthcare.gov was the biggest story, the next biggest, or maybe bigger, story was the Edward Snowden drama. Savior of freedom or traitor, he certainly was the ultimate insider threat-turned-real. A group of Chinese government IT people I spoke to recently pointedly asked what changes in procedures had been instituted since Snowden. The episode brings together the need for continuous monitoring, ID management and specific procedures to prevent anyone from mass downloadings, even if it’s the Director of National Security.
  • Get mobility right. Let go of the BYOD idea. It only works in the public sector if a narrow set of devices is allowed and the agency has access to and control of the devices. Might as well be government-furnished.  And let go of the notion that the “desktop” PC  is dead. Pick the right device for the right situation. The hard part is software. Making applications mobile and fixed, and managing the licenses are the two hardest tasks.
  • Rightsize infrastructure. Really and actually find ways to boost interagency services sharing so the net spending on data center elements at least stops growing. Data center consolidation efforts have been going on for 20 years. It’s time to get serious about it.
  • Become a model for the post-password world. Time’s up. Everybody out of the password pool and go join ID management of the 21st century.
  • Become the Dominique Dawes of development. Agile, that is. Postulate every development project as if it was the final vault in the ’84 Olympics. Nail it then smile. It’s more than a matter of using this development library or that project management scheme. It’s a whole approach that starts with thinking and visualizing the end — then being the toughest S.O.B. there is when it comes to testing and requirements.

Two posts: An app for biz-dev app; a closer look at CMS claims

December 2, 2013 Leave a comment

Mobile app puts biz-dev in your hand

Sometimes good things do come in small packages. I’ve been intrigued by an iPhone app called Hord, pronounced hoard. The publisher, startup GovTribe, spells it with the “o” adorned by an overline  — a character unavailable in WordPress. Hord is an example of using government data to build apps, but not quite in the way the Obama administration has been pushing. The app is free; the service will cost $5 per month after a user tries it for 30 days. So Hord is not in the price league of Bloomberg, Deltek or Govini.

It’s also not a comprehensive environment with consulting and a large and growing database to consult. What it is, is a way of getting instant delivery of changes in solicitations from specific agencies or specific product categories. In later versions,  company co-founder Nate Nash told me, users will be able to look for specific product solicitations from specific agencies, for example, “mobile technology” from “Agriculture Department.”

Hord pulls data from the Government Accountability Office, General Services Administration, System for Award Management (SAM) and USASpending.gov. You pick the agencies and categories you want to “hoard” and then receive automatic push notifications when anything changes. One new item tells me the Coast Guard Surface Forces Logistics Center is requesting quotes for a bunch of diesel engine parts, that it has added a small business requirement, and that it was posted by Erika R. Wallace. It even give me Ms. Wallace’s e-mail address and phone number. Other feeds track awards and protests, and you can even see which hords are popular.

As an app, Hord is fast and sylish, a total mobile conception. Nash said GovTribe will issue a web version for the office later on.

Why Healthcare.gov still isn’t fixed

If you read the 8-page fix-it report from the Centers for Medicare and Medicaid Services, you could be fooled into thinking, “Gee, job done!” But look carefully and you’ll find claims that are hard to verify and others that don’t point to a business-grade level of Web site operation.

In particular, I note that as of Sunday, CMS was claiming 95.1% availability. That translates to nearly a day and a half per month of downtime. A CIO responsible for a high-activity commercial site would be canned for that level of performance if it lasted very long. Most strive for the “five nines” level of availability, which translates to 5 minutes of downtime per year. 99% uptime means three and a half days down per year.

CMS is claiming a 4x “registration throughput” improvement, 3x “database throughout” improvement, and 2x “capacity” increase — together with 5x network throughput improvement. Taking the agency at its word, the bottleneck would therefore be “capacity.” Consistent with the weakest-link theory, even with a 10x network throughout increase, users will only see improvement as extensive as the smallest improvement. For a site that was down nearly half the time and slow when it was operating, a 2x limiter on performance doesn’t sound like a triumph.

The whole episode calls into question President Obama’s belief that the failure of healthcare.gov’s launch was connected to the government’s inefficiency at IT procurement. Procurement is a popular canard. Sometimes it really is the problem. But not in this case. The awards to contractors were made in reasonable time using an existing multiple-award vehicle in place at CMS. The technologies used  are neither exotic nor out of date as a result of slow procurement. It looks to me like a matter of pure project management, or lack thereof.

CMS cites twice-a-day “standup war room” meetings. It says “the team is operating with private sector velocity and effectiveness” with “clear accountability and decision-making.” Well, okay. The report shows things going in the right direction. But it deepens the mystery of what was going on since March 2010 when the Affordable Care Act became law.

One of the Big Guys Contributes to Open Source

November 25, 2013 Leave a comment

Over the years I am regularly reminded about how much software permeates industrial activity far beyond the software industry itself. Thirty years ago the procurement director at McDonnell-Douglas told me that software was the most challenging thing to predict planning and executing on aircraft design and production schedules.

A lot of software expertise resides in companies that don’t publish software. In the national defense domain, software development is diffused among Defense Department components and its many contractors.

That’s prelude to why I was intrigued by a press release last month from Lockheed Martin. Engineers in the company’s Information Systems and Global Solutions unit donated a data search engine/discovery tool to  the open source community Codice Foundation. Codice is a 2012 offshoot of the Mil-OSS project. Both work to move defense related software from the proprietary world into the open source world. Codice members modeled the effort after the granddaddy of open source foundations, Apache.

By the way, Apache last month released a new version of the widely-used Hadoop framework for distributed, big-data applications. It was a very big deal in the open source and big data application development world. For the uninitiated, here is a good explanation of Hadoop from InformationWeek.

Lockheed donated to Codice what the company describes as the core software in the Distributed Common Ground System (DCGS) Integrated Environment, or DIB. It’s a mouthful, but DOD uses it to share ISR (intelligence, surveillance and reconnaissance) data. The donated core is dubbed the Distributed Data Framework (DDF).

Andy Goodson, a Lockheed program manager, explained that the DIB started as an Air Force-led joint program a decade ago. Its basic function is to make data discoverable. As an open source tool, the DDF can be adapted to any domain where multiple data sources must also be rendered discoverable by an application. It makes data where it resides available to application algorithms as well as to processes resulting in the data’s presentation to users.

In effect, the DDF furthers the fast-emerging computing model that treats data and applications as separate and distinct resources. The federal government has been trying to adapt to this model for some time, as manifest in (among other things) the Digital Government Strategy. In practice, to be useful, data even from disparate sources within a domain must adhere to that domain’s standards. But it need not be connected to a particular algorithm or application, so the data is reusable.

Goodson said Lockeed found that the code, because it is independent of any particular data source, was not subject to export control even though it was used in a sensitive military environment. It had advice and counsel from its federal client in this determination. Now, the software is available through Codice to other agencies, systems integrators and developers dealing with the issue of big data applications.

Smartphone vs. classic Instamatic: Fast fotos then and now

November 6, 2013 Leave a comment

Hit cable TV drama “Mad Men” is noted for its attention to period detail. In one recent episode I spotted a character using a Kodak Instamatic 104 camera. Remember Flashcubes? I would wager a million closets still harbor an Instamatic tucked into the darkness. Nearly every family had one model or another of an Instamatic from the camera’s introduction in 1963 to its demise in 1988. If you went to college and traveled in the late ’60s through the ’70s you probably still have scrapbooks filled with square, somewhat garishly color photos. The handsome Instamatic 100 and 104 were slightly clunky, and, as pieces of industrial design, invoke their era, sort of like the democratic Ford Pinto:

Unknown

Instamatic

images

Pinto dash

Just as no one would consider the Pinto for limousine service or luxury motoring, not one ever considered the Instamatic for any serious photography application.

Today’s Instamatic is the smartphone. Or used to be. The iPhone camera app even has a feature that produces square pictures, the format of the Instamatic’s 126 cartridge film format. Like the point-and-shoot film cameras, smart phones lack zoom or interchangeable lenses.

That’s where the similarity ends.

Thanks to the photographic software that was unknown when the Instamatic reined, even point-and-shoot smart phones have become fairly powerful imaging devices. Enhanced with a tool like True HDR that compresses dynamic range that otherwise is too wide for a smartphone, you can produce astonishing pictures. As organizations including government agencies mobilize their workforces, smartphones are fully capable of standing in for “real” cameras in a variety of situations. HDR tools give you the digital equivalent of the old maxim, “Expose for the shadows, develop for the highlights.” Below  is a straight show with the iPhone camera app. The second is an HDR photo. Now the “bald” sky and lack of detail in the dark leaves on the left.

IMG_1292

Regular camera app image

IMG_1293

HDR enhanced image

But after spending a few weeks concentrating on the finer points of smartphone photography, I still feel their tiny lenses present the biggest challenge. I say that in contrast to their lack of true zoom lenses, which is becoming less of a disadvantage (forget about digital zoom).

With enough pixel density and a perfectly steady hand, you can enlarge any photo to the magnification you want because perspective is a function of distance from the subject, not the focal length of the lens. In zooming in on a wide-field shot, though, you see the limited pixel density give out to the digital equivalent of film grain pretty quickly. So for serious telephotography, you still need a good digital body to which you can attach a telephoto lens.

This picture shows a distant hillside development through glass a few feet away:

IMG_1166

Here is a part of the same image enlarged as if shot with a telephoto lens. You get the telephoto appearance, but the resolution doesn’t quite hold up. In fact it has the approximate resolution of an Instamatic image:

IMG_1166

According to iPhoto data, the picture contains 3.1 megabits — 3264 by 2448 pixels, insufficient to blow it up too much. Given the progress in lens, software and resolution, I feel that eventually smart phone cameras will come close to rivaling expensive digital cameras. Newer smartphones are getting up to 8 megapixels sensors. But you’ll still need some way to hold the phone-camera steady and fire it without poking it.

I was thinking about all of this on a recent two-week, overseas vacation trip. At one time I would have traveled with two Nikon Fs and four or five lenses together with fistfuls of Kodachrome canisters. But I never really made the transition to professional grade digital photography. This trip was the first time I tried to do more or less serious photography with an iPhone.

My conclusion: At this point, smartphones are pretty darn capable photographically, but it’s an effort to produce the effect you want if you’ve spent decades with large, low-glare lenses and complete control over f-stop and shutter speed. Still, for some commercial and documentary uses, the latest generation of smartphones are capable photography devices. With their communications functions and sharing apps they are more useful in some situations than regular cameras. The other benefit is, now I’m starting to become re-inspired about photography itself, which at a long-ago stage in my life was a consuming passion.

Categories: Uncategorized
Follow

Get every new post delivered to your Inbox.