Tuesday 1 September 2015

The new IT is all about the customer



Enterprises have turned outward, investing more and more in improving the customer experience -- giving those closer to customers power to make key technology decisions

A decade ago, most of the grand IT initiatives I heard about involved optimizing internal business processes and reducing the cost of IT. But the direction of technology has shifted -- to improving the customer experience and deploying new applications that are the public face of new and continually improving products.

Decentralization. Technologists, particularly developers and engineers, have become integrated into lines of business. IT no longer likes to be seen as a separate entity. Instead of remaining a static, isolated cost center, IT has spread itself into every corner of the business and associated itself with driving revenue.

Self-service. A highly centralized IT entity cannot keep up with increased demand for customer applications and improvement. Either lines of business have mechanisms to procure resources internally -- or they turn to outside resources, including professional services and cloud service providers.

Outward-facing analytics. Tracking customers goes beyond conventional trends such as seasonal demand to detailed profiling and analysis of behavior, such as that represented by Web clickstreams and analysis of social media.

Increased risk awareness. Technology has become so central to the enterprise that its failure has disastrous consequences. Outages are no longer acceptable and data breaches get CEOs fired.

So how do these four trends affect the technologies enterprises invest in? To answer that question, you need to begin by acknowledging how much more is being demanded of enterprise technology.

Focusing on customers requires a multiplicity of Web and mobile applications that can change continually and scale at the drop of a hat. Building the infrastructure and recruiting the human capital to execute on that endeavor now consumes more and more of the technology spend.

Moreover, enterprises can no longer turn a blind eye to substandard enterprise applications for their own employees. In particular, sales and field service personnel need highly usable applications that can be modified easily as customer needs change, while business analysts need self-service access to analytics, rather than waiting for reports from business intelligence specialists.

To meet this rising demand, applications must be built using preexisting parts rather than from scratch. Those parts have several sources, and in most cases, developers are choosing which to use:

Shared services. Today many Web and mobile applications are built using microservices architecture. Instead of building monolithic applications with all sorts of internal dependencies, you create an array of shared, API-accessible services that can be used as building blocks for many different applications.

Open source code. GitHub and other cloud repositories enable developers to share and consume code for almost any purpose imaginable. This reflects today's practical, non-ideological open source culture: Why code it yourself if someone else is offering it free under the most liberal license imaginable?

Cloud APIs. Cloud service APIs from Google, Facebook, LinkedIn, and PayPal have become stalwarts for Web and mobile developers -- and are easily integrated into microservices architectures. Hot new APIs like those offered by Stripe for e-payments emerge all the time, along with more specialized plays such as the popular Twilio for telecom services.

Frameworks everywhere. Programming frameworks, available for all the popular languages, free developers from having to worry about the nonessential details of application development. Choosing the right programming frameworks for the job has become so critical, they've even been referred to as the new programming languages.

Wrapped around these prebuilt elements are modern development approaches such as agile methodology, which stipulates a recursive, piece-by-piece development process that continually solicits feedback from business stakeholders. Devops tools enable developers to provision their own virtual infrastructure -- or, alternatively, have operations reconfigure dev and test environments faster.

Underlying this new, high-speed application assembly line is cloud infrastructure. Wildly unpredictable fluctuations in the number of public users, as well as demands on shared services that may be used by many applications, requires an infrastructure that can pour on compute, storage, or network resources as needed.

For customer-facing applications, cloud has become the default. In most cases, enterprises are turning to public IaaS or PaaS providers such as Amazon Web Services or Microsoft Azure rather than trying to build private clouds from scratch.

Perhaps the most profitable area of big data involves gathering clickstream data about user behavior to optimize applications and make it easier to, say, compare and purchase products through an e-commerce application. Big Web companies such as Yahoo are way ahead in this area, with petabytes of data on HDFS to support mobile, search, advertising, personalization, media, and communications efforts.

Enterprises are pouring money into Hadoop, Spark, and Storm deployments -- as well technologies such as Hive or Impala that enable you to query Hadoop using SQL. The most exciting area today, however, is streaming analytics, where events are processed in near real time rather than in batches -- using clusters of servers packed with huge amounts of memory. The Storm-plus-Kafka combination is emerging as a popular streaming solution, but there are literally dozens of open source projects in the Hadoop ecosystem to experiment with.

Enterprise adoption of these new analytics solutions tends to be somewhat haphazard. Some enterprises encounter problems managing Hadoop at scale; others experiment without clear objectives, resulting in initiatives that never get off the ground. Still others may roll out unconnected projects using similar technology and duplicate their efforts unnecessarily. To avoid the latter case, InfoWorld's Andrew Oliver notes that deploying "Hadoop as a service" is becoming a common pattern: With sufficient preparation, various business units can obtain Hadoop analytics self-service style from a large, centralized, scalable hub.

While most IT decision-making is no longer top down, getting serious about security needs to come from the top. That's because security almost always has a negative effect on productivity -- adding more steps to go through -- and diverts technology resources toward fixing vulnerabilities and away from meeting business goals.

But as many enterprises have learned the hard way, you can focus on user experience all you like, but if a data breach exposes customers' personal information, your brand may never be trusted again.

Making security a high priority needs to come from the C-suite, because you can't break security bottlenecks without it. For example, unpatched systems are the number one vulnerability in almost all enterprises. It would seem relatively simple to establish a program to roll out patches as they arrive, at the very least for high-risk software such as Java, Flash, or Acrobat. But in most enterprises, systems remain unpatched because certain applications rely on older software versions.

You need to carry a big stick to convince a line of business manager to rewrite or replace an application because it's too much of a security risk.

In security, best practices -- such as prompt patching and up-to-date user training -- trump technology every time, but certain security technologies have more impact than others:

Multifactor authentication. Fingerprints, face scans, or sending codes via text message to a user's mobile phone all decrease the likelihood intruders can take over an endpoint and gain access to the network.

Network monitoring. First, get to know the normal network traffic flow. Then set up alerts when that flow deviates from the norm -- and you may catch data being exfiltrated from your network.

Encryption by default.  Processing power has become so abundant that sensitive data can be encrypted at rest. The bad guys may be able to steal it, but they can't do anything with it.

Again, all these security measures spin cycles that could be applied to more competitive endeavors that please customers and drive revenue. That's why senior management needs to enforce their implementation and penalize those who fail to comply. It's a lot better than picking up the pieces after a horrific data breach.

While technology decision-making has become more decentralized, security isn't the only area where central control still plays an important role.

The great risk of decentralized IT is a balkanized organization. It's one thing to empower people at the line-of-business or developer level to choose the technology they need to serve customers best. But if they're creating their own data stores in the cloud without considering how that data needs to be consilidated, or building systems that are redundant with others, or signing disasterous contracts with providers...then the agility of decentralization descends into chaos.

The answer is an architectural framework that empowers people but at the same time prevents them from making bad decisions. You need to enforce best practices and promote pre-approved services and technologies -- while giving stakeholders the latitude to experiment and the processes to evaluate exciting new solutions as they come along. That governance needs the full force of senior management behind it.

Today, IT is more federated than centralized, and it needs to be that way to serve customers best. But the policies established at the heart of IT are more important than ever, because they're what holds that federation together.

Better IT support could make users more ignorant


Researchers and theorists may argue about the strategic value Information Technology departments make to large organizations – or even whether tech prevents social change rather than encouraging it.

But few question the value of IT support to end users – who are often reputed to be more tech savvy than their predecessors, but who may also insist the tech they need to understand has more to do with getting video or cloud apps on their iPhones, not fundamentals like how to get back online if someone kicks a plug out of the wall.

Close support may improve productivity, but it could also create a dependency that makes end users more ignorant and less competent on tech issues in the long run, according to a new study on the dynamics of geek-human relationships and the unfortunate tendency to let experts do things for you.

In family groups, though probably in others as well, having one member who is technically skilled generally does not encourage others to learn more about the tech they all use, according to the study from researcher Erika Poole, assistant professor of Information Sciences and Technology at Pennsylvania State University.

Penn State researcher Erika Pool looks for healthy patterns inthe use of technology but found some tPatrick Mansell, PSU
Poole studied the tech-related behavior of 10 families asked to keep a log of every tech-related interaction or event for several weeks, in each of which Poole would introduce a new technical challenge such as asking the family to set up and use an iPod.

Usually that task fell to the more tech-savvy of the adults in the household, rather than post-Millenial children, for example.

When the task fell to the non-technical partner, however, Poole identified a consistent problem: Non-technical members of the household often had surprising difficulty with even relatively simple technical tasks, and reported avoiding or ignoring them for fear of getting stuck and feeling like a burden when they had to ask for help.
The more geekish members of the household tended to pick up a technical task and race through it without asking about the configuration preferences of other household members or explaining what was being done so other members of the household could modify or repeat the process on their own.

"They might make decisions about computer settings, for example, without asking the other’s opinion," Poole said.

Depending on others for even simple technical help can make non-geeks even more reluctant to ask for help or training, trapping both them and their significant, or to be taught how to do things themselves, leaving them far less able to deal with their own technical problems than they would have been otherwise, and trapping tech-savvy members of the group into permanent tech-support roles.

It's only when the tech-supporting partner was absent that the non-geeks were pushed to learn anything.

"When they put aside their initial reluctance to do an unfamiliar task, their self-confidence ended up increasing when they finally tried it and figured it out", Poole said in a Penn State announcement of the study.

The study didn't answer any questions about how to train tech-aversive end users more effectively. It did raise questions about what constitutes technical competence at a time when it's not unusual for end users integrate and use half a dozen cloud or mobile services to get a job done, but be stumped a the need to reboot – or even find – the home router that lets them touch the cloud in the first place. 

"Tech is becoming more important everywhere, but not everyone needs to be on the level of a systems administrator," Poole said in the announcement. "I wish I could say there’s a set list of skills that everyone needs to know, but it’s a very individual thing. It’s about learning what you need to know to navigate the technology that’s important to you."

Unfortunately, the ease of use, 24/7 availability and support for a range of mobile devices that is de rigeur for cloud apps also seems to have raised end-user expectations for IT as well.

Internally produced apps and the interfaces to automated support have to be simple, support has to be 24x7, support multiple devices and be available through phone, email, web, chat and other services, according to a June, 2014 Help Desk Institute study comparing expectations of IT support groups and end users.

End-user support organizations are trying to meet those demands, but also admit they spend budget and work-hours on things that are important to them – security, reliability and efficiency of support tools and networks, for example – that are not important to end users.

And the focus is still on fixing the problem or answering the question – not on teaching end users to do things themselves.

Part of that is self defensive. You may only feed people for a day if you hand them a fish rather than teach them to fish, but at least your help desk won't be flooded with complaints about hooked fingers, tangled lines and fish that inexplicably burst into flames and exploded despite the user doing "exactly what you told me to."

It's simply easier to fix something for someone than it is to teach them to do it themselves, especially if they don't particularly want to learn.

If following that path doesn't produce users even more ignorant about the technology infrastructure on which everything they do depends, it will only be because it's not possible to fit any more ignorance into that particular group of users.

It's not all ignorance, of course. Many users consider themselves "tech savvy" because they know how to provision, integrate and use cloud apps even knowing nothing at all about the infrastructure underneath.

That's okay(ish). The cloud exists specifically to hide the ugly reality of infrastructure from the delicate sensibilities of users. It's usually not a good idea to ask end users to do too much of their own tech work, anyway.

And it is genuinely a good thing that corporate IT has finally recognized external clouds sufficiently to begin talking support for multiplatform datacenters, hybrid cloud implementations, federated security and flexible, dynamic security – as well as 24/7 multiplatform support for end users.

It's just that the good IT does by handholding and coddling end users in an effort to get their work done also makes them more ignorant about the tools and infrastructure they're using to try to continuously raise their productivity.

At some point the combination of ignorance, co-dependence not-always-reliable technology is going to combine into something really ugly, broken and helpless for end users, who may not even recognize enough about it to know who they should call to come fix it.









Windows 10 hits the 75-million mark


Third-party data supports Microsoft's claim that Windows 10 is off to a strong start


Microsoft this week bumped up its claim of Windows 10 devices to 75 million when a company executive tweeted that figure Wednesday.

"More than 75 million devices running Windows 10 -- and growing every day," Yusuf Mehdi, corporate vice president of Microsoft's Windows and Devices division, said on Twitter.

Mehdi had last updated the official Windows 10 tally on July 30, when he said 14 million devices were running the new OS. Microsoft began triggering on-screen Windows 10 upgrade notifications on PCs running Windows 7 and 8.1 on July 29.

While the 75 million cannot be independently verified -- Microsoft is likely citing the number of Windows 10 "activations," the check-in the OS does with Redmond's servers when it's first run to verify that it is linked with a valid product key -- it is in the ballpark of third-party estimates.

Data provided to Computerworld earlier this month by analytics vendor Net Applications showed that by its calculations 3% of all Windows-powered personal computers ran Windows 10 during the week of Aug. 2-8. That 3% translated into approximately 45 million devices, assuming there are 1.5 billion Windows systems worldwide, the latter number one that Microsoft itself has repeatedly used.

It's not unreasonable to think that Microsoft has added another 30 million copies of Windows 10 to the running total in the three weeks since. (Net Applications has declined to provide more recent weekly user share data, saying that its engineers had disabled weekly reporting because they were revamping the back-end infrastructure to prep a new service.)

Another analytics vendor, Dublin-based StatCounter, which tracks a different metric, has posted data that also appears to dovetails with Microsoft's 75-million device claim. The growth of StatCounter's usage share for Windows 10 -- a measurement of Internet activity -- since July 30 closely matches the increase Microsoft claimed.

The growth rate from 14 million to 75 million -- Mehdi's numbers -- represented a very strong 436%, give or take a decimal point.

Likewise, growth in StatCounter's Windows 10 usage share from the 1.34% on July 30 (when Mehdi touted 14 million near day's end) to the high water mark of 7.26% on Sunday, Aug. 23, was an almost-the-same 441%.
Microsoft's 75 million was significantly larger than similar boasts in 2012 -- and earlier, compared to how fast previous Windows' editions left the gate. That's not a shock; everyone has expected a stronger uptake because Windows 10 is a free upgrade. Prior Windows editions were not.

In 2012, for example, Microsoft said it took Windows 8 a month to crack the 40-million-licenses-sold mark. In 2010, a little more than four months after Windows 7 debuted, the developer said 90 million licenses of that OS had been sold.

StatCounter's data supports the idea that Windows 10's start has been a record-setter for Microsoft. Windows 10's usage share after 30 days, for instance, was 37% higher than that of Windows 7 after its first 30 days of availability.

Microsoft has set a goal of putting Windows 10 on 1 billion devices within three years: The 75 million represents 7.5% of that target.

It is a good start.















Sunday 30 August 2015

Welcome to the smartphone revolution


You say you want a revolution? The smartphone market as we know is about to be flipped upside down -- and there's no turning back from this kind of change.
        
It's been a long time coming, gang, but it's finally here. Yes, oh, yes: The smartphone revolution has officially begun.

Hang on a sec -- let me clarify. By "smartphone revolution," I don't mean that mobile technology is just now maturing or becoming an integral part of our lives. That obviously happened a while ago; we've all been toting around and depending on our devices for years, and the technology is pretty much moving forward incrementally at this point.

Nay, the revolution of which I speak isn't about the technology itself -- but rather the way we pay for the privilege of carrying it. For the first time, particularly in America, smartphones are becoming a consumer's game. Both in purchasing them and using them, the tables have at long last started to turn our way.

Stay with me for a minute, because we've got a fair amount of ground to cover here. The revolution is actually happening on a couple different fronts -- first, with the long overdue and much-deserved death of the two-year carrier contract.

The smartphone revolution, part I: The crumbling of the carrier resistance
You've heard about this pending doom by now, right? Sprint announced the other day that it's phasing out contracts and the subsidies that accompany them by the end of this year. Verizon made a similar move earlier this month, while T-Mobile killed off its contracts two years ago. Only AT&T is still clinging to the antiquated notion of locking consumers in, and we'll see how much longer that's able to last.

Of course, those of us in the know have been avoiding carrier contracts for a while. Once you realize that a "$200 phone" actually costs $700 to $900 -- and that under most carrier contract plans, you end up paying that full price and then some via the ongoing inflated monthly service charges -- it becomes a no-brainer to simply purchase phones unlocked and then find an inexpensive (and commitment-free) prepaid plan that fits your needs.

The problem is that most people are completely unaware that such options exist. In America, the norm has long been to walk into a carrier store and pick out a new device right then and there. Near-ubiquitous ads touting shiny new flagship phones for "ONLY $200!" are pretty deceptive -- and until your eyes are opened to the actual math behind those numbers, it's easy to be misled.

With contracts finally going the way of the dodo, though, the smartphone-purchasing landscape is starting to look drastically different. Everyone's suddenly realizing, holy crap, that "$200 phone" actually costs $815! Maybe the carrier is offering a way to spread out the cost over monthly payment plans separate from your base service -- in other words, a less shady and deceptive way of doing what they were doing before -- but that big final number is no longer hidden behind tiny and out-of-the-way print.


And while the initial sticker shock may come as, well, a shock to some people, trust me: This is a good thing for consumers. Over the course of two years, you'll almost certainly end up paying less than you would have on those old bloated contract plans, even with the misleading smoke and mirrors of a lower upfront cost that accompanied them. And that's not even considering if you take the time to find an inexpensive prepaid plan, like the ones I mentioned before -- which are still widely available and still generally provide better deals than what the carriers themselves will give you.

But that's only part one of the revolution.

The smartphone revolution, part II: The battle of new phone cost

The other and equally important part is the price of the phones themselves -- and this is where things really start to get interesting.

We've been seeing an explosion of surprisingly decent affordable Android phones for a while now. Most of them have been budget-level to midrange devices, but even so, many provide admirable overall experiences -- like Motorola's new Moto G, which costs $180 to $220 and is honest-to-Goog more than enough muscle for the majority of smartphone users.

But the price drops are no longer stopping at that midrange level. Motorola is now also about to offer its new 2015 flagship Moto X phone for a mere $400. That's $400, unlocked and off-contract, for a beautifully designed and premium phone that promises to be one of the best all-around devices of the year.

That, my friends, is unheard of. And here's what's especially fascinating: That move -- along with the proliferation of the lower-end-but-still-perfectly-decent super-cheap smartphones -- is changing the way we think about mobile device pricing.

A year ago, the conversation surrounding Motorola's 2014 Moto X -- which itself was a bargain at the time, at $500 unlocked -- was that the phone was a particularly good deal. It was unusually inexpensive relative to the then-widely-accepted flagship phone norm.

This year, the conversation is taking a very different shape. It's something I've been noticing as I've worked on my review of Samsung's Galaxy Note 5 and tuned in closely to the public reaction to that device. The new Note is priced in the mid-$700-range (with slight fluctuations depending on where you buy it), which is not at all unusual for a flagship phone; in fact, last year's Note 4 was priced even higher, in the mid-$800-range off-contract, when it first came out.

But -- can you see where this is going? -- two things have changed:

People are more aware of the actual price of phones, thanks to the phasing out of carrier contracts and subsidies.

Along with that awareness, people are seeing that some Android phones are available for significantly less -- both in the budget-to-midrange realm, where you can get "good enough" for a couple hundred bucks, and in the flagship realm, where you can get a major manufacturer's top-of-the-line offering for $400.

As a result, for the first time this year, Samsung's phones are being described as expensive. Take a minute and wrap your head around that: A new flagship phone that actually costs less than its predecessor is being described as expensive -- pretty widely, if you look around -- all because of the way the environment surrounding that phone has evolved.

"Eventually, resistance will become futile

Call it commoditization or call it whatever you want, but the times are definitely a-changin'. And while Samsung still has a firm leg up over Motorola with its huge marketing budget, brand name recognition, and carrier store placement, sooner or later, "normal" smartphone shoppers are gonna start picking up on what we enthusiasts already see.

At some point, another major manufacturer will feel the pressure. Another major manufacturer will start trying to compete with Motorola at those lower off-contract prices. And then the ball will really be rolling. Little by little, over time, the base cost of buying a smartphone is going to plummet -- and eventually, resistance will become futile. It's practically inevitable.

After all, if Motorola manages to deliver on its promise of an excellent camera and superb stamina in the new Moto X (and after seeing what the company managed to do with the lower-end Moto G, I'm optimistic it can), the perception of being "expensive" starts to be a real problem for Samsung. Sure, a lot of people have grown to like Samsung's smartphones -- and the devices certainly have plenty of compelling qualities -- but for most shoppers, is there anything about them that makes them worth an extra $350 over Motorola's equivalent? (To say nothing of being worth an extra $500-and-change over the extremely low-cost alternatives.) We're talking nearly double the price from one flagship to another.

Apple might be able to pull off that kind of "premium pricing" approach, as people really do buy into that brand and the connotations it carries. But Samsung? I'm not sure Samsung is at that same level of allegiance. Not when we're talking a difference of hundreds of dollars -- a difference shoppers are about to see more clearly than ever.

(I'm using the new Note as an example, by the way, since that's where I started to notice this "expensive" discussion beginning. Samsung's smaller Galaxy S6 isn't much cheaper, though: That phone is in the mid-$600-range off-contract. And it's not just Samsung, either: Current flagships from HTC and LG are also in that same general range. This is very much an industry-wide phenomenon.)
Now, don't expect any sort of drastic changes to happen overnight. Like most things, it's going to take time for the effects of this to spread throughout the industry and start making a difference. But just like with the carrier contracts three years ago, a rumbling is rising down under. Expectations are evolving. The writing is on the wall.

The smartphone revolution is upon us, my comrades. And for us as consumers, that's nothing but good news.

After years of oppression, the ball is finally in our court.

Wednesday 26 August 2015

PayPal expands One Touch program to new markets in Europe, Australia


PayPal expands One Touch program to new markets in Europe, Australia

The PayPal logo is seen during an event at Terra Gallery in San Francisco, California May 21, 2015.

PayPal Holdings Inc said on Tuesday it would expand its One Touch payments product to 13 new markets in Europe and Australia, bringing a simpler check-out to online shoppers and merchants.

PayPal, which separated from eBay Inc last month, is a formidable player in the fast-growing payments industry, which has attracted new entrants like Apple Inc.

The company said "millions" of consumers had already enabled One Touch to make payments on desktops and through mobile apps in the United States, United Kingdom and Canada and that half of the world's top 100 online retailers were already using the program.

Existing PayPal users can opt-in to use One Touch once and then no longer be required to re-enter their billing and shipping information into shopping apps or websites that support PayPal in their checkouts. Merchants that already accept PayPal will see One Touch automatically enabled, PayPal said.

A smoother shopping experience is crucial to online retailers as they often blame the high rates of unfinished or abandoned online sales on the tiring process of re-entering payment information.

PayPal said more than half of e-commerce shopping sessions are taking place on mobile but said that only around 10 to 15 percent of purchases occur on mobile devices.

"If consumers can't check out in one touch or tap and instead have to do data entry on their mobile device they're far less likely to complete the transaction," Bill Ready, PayPal's senior vice president, said in an emailed response to Reuters.

"With One Touch, we're bridging that gap and creating better buying experiences for consumers, which in turn means higher conversions for merchants."

PayPal said the One Touch program had led to 50 percent or greater improvement in conversion rates and said it planned to eventually launch One Touch to all markets in which PayPal exists.

The company, which has more than 169 million customer accounts worldwide, processed 4 billion payments last year totaling $235 billion. PayPal also said $40 billion was spent globally on mobile devices with the company.

Top 10 technology schools in the States

Interested in going to one of the best colleges or universities to study technology? Here are the top 10 schools known for their computer science and engineering programs.




Top technology schools

Every year, Money releases its rankings of every college and university in the U.S., and not surprisingly, a number of those top schools are leaders in the tech space. Here are the top 10 technology schools, according to Money's most recent survey of the best colleges in America.

1. Stanford University


First on the list for not only technology colleges, but all colleges, Stanford University has an impressive 96 percent graduation rate. The average price for a degree is $178,731 and students earn, on average, $64,400 per year upon graduation. Stanford's global engineering program allows its 4,850 students to travel around the globe while studying engineering. There are nine departments in the engineering program: aeronautics and astronautics, bioengineering, chemical engineering, civil and environmental engineering, computer science, electrical engineering, management science and engineering, materials science and engineering, and mechanical engineering.

2. Massachusetts Institute of Technology


The Massachusetts Institute of Technology, located in Cambridge, Mass., is the second best technology school in the country, with a 93 percent graduation rate. The average net price of a degree comes in at a $166,855, but students can expect an average starting salary of $72,500 per year after graduating. As one of the top engineering schools, it's ranked number 1 for chemical, aerospace/aeronautical, computer and electrical engineering. The top employers for the 57 percent of graduates that enter the workforce immediately include companies like Google, Amazon, Goldman Sachs and ExxonMobil. Another 32 percent of students, however, go on to pursue a higher degree.


3. California Institute of Technology


Located in Pasadena, Calif., the California Institute of Technology has a graduation rate of 93 percent. The average cost of a degree is $186,122, and students earn an average starting salary of $72,300. CalTech, as it's often called, has departments in aerospace, applied physics and materials studies, computing and mathematical sciences, electrical engineering, environmental science and engineering, mechanical and civil engineering, and medical engineering. The prestigious college is also home to 31 recipients of the Nobel Peace Prize.


4. Harvey Mudd College


Harvey Mudd College in Claremont, Calif. has a strong technology program, putting it at number 4 on the list of top technology schools. The cost of tuition is also one of the highest on this list, at $196,551 for a degree. Graduates of Harvey Mudd earn an average of $76,400 early on in their careers and the graduation rate is 91 percent. The engineering program at Harvey Mudd College focuses on helping students apply their skills to real world situations. Students can also get professional experience and help solve design problems outside of the classroom through an engineering clinic.


5. Harvard University


Harvard University, located in Cambridge, Mass., technically ties with Harvey Mudd for top technology schools, and top overall colleges. The graduation rate is 97 percent and the average price of a degree is $187, 763 while graduates earn an average annual salary of $60,000 when starting their careers. At Harvard's Jon A. Paulson School of Engineering and Applied Sciences, which goes back as far as 1847, undergraduate students can study applied mathematics, biomedical engineering, computer science, electrical engineering, engineering sciences and mechanical engineering.


6. University of California at Berkeley


The University of California at Berkeley has a graduation rate of 91 percent, and students can get a degree for around $133,549. After graduation, the average salary for students starting out their careers is $58,300 per year. The electrical engineering and computer science division of the University of California at Berkeley has around 2,000 undergraduate students and is the largest department within the university.


7. University of Pennsylvania


The University of Pennsylvania, located in Philadelphia, Penn., has a graduation rate of 96 percent and the average cost of a degree is $194,148. Students graduating from UPenn and beginning out their careers earn an average annual starting salary of $59,200. The UPenn engineering department focuses on computer and information science. Students can study computer science, computer engineering, digital media design, networked and social systems engineering, computational biology as well as computer and cognitive science.


8. Rice University


Located in Houston, Rice University has a graduation rate of 91 percent and the average cost of a degree is $157,824. Upon graduation, the average starting salary for students comes in at $61,200 per year. Rice University has a Department of Computer Science where students can work in faculty research programs and describes the perfect computer science student as a "mathematician seeking adventurer," a quote from system architect Bob Barton. In the electrical and computer engineering department, students can prepare for a career in oil and gas, wearables, entertainment, renewable energy, gaming, healthcare, space industry, security and aviation.


9. Brigham Young University-Provo


Brigham Young University-Provo, located in Provo, Utah, has a graduation rate of 78 percent, but students won't have as many loans as other colleges on this list. The average price of a degree is a moderate $80,988 and the average starting salary for graduates is around $51,600 per year. Brigham Young University-Provo offers degrees in electrical engineering, computer engineering and computer science. With a wide array of programs to choose from in each degree, Brigham Young University-Provo boasts a rigorous course load with an emphasis on gaining practical skills for the workforce.


10. Texas A&M University


College Station, Texas is home to Texas A&M University where 79 percent of students graduate and the average cost of a degree is $84,732. Students can expect to earn an average starting salary of $54,000 per year after graduation. The Texas A&M computer science and engineering program boasts an "open, accepting, and compassionate community that encourages the exploration of ideas." Students should expect to leave the program prepared to help solve real-world challenges in the technology industry through applied research.










Oracle, still clueless about security

Oracle's CSO has some wrongheaded notions about her area of expertise. What is the company doing about that?


Oracle’s chief security officer, Mary Ann Davidson, recently ticked off almost everyone in the security business. She proclaimed that you had to do security “expertise in-house because security is a core element of software development and you cannot outsource it.” She continued, “Whom do you think is more trustworthy? Who has a greater incentive to do the job right — someone who builds something, or someone who builds FUD around what others build?”

What she said in 2015 was that security reports based on reverse-engineering Oracle code and then applying static or dynamic analysis to it does not lead to “proof of an actual vulnerability. Often, they are not much more than a pile of steaming … FUD.”

Davidson’s blog post is one long rant that boils down to, “How dare people analyze Oracle code?” “I have seen a large-ish uptick in customers reverse engineering our code to attempt to find security vulnerabilities in it. <Insert big sigh here.> This is why I’ve been writing a lot of letters to customers that start with “hi, howzit, aloha” but end with ‘please comply with your license agreement and stop reverse engineering our code, already.’”

Because God forbid someone should find a security hole!

Oracle backed away from Davidson’s position in less than 24 hours. “We removed the post as it does not reflect our beliefs or our relationship with our customers,” wrote Edward Screven, Oracle executive vice president and chief corporate architect.

But Oracle has not taken down Davidson’s 2011 rant, nor others. For example, in an earlier 2015 post, Davidson described security researchers outside Oracle’s Unbreakable walls as little more than greedy brats crying for attention:

A researcher first finds vulnerability in a widely-used library: the more widely-used, the better … Next, the researcher comes up with a catchy name. You get extra points for it being an acronym for the nature of the vulnerability, such as SUCKS—Security Undermining of Critical Key Systems. Then, you put up a website (more points for cute animated creature dancing around and singing the SUCKS song). Add links so visitors can Order the T-shirt, Download the App, and Get a Free Bumper Sticker! Get a hash tag. Develop a Facebook page and ask your friends to Like your vulnerability. (I might be exaggerating, but not by much.) Now, sit back and wait for the uninformed public to regurgitate the headlines about “New Vulnerability SUCKS!” If you are a security researcher who dreamed up all the above, start planning your speaking engagements on how the world as we know it will end, because (wait for it), “Everything SUCKS.

This is so much horse-pucky.

Yes, people want to make money and gain fame by finding and revealing security holes. Is that such a bad thing? It’s certainly better than, say, finding a security hole and then exploiting it, isn’t it? I think so.

Davidson also seems stuck in the dark ages of security. She believes in security by obscurity.

In 2012, for example, Davidson lambasted the Payment Card Industry Security (PCI) Standards Council for requiring “vendors to disclose (dare we say ‘tell all?’) to PCI any known security vulnerabilities and associated security breaches.” Or, as she put it more succinctly, “tell your customers that you have to rat them out to PCI.”

She added, just to make it perfectly clear where she’s coming from, that information on security vulnerabilities at Oracle is on a “need to know” basis.

Perhaps Davidson’s extreme reactionary stance comes from the fact that David Litchfield, the famed U.K. security expert, has made a career of hacking Oracle database software. Back in 2005, Litchfield, who reverse-engineers Oracle code to find its vulnerabilities, said, “It is my belief that the CSO [Davidson] has categorically failed. Oracle security has stagnated under her leadership and it’s time for change.”

Ten years later, people like Davidson who believe that keeping code closed and proprietary is a good thing have grown far fewer in number. Even Microsoft has gotten the open-source message.

Who loves Linux? Microsoft CEO Satya Nadella loves Linux.

Oracle with Linux and MySQL gets open source too. But Davidson? Not so much.

One of open source’s tenets is Linus’s Law: “Given enough eyeballs, all bugs are shallow.” Davidson, with her naked contempt for anyone who examines Oracle’s code, appears to be out of step with Oracle and the open-source method.

Or, is she?

It’s not as if Davidson is saying anything new. She’s been making juvenile attacks — I mean what’s a chief anything officer doing saying “suck” over and over again? — for years now. She’s been Oracle’s CSO for 15 years, and Oracle still lets her babble to the public without any control. Larry Ellison, if no one else, clearly thinks she’s doing a great job.

I don’t pretend to understand what’s going on inside Oracle. People at Oracle who talk to reporters don’t tend to keep their jobs for very long.

From the outside looking in, I see a company that both embraces and rejects the open-source method. That second part is not healthy for its products’ security. And, in the long run, it’s not healthy for Oracle’s future as a company.

Back in 2006, Davidson said, her “goal is to be out of a job.” Maybe it’s time for Oracle to take her up on that offer.

Saturday 22 August 2015

Google Is Millions Of Miles Ahead Of Apple In Driverless Cars

Last week’s “exclusive” in the Guardian claiming to “confirm Apple is building self-driving car” raised quite a buzz. Much of that buzz was skeptical, with many pointing out that the facts failed to support the Guardian’s conclusion.

The logical leap that Guardian made was that an Apple engineer’s interest in the GoMentum Station vehicle test track confirmed Apple’s driverless car program. This is too big a leap, as a range of Apple car-related aspirations—self-driving or not—might have use for such a test track.


Let’s assume, however, that the Guardian is right and Apple does have a driverless car ready for testing. (This is possible, as Apple has hired many automotive engineers, including the former CEO of Mercedes Benz’s Silicon Valley research center.) What would that say about the relative state of Apple’s driverless car?

It would tell us that Apple is millions of miles behind Google, and falling further behind every day.

As one of the few companies in the world richer than Google, Apple can match the cars, sensors, processors, navigational systems and other pieces of hardware that Google might deploy. It can replicate the sophisticated maps that Google has compiled. It will have a very hard time, however, catching up with Google’s on-the-road learning.

Actual road miles are critical because driverless cars learn to drive like humans do—through experience. Sophisticated hardware tells the car where it is and what the other things around it are doing. The actual driving, including identifying those other things, predicting what they might do, and handling all possible situations to safely get the car to where it needs to go, depends on very sophisticated machine learning algorithms. Those algorithms, in turn, depend on analyzing real-world driving situations.

Microsoft fires back at Google with Bing contextual search on Android



'Snapshots on Tap' echoes a feature coming with the next version of Android

Microsoft has pre-empted a new feature Google plans to include in the next version of Android with an update released Thursday for the Bing Search app that lets users get information about what they're looking at by pressing and holding their device's home button.

Called Bing Snapshots, the feature is incredibly similar to the Now on Tap functionality Google announced for Android Marshmallow at its I/O developer conference earlier this year. Bing will look over a user's screen when they call up a Snapshot and then provide them with relevant information along with links they can use to take action like finding hotels at a travel destination.



For example, someone watching a movie trailer can press and hold on their device's home button and pull up a Bing Snapshot that will give them easy access to reviews of the film in question, along with a link that lets them buy tickets through Fandango. 

Google Now On Tap, which is slated for release with Android Marshmallow later this year, will offer similar features with a user interface that would appear to take up less screen real estate right off the bat, at least in the early incarnations Google showed off at I/O.



The new functionality highlights one of the major differences between Android and iOS: Microsoft can replace system functionality originally controlled by Google Now and use that to push its own search engine and virtual assistant. Microsoft is currently beta testing a version of its virtual assistant Cortana on Android for release later this year as well. 

A Cortana app is also in the cards for iOS, but Apple almost certainly won't allow a virtual assistant to take over capabilities from Cortana, especially since Google Now remains quarantined inside the Google app on that mobile platform. 

All of this comes as those three companies remained locked in a tight battle to out-innovate one another in the virtual assistant market as a means of controlling how users pull up information across their computers and mobile devices. For Microsoft and Google, there's an additional incentive behind the improvements: driving users to their respective assistants has the potential to boost use of the connected search engines. 

Friday 21 August 2015

PHP 7 drops first release candidate


The release candidate for the speedy PHP upgrade features bug fixes and stability improvements, but it cannot be used in production yet

 Faster PHP is approaching. PHP 7.0.0, which has been promoted as a much quicker upgrade to the server-side scripting language, has just gone into a release candidate stage, bringing its general availability even closer to fruition.

Available today, the release candidate is the sixth pre-release of the PHP 7 major series, according to PHP.net.  Once again, PHP proponents are advising that this latest release is not to be used in production, as it's just a development preview.

"PHP 7.0.0 RC 1 contains fixes for 27 reported bugs and altogether over 200 commits with various stability improvements for database, array, assert, streams, and other functionality," PHP.net said in its bulletin. 

Featuring a new version of Zend Engine, version 7.0.0 is up to twice as fast as PHP 5.6, and it offers consistent 64-bit support. Also featured are return type and scalar type declarations and anonymous classes. Many fatal errors become exceptions in the upgrade and old, unsupported SAPIs and extensions are removed. In a recent presentation, PHP founder Rasmus Lerdorf emphasized that the upgrade would require fewer servers as well as offer performance gains for "real-world" applications.

PHP builders are asked to test the version and report incompatibilities in the project's bug tracking system. A second release candidate is planned for September 3, and the final version of PHP 7.0.0 is set for release on November 12.

Importing an Eclipse Project into Android Studio

Thursday 20 August 2015

Silicon Valley's 'pressure cooker:' Thrive or get out

Spotlight may be on Amazon, but tech jobs are high profit and high stress

It's true. People working in Silicon Valley may cry at their desks, may be expected to respond to emails in the middle of the night, and may be in the office when they'd rather be sick in bed.
But that's the price employees pay to work for some of the most successful and innovative tech companies in the world, according to industry analysts.
"It's a pressure cooker for tech workers," said Bill Reynolds, research director for Foote Partners LLC, an IT workforce research firm. "But for every disgruntled employee, someone will tell you it's fine. This is the ticket to working in this area and they're willing to pay it."
 The tech industry has been like this for years, he added.
Employees are either Type A personalities who thrive on the pressure, would rather focus on a project than get a full night's sleep and don't mind pushing or being pushed.
If that's not who they are, they should get another job and probably in another industry.
"A lot of tech companies failed, and the ones that made it, made it based on a driven culture. No one made it working 9 to 5," said John Challenger, CEO of Challenger, Gray & Christmas, an executive outplacement firm. "Silicon Valley has been the vanguard of this type of work culture. It can get out of control. It can be too much and people can burn out. But it's who these companies are."
Work culture at tech companies, specifically at Amazon, hit the spotlight earlier this week when the New York Times ran a story on the online retailer and what it called its "bruising workplace."
The story talked about employees crying at their desks, working 80-plus-hour weeks and being expected to work when they're not well or after a family tragedy.
"At Amazon, workers are encouraged to tear apart one another's ideas in meetings, toil long and late (emails arrive past midnight, followed by text messages asking why they were not answered), and held to standards that the company boasts are "unreasonably high," the article noted.
In response, Amazon.com CEO Jeff  Bezos sent a memo to employees saying he didn't recognize the company described in the Times article.
"The article doesn't describe the Amazon I know or the caring Amazonians I work with every day," Bezos wrote. "More broadly, I don't think any company adopting the approach portrayed could survive, much less thrive, in today's highly competitive tech hiring market."
Bezos hasn't been the only one at Amazon to respond. Nick Ciubotariu, head of Infrastructure development at Amazon.com, wrote a piece on LinkedIn, taking on theTimes article.
"During my 18 months at Amazon, I've never worked a single weekend when I didn't want to. No one tells me to work nights," he wrote. "We work hard, and have fun. We have Nerf wars, almost daily, that often get a bit out of hand. We go out after work. We have 'Fun Fridays.' We banter, argue, play video games and Foosball. And we're vocal about our employee happiness."

Working for the big players

Amazon has high expectations of its workers because it's one of the largest and most successful companies in the world, according to industry analysts.
The company, which started as an online book store, now sells everything from cosmetics to bicycles and toasters. With a valuation of $250 billion, Amazon even surpassed mega retailer Walmart this summer as the biggest retailer in the U.S.
With that kind of success comes a lot of pressure to stay on top and to come up with new, innovative ways to keep customers happy.
That kind of challenge can lead to a stressful workplace where employees are called on to work long hours and to outwork competitors' own employees.
It's just the way of the beast, according to Victor Janulaitis, CEO of Janco Associates Inc., a management consulting firm.

"If you go to work for a high-powered company where you have a chance of being a millionaire in a few years, you are going to work 70 to 80 hours a week," he said. "You are going to have to be right all the time and you are going to be under a lot of stress. Your regular Joe is really going to struggle there."
This kind of work stress isn't relegated to Amazon alone. Far from it, Janulaitis said.
"I think it's fairly widespread in any tech company that is successful," he noted. "It's just a very stressful environment. You're dealing with a lot of money and a lot of Type A personalities who want to get things done. If you're not a certain type of person, you're not going to make it. It's much like the Wild West. They have their own rules."
Of course, tech companies, whether Amazon, Google, Apple or Facebook, are known to work people hard, going back to the days when IBM was launching its first PCs and Microsoft was making its Office software ubiquitous around the world.
However, tech companies also are known for giving their employees perks that people working in other industries only dream of.
Google, for instance, has world-class chefs cooking free food for its employees, while also setting up nap pods, meditation classes and sandy volleyball courts.
Netflix recently made global headlines for offering mothers and fathers unlimited time off for up to a year after the birth or adoption of a child.
It's the yin and yang of Silicon Valley, said Megan Slabinski, district president of Robert Half Technology, a human resources consulting firm.
"All those perks - the ping pong tables, the free snacks, the free day care -- that started in the tech industry come with the job because the job is so demanding," she said. "There's a level of demand in the tech industry that translates to the work environment."
When asked if Amazon is any harder on its employees than other major tech companies, Slabinski laughed.
"Amazon isn't different culturally from other IT companies," she said. "I've been doing this for 16 years. You see the good, the bad and the ugly. If you are working for tech companies, the expectation is you are going to work really hard. This is bleeding-edge technology, and the trade-off is there's less work-life balance. The people who thrive in this industry, thrive on being on the bleeding edge. If you can't take it, you go into another industry."
Janulaitis noted that top-tier employees are always chased by other companies, but middle-tier workers -- those who are doing a good job but might not be the brightest stars of the workforce -- are hunkering down and staying put.
Fears of a still jittery job market have convinced a lot of people to keep their heads down, put up with whatever their managers ask of them and continue to be able to pay their mortgages, especially if they live in pricey Silicon Valley.
That, said Janulaitis, makes companies more apt to ask even more from their employees, who know they're likely stuck where they are for now.
"Once the job market changes, turnover will increase significantly in the IT field," he said.
Like stock traders working under extreme pressure on Wall Street or medical interns working 36-hour shifts, the tech industry is a high-stress environment - one that's not suited to every worker.
"If you can't live with that pressure, you should go somewhere else," said Reynolds. "For people in Silicon Valley, it's who they are. It's the kind of person they are."