Tuesday, 29 April 2014

Finland looses Nokia; Now completely Microsoft Property as acquisition of $7.2B completes

Finland looses Nokia and employs to Microsoft as transaction complete,

Microsoft today officially closed its $7.2 billion acquisition of Nokia's handset business, welcoming approximately 30,000 new employees to its rolls.

As expected, the new unit has been named "Microsoft Mobile Oy," and will act as a subsidiary. "Oy" is the Finnish equivalent to "Ltd." or "LLC," a limited liability company in the U.K. and the U.S., and to "GmbH" in Germany.

Microsoft Mobile will be headquartered in Espoo, Finland.

Also today, Microsoft formally appointed former Nokia CEO Stephen Elop as the new head of Microsoft's Devices Group. That group, Microsoft said, will add Lumia smartphones and tablets, as well as Nokia mobile phones, to its existing portfolio, which already included Microsoft's Surface tablets, its Xbox gaming console, and hardware accessories such as keyboards and mice.

Before taking the CEO role at Nokia, Elop lead Microsoft's Office group.

The Microsoft Mobile subsidiary will develop, manufacture and distribute Lumia, Asha and Nokia X mobile phones and other devices, Microsoft said, indicating that the Redmond, Wash. technology firm will continue to use the brands. The "Nokia Lumia" brand will be shortened to simply "Lumia;" the "Asha" brand has been and will continue to be used for the less-advanced "feature" phones.

"The mobile capabilities and assets ... will advance our transformation," said Microsoft CEO Satya Nadella in a statement today. "Together with our partners, we remain focused on delivering innovation more rapidly in our mobile-first, cloud-first world."

That phrase -- "mobile first, cloud first" -- was coined by Nadella on his first day as CEO, and has become Microsoft's latest mantra, replacing former CEO Steve Ballmer's "devices and services" label as the future Microsoft has chosen as the PC business contracts and the company finds itself far behind Google and Apple in the mobile market.

As part of the finalized deal, Microsoft will honor all existing Nokia customer warranties for currently-owned devices, the company said.

Yesterday during its March quarter earnings call, Microsoft excluded the financial impact of the Nokia acquisition from guidance it gave Wall Street analysts for the June quarter. Instead it will disclose the financial impact, including one-time integration and severance costs, in July.

Microsoft said its omission of the acquisition costs in forecasts for the June quarter was caused by its inability to access Nokia's data before the deal officially wrapped.

"The reality is we've not had the type of access until close where we could confidently begin to give the type of guidance that I believe we have come and you have come to expect from us in terms of the depth and analysis required to get there," said Amy Hood, Microsoft's CFO, during the earnings call Thursday.

Microsoft announced the acquisition of Nokia's devices arm -- and an associated patent deal -- on Sept. 2, 2013, when Ballmer was still chief executive. At the time, Microsoft said it would pay approximately $5 billion for "substantially all" of Nokia's Devices & Services business and $2.2 billion to license a broad portfolio of Nokia's patents.

Today, Nokia said that the closing price will be slightly higher than the original amount, but did not say by how much.

Let hope well wishes for Microsoft in the future and become world leader again.





Best Microsoft MCTS Certification, Microsoft MCP Training at certkingdom.com

Sunday, 20 April 2014

Red-hot IT jobs

Dice.com released its Tech Trends Q1 report for 2014 and while the numbers are excellent across IT as a whole, they’re especially lucrative for tech consultants.

Most tech jobs gained and lost
Computer systems design pros in the Professional and Business Services sector saw the biggest gain in jobs for Q1, adding 17,200 from Q4 2013.

The news was not as good for those in computer and electronic products manufacturing, which lost 2,900 jobs since Q4. Data processing and hosting jobs also took a hit, losing the second-most: 1,600 since Q4.

Tech unemployment rate by job
Tech Trends broke down unemployment by position, unsurprisingly finding Web developers as the most-employed with a miniscule jobless rate.
0.7%: Web developers:
0.8%: Computer systems analysts
0.8%: Network architects
2.3%: Computer support specialists
2.6%: Programmers
2.7%: Database administrators
2.8%: Software developers
3.0%: Computer and information systems managers
3.2%: Network and systems administrators

Tech unemployment drops to recovery low
Overall tech unemployment dropped to 2.7% in Q1 2014, a recovery low and a full 4% below the quarterly national unemployment rate.

Here’s how the 2.7% rate compares to 2013:
Q1: 3.5%
Q2: 3.6%
Q3: 3.9%
Q4: 3.5%
Q1’s 2.7% is still higher than the all-time low of 1.8% in Q2 2007.

New consultant jobs skyrocket
The first quarter saw 17,200 new jobs, bringing the tech consulting workforce to more than 1.7 million.

“A survey of those responsible for hiring consultants, conducted by Source Information Services, found nearly all plan technology improvements this year, and most will use consultants to help,” notes Dice’s Mark Feffer. “Half will spend more on technology consultants than they did during 2013 and of those, half plan an increase of more than 10%.”

Consultant demand expected to increase
Surveys suggest consultant spending will continue to rise, especially in verticals such as:
Finance: New regulatory requirements and the popularity of online banking will drive demand.
Retail: The desire for more online offerings and “omnichannel” undertakings (seamless experience between brick-and-mortar and online) will spur more consultant hours.
Pharmaceuticals: A whopping 60% of pharma decision-makers say they plan to increase their consultant budget, but due to the sector’s small size, “the actual number of opportunities will be modest.”

Hours worked hit record high
There are thousands of open jobs in IT, which means organizations are turning to consultants to fill gaps or skill-set shortcomings until permanent hires can be made. This translates into consultants working nearly full-time hours: an average of 38.8 per week in February. Notes Dice’s Nick Kolakowski: “And given how that’s an average, it’s certain that many consultants are working far longer in order to keep their clients happy.”

Hourly rate hits all-time high
According to the Bureau of Labor Statistics, the average hourly salary for tech consultants reached $42.17 in February 2014, an all-time high. By comparison, the 2006 average hourly rate was $36-$37 and has risen steadily since. Dice attributes the jump “in large part to growth in technology segments such as mobility and the cloud."

It’s a hat trick for tech consultants, says Dice President Shravan Goli: more jobs, more wages and more hours. Dice.com released its Tech Trends Q1 report for 2014 and while the numbers are excellent across IT as a whole, they’re especially lucrative for tech consultants. The good news just keeps on coming for the for-hire set, which saw a 4% pay increase last year, outdistancing the 3% average seen by the overall tech industry, according to the 2014 Dice Salary Survey.



Best Microsoft MCTS Certification, Microsoft MCP Training at certkingdom.com

Sunday, 6 April 2014

The mainframe turns 50, or, why the IBM System/360 launch was the dawn of enterprise IT

In 1964, mainframes weren't new, but the System/360 revolutionized the computer industry

In many ways, the modern computer era began in the New Englander Motor Hotel in Greenwich, Connecticut.

It was there in 1961 that a task force of top IBM engineers met in secret to figure out how to build the next-generation IBM computer.

A new design was sorely needed. IBM already sold a number of successful though entirely separate computer lines, but they were becoming increasingly difficult to maintain and update.

+ ALSO ON NETWORK WORLD The (mostly) cool history of the IBM mainframe +

"IBM in a sense was collapsing under the weight of having to support these multiple incompatible product lines," said Dag Spicer, chief content officer for the Computer History Museum, which maintains a digital archive on the creation and success of the System/360.

Fifty years ago on April 7, IBM announced the computer that the task force had designed, the System/360.

The system eventually became a huge success for the company -- and a good thing too. IBM's president at the time, Tom Watson, Jr., killed off other IBM computer lines and put the company's full force behind the System/360. IBM's revenue swelled to $8.3 billion by 1971, up from $3.6 billion in 1965. Through the 1970s, more than 70 percent of mainframes sold were IBM's. By 1982, more than half of IBM's revenue came from descendants of the System/360.

But its impact can be measured by more than just the success it brought to IBM.

"IBM was where everyone wanted to work," said Margaret McCoey, an assistant professor of computer science at La Salle University in Philadelphia, who also debugged operating system code for Sperry/Univac System/360 clones in the late 1970s.

The System/360 ushered in a whole new way of thinking about designing and building computer systems, a perspective that seems so fundamental to us today that we may not realize it was rather radical 50 years ago.

Before the System/360 introduction, manufacturers built each new computer model from scratch. Sometimes machines were even built individually for each customer. Software designed to run on one machine would not work on other machines, even from the same manufacturer. The operating system for each computer had to be built from scratch as well.

The idea hatched at the Connecticut hotel was to have a unified family of computers, all under a single architecture.

Gene Amdahl was the chief architect for the system and Fred Brooks was the project leader.

Amdahl would later coin Amdahl's Law, which, roughly stated, holds that any performance gains that come from breaking a computer task into parallel operations is offset by the additional overhead incurred by managing multiple threads. And Brooks would go on to write "The Mythical Man Month," which asserted a similar idea that adding more people to a software development project can actually slow development of the software, due to the additional burden of managing the extra people.

The idea they came up with was to have a common architecture shared among the lower-end, less expensive machines and the priciest high-speed models. The top-end models would perform 40 times as fast as the low-end models. Keep in mind that applying the word "architecture" to the design of a computer was all but unheard of in the early 1960s.

But specifying an architecture, rather than a specific implementation, paved the way for compatibility amongst different models.

"In designing a system with both upward and downward compatibility for both scientific and business customers, IBM was attempting to use a single architecture to meet the needs of an unprecedentedly large segment of its customers," according to a 1987 case study of the System/360 published by the Association for Computing Machinery. In fact, the "360" in the moniker was meant to indicate that the machine could serve all kinds of customers, small or large, business or scientific.

"The System/360 actually unified both business and computing threads in the market into one system," Spicer said.

While the idea seems obvious today, the concept of a unified family of computers had profound consequences for both IBM, its customers and the industry as a whole.

IBM was able to use a single OS for all of its computers (though it ended up creating three variants to span the different use cases). A lot of work writing software for separate computers was eliminated, allowing engineers to concentrate on new applications instead.

IBM saved a lot of resources on the hardware as well. No longer would components, such as processors and memory, need to be designed for each machine. Now different models could share general-purpose components, allowing IBM to enjoy more economies of scale.

Customers benefited as well. They could take code written for one System/360 machine and run it on another. Not only could users move their System/360 code to a bigger machine without rewriting it, but they could port it to a smaller model as well.

When an organization bought a new computer in the early 1960s it "generally had to throw out all of its software, or at least rejigger it to work on the new hardware," Spicer said. "There was no idea of having computers that could run compatible software over the generations."

IBM has steadfastly maintained backward compatibility in the decades since. Programs for the original System/360s can still run, sometimes with only slight modification, on IBM mainframes today (which is not to say IBM hasn't aggressively urged customers to upgrade to the latest models for performance improvements).

Compare that longevity to one of IBM's largest competitors in the software market. This month, Microsoft ends support for its Windows XP OS after a mere decade since its release.

The System/360 and its successor System/370 continued to sell well into the 1970s, as punch cards were slowly replaced by IBM 3270 terminals, known as green screens.

Green screens changed the way the System360/370 could be used. Originally, they did batch processing, where a job was submitted via punch cards. The machine would churn through the data and return the results. The green screens paved the way for more interactive sessions with a machine, noted Greg Beedy, senior principal product manager at CA and a 45-year veteran of working on mainframes.

Beedy noted that the 3270 terminals were always 80 columns wide -- equal to the number of columns on a punch card.

Even after the introduction of terminals, programming was a much more tedious job back in the 1970s; today, programmers have "instant gratification," McCoey said.

"They hit the return key and up pops an answer. That never happened. We would put together a new unit and leave it for the overnight operators to run," she said. "It would take them about 10 hours to run the test and they'd spend two or three hours to make sure everything ran correctly."

Debugging back then involved reviewing a stack of papers with nothing but hexadecimal code. McCoey would have to transcribe the code back into the routines the original programmer had devised, and then try to locate the logical error in the code.

"For me, that was a lot of fun. It was like a puzzle," McCoey said.

The programming world was also smaller then as well. Beedy started working on System/360 and similar systems in the mid-1970s, writing COBOL code for insurance companies.

"Back then, it was a tiny cult of people. Everybody knew each other, but the rest of the world didn't know what we did. It was very arcane and obscure," Beedy said. "Even the word 'software' was not that well-known. I told people I worked for a software company, they looked at me like I was crazy."

Pat Toole, Sr., one of the original System/360 engineers and later an IBM division president, observed that there were no commercial enterprise software companies, such as an SAP or Oracle, in the mainframe days. IBM supplied a few standard programs for banks, but customers also wrote their own software and it was a big undertaking.

"It was a big deal if a company spent a fortune and two or three years to write a program for a banking application, then your hardware wouldn't run it and you basically had to redo it all," Toole said.

McCoey recalled how an insurance company she once worked for when she left Sperry would run the billing program on its mainframe for the Wanamaker's department store.

"Twice a month, they would shut down their billing over the weekend and run all Wanamaker's accounts, so Wanamaker's didn't have to employ their own IT department," McCoey said.

Nonetheless businesses saw the value of System/360 and other mainframes.

"They not only allowed business to operate faster and gain competitive advantage but allowed them to have a lot more flexibility in their products and services. Instead of just having one standard product, you could have all these different pricing schemes," Beedy said.

It was not until the emergence of lower-cost mini-computers in the late 1970s that IBM's dominance in computer platforms started to fade, though the company caught the next wave of computers, PCs and servers, in the following decade. IBM has also managed to keep its mainframe business percolating.

Organizations continued to use mainframes for their core operations, if for no other reason than the cost of porting or rewriting their applications to run on other platforms would dwarf any savings they might enjoy from less expensive hardware, Spicer said,

"IBM, while it has been a big lumbering company at times, has adapted well over the years to keep the mainframe relevant. They managed to bring price/performance down to where mainframe computing stayed viable," Beedy said. "Many times it has been announced that the mainframe was dead, replaced by minicomputers or servers, but what we've seen all along [is that these new technologies] extend what is already there, with the mainframe as the backbone."

(IDG News Service editor James Niccolai contributed to this report.)


Saturday, 5 April 2014

Build: Microsoft Azure embraces outside technologies

Microsoft has open-sourced its new C# compiler and Azure now incorporates the open source Chef and Puppet configuration managers

As it rolled out tools and features for coders at its Build developer conference Thursday, Microsoft showed that it is ready to embrace technologies and platforms not invented within its walls.

Rather than relying solely on internal tools, the Azure cloud services platform has incorporated a number of non-Microsoft technologies, including popular open source tools such as the Chef and Puppet configuration management software, the OAuth authorization standard, and the Hadoop data processing platform.

The company has also taken steps to incorporate open source into its product roadmaps, by releasing the code for its new compiler and setting up a foundation for managing open source .Net projects.

"Clearly Microsoft's message is its support of multi-platform. It will take any part of your stack, it doesn't have to be just Microsoft software," said Al Hilwa, IDC research program director for software development. "This is good for Microsoft and good for the ecosystem."

Microsoft's Azure strategy is to "enable developers to use the best of Windows ecosystem and the best of the Linux ecosystem together ... and one that enables you to build great applications and services that work on every device," Scott Guthrie, Microsoft's new executive vice president overseeing the cloud and enterprise group, told the audience of developers and IT professionals.

On the developer side, the company announced that it has open-sourced its next generation compiler for C# and Visual Basic, code-named Roslyn.

To date, compilers have been "black boxes," explained C# lead architect Anders Hejlsberg.

Roslyn is unique as a compiler because has a set of APIs (application programming interfaces) that can feed information about a project as it is being compiled to Microsoft's Visual Studio IDE (integrated development environment) and third-party development tools.

Hejlsberg demonstrated how Visual Studio can offer helpful tips through an "interactive prompt," using feedback from the compiler. For instance, it can flag libraries that have been called but not used in the program code.

Microsoft is hoping that other vendors will incorporate the API into their software development tools. Developers can also now add their own features into C# and have the compiler recognize them. Open-sourcing the compiler may also lead to efforts to create versions of C# for other platforms.

The company released Visual Studio 2013 Update 2 Release Candidate.

One new capability allows for two-way communication between the Visual Studio IDE and browsers.

Typically, when developers write code for a Web application in Visual Studio, they can check to see if it runs correctly by running it in a browser.

Now, using a technology known as Browser Link, developers can edit the source code directly in the browser. Browser Link will write the changes back to the source code file in Visual Studio. If a file such as a related style sheet is not open, Visual Studio can open the file and make the change as well.

Browser Link works on "any open browser," in Microsoft's words; the company named Google Chrome and Firefox, in addition to Internet Explorer.

In addition to open-sourcing C#, Microsoft has also started an organization, called the .Net Foundation, to manage additional open source .Net projects from Microsoft and others.

The company also announced the general availability of Visual Studio Online, a hosted version of the IDE that works within Azure and is incorporated into Microsoft Team Foundation Service to enable rapid DevOps-styled development.

On the cloud side of operations, Azure has incorporated two of the industry's leading open source configuration management tools, Chef and Puppet. Users can deploy these technologies to quickly boot up, configure or reconfigure large numbers of virtual machines.

Microsoft has also redesigned the Azure portal, giving it a much more flexible interface. It builds on the Windows Tile design, allowing users to add their own tiles that can display live information, such as metrics of how well the user's operations are performing. One tile even keeps a tally of the bill that the user has accumulated in the current billing cycle, which should help eliminate any surprises when the monthly payment comes due, Guthrie noted.

Guthrie touted a wide range of other Azure improvements and new features as well.

Azure now offers staging support. This feature allows a Web developer to set up a working copy of an application that is about to go live in a full production setting, for final testing. This eliminates the need to do the final test on the live production version of the application.

Also new with Azure is Traffic Management Server, a service that can route application requests to the copy of a distributed application that is closest to the requester's geographic origin, potentially lowering latency times for users.

Microsoft has taken further steps in integrating its Active Directory (AD) directory services into Azure.

Now enterprises can use their AD directories to authenticate mobile users, providing a single sign on option for employees and partners that allows them to use the same password for desktop and mobile device access to an organization's resources.

This AD support has also been incorporated in the Microsoft's Office365 hosted Office service.

On the data side, Azure's SQL Server service now offers more space and a higher promised service level agreement. Users now can store up to 500GB of data, rather than 150GB. Microsoft is also guaranteeing that the service will remain in operation for at least 99.95 percent of the time.

The company has also added a backup service that allows users to revert the database back to an earlier state any time in the prior 31 days. This "roll-back" feature would be valuable to a database administrator who accidentally deletes data or makes some other mistake that could cause irreparable loss of data.

Microsoft has also updated its HDInsight Hadoop service to run the latest version of Hadoop, version 2.2, and to incorporate the Hadoop YARN (Yet Another Resource Negotiator) scheduler that can be used to process jobs based on streaming data.

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com