Wednesday, October 30, 2013

How to more easily upgrade your network to 40/100G Ethernet

Advancing from 10G to 40G or 100G Ethernet not as simple as swapping out a few switches or line cards

Upgrading you network from 10G Ethernet to 40G and 100G is not as easy as swapping out switches and line cards.
40G/100G Ethernet

Several factors have to be weighed, such as synchronizing switch clocks for the higher-speeds, especially among multivendor equipment; ensuring latency remains at acceptable levels; keeping the network design and architecture optimal for 40/100G; and making sure the existing cabling infrastructure can accommodate the 4x to 10x increase in bandwidth.

One of the caveats that users should be aware of as they migrate from 10G to 40/100G Ethernet is the need to ensure precise clocking synchronization between systems – especially between equipment from different vendors. Imprecise clocking between systems at 40/100G – even at 10G – can increase latency and packet loss.

The latency issue is a bigger problem than most people anticipate, industry experts say. At 10G, especially at high densities, just the smallest difference in the clocks between ports can cause high latency and packet loss. At 40G, it's an order of magnitude more important than it is for 10G.

This is a critical requirement in data centers today because a lot of the newer innovations are meant to address lower latencies.

“Where you’re going to have the biggest challenges will be different latency configurations if RDMA (remote direct memory access) is used,” says Shaun Walsh, Emulex senior vice president of marketing and corporate development. RDMA is a low-latency, high throughput data transfer capability where application memory is exchanged directly to and from network adapters without copying it to operating system buffers.

“You see a lot more in-rack virtual switching, VM-based switching that is very application specific,” Walsh says. “New line cards in new backplane architectures mean different levels of oversubscription. There’ll be generational tweaks, configuration ‘worrying’ that has to occur. The biggest thing (testers) are running into is making sure you get the 40G you are paying for (with regard to) latency issues, hops, and congestion visibility.”

Emulex late last year acquired Endace, a developer of network performance management tools. Demand for the Endace product and the 40G capabilities of Emulex’s XE201 I/O controller are picking up as more data centers and service providers upgrade from 10G to 40G.

Walsh expects 40G Ethernet to be a $700 million market in four to five years, roughly half the time it took 10G Ethernet to reach that mark. Driving it are next-gen blade server mid-plane interfaces and architectures, big data, analytics, video and data over mobile, BYOD and high frequency trading, Walsh says.

Another challenge is readying the cabling infrastructure for 40/100G, experts say. Ensuring the appropriate grade and length of fiber is essential to smooth, seamless operation.

This is a big consideration for users because it could mean re-wiring a significant portion of their physical plant, if not all of it. That could be an expensive and disruptive undertaking.

At the physical layer, 40G Ethernet is essentially 4x10G “lanes.” But 100G Ethernet is 4x25G lanes, which will be disruptive to the 10G and 40G infrastructure.

“100G is going to be even more of a challenge because now you’re dealing with a whole new layer of physical infrastructure,” Walsh says. “You will have a whole new generation of optics, cables, everything will be a whole new generation at that point.”

Moving bits four to 10 times faster error free is a challenge in and of itself. Making sure the higher level systems – routers and switches – deliver services and non-disruptive service quality at those speeds is equally as challenging, if not more so.

Each device has to do this at one-fourth or one-tenth the time it does at 10G. For a router, it means performing all of the packet inspection, queuing, lookups, filtering, policing, prioritization, table updating and logging while meeting SLAs by not dropping or reordering packets, or increasing latency or jitter.

“Routers aren’t just forwarding packets,” says Scott Register, senior director of product management for Ixia, a maker of test, measurement and visibility tools. “There’s carrier grade NAT, application ID, and security processing and things like that. One of the more interesting testing is, what services can you enable at that rate before you start having problems?”

With carrier grade NAT, the problems get harder as that traffic load increases, Register says. In addition to throughput, increased session capacity and more concurrent connections are also issues as bandwidth climbs from 10G to 40/100G.

“You don’t get more TCP or UDP ports just because you have more traffic,” Register says. “How smart are you at cycling through your NAT tables? Some of the challenges that might not show up at the lower speeds show up at higher speeds. So a lot of the testing that we see is around that kind of high level stuff.”

And that’s pre-deployment testing. Post-deployment presents its own set of challenges, especially in enterprises with strict auditing and compliance requirements.

“Increasing security requirements for compliance, recording e-mail correspondence… it’s easy at 1G; at 10G or 40G or 100G it’s really, really difficult,” Registers says. “So if you want to do that you have to be able to do that kind of intelligent load balancing and filtering out of any unnecessary data so that your tools can keep up with that bandwidth.”

That challenge is exacerbated by existing filtering and analysis tools that only run at sub-10G speeds, Register says. Tapping a 40G link would require 30 or so such tools each monitoring a slice of that 40G traffic, he says.

Ixia offers a switch that sits between the production network and the analysis tools that does just that.

“They would put our switch between the production network and their tools to do the filtering – ex., seeing only the webserver traffic – to only see a very small subset,” Register says. “I can take the input stream and balance it across 32 analysis tools. I can round robin or spread that traffic across a bunch of connected tools so you can preserve your investment with existing tools. There aren’t many analysis tools that’ll run at 40G and certainly there’s nothing that runs at 100G. So a challenge we’ll have is maintaining visibility into that traffic when they do their infrastructure upgrade.”

And then, of course, there’s always multivendor interoperability challenges at 40G. In addition to clock synchronization between systems, adding features or applications could test the stability of that interoperability, Walsh says.

“Where we’re going to have problems is when people try to implement different features,” he says. “Base Ethernet connectivity will work fine – but where we’ll start to see challenges will be in RDMA, when you do lower latency stuff around RoCE (RDMA over Converged Ethernet). And when you load a special version of an SDN application that is trying to meet a specific need. When that SDN switch is plugged into the general Ethernet population, will it interoperate exactly right?”


What's holding back the cloud industry?

Cloud enthusiasts blame processes, not technology
While cloud enthusiasts roaming the halls of McCormick Place convention hall in Chicago last week at Cloud Connect may be high on the market, the reality is that many enterprises IT shops are still reticent to fully embrace public cloud computing.

Network World asked some of the best and brightest minds in the industry who were at the event about what’s holding the cloud industry back. Here’s what they said:

Cloud sounds like a great idea, but how will it really work when it’s adopted? Hanselman says one of the biggest barriers is an organizational one. Typically IT organizations are split into groups focusing on compute, network and storage. When applications run from the cloud, those are all managed from one provider. That means the jobs from each of those groups within IT may change. How can organizations evolve? “You’ve got to converge,” Hansleman says. That could be easier said than done with people’s jobs at stake.

Security and application integration
Krishnan Subramanian, director of OpenShift Strategy at Red Hat; founder of Rishidot Research

Security is still the biggest concern that enterprises point to with the cloud. Is that justified? Cloud providers spend a lot of money and resources to keep their services secure, but Subramanian says it’s almost an instinctual reaction that IT pros be concerned about cloud security. “Part of that is lack of education” he says. Vendors could be more forthcoming with the architecture of their cloud platforms and the security around it. But doing so isn’t an easy decision for IaaS providers: Vendors don’t want to give away the trade secrets of how their cloud is run, yet they need to provide enough detail to assuage enterprise concerns.

Once IT shops get beyond the perceived security risks, integrating the cloud with legacy systems is their biggest technical challenge, Subramanian says. It’s still just not worth it for organizations to completely rewrite their applications to run them in the cloud. Companies have on-premises options for managing their IT resources and there just isn’t a compelling enough reason yet to migrate them to the cloud. Perhaps new applications and initiatives will be born in the cloud, but that presents challenges around the connections between the premises and the cloud, and related latency issues.

New apps for a new computing model
Randy Bias, CTO of OpenStack company Cloudscaling

If you’re using cloud computing to deliver legacy enterprise applications, you’re doing it wrong, Bias says. Cloud computing is fundamentally a paradigm shift, similar to the progression from mainframes to client-server computing. Organizations shouldn’t run their traditional client-server apps in this cloud world. “Cloud is about net new apps that deliver new business value,” he says. “That’s what Amazon has driven, and that’s the power of the cloud.” Organizations need to be forward thinking enough and willing to embrace these new applications that are fueled by big data and distributed systems to produce analytics-based decision making and agile computing environment.

It’s more than just technology
Bernard Golden, VP of Enterprise Solutions for Enstratius, a Dell company

The biggest inhibitor to more prevalent cloud computing adoption is that organizations are still holding on to their legacy processes, says Golden, who recently authored the Amazon Web Services for Dummies book. It’s not just about being willing to use new big data apps, and spin up virtual machines quickly. It’s the new skill sets for employees, technical challenges around integrating an outsourced environment with the current platform, and building a relationship with a new vendor. “For people to go beyond just a small tweak, there needs to be a significant transformation in many areas of the organization,” he says. “Each time there is a platform shift, established mechanisms are forced to evolve.”

Regulatory compliance
Andy Knosp, VP of Product for open source private cloud platform Eucalyptus

One of the biggest hurdles for broader adoption of public cloud computing resources continues to be the regulatory and compliance issues that customers need to overcome, Knosp says. Even if providers are accredited to handle sensitive financial, health or other types of information, there is “still enough doubt” by executives in many of these industries about using public cloud resources. Many organizations, therefore, have started with low-risk, less mission critical workloads being deployed to the public cloud. Knosp says the comfort level for using cloud resources for more mission critical workloads will grow. It will just take time.



Wednesday, October 9, 2013

How to Close the Big Data Skills Gap by Training Your IT Staff

Research firms paint a dire picture of a massive big data skills gap that will get worse over time. But companies like Persado, which uses big data to help marketers optimize their messages, are finding success training their existing staff in the new big data technologies.

It's difficult to talk about big data without also discussing the big data skills gap in nearly the same breath. But is it as bad as it seems?

According to a recent CompTIA survey of 500 U.S. business and IT executives, 50 percent of firms that are ahead of the curve in leveraging data, and 71 percent of firms that are average or lagging in leveraging data, feel that their staff are moderately or significantly deficient in data management and analysis skills.

These firms see real costs associated with a failure to come to grips with their data, from wasted time that could be spent on other areas of their business to internal confusion over priorities, lost sales, lack of agility and more.

big data
Forecasters paint a seemingly dire portrait of a skills shortfall that will only get worse over time. The McKinsey Global Institute estimates that by 2018, there will be a shortage of 1.7 million workers with big data skills in the U.S. alone—140,000 to 190,000 workers with deep technical and analytical expertise and 1.5 million managers and analysts with the skills to work with big data outputs.

But Tim Herbert, vice president of research and market intelligence at CompTIA and author of its second annual Big Data Insights and Opportunities study, says the situation may not be as drastic as you think.

"Hadoop is a new technology and there really is a skills gap. But you can easily cross-train people. It's not that the technology is incomprehensible. You just need to take existing developers, analysts and admins and cross-train them."

--Sara Sproehnle, vice president of Educational Services at Cloudera

"There will be a situation where, at the highest levels, probably the Fortune 100, there will be a skills shortage," Herbert says. "For most medium and small companies, they probably will be able to satisfy their skills needs by a combination of retraining and additional staff. The tools associated with big data will mature. The capabilities and ease of use will mature over time and that will certainly help. Like a lot of other technologies, there will be individuals that maybe they weren't trained to do it but they will have an aptitude to work with data."

Hadoop Isn't Incomprehensible
Sara Sproehnle, vice president of Educational Services at Cloudera, provider of one of the most popular Hadoop distributions, agrees.

"Training has really been a strategic component of what we do at Cloudera," she says. "Hadoop is a new technology and there really is a skills gap. But you can easily cross-train people. It's not that the technology is incomprehensible. You just need to take existing developers, analysts and admins and cross-train them."

Case in point: Persado, a pioneer in "Marketing Language Engineering." Persado helps brands optimize their marketing messages to their target audience at every digital interaction through a systematic methodology that leverages math, computational linguistics and big data.

"We can look at the different 'genes' of a marketing message and break it down and build it back up using mathematics, linguistics and technology to make it a marketing message that a marketer would be happy to bring to market and a consumer would be more likely to interact with and click on," says Matthew Novick, chief financial officer at Persado.

Achieving this requires continuous data collection and the ability to query that massive volume of data. Persado's business depends upon its data warehouse.

Persado's development team is focused on ensuring that the company's infrastructure is aligned with the needs of its data scientists, including regularly generating key performance indicator (KPI) reports, managing data from heterogeneous sources, preparing customized analyses and implementing specific statistical algorithms in Java based on reference implementations of R.

But in 2010, not long after Persado was born, the relational database management system (RDBMS) the company was using to power its data warehouse was becoming unwieldy. The development team, led by Christos Soulios, software team leader and application architect at Persado, began the process of migrating to a NoSQL environment. With its analytics and reporting needs becoming more sophisticated, it then needed to decouple the online analytical processing (OLAP) system into a technology stack of its own.

Hadoop
Soulios decided that Apache Hadoop was the right solution to collect, aggregate and process data from Persado's heterogeneous data sources, including MongoDB, MySQL config servers and Apache logs populated by structured and semi-structured files in Amazon Web Services (AWS) S3 buckets using libraries built on Apache Kafka and Apache ZooKeeper.

But those tasks were easier said than done. Persado didn't have the big data engineers on its staff that it needed to grow capabilities and scale its systems. Moreover, while Persado is a global company with headquarters in London and New York, its development team is based in Athens, Greece, making big data talent even harder to come by.

"Most of our development team and the resources are here in Athens, Greece," says Xinyu Huang, vice president of Engineering at Persado. "Unlike in the U.S., where big data is all over the place, in Greece it's still in the early stage."

Persado Looks to Train Its Teams to Use Big Data Tools
Without the ability to buy the talent it needed, Persado decided to create its own, Huang says. Soulios brought in Cloudera—specifically, Cloudera University. Soulios and the development team worked with Cloudera University's curriculum team to tailor a private, week-long onsite training course for Persado.

"We started benefiting from our decision to work with Cloudera almost right away, since no other company offers a full Data Analyst Training targeted at both developers and analysts, which was one of our biggest priorities," says Soulios, speaking of a course on Apache Hive and Apache Pig. "The intensive workshop also included the full Cloudera Developer Training for Apache Hadoop with the option of testing for the sought-after CCDH certification following the class."

"Having the training in-house was really important," adds Huang. "It got our team interacting with the technology and understanding what you could possibly do with it. We have the data, but the team has been dealing with data on an ad-hoc basis, chunk by chunk. Doing the training really helped us to know how these tools really can help us. Over the long run, what was most beneficial for the team was to talk to someone who actually has real experience working with this big data technology. That really opened up the mindset of the developers here, especially the local developers that we have in Athens."

Hadoop Is a Game-Changing Technology
After the training, Huang says Persado has successfully built up its new data warehouse capabilities using Hadoop, Hive and Pig.

"What we find is that Hadoop is kind of a game-changing new technology," says Sproehnle. "It's not that people can't learn it, but they need to invest in that training. They really need to learn this brand new technology. We find that if people fumble around on their own, it's really hard to get Hadoop into production. But if you invest in a week of training you can begin maximizing that investment really quickly."

 

5 Things Businesses Need to Know About Google Glass

Much of the talk surrounding Google Glass has focused on its consumer appeal. However, the device does have enterprise potential. CIOs should consider developing applications such as providing diagnostic advice to field service workers and mobile coupons to retail customers.

When Google Glass went out to beta testers for the Glass explorer program, the hype had already started. Industry experts, gadget geeks and researchers raced to predict the uses of the headset, such as videos, photos, gaming and voice commands.

Much of the talk about Glass, however, centered on how consumers would use the device and what apps would enhance their lives. But where does Glass fit in the enterprise? How will IT departments handle another device to monitor? Are there major business opportunities with the device?
Google Glass
Glass does have potential for businesses, but like any new device in the market, IT departments must be vet and explore applications for the new technology. Here are five things CIOs should be aware of before trying Google Glass on for size:

1. It's most useful for mobile workers. The hype over Google Glass, a wearable computer with a head-mounted display, has focused on consumer applications, but there are tantalizing business applications, too.

[Related: 11 Unique Uses for Google Glass, Demonstrated by Celebs ]

"[Glass] enables hands-free communication with a camera for busy people on their feet that need to make things come together," says Angela McIntyre, research director at Gartner. For example, field service technicians in industries such as oil and gas, healthcare and manufacturing could use Glass to diagnose an equipment problem by sending a picture to an expert at headquarters or by watching an instructional video to fix the issue.

"Instead of paying experts and flying them all over the world, you have staff that connects with remote experts," says Brent Blum, manager of digital experiences at Accenture Technology Labs. Glass could also advance the use of maps for workers locating a package in a warehouse or making a delivery to a new location.

[ Related:15 Cool Apps for Google Glass ]

Other types of mobile workers may also find practical uses of Glass for instantaneous updates of information. Financial services institutions could give Glass to their traders at the stock exchange so they can receive real-time information on stock quotes.

"They need to have information as soon as they can get it to make the deals and trades," McIntyre says. Retail companies could give Glass to sales clerks in stores to look up product information for customers or to conduct transactions.

2. It's a new tool for geofencing. Retailers will like the idea of another device for sending mobile coupons. But "there are some basic building blocks that need to be in place first in terms of Wi-Fi infrastructure and data management," says Blum.

Down the line, stores could send coupons that coincide with their loyalty program as you enter the area around the store or shopping center in hopes of getting your business. Glass currently has Crystal Shopper, an app for scanning bar codes to look up prices and reviews, but eventually consumers may be able to see coupons before their eyes as they walk into a store.

3. There are privacy concerns. The camera in Google Glass could be used to surreptitiously take photos or record videos. McIntyre cautions that "people can take pictures of screens and use them in illegal ways." Data captured by Google Glass will need to be secured, like any other computing device.

"If you're providing a service through Glass, it could be a selling point that you offer better encryption," says Shane Walker, an associate director at research firm, IHS. Businesses will also have to be weary of facial recognition because although Google has banned the technology from Glass, hackers have found a loophole.

If companies are seriously considering bringing Glass into the corporate environment, they will have to vet the devices just like they did with smartphones and tablets. "Start thinking about policy now and how you will handle those requests," McIntyre says. "It's another kind of BYOD policy."

4. The price is high but will level out. The current price of Google Glass, about $1,500, is about triple that of a smartphone. When Glass becomes commercially available in 2014, the price will have to come down to the level of a smartphone or tablet before companies will buy it.

"You'd have to understand what problem you're trying to solve that can't be solved by cheaper technologies," says Sam Chesterman, CIO of IPG Mediabrands and a member of the Glass Explorer program for beta-testers. Blum says for businesses, it will be about figuring out how much time and money will be saved by using Glass to determine the ROI.

5. There isn't an app for that. Glass comes standard with a few basic apps like search, messaging and video but doesn't have an app store yet. Still, many developers and businesses are creating their own apps, such as Fidelity Investments' Fidelity Market Monitor app to view stock quotes.

Chesterman agrees, saying companies should, "Pick a business problem you can solve with [Glass] and see if you have someone on your staff with Android development chops." McIntyre says to get out there before competitors do.

"Companies that want to be seen as leading edge are starting to work on apps for delivering content to consumers through Glass," McIntyre says. For now, Glass is well-suited for content creation across many industries such as media and marketing.


 

Wednesday, October 2, 2013

Millions flood healthcare insurance sites as feds grapple with glitches

Web traffic was 7 times greater than Medicare site ever saw

The U.S. government's Health Insurance Exchange (HIX) website and its state-run counterparts launched today and were immediately flooded with potential enrollees, causing widespread glitches throughout the country.

As of about 4 p.m. ET, more than 2.8 million people had visited the U.S. Health and Human Service's healthcare website since the HIXes went live at 8 a.m., according to Marilyn Tavenner, administrator for the U.S. Centers for Medicare & Medicaid Services (CMS).

CMS oversees the administration of the HIXes, a cornerstone of the Affordable Care Act (ACA), also known as Obamacare. The ACA requires all states to roll out HIXes or opt for a federally-operated version of one, where consumers can compare in one place plans based on price, deductibles and benefits.

Citizens who apply for healthcare insurance on the open exchange by Dec. 15, will be able to receive coverage beginning Jan. 1, 2014. Those who wait until after Dec. 15 to apply will receive coverage at a later date.

"In less than 15 hours today, our site traffic has tripled from what we saw when we re-launched in June. What's more, there were seven times more users on the [HIX] marketplace website today than have ever been on the Medicare.gov website at any one given time," Tavenner said, referring to HHS's Healthcare.gov site.

The HHS, under which CMS operates, also posted a warning on its website about being flooded with a high volume of requests.

CMS refused to disclose how many people have been able to enroll in health insurance plans, saying only that "people have been able to successfully complete the application and enrollment process."

Officials admitted to having technical issues with their website that stopped some from enroll ing. The officials said they're continuing to work on those problems to smooth out the wrinkles.

"We're very pleased with progress states are making," Gary Cohen, deputy administrator and director for the Center for Consumer Information and Insurance Oversight, said during a news conference at 4 p.m.

"We are making improvements as we speak. What we're hearing from other issuers is that problems are being resolved," Tavenner said. "This is day one of a six-month process. You have until Dec. 15 to enroll for coverage for Jan. 1."

The Congressional Budget Office estimates that 12 million consumers will buy health insurance in the HIX market by 2014, with that figure rising to nearly 28 million people by 2019.

The overwhelming majority of those using the HIXs will be low-income people, contractors who don't have an employer-sponsored plan or those already insured through employer plans, but whose family members aren't covered.

"About 17 million of those newly insured (those below 133% of the Federal Poverty Level) will receive coverage through an expanded Medicaid program," PricewaterhouseCoopers International (PwC) said in a report released in June.

"There are potentially 14 million new people walking through the electronic front door in light of ACA," said Garland Kemper, health and human services program director at services provider Unisys. "There are [state-based computer] systems that in some cases are 25 years old. They're legacy apps that, to modify the rules to reflect the new federal ones, will be very difficult. It varies state to state.

"This is going to be a huge impact to state government," she added.

One of the most cited problems by those attempting to sign up for healthcare insurance online was an issue associated with creating a password, which protects the enrollee's identity. Several members of national news organizations stated that people states including Maryland, Virginia, Hawaii, Michigan and Florida had trouble logging into the HIX website in order to enroll.

One of the problems is that some public HIXes, such as Massachusetts' Health Connector, have been online for years,. Other states found themselves behind the curve and faced tough deadlines to enact their exchanges. Massachusettts' Health Connector uses a model where the state evaluates and selects insurers in a competitive bidding process, and then offers those insurers to the public.

However, 36 states opted out of creating their own HIXes and instead opted to allow residents to visit the federally-created HIX, which offers a central database of insurers from which they can choose based on their state and economic status.

About 85% of Americans are already covered by some form of insurance, ether privately or as part of their benefits from an employer. The other 15% of Americans who are self-employed or unemployed are the target of the HIX system.

The Healthcare.gov site also offers an around-the-clock chat line to assist enrollees with the process. Some users found that service down, as well. Additionally, a Medicaid calculator tool that allows enrollees to calculate their tax credits after enrolling also experienced issues with accuracy.

Tavenner said part of the problem with the calculator has been that it double-checks all the results, which can result in slowdowns, but "we're seeing more accuracy," she added.






Apple iPhone 5S vs. Nokia Lumia 1020: Which Camera is Better?

Apple's brand new iPhone 5S has an enhanced iSight digital camera, but how does it stack up to Nokia's impressive 41-megapixel PureView Lumia 1020 camera? CIO.com's Al Sacco pits the two popular devices against each other in this hands-on iPhone vs. Lumia camera comparison. The results might surprise you.

UPDATED: As noted in the comments by user "M_F," the Nokia Lumia 1020 images that were originally included in this post were scaled-down, 5MP images and not the Lumia 1020's full-resolution 34MP images. The Nokia Lumia 1020 saves two versions of every image captured with its Pro Camera: a full-resolution 34MP image and a smaller, 5MP version for sharing. I have updated this post to include iPhone 5S images and both the smaller 5MP and larger 34MP Lumia 1020 images; the 34MP images are marked "High Res" in the photo captions. I have also updated the final Conclusions page.

Apple last week released its latest iPhones, the high-end iPhone 5S and midrange iPhone 5C. Among the most notable features is the iPhones' iSight camera. The new iPhone 5S has an 8MP camera, which is the same megapixel count as last year's iPhone 5. But Apple says the new device has "a redesigned camera sensor that allows for bigger pixels. Bigger pixels equal better photos. And better photos are precisely what inspired the advancements we made with the new iSight camera on iPhone 5s."

The camera is one of the most used and most valued features in today's top-of-the-line smartphones. I put Apple's latest and greatest to the test against the best Nokia has to offer: The Nokia Lumia 1020 Windows Phone, which is among the best camera phones available today.

Apple says, "What makes the iSight camera so remarkable is how beautiful photos look without your having to do anything at all. Just aim and shoot. That's it."

So that's what I did. The following comparison is not meant to be scientific; it's meant to serve as a quick image-quality comparison. I did not use any "advanced" settings; in all cases, I used the cameras' default and "auto" settings.

I also won't get into cameras specifications beyond the following basics. If you want details or full camera specs, visit Apple's iPhone 5S page or Nokia's Lumia 1020 page.

Nokia Lumia 1020 rear-facing camera: 41MP PureView camera; Carl Zeiss Tessar lens; f/2.2 aperture; autofocus; and Xenon flash.

Apple iPhone 5S rear-facing camera: 8MP iSight camera; f/2.2 aperture; autofocus; and True Tone flash with dual LEDs.