Essential networking tips

for small business

Android KitKat Unveiled

Suprising move by Google

Windows 8

Nine unanswered questions about the new OS

Pioneer 15x Blu-ray burner

coming soon for under $100

Friday, September 21, 2012

Why Gamers Are Gluttons for Punishment

Being punished never feels good.

Everyone has a memory of paying the price for at least one error in judgment. Getting grounded. Getting a ticket. Getting a full leg cast because that tree branch was a little less stable than you judged.

We remember these punishments with a grimace. That’s the point, right? The punishment must trigger enough unpleasantness that you (hopefully) won't repeat the mistake. Or at least disagreeable enough to serve as a stark contrast between it and the sweet spoils of success. Punishments and consequences are essential because they allow us to appreciate said success when it does eventually come.

But there exists another category of punishment that most people reading this probably feel more ambivalent about. Losing your gear in a pool of lava in Minecraft. Using up your final life and being shuttled way back to stage 1-1 in Mario. Making yet another long, boring corpse-run in World of Warcraft.

None of these experiences feel good either. But I feel differently about these memories than I do about time spent in detention or other “real world” punishments. Failing in a game just makes me more motivated to succeed. I get back up and try again.

This is not a coincidence. Without a punishment or penalty for failing, what’s the point of play? What’s the purpose of overcoming a challenge if nothing is actually at stake?

It’s worth pointing out that a game being punishing is not the same as a game being difficult. Games can be (and often are) both, but there is an important distinction. A punishing game imposes a significant penalty onto players when they fail. The game itself can be easy, but it whacks gamers heavily when they are defeated.

If I have to run back through Westfall one more time...

I think about video games as a series of rewards. It’s a rush to earn a badass new weapon. To level-up. To open up new areas to explore. But the truth is that video games almost never work if they only offer The Carrot. Great video games also know how to make good use of The Stick. It’s a tightrope-balancing act that game designers must walk. If your game is too punishing, gamers will give up. If it’s too lax, they will grow bored.

All of this might sound obvious. Of course when you miss a jump and hit some spikes, you’re punished with a lost life and forced to restart. This is just a part of the contract of video games. Right? But some games subvert this formula for failure and play upon people’s expectations. Prince of Persia: The Sands of Time let gamers literally rewind time and try again. Failure in that game, then, is just a warning. “Don’t do that again.”

Brutal 8-Bit Memories


By making NES-era games so punishing, game developers were increasing the game’s value to the player. Games like Super Mario Brothers, Castlevania and Mega Man are fondly remembered for offering hours upon hours of fun, spread across weeks and months of play. But these three games are actually very short. They used their extreme difficulty and their extreme form of punishment (start the whole game over!) to make up for their relatively short length. The challenge and nasty punishments didn't arise because game developers were masochists or didn't yet know how to tune their products. The titles were punishing out of necessity.

Of course, coin-op arcades also played a role. It was in a game's (and by extension, the arcade owner's) best interest to keep gamers staving off failure one quarter at time.

At home, the very same element that frustrated young you to no end in Castlevania (Go back to the beginning! AGAIN!) is also what made the game such a classic. Imagine if you could rewind time and amend your mistakes like Prince of Persia. You would have destroyed Dracula within a single afternoon. Sure you might have popped the cart back in once in a while to experience the awesome graphics or to replay the adventure, but would gamers still consider the title such a classic?


Castlevania-ville this wan't.

The penalty for failure contributed to Castlevania’s epic feeling. It made the boss battles much more harrowing. It caused you to shout out in joy when, on the brink of death, you found a hidden turkey. It made the adventure lengthier and more epic. Gamers might have hated it. But the game needed it. Without the looming threat of being sent back to the start, the turkey means nothing, and no one wants a meaningless turkey. No one.

Like Like many gamers, when I was young I would endlessly complain and moan about a game’s extreme punishments. But sometimes gamers like me can’t be trusted to accurately pinpoint what we want from a game. Even if a gamer insists they want an easier path, I believe what they really want is a challenge worth overcoming, and a penalty they fear enough to give them a rush when they avoid or overcome it.

In this same NES era, advances in technology allowed developers to push the entire video game category forward. As games grew in size, length and sophistication, it was no longer feasible to expect gamers to happily trudge back to the beginning when they failed. The increasingly complex nature of video games suddenly made the most accepted and established method of punishment (start over!) too severe. The punishment didn’t change; video games had changed. Game developers had to adjust.

What worked before simply wasn’t an option for games like Final Fantasy, Metroid or The Legend of Zelda. Luckily, through sheer ingenuity (password saves) and technology (battery back-up saves), game makers were able to soften the blow of failure and allow games to continue their slow march towards greater size, complexity, and ultimately accessibility.

 
When Games Lost Their Bite

As the game industry grew up and transcended the bedroom of teenage boys, in-game punishments lost their teeth. There are exceptions – for every friendly, take-you-on-an-adventure game like Journey there are shrewd, exacting experiences like Dark Souls. But generally speaking, games have had all their sharp corners rounded into nice, smooth curves to help ensure no one gets hurt.

There are examples everywhere you look. If you died in EverQuest in 1999 you were forced to return to your corpse to recover your lost items, and you also lost a big chunk of experience. By the time World of Warcraft launched in 2004, dying imposed no penalty on the player other than a small amount of repair money and a few minutes of downtime. A poorly handled engagement in a shooter like Doom of Half-Life would leave you permanently damaged and vulnerable, potentially causing you to lose a future firefight for a mistake you made many minutes before. But now permanent player health has been completely replaced with rapidly recharging health systems in virtually every major shooter.


I actually really don't have time to bleed.

The above examples are observations, not value judgments. I don’t yearn to return to the days of quicksaving every 30 seconds in Half-Life 2 to ensure every battle goes perfectly. Recharging health offers an elegant solution to the out-moded littering of every FPS level with health packs. And besides a few very specific exceptions (like arcade titles), the idea of actually starting a game over from scratch as a viable form of instructive player punishment seems ludicrous. Modern, more player-friendly gaming is undoubtedly an improvement in many ways.

Of course every rule is also made to be broken – plenty of incredibly successful high-score games like Temple Run and Super Hexagon force full restarts onto unwilling players. But it is undeniable that video games as a whole are friendlier now than ever before.

But it can be hard to judge how far developers can take player accommodation before something important is lost. How easy is too easy?

World of Warcraft was widely praised (and made Blizzard billions of dollars) because of its much more player-friendly policies. Bonus XP, rapidly recharging health, relaxed death penalties and more “user-friendly” designs contributed significantly to the game’s success. Professional gamers and casual players alike praised the changes. But how far is too far? In WoW, if instantly resurrected on the spot after death, wouldn't most people simply ask “what’s the point of death?” – obviously some sort of penalty is an absolute must.

And yet 2012’s Guild Wars 2 lowers the death penalty even further. Players can resurrect just a few seconds away from the fight and sometimes can re-join the same battle, essentially making failure impossible.

This trend obviously isn't exclusive to massively-multiplayer online games. 2006's Prey notably sent players to an undying purgatory of sorts to fight your way back to life. 2008's Prince of Persia reboot literally rescued you from death anytime you fell. Gears of War allows AI and co-op teammates to revive each other, and even the just-released Borderlands 2 features a "fight for you life" mechanic which allows you to return to partial health and leapfrog death if you score a kill before you bleed out.

 
What Happens When Failure Isn't an Option

To truly examine the impact a complete lack of punishments and failure states can have on video games, players need to look no further than the rise of Zynga and other social game makers. It’s not a coincidence that core gamers have such a deep and intense disdain for the entire category. It’s true that you hear complaints about how they make their money. You hear complaints about their spammy nature. But what underlies all of it is a low grumble that “they aren’t even actual games.”

This complaint is the perfect window into the mindset of a true gamer. It’s the perfect summation of why the potential to lose, and the punishment and inevitably follows, is absolutely essential to our enjoyment of games. Even though we don't like losing. Without the possibility to failure, there can be no success.


My crops may die, but I never will!

Despite the naysayers, Zynga’s games are actually incredibly complex. They feature elaborate storylines, detailed artwork and offer up collaboration with friends and strangers on a grand scale. What’s more, they give players an almost unprecedented level of freedom for self-expression. Despite all this, many core gamers don’t consider FarmVille a game because it is literally impossible to lose, even if one were to work at it. Try as you might, the game will keep giving you more coins. Your game will never truly be over. The worst you can ever experience is a farm plot full of withered crops.

Yet these “little” flash games with their lack of failure states or player punishments have managed to eclipse anything the else games industry has ever accomplished. At its peak, Zynga’s CityVille had 100 million active players.

But the verdict is still out on whether this truly represents a sea change for the video game industry. Zynga is on rocky ground, with many professional pundits beginning to wonder if the entire social game category is materializing into a massive fad.

The meteoric rise of Zynga, born on the backs of games that appease the player at every turn and never allowed them to fail, sent traditional game companies scrambling. But it now looks more and more likely that the basic psychology of failure used by game designers for decades hasn't actually been upended after all.

When I was little, I was once allowed to eat whatever I wanted for dinner. Eight-year-old me picked cookie dough, and I ended up with a terrible stomachache that lasted all night. It was too much of a good thing. It turns out I only tolerate cookies after a real, balanced meal.

Zynga’s recent fortunes might be proof writ large that game players don’t know what they want from a game’s difficulty. Just as eight-year-old me picked cookie dough for dinner (and would have picked an easier, friendlier Castlevania), Zynga’s casual game players have picked the company’s friendly fare over the more punishing titles the traditional game makers produce. It is very possible that the waning interest in social games is the latest proof of the maxim that players can’t be asked what they want – they sometimes need to be told what they want.


In one version of the future, a generation of gamers may never see a screen like this.

 
Why Fear is Good
 
Gamers want to be challenged. We want a game to bring us directly to the peak of our abilities and push us into a flow state. But I believe that this alone isn’t enough. Without a fear of punishment, a victory over a game feels hollow. Everyone absolutely loathes losing rare gear to the aforementioned unexpected patches of lava in Minecraft, but without that fear of loss, there is no tension. Exploring the creepy, blocky cave wouldn’t be as fun. Even though we don't like it, it makes Minecraft a better game.
Game makers have been pushing more “user-friendly” consequences and punishments onto players for as long as video games have existed. But this trend can’t continue indefinitely. If you follow that thread to its end, it terminates at the Zyngas of the world, with a library of “games” that aren’t games at all. They are colorful interactive distractions.

But if '80s gamers weaned on Castlevania and Mega Man were to skip ahead 30 years, would they view Gears of War or Uncharted any differently than gamers today view social titles? Both shooters feature the recharging health and checkpoint systems that gamers insist they don’t want to live without. But do we truly know what we want, or are we just eating too much cookie dough?
 

Tuesday, September 18, 2012

Dump Internet Explorer Until Microsoft Issues Patch

If you use Internet Explorer 6, 7, 8 or 9 as your default browser on a Windows PC, security experts are advising you to use a different Web browser until Microsoft patches a critical vulnerability in IE. Microsoft on Monday confirmed that hackers were actively exploiting an IE vulnerability that could allow an attacker to take over your PC. The exploit does not affect users running IE10 on the Windows 8 Release Preview.
So far, Microsoft says it has received reports of “a small number of targeted attacks” using this exploit. The software maker is working on a security patch for the problem, but the company has not yet said whether it will issue a security update as soon as possible or as part of its monthly “patch Tuesday” update cycle. The next “patch Tuesday” would be October 9.
The exploit was made public on security firm Rapid7's Metasploit Project and first discovered in the wild by security researcher Eric Romang. Metasploit is advising users to dump IE until Microsoft issues a security update. The new IE security flaw was developed by the same group that created the recent Java zero day flaw, according to Metasploit.
Microsoft's Internet Explorer makes up about 48.75 percent of active Web browsers worldwide, according to Net Market Share.


The Exploit

Microsoft said the exploit makes it possible for a hacker to take advantage of corrupted memory in your system and execute malicious code on your PC. The end result is that, if attacked, a hacker would have the same control over your PC that you do. So if you login as an administrative user, which many Windows users do, then the hacker would be able to do everything you can including install or remove programs; view, change, or delete files; and even create new user accounts with full administrative rights.

How It Could Happen

For most home users, the exploit would require you to visit a malicious Website where the attack could be carried out. The attack is also possible via compromised sites that may have malicious advertisements on them or host user-provided content. The most likely scenario for getting hit with this exploit appears to be phishing attempts where a hacker attempts to trick you into visiting a malicious site.

What Microsoft Advises

While Microsoft is working on a patch for the new IE exploit, the software maker is advising users to employ a multi-step workaround including downloading and installing a security toolkit, and setting your Internet security zone settings via Tools>Internet Options>Security to “High.” The company is also advising you to configure Internet Explorer to either disable Active Scripting or prompt you before running any script. You can find out more details from Microsoft's security advisory.

Think About Switching, For Now

Employing this workaround will make it much harder to take advantage of the security threat, but it won't eliminate the problem entirely. That's a lot of hassle to go through just to mitigate but not eliminate a serious security flaw, which is why it might be more advisable to just dump IE until the problem is fixed.
Popular alternatives to Internet Explorer include Google's Chrome browser, Mozilla Firefox, and Opera.

By Ian Paul Sep 18, 2012 6:36 AM
http://www.pcworld.com/article/2010031/dump-internet-explorer-until-microsoft-issues-patch-security-experts-warn.html

Thursday, September 13, 2012

Nintendo Wii U

The Nintendo Wii U will launch in the US on Sunday, November 18, Nintendo America president and CEO Reggie Fils-Aime revealed in a New York City press conference this morning. It'll arrive in Europe on November 30, though no price was given for our friends abroad. Like its Japanese release, the console comes in two varieties: a base level bundle in white with just 8GB of internal memory for $300 and a premium bundle in black with 32GB of internal memory for $350.

Each version contains the console itself, a WiiPad, a charging stand, a play stand, and a stand for the console. The premium version, however, adds a subscription to Nintendo's Premium Network, which offers various rewards for digital purchases -- it also gets a full 32GB of internal memory, which is a pretty major step up over the 8GB model. It's hard to imagine either having enough internal storage compared with current-gen consoles, but the Wii U's memory is expandable via USB.

Fils-Aime also said that Wii remotes are getting rebranded for the Wii U, and will be available in retail shops shortly. Check out the official PR and an additional shot of the basic set after the break.
Nintendo Wii U arrives in the US on Nov 18 in two versions for $300 and $350, Europe on Nov 30 


Wednesday, September 12, 2012

GoDaddy offers users one month credit following service outage

GoDaddy customers are being given an apology and one month of free service after grappling with Monday's service snafu.

In an e-mail sent to GoDaddy users, the company's CEO Scott Wagner apologized for the outage that affected Web sites, e-mail availability, and other services.

Go Daddy"We let you down and we know it," the e-mail read. "We take our responsibilities -- and the trust you place in us -- very seriously. I cannot express how sorry I am to those of you who were inconvenienced."

To appease its customers, GoDaddy is kicking in a credit good for one month of service for all active and published sites. Customers can click on a link in the e-mail to redeem the credit but must take advantage of the offer within the next seven days.
GoDaddy is one of the biggest Web site hosting companies and also one of the largest domain registrars. So Monday's outage could have affected thousands, if not millions, of sites.

A hacker with the Twitter name "Anonymous Own3r" took credit for the outage. But GoDaddy attributed the cause to an internal network problem that corrupted the data tables used by its routers. After finding the issue, the company was able to get its service back up. GoDaddy has assured customers that no credit card data, names, addresses, or passwords were compromised.

The company also promises to learn from its mistakes.

"Throughout our history, we have provided 99.999% uptime in our DNS infrastructure," the e-mail added. "This is the level of performance we expect from ourselves. Monday, we fell short of these expectations. We have learned from this event and will use it to drive improvement in our services."


September 12, 2012 8:29 AM PDT
http://news.cnet.com/8301-1023_3-57511288-93/godaddy-offers-users-one-month-credit-following-service-outage/

Monday, September 10, 2012

William Moggridge

The next time you hinge open that notebook PC and smile at a feature that makes it easier to use, give a thought to Bill Moggridge, who passed away Saturday from cancer at the age of 69. The pioneering designer invented the modern clamshell design seen in all modern laptops, and is also viewed as the father of human interaction software design.

The Compass Computer he designed for Grid Systems with the screen folded over the keyboard appeared in 1981, flew on the space shuttle, and inspired virtually every notebook design since. Perhaps more importantly, when he tried to use the machine himself, Moggridge was exasperated with the difficulty and decided to take the human factor into account for software design. To that end, he engaged experts from fields like graphics design and psychology, and tried to "build empathy for the consumer into the product," according to former partner, Professor David Kelly. The pair merged their design firms to form Ideo in 1991, and worked with clients like Apple, Microsoft and Procter & Gamble, designing products like the first Macintosh mouse and Palm V handheld along the way.

In 2010, Moggridge became the director of the Smithsonian's Cooper-Hewitt Museum in New York, and was a recipient of that institution's lifetime achievement award. He also won the Prince Philip Designer's Prize, the longest running award of its type in the UK, given for "a design career which has upheld the highest standards and broken new ground." See why that's true by going to Cooper-Hewitt's tribute video, right after break.


http://www.engadget.com/2012/09/10/william-moggridge-portable-computer-pioneer-dies/

Wednesday, September 5, 2012

A Murder Is Announced

Well, it pains Linux Girl to have to write these words, but it looks like the "Death of Desktop Linux" story is back for another round.
Yes, after countless debates and discussions of the topic ad nauseum over the years -- the most recent being just a few short months ago, in fact -- it recently reared its ugly head again, like a zombie that just won't quit.
The culprit this time? None other than Miguel de Icaza, of GNOME and Mono fame.
The claim? Essentially, that Apple killed the Linux desktop.
Only problem is, FOSS fans can't seem to find any evidence that the crime ever happened.

'Then OS X Is on Life Support'

"Another one of these? Please," exclaimed Google+ blogger Linux Rants. "Now Apple killed the Linux desktop? No. I'm afraid not."
In fact, "the Mac OS in one form or another has been around since 1984, and in that time has managed to gain 6 to 7 percent market share," Linux Rants pointed out. "Linux has been around since 1991, and has managed to gain at least 1 to 2 percent market share. Probably more. Possibly much more, depending on who you ask.
"If desktop Linux is dead -- which I feel wholeheartedly that it is not -- then OSX is on Life Support and it's not looking good," he asserted.
The reality is that "this is a very exciting time for desktop Linux, with Windows 8 threatening to popularize it like we've never seen before, and gaming companies committing to supporting it unprecedented numbers," Linux Rants noted.
So "no, desktop Linux is not dead," he concluded. "It's had some difficulty gaining traction because it was a decade late to the Operating System market. Despite that, once it gets going it will be impossible to stop."

'It Seems to Be Working for Me'

Indeed, "if the Linux desktop is dead, why am I using it now?" asked Google+ blogger Kevin O'Brien. "It seems to be working for me as well as anything."
The real question, O'Brien suggested, "is what you want to accomplish. If it is total domination, with Linux having 100 percent of the desktop market, not only will that not happen, I wouldn't want it to happen.
"Monoculture never works well," he added. "So, I think de Icaza identifies some problems with development in Linux, but there's problems in everything."

'Killed? No Way.'

Blogger Robert Pogson took a similar view.
"Apple killed nothing," Pogson told Linux Girl.
Rather, "Apple's fanbois just wish they had 1K+ retail stores pushing product in China and India like Canonical has Dell doing," he explained. "They wish they were shipping more than 20 million PCs -- GNU/Linux will ship on that many PCs with Ubuntu next year. That leaves hundreds of other distros being installed by individuals and organizations on a global scale.
"Walmart Brazil barely sells any Apple products," Pogson added. "GNU/Linux and that other OS top them in popularity."
In short, "killed? No way," he concluded.

'We Have an Opportunity'

"I don't think Apple killed anything," consultant and Slashdot blogger Gerhard Mack agreed. "'Killed' implies a permanent state, and I don't think it's actually permanent -- I'm seeing more interest from my non-techie friends, and announcements such as the porting of Steam to Linux give me hope for the future."
De Icaza "is correct that the constant breakage caused by people completely rearranging interfaces and breaking apps on a constant basis set the Linux desktop back by years," Mack conceded. However, "he is completely out of line for blaming Linus for it."
Looking ahead, meanwhile, "the sad thing is that we have an opportunity to take market share, since Microsoft seems to be going out of their way to get rid of their entire userbase with Windows 8, but I don't think we will have a non Gnome 3/Unity distro ready in time to take advantage," he concluded.

'It's the Devs'

Slashdot blogger hairyfeet took an even stronger view.
"It's the devs," hairyfeet charged. "The devs can't stand bug fixing and instead would rather write something 'New!' even if it breaks compatibility, makes third party support impossible, and makes Linux drivers practically impossible to keep 100 percent functional past a single update."
Meanwhile, "you have Apple giving you 5 years of support, making sure their ABI doesn't break software so companies like AutoCAD and Photoshop can actually support them, in short they make it NICE for the user, what a concept!" he asserted. "And you still have the BSD underpinnings, so the old-school Unix heads can have their CLI and have a functional system too!"

'Dead on Arrival'

In fact, "the desktop distribution Linux community really has no concern as to whether it gets widespread adoption," opined Robin Lim, a lawyer and blogger on Mobile Raptor.
"In the past few months, maybe out of frustration, I have gone the same route," Lim explained. "I love my Linux distro, I use it, I benefit from it, but I do not bother to promote it with anyone anymore. This was some time after I got into a 'discussion' in a Linux forum about the issue of the need for change for widespread adoption -- the overwhelming response was, 'who cares?'"
So, "how can it win, when it is not even trying to fight?" Lim concluded. "Excellent article by Mr. Miguel de Icaza. But he is wrong about his conclusion: Mac OS did not kill Linux; Linux on the desktop was dead on arrival. His own article explains why."

'They Just Want Their Problems Solved'

Linux on the desktop has had "a number of important successes, but these are still very much niche cases," noted Chris Travers, a Slashdot blogger who works on the LedgerSMB project.
Breaking into the mainstream, however, "has not happened and it isn't about to happen," Travers opined. "Linux makes a great desktop tailored at each and every user, but nobody has really figured out how to make users see why they should consider a switch."
De Icaza's article focuses primarily on technical problems with the attempts thus far to bring Linux to the desktop, but "in the end this doesn't matter if you can't convince users to switch, and you can't do this by merely building a great desktop environment," he said. "It doesn't matter how great your desktop is, you have to find some way to sell the move to users, because moving operating systems is always a certain amount of trouble.
"If you don't market it," in other words, "you won't sell it," he added.
"People don't care what is technically best," Travers concluded. "They just want their problems solved."

By Katherine Noyes
http://www.technewsworld.com/story/A-Murder-Is-Announced-but-No-Corpse-Can-Be-Found-76067.html

Tuesday, September 4, 2012

Bad Piggies


Bad Piggies, the alternateuniverses answer to Angry Birds lands September 27th

Imagine a world where everyone's evil and wears a goatee, while our avian allies from Angry Birds are actually the villains of the piece. That's the premise behind Bad Piggies, Rovio's newest productivity killer, which promises entirely new game mechanics (and no slingshots!). It'll arrive on iOS, Android and OS X on September 27th, with Windows and Windows Phone 8 versions following shortly afterward.

Press Release

Rovio to launch Bad Piggies on September 27!
Espoo, Finland- September 4, 2012 Rovio Entertainment, the creators of Angry Birds, today announced their newest game, Bad Piggies, launching September 27 for iOS, Android and Mac. This innovative game turns the franchise on its head by letting the fans play as the pigs – with all new, never-before-seen gameplay – and not a slingshot in sight!
"There's a lot of empathy towards the lovable enemies from the Angry Birds games, and we've been constantly asked: what about the pigs' side of the story?" said Mikael Hed, CEO of Rovio. "Bad Piggies gives you the chance to play as the second-most-loved characters in the Angry Birds universe, and explore this rich world through their green eyes."
The new game will launch on iOS, Android and Mac on September 27. Windows Phone, Windows 8 and PC versions will follow shortly.
"We've had a lot of fun creating a totally new and unique gameplay experience," said Petri Järvilehto, EVP Games at Rovio. "There's so much more to these pigs than what is seen in the Angry Birds games, and Bad Piggies is the first glimpse into what's going on in the imaginative and ingenious minds of the pigs."

By posted Sep 4th 2012 2:02PM
http://www.engadget.com/2012/09/04/bad-piggies/