Booker Prize winner features in murky Mega piracy row

When Cameron Slater, a colourful  New Zealand blogger travelling online as Whale Oil, accused Kim Dotcom’s new online privacy company of facilitating the pirating of Booker Award winning novel The Luminaries last week, people were quick to pile on and condemn Dotcom’s new website.

But perhaps they were too quick.

The accusor has been fisked by an online posse who have questioned and analysed the evidence and found it wanting. But while Whale Oil may have been forced onto the back flipper, the truth remains is elusive.

Whale Oil said it is probably impossible to control piracy these days.

“But to have Kim Dotcom, a cheerleader for New Zealand and his ‘New Zealand’ company Mega essentially provide a platform for people to distribute a fellow Kiwi’s hard work – and a Booker Prize winning one at that, well… it’s not on.

“It is a clear example of why he can’t just sit there and claim no responsibility for any of it. His Mega service clearly aids and abets the stealing of copyrighted works, thereby denying the artists and publishers of their well earned income.”

The New Zealand Publishers’ Association was quick to climb in saying it was disappointing a New Zealand registered company was involved.

“Everyone is rightly proud of the achievements of Eleanor Catton on the world stage so to see her work given away without her consent by a fellow Kiwi company is really appalling,” Publishers Association of New Zealand president Sam Elworthy said in a statement..

“We should be doing all we can to support the good work of not only these two artists but also every New Zealander who makes an honest living from his or her creative works.

“Mega should do more to ensure this kind of thing does not occur.”

Catton’s publisher Fergus Barrowman, from Victoria University Press, expressed his disappointment as well. And the media followed.

Then another blogger, Public Address’s Russell Brown, chimed in, noting some oddities in the story so far.

According to Mega CEO Vikram Kumar, he said, the infringing e-book file has only been accessed by the same customer who uploaded it. That customer, Kumar said, registered with a throwaway email address.

“Kumar says Mega was able to identify the file URL from the Whale Oil blog post, even though it was partially obscured. But where did Whaleoil get it from?” Brown asked.

“As I understand the Mega service, it’s not possible to discover such a URL unless you have created it – that is, if you are the uploader – or it has been advertised to you by the uploader, who wishes to share the file.

“So there’s the two ways that Slater could have got the link on which he based his story.”

That where the crowd picked the matter up in 174 comments at the time of writing.

Kumar explained how Mega had located the file and the account given Mega’s encryption and claims to privacy.

In another comment, Kumar wrote the Victoria University Press’s release was based on Whale Oil’s blog post. It appeared, he said, VUP did not download or independently confirm the alleged copyright infringement.

VUP publisher Barrowman responded:

“In the interest of transparency, the press release was ‘Issued for Publishers Association of NZ by Pead PR’, and was different from the draft I approved,” he said. “And no, we didn’t attempt to download the files or confirm the alleged infringement.”

Meanwhile, Whale Oil denied any suggestion he had uploaded the file.

“I’ll repeat again Russell [Brown] for your benefit since you seem slow on the uptake. I [didn’t] upload the book to Mega.

“And again for your benefit and for all the enablers of Dotcom…I will not be revealing my source or details of how I came by the story.”

So who to believe, Whale Oil or Kumar?

Mega’s Bram van der Kolk posted on Twitter that although Slater was talking about protecting his “source” for the story, his original post claimed that he found the infringing file by dint of just “a bit of poking around on the internet”.

Brown said the two positions didn’t tally.

Another commentor said you have to poke pretty deep on the internet to find any pirate links to Mega.

“It really doesn’t seem to be a popular service for such activities.”

Kumar said the stats speak for themselves: 435 million files uploaded to Mega in nine months, of which 0.05% received a notice of alleged copyright infringement.

But Copyright Licensing NZ has taken issue with Mega on another front, claiming it found an educational text on the site. It said it had issued takedown notices.

And that’s the rub. With Safe Harbour provisions that protest providers of online services from responsibility for their users’ infringements, the issue really comes down to how Mega responds to such takedown notices.

Kumar said it responds promptly.

Meanwhile, Mega and Dotcom, still fighting extradition by US authorities, appear to be embedding themselves as part of the fabric of New Zealand’s online world.

Dotcom has teamed with ISP Orcon to front a new campaign for better internet access. Yesterday he also inked a deal with another IDP, Slingshot, to provide services to its customers.

Source Article from

Microsoft may end antivirus updates on XP in April


Just days after sending a clear message about the trouble awaiting Windows XP users next April when Microsoft ends security updates for the operating system (and for Office 2003), the company is saying that it may also stop delivering antivirus signature updates for Microsoft Security Essentials, their free antimalware product.

A spokesperson issued the following statement:

Microsoft will not guarantee updates of our antimalware signature and engine after the XP end of support date of April 8, 2014. Running antivirus on out of support operating systems is not an adequate solution to help protect against threats. Running a well-protected solution starts with using modern software and hardware designed to help protect against today’s threat landscape.  In addition, Microsoft recommends best practices to protect your PC such as:  1) running up to date antivirus, 2) regularly applying security updates for all software installed, and 3) using modern software that has advanced security technologies and is supported with regular security updates.

Microsoft’s message about the advantages of consistently updated software is well-covered ground. By continuing antivirus updates, they would just be enabling behavior they have done their best to discourage. Users who insist on continuing to run Windows XP can shift to one of the other free products (such as Panda, AVG and Avira).

Source Article from

Surface 2: Thinking different about my device experience

I admit it. I was one of those crazies who stood in line to buy a first-generation Surface RT in the first hour it was available. And I did so without having had a chance to test drive a device for more than a few minutes beforehand.


This time around (given the fact I was moved from the “banned” to the “approved” Surface tester list) I decided to wait to decide whether to buy the Surface 2 until I had a chance to test Microsoft’s second version of its ARM tablet. 

After using a Surface 2 tablet loaned to me by Microsoft for the past week, I’ve come to realize that many of the features I like about it have nothing to do with the new core device itself.

Yes, the new Tegra 4 ARM chip, a step up from the current Surface RT’s Tegra 3 core, allows apps and Web sites to open more quickly. And the new higher-resolution screen makes colors really pop. I like the new dual-position kickstand better than the single-position one on the Surface RT. Like Peter Bright at Ars Technica, I still wouldn’t call the new Surfaces truly “lapable,” as they are still less stable on my lap than any laptop I’ve ever used.

The Surface 2’s magnesium colored body shows fingerprints less than the original black. And the new “ring of light” power cord is easier to connect correctly than the original Surface power cord. (Microsoft is bundling this new cord with new Surface 2s, but not with Surface Pro 2s — at least for now. So be warned.)

But, in the end, what I really like most about the Surface 2 is the new, backlit Type keyboard. In fact, I am pretty sure I am going to spring for a replacement for my existing Type Cover, at $129.99. The new Type 2 keyboard/cover makes typing on ARM-based Surfaces even nicer/easier than the original Type keyboard. Though I also received a Touch 2 keyboard for review, I didn’t put it through its paces. I want and need something that allows me to type accurately and at full speed, and to me, the Touch covers are more novelty than useful peripherals. (Your mileage may vary.)

After a week with the Surface 2, I’ll say I also like Windows RT 8.1. I’m reserving judgment about IE11.

I hesitate about the browser here because my experience with IE11 for the majority of the past week was non-optimal. And as ARM-based Surface users know, this is a problem, since our only browser choice on these devices, due to their locked-down nature, is IE.

IE11 for me has been, in Microsoft parlance, “non-performant.” It’s been crashing, hanging and randomly restarting for most of the past seven days I’ve used it. There seemed little rhyme or reason as to which sites or circumstances were causing problems. The one site I use frequently that’s bombed on IE11 the most for me has been — the Web version of Twitter’s official client. (This was true in both the Windows RT 8.1 preview and the RTM version.)

I’ve received a lot of reader and colleague advice about IE11 over the past few days. I’ve gotten suggestions about disabling syncing of tabs. And I’ve learned when all else fails, the best way to try to “fix” the Metro-Style version of IE is to open the Desktop complement, go to Internet Options and do a reset. (Thanks to ZDNet’s Ed Bott, for that one.)

The past day or so, however, I’ve noticed improvements to my IE11 browsing experience. I’ve been applying nearly daily the various updates (this IE11 reliability update, among them) that Microsoft has been pushing to us Windows 8.1 users. So maybe something has finally helped.

The other piece of my Surface 2 experience which has improved noticeably in the past week is around battery life. When I received my Surface 2 tablet loaner a week ago, it came in a cloth bag inside of bubble rap in a delivery box. (Microsoft provided some reviewers with Surface 2 and Surface Pro 2 devices in retail packaging; others of us got just the Surface 2 in a cloth bag.) Somehow, the loaner device I received wasn’t powered completely down before shipping, so when I received it, the outer box was actually warm to the touch. When I opened the box, the device was hot. Not warm. Hot.

After letting the device cool for about an hour, I plugged it in and repowered. On that first charge, the battery lasted six hours, max. I know some other reviewers claimed to have seen battery life ranging from 14 hours to 10 hours to 6 hours. I was definitely at the low end of this range and right around where I’ve been with my Surface RT prior to upgrading to Windows RT 8.1. (Anyone with excessive batery drain who upgraded their first-generation Surface RT to Windows 8.1 may want to apply this quietly-released Microsoft battery-life update and see if that helps.)

After powering up the Surface 2 loaner  for a second time, I got better battery life out of it. I have a theory which I can’t prove: I think those touting Surface 2 battery times above Microsoft’s own claims of 10 hours (for video playback specifically, if you look at Microsoft’s fine print) are using these devices more or less continuously. When I’ve turned the device back on after multiple hours of non-use, my battery had drained considerably. Is this a Connected Standby issue? I am not sure.

As I am using the ARM-based version of Surface, I can’t make use of the Surface docking station (for sale in limited quantities in the U.S. and coming to the rest of the world by early next year). A $200 Power Cover, which adds some extra number of hours of battery life to Microsoft’s Surface 2, Surface Pro and Surface Pro 2 devices, won’t be available for purchase until early 2014.

I would be interested in purchasing an Ethernet adapter for a Surface device and supposedly support for these kinds of adapters is on its way. Earlier this week, Microsoft updated its Surface Support page indicating that users running a Windows 8.1 or Windows RT 8.1 device could now use Ethernet adapters with them. Microsoft has subsequently removed the Ethernet-support wording from its Support site, but supposedly workarounds are allowing some users to nonetheless connect Ethernet adapters to their Surface RTs and Surface 2s. I’ve heard Microsoft does plan to support Ethernet adapters with its ARM-based Surfaces at some point.

So am I all in?

There’s a lot to like about the Surface 2. But price isn’t one of those things. A new 32 GB Surface 2, without a cover, costs $449. It’s difficult to get any significant trade-in value for a current Surface RT from Microsoft’s own stores or its U.S. retail partners like Best Buy or Staples. (Of these three, Best Buy is probably the most generous. If  I mail in my Surface RT, plus power cord, I might be able to get a store credit of $150 for a Surface RT in very good condition.)


Are a nicer finish, kickstand and somewhat better CPU worth a few hundred bucks to me? Given that I use my Surface RT as a companion device to my primary work PCs — for browsing, checking mail, and some light writing/editing — probably not. I’m leaning toward keeping my current generation Surface RT, but updating it with a new Type 2 keyboard.

My biggest takeaway, after a week of use of the new Surface 2, is I/we need to rethink how we think about mobile devices.

Microsoft is building and deploying Windows and IE a lot differently than it used to, even as recently as with Windows 7. RTM doesn’t mean it’s done and won’t be updated for a year or so. Microsoft already has pushed a bunch of updates for Windows 8.1 and IE 11 since the official RTM in late August — and even since general availability on October 22. This concept takes some getting used to for those of us who grew up expecting RTM to mean a new Windows variant was well-tested and stable enough for everyday use for months if not a year until the next update from Microsoft.

The breakneck release pace of new devices from Microsoft, its OEM partners and its competitors also means users have to just take a leap into the new product stream at some point, knowing there could be something better/faster and maybe cheaper literally just around the corner.

Nokia announced its own ARM-based tablet a week ago, which is supposedly due to ship in mid-November. Microsoft is rumored to be releasing its first “Surface Mini” 8-inch tablet in the spring of 2014. I’s also readying an LTE-enabled, ARM-based Surface tablet for spring 2014. (There’s no word if the Mini is the LTE Surface, or if theese are two different devices.) And for the vendor-agnostic, new shiny iPads are going on sale tomorrow. It’s always been the case that buying a new gadget may mean immediately having to say you’re sorry. But these days, that’s more of a risk than ever.

Who else out there has been kicking the new Surface tires? Do the new devices offer enough of an incentive for you to jump on the bandwagon? If not, why not?


Source Article from

Who offers Italy’s best mobile broadband? The results are in

In their quest to find the right mobile broadband supplier, Italians can now count on extra help: the measurements provided by AgCom, the local telecoms regulator.

On Wednesday, the watchdog published the findings of its first nationwide tests of mobile connectivity quality for the four main operators in Italy.

By breaking down the numbers for each of Italy’s 20 regions, AgCom’s scores offer consumers some indication on what brand delivers the best connection in the different zones of the Boot, showing how much variation in performance there could be from one operator to another in the various areas.  

If you live in Genova, in the north west of Italy, for instance, you might be tempted to go for Tre (Three’s Italian arm). According to AgCom, the H3G-controlled operator will give you an average download speed of 9.3Mbps, compared to 5.7Mbps with Telecom Italia, 7.0Mbps with Vodafone and 6.0Mbps with Wind.

But if you plan to connect to the internet in the Southern city of Bari and value your download speeds, then you might be inclined to opt for Telecom Italia and Vodafone. There, their average speed, as calculated by AgCom, re respectively 7.2Mbps and 7.0Mbps, while Tre (5.6Mbps) and Wind (4.4Mbps) lag behind.

The watchdog’s analysis found that in some places, like Trento in the north east, there doesn’t seem to be much of a contest: when it comes to who’s the fastest, Telecom Italia (11.1Mbps) offers almost double the speeds of the rest of the pack. It’s the same in Potenza, in the south, with Vodafone (6.8Mbps) clearly standing out compared to its rivals. However in other areas, like Milan in the north or Ancona in the centre, the race is more closely run.

Download speeds is not the only variable taken into account by AgCom’s measurements: the watchdog has also studied upload speed (if you’re in Rome and eager to share content, giving Telecom Italia a try is probably wise), round trip time and jitter too.

Looking at the country as a whole, AgCom’s data shows Vodafone outperforming rivals when it comes to download speeds while Telecom Italia tops the rest for uploads. Wind seems to come distant fourth in both categories.

AgCom’s analysis was carried out using USB dongles provided by the carriers on the same plans commercially available to consumers. The goal of the tests – according to AgCom, which will perform these tests twice a year – is to evaluate the performance of the carriers using the best technology they currently provide on the market.

“We want to give the consumers the opportunity to compare various offers both in terms of price and quality,” Sergio Del Grosso, head of AgCom’s QoS Office, told ZDNet. “Plus, the mobile carriers can get insights on where to make their network better, which should benefit the market overall.”

In the future, the Italian telecoms authority hopes to start testing phones as well. “We are planning to compare the carriers’ directly on smartphones, so users can see how the mobile providers perform on the same top of the range device,” Del Grosso added.

Source Article from adds private storefront option to AppExchange


SAN FRANCISCO—‘s annual (and ever-growing) Dreamforce expo is still a few weeks away, but the CRM giant is pulling back the curtain on some of its new developments already.

Unveiled to the local tech media on Wednesday, The San Francisco-headquartered corporation is extending its cloud app store with the launch of the Private AppExchange.

Essentially, the idea is to satisfy employee demands for the latest web and mobile apps within the boundaries of each IT department’s rules and regulations.

“The employee wants whatever app they need to get their job done,” remarked Leyla Seka, vice president of the AppExchange unit as well as partner operations at Salesforce, specifying they don’t want to wait for tickets and approval from IT.

The private version of the AppExchange is suppose to fulfill those wishes, she continued, noting that IT managers “can go crazy customizing it or make it very mellow, being able to customize everything from logos to banners to app categories. 

Salesforce is also infusing its social network Chatter feed to each app listed on the Private AppExchange so that employees can have a conversation around the app, or bringing the “water cooler” discussion to each app, as Seka described it.

Apps uploaded to the Private AppExchange can be existing AppExchange apps, new apps added specifically to the private portal, and apps made on platforms other than Salesforce.

Salesforce originally launched the AppExchange seven years ago. In the first five years, the platform grew to one million installs for roughly 1,000 apps, according to Ron Huddleston, senior vice president of Global ISV and channel alliances at Salesforce.

To stress how strong the momentum for the hub has been since then, Huddleston cited that the figure has accelerated to two million installs for 2,000 apps in the last two years alone.

“The momentum is driven by our platform value proposition,” Huddleston asserted, encouraging businesses to develop directly on the Salesforce platform. Salesforce is known for offering plenty of third-party support for products across the portfolio, but there are plenty of incentives in place to keep as many customers as possible in-house. 

Seka noted that there are roughly three million custom apps built on

The total application count is up 27 percent year-over-year, growing at an average rate of 30 percent per year. Installs are also growing at a rate of roughly 33 percent annually.

“When people think about the AppExchange, people think about front-office applications,” Huddleston speculated, touting there are a lot of back office, horizontal apps. He added that it is this side of the businesses that is growing quickly, but people don’t generally recognized.

“Right now people are realizing that the enterprise space is where the money is,” Huddleston argued. “We did some simple math. You can do this math yourself.”

He continued that on the Apple App Store, developers see an average of $7,000 per app in returns, following up that the average is $400,000 for apps listed on the AppExchange.

The Private AppExchange will go online on Friday, November 1. 


Images via

Source Article from

FDIC closes bank in Florida bringing total number of failed banks in 2013 to 23

FDIC closes bank in Florida bringing total number of failed banks in 2013 to 23

The Bank of Jackson County, located in Graceville, FL, was closed down by the FDIC on Thursday, Oct. 30.  This bank failure is the first in the month of October, and brings the total number of failed banks in 2013 to 23.


10/30/2013 *** FL *** Graceville *** Bank of Jackson County *** $5.1 million dollar estimated FDIC DIF cost.

The total DIF for failed banks this week is $5.1 million.

If you were banking at the Bank of Jackson County in Graceville, FL, you are now banking at the First Federal Bank of Florida.

For more on the FDIC bank closure lists you can go to the FDIC website and search through their report of failed banks, credit unions, and Trusts.

In 2012, there were a total of 51 banks that went into receivership, merged with another financial institutions, or closed their doors entirely.

Entering 2013, there are nearly 1000 banks and other financial organizations on the troubled list due to mortgages, derivatives, and bad investments.

At the current rate of banks and financial institutions closing their doors so far this year, the estimated total number of failures for 2013 could reach 27.

Kenneth Schortgen Jr is a writer for, and hosts the popular web blog, The Daily Economist. Ken can also be heard Friday evenings giving an weekly economic report on the Angel Clark radio show.



Source Article from

Rare Hybrid Solar Eclipse To Be Visible This Sunday 5/3

Rare Hybrid Solar Eclipse To Be Visible This Sunday 5/3

The moon will blot out the sun Sunday in an eclipse that will be visible from eastern North America to the Middle East.


Sunday’s celestial event is a relatively rare occurrence known as a hybrid solar eclipse. It will begin as an annular or “ring of fire” eclipse along the path of totality, then shift to a total eclipse as the moon’s shadow sweeps across our planet.


What you’ll observe depends on where you live. Skywatchers in the eastern United States, northeastern South America, southern Europe, the Middle East and most of Africa will be treated to a partial solar eclipse, while people along the path of totality in central Africa will see the sun totally obscured by Earth’s nearest neighbor for a few dramatic moments.


If you live in eastern North America, you’ll have to get up early to enjoy the show. The partial eclipse will be visible at sunrise — about 6:30 a.m. local time — and last for about 45 minutes, experts say. Viewers in Boston and New York will see the sun more than 50 percent covered by the moon, while our star will appear 47 percent obscured from Miami and Washington, D.C. [Solar Eclipses: An Observer’s Guide (Infographic)]



All of the action in this part of the world will be occurring low in the sky, less than 8 degrees from the east-southeast horizon. (Your fist held at arm’s length measures about 10 degrees.) So you’ll want to find a spot that affords a good look at the horizon, without any buildings or hills blocking the view.


The path of totality, meanwhile, starts in the Atlantic Ocean off the eastern U.S. and runs through Gabon, the Democratic Republic of Congo and several other African nations before petering out in southern Ethiopia and Somalia around sunset.


Warning: If you are planning to watch Sunday’s solar eclipse in person, be extremely careful. Never look directly at the sun, either with the naked eye or through telescopes or binoculars without proper filters. To safely view solar eclipses, you can buy special solar filters or No. 14 welder’s glass to wear over your eyes. Standard sunglasses will NOT provide sufficient protection.


You can also build a simple pinhole camera, or look at the shadows filtering onto the ground through the leaves on a tree. (The spaces between leaves often create many natural pinholes).





Source Article from

NSA spied on Pope under guise that the Vatican was a ‘threat to the financial system’

NSA spied on Pope under guise that the Vatican was a ‘threat to the financial system’

In the world of fictional Big Brother, or a lessor form of sociopathic tyranny like the United States has become, spying on others becomes an obsession for those in power, much more than simply a justified means of ensuring security.

Which is why it can no longer be considered shocking when new information reveals that the NSA’s spying arm has infiltrated even the Vatican, and has used its measure of ‘higher power’ to record communications from perhaps even the Pope himself.

All under the auspicious guise that the Catholic Church was a potential ‘threat to the financial system’.


According to a Reuters report, the “spy agency had eavesdropped on Vatican phone calls, possibly including when former Pope Benedict’s successor was under discussion, but the Holy See said it had no knowledge of any such activity. Panorama magazine said that among 46 million phone calls followed by the U.S. National Security Agency (NSA) in Italy from December 10, 2012, to January 8, 2013, were conversations in and out of the Vatican.” But while it is unclear just what divine information the NSA had hoped to uncover by spying on the Vatican, what is an absolute headbanger, is that according to Panorama one of the reasons for the illegal wiretaps was to be abreast of “threats to the financial system.” – Zerohedge

This new revelation comes at a time when the NSA and the Obama administration are under serious heat from Congress, the American people, and world leaders for using their wiretapping apparatus to collect data from nearly every individual on the planet.

The Vatican is a major player in the global financial system, but has little power in circumventing the affairs of the world’s central banks and sovereign economies.  The Vatican Bank has a long history of neutrality when it comes to how it deals with individual, church, and sovereign funds, as a large portion of their wealth resides in real estate, not derivatives and government bonds that make up the majority of assets that hold the global financial system on the brink of collapse.

The ruse by the NSA that their spying efforts were directed towards prevention of any threats to the global financial system when they sought to wiretap the Vatican and Pope is but a cover to perhaps mask the greater agenda of learning who might have followed Pope Benedict in this year’s change of papacy.  Knowing the character and agenda of a man who would come to power, and have a modicum of control over 1.2 billion people throughout the world, is an important detail for the NSA and Obama administration to know, much more than what the Vatican Bank may be doing financially in its daily affairs.

Kenneth Schortgen Jr is a writer for, and hosts the popular web blog, The Daily Economist. Ken can also be heard Friday evenings giving an weekly economic report on the Angel Clark radio show.



Source Article from




500 people assembled on October 19th on Ocean Beach in San Francisco and formed the letters with their bodies to demonstrate their growing concern about eventual fallout on the west coast. Credit and More Information:


Experts at Canada’s Vancouver Aquarium say they are puzzled by what is causing thousands of sunflower starfish, or sea stars, to die in the waters of Vancouver Harbor and Howe Sound.

What is even more startling is the way the creatures perish — by quickly dissolving in a phenomenon the aquarium has dubbed Sea Star Wasting Syndrome.


“They have disintegrated, and now there is just goo left,” says research diver and taxonomist Donna Gibbs.


Radiation Levels Will Concentrate in Pockets at Certain  West Coast Locations

An ocean current called the North Pacific Gyre is bringing Japanese radiation to the West Coast of North America:

Marine Debris Poster (4) AI9

The leg of the Gyre closest to Japan – the Kuroshio current – begins right next to Fukushima:



While many people assume that the ocean will dilute the Fukushima radiation, a previously-secret 1955 U.S. government report concluded that the ocean may not adequately dilute radiation from nuclear accidents, and there could be “pockets” and “streams” of highly-concentrated radiation.


Physicians for Social Responsibility notes:

An interesting fact for people living on the US west coast is also included in the UNSCEAR [United Nations Scientific Committee on the Effects of Atomic Radiation] report: only about 5% of the directly discharged radiation was deposited within a radius of 80 km from the Fukushima Dai-ichi nuclear power station. The rest was distributed in the Pacific Ocean. 3-D simulations have been carried out for the Pacific basin, showing that within 5–6 years, the emissions would reach the North American coastline, with uncertain consequences for food safety and health of the local population.


The University of Hawaii’s International Pacific Research Center created a graphic showing the projected dispersion of debris from Japan.


Last year, scientists from the National Oceanic and Atmospheric Administration’s (NOAA) Pacific Marine Environmental Laboratory and 3 scientists from the GEOMAR Research Center for Marine Geosciences showed that radiation on the West Coast of North America could end up being 10 times higher than in Japan:


After 10 years the concentrations become nearly homogeneous over the whole Pacific, with higher values in the east, extending along the North American coast with a maximum (~1 × 10−4) off Baja California.


With caution given to the various idealizations (unknown actual oceanic state during release, unknown release area, no biological effects included, see section 3.4), the following conclusions may be drawn. (i) Dilution due to swift horizontal and vertical dispersion in the vicinity of the energetic Kuroshio regime leads to a rapid decrease of radioactivity levels during the first 2 years, with a decline of near-surface peak concentrations to values around 10 Bq m−3 (based on a total input of 10 PBq). The strong lateral dispersion, related to the vigorous eddy fields in the mid-latitude western Pacific, appears significantly under-estimated in the non-eddying (0.5°) model version. (ii) The subsequent pace of dilution is strongly reduced, owing to the eastward advection of the main tracer cloud towards the much less energetic areas of the central and eastern North Pacific. (iii) The magnitude of additional peak radioactivity should drop to values comparable to the pre-Fukushima levels after 6–9 years (i.e. total peak concentrations would then have declined below twice pre-Fukushima levels). (iv) By then the tracer cloud will span almost the entire North Pacific, with peak concentrations off the North American coast an order-of-magnitude higher than in the western Pacific.



(“Order-of-magnitude” is a scientific term which means 10 times higher. The “Western Pacific” means Japan’s East Coast.)


In May, a team of scientists from Spain, Australia and France concluded that the radioactive cesium would look more like this:

A team of top Chinese scientists has just published a study in the Science China Earth Sciences journal showing that the radioactive plume crosses the ocean in a nearly straight line toward North America, and that it appears to stay together with little dispersion:


On March 30, 2011, the Japan Central News Agency reported the monitored radioactive pollutions that were 4000 times higher than the standard level. Whether or not these nuclear pollutants will be transported to the Pacific-neighboring countries through oceanic circulations becomes a world-wide concern.


The time scale of the nuclear pollutants reaching the west coast of America is 3.2 years if it is estimated using the surface drifting buoys and 3.9 years if it is estimated using the nuclear pollutant particulate tracers.



The half life of cesium-137 is so long that it produces more damage to human. Figure 4 gives the examples of the distribution of the impact strength of Cesium-137 at year 1.5 (panel (a)), year 3.5 (panel (b)), and year 4 (panel (c)).


It is worth noting that due to the current near the shore cannot be well reconstructed by the global ocean reanalysis, some nuclear pollutant particulate tracers may come to rest in near shore area, which may result in additional uncertainty in the estimation of the impact strength.


Since the major transport mechanism of nuclear pollutants for the west coast of America is the Kuroshio-extension currents, after four years, the impact strength of Cesium-137 in the west coast area of America is as high as 4%.


Bluefin tuna on the California shore tested positive for radiation from Fukushima, and there are reports of highly radioactive fish in Canada.


The CBS show The Doctors warned that we should be moderate with our fish intake, and children and pregnant women should be especially careful



 Earth Week

Global Research



Source Article from


shock collar conn parents 9 year old girl article size



DANBURY, Conn. (AP) – Police say a Danbury couple faces charges after a 9-year-old girl in their care was punished with an electric shock collar used to discipline barking dogs.


Forty-three-year-old Eduardo Montanez is held on $250,000 bond on charges including third-degree assault, risk of injury to a child and cruelty. Thirty-four-year-old Paula Montanez is held on $200,000 bond on charges including risk of injury to a child, conspiracy to commit third-degree assault and conspiracy to commit cruelty.


WVIT-TV reports police said Montanez allegedly forced the girl to bark so the collar would shock her, and his wife didn’t stop him. Police said the couple was upset about the girl’s school progress.


They were arrested Tuesday after an investigation triggered when the girl told a teacher. They appear Wednesday in Danbury Superior Court. It’s not immediately known who’s representing them.



Source Article from