Contact Me By Email

Thursday, May 05, 2016

Racism and 'Racial Conflict' in America - The Atlantic

"When viewed through this lens, Obama’s election and reelection represented not a logical endpoint for racial conflict, nor even a sign that the occupation is over, but a promising proof-of-concept for discussing that occupation. Obama’s candidacy in 2008 was the beginning—not the end—of a new wave of black and brown organizing. It came just as the mirage of progress lifted and the centuries-old sinews of exploitation and exclusion were revealed again. The racist backlash and gatekeeping appeared first and most often as opposition to Obama, the most visible emblem of the organizing power of people of color. As Jamelle Bouie at Slate aptly notes, that racist reaction set off a chain of events that now sees Donald Trump––a candidate who peddles racism as a first order––as the presumptive Republican nominee and will likely see the party coalesce around him, #NeverTrump sentiments be damned.



Obama could never have been both the post-racial harmonizer and the racial equalizer that many analysts seem to have expected. “Racial conflict,” like more polite euphemisms, suggests a grand poetic struggle between groups and ideas— a negotiator like Obama should’ve been able to broker a peace. But in reality, addressing institutional racism tends to intensify societal racism; promoting interracial conciliation and promoting racial equality are often antithetical. This has always been the central issue at the core of “racial conflict” in America, from the backlash during Reconstruction to outrage about the Voting Rights Act. There is a much better word for what Obama has been confronted with, and it has always sufficed: racism."



Racism and 'Racial Conflict' in America - The Atlantic

Apple MacBook Air (13-Inch, Early 2015) - Full Review and Benchmarks







The Pros

Excellent battery life; Strong overall performance; Blazing flash storage; Comfortable keyboard



The Cons

Design could use update; Lower-res display than competitors



Verdict

While we wish it had a better display, the 2015 MacBook Air is superfast and lasts longer on a charge than any other ultraportable today.





Apple MacBook Air (13-Inch, Early 2015) - Full Review and Benchmarks

HP Chromebook 13 hands on review | TechRadar

TODO alt text



The Chromebook Pixel just met its first worthy rival that's actually another Chromebook, the HP Chromebook 13. Rather, I should say "frenemy", as HP and Google collaborated quite closely to bring forth one of the most pristine and productive Chromebooks to market.



What you see before you is a 13-inch, full-metal machine packing some serious power and versatility. Driven by Intel's Core m series of processors and not one, but two USB-C ports with Thunderbolt, the laptop can drive up to two Full HD (or one 4K) displays at once.



While Google has made huge headway in the past year or so in making Chromebooks close to 100% viable for the office, HP aims to herald in the sea change with one gorgeous vessel.



HP Chromebook 13 hands on review | TechRadar

HP Chromebook 13 Release Date, Price and Specs - CNET



HP Chromebook 13 Release Date, Price and Specs - CNET

Wednesday, May 04, 2016

NYTimes: Moore’s Law Running Out of Room, Tech Looks for a Successor

"SAN FRANCISCO — For decades, the computer industry has been guided by a faith that engineers would always find a way to make the components on computer chips smaller, faster and cheaper.

But a decision by a global alliance of chip makers to back away from reliance on Moore’s Law, a principle that has guided tech companies from the giant mainframes of the 1960s to today’s smartphones, shows that the industry may need to rethink the central tenet of Silicon Valley’s innovation ethos.

Chip scientists are nearly at the point where they are manipulating material as small as atoms. When they hit that mark within the next five years or so, they may bump into the boundaries of how tiny semiconductors can become. After that, they may have to look for alternatives to silicon, which is used to make computer chips, or new design ideas in order to make computers more powerful.

It is hard to overstate the importance of Moore’s Law to the entire world. Despite its official sound, it is not actually a scientific rule like Newton’s laws of motion. Instead, it describes the pace of change in a manufacturing process that has made computers exponentially more affordable.

In 1965, the Intel co-founder Gordon Moore first observed that the number of components that could be etched onto the surface of a silicon wafer was doubling at regular intervals and would do so for the foreseeable future.

When Dr. Moore made his observation, the densest memory chips stored only about 1,000 bits of information. Today’s densest memory chips have roughly 20 billion transistors. To put it another way, the iPad 2, which went on the market in 2011 for $400 and fits in your lap, had more computing power than the world’s most powerful supercomputer in the 1980s, a device called the Cray 2 that was about the size of an industrial washing machine and would cost more than $15 million today.

That iPad 2, mind you, is slow compared to newer models.

Without those remarkable improvements, today’s computer industry wouldn’t exist. The vast cloud-computing data centers run by companies like Google and Amazon would be impossibly expensive to build. There would be no smartphones with apps that allow you to order a ride home or get dinner delivered. And scientific breakthroughs like decoding the human genome or teaching machines to listen would not have happened.

Signaling their belief that the best way to forecast the future of computing needs to be changed, the Semiconductor Industry Associations of the United States, Europe, Japan, South Korea and Taiwan will make one final report based on a chip technology forecasting system called the International Technology Roadmap for Semiconductors.

Nearly every big chip maker, including Intel, IBM and Samsung, belongs to the organization, though Intel says it is not participating in the last report."

NYTimes: Moore’s Law Running Out of Room, Tech Looks for a Successor

Sunday, May 01, 2016

Artificial Intelligence Siri vs Alexa The Verge



The Verge

"As 2015 drew to a close, you might be forgiven for thinking the encryption debate was all talk. There had been a lot of speeches and it was clear the FBI didn’t like Apple’s default encryption system — but what could they actually do about it? They had been leaning on Congress all year and getting nowhere.

Then, everything changed. On February 16th, the FBI took Apple to court over an iPhone used by one of the San Bernardino attackers, putting encryption at the center of the largest terrorism-linked shooting in the US in years. A similar phone-unlocking order was already being argued in New York, and the two cases plunged Apple into a legal crisis, as the company faced the possibility that a single ruling might undo years of security work.

THE FBI'S ENCRYPTION CASES ARE OVER

Now, two months later, that fight is effectively over. The government backed out of the San Bernardino case on March 28th, after paying for a new method to break into the phone, and on Friday, the government pulled a similar move in New York. Late Friday night, investigators said they had discovered the passcode to the iPhone at the center of the New York case. It was an embarrassing retreat, announced at a time that would generate as little press coverage as possible, and hastily close an appeal that prosecutors had sworn to continue just two weeks earlier."

With its retreat in New York, the FBI has lost the encryption fight | The Verge

Scientists are working to make cows obsolete

"We depend on cows for food, clothing, and sometimes even insulin. Cattle, though, are expensive and inefficient—each cow drinks a bathtub of water and emits three times that volume of methane daily. There are also the ethics of animal slaughter. But we might no longer need the cow.

Scientists are synthesizing the substances we normally get from cows by using bovine cells, yeast, and even bacteria.

Scientists are working to make cows obsolete