frequency

New Apple iPad mini, 4th-generation iPad reach the FCC

Source: http://www.engadget.com/2012/10/23/new-apple-ipad-mini-4th-generation-ipad-reach-the-fcc/

New Apple iPad mini, 4thgeneration iPad reach the FCC

Apple’s iPad mini and 4th-generation iPad didn’t arrive alone. In the company’s time-honored tradition, it has timed the FCC filings for both devices to show up alongside the products themselves. Each iOS tablet has been approved in both singular WiFi and dual cellular editions: the iPad mini has appeared as the WiFi-only A1432 as well as the A1454 and A1455 for worldwide HSPA+, EV-DO and LTE coverage, while the full-size iPad has been cleared in directly paralleled A1458, A1459 and A1460 versions. Not surprisingly, the frequency range matches that of the iPhone 5 and suggests that we’re dealing with the same Qualcomm MDM9615 chip. We’ll know more once the two iPads are in our hands and those of teardown artists, but for now you can explore Apple’s regulatory gymnastics in full at the source links.

Filed under: ,

New Apple iPad mini, 4th-generation iPad reach the FCC originally appeared on Engadget on Tue, 23 Oct 2012 15:39:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceA1432, A1454, A1455, A1458, A1459, A1460  | Email this | Comments

Tags: , , , , , , , , , , , , , , , ,

Tuesday, October 23rd, 2012 news No Comments

‘99% Of Sales Come From People Who Don’t Interact With Ads’

Source: http://www.businessinsider.com/facebook-99-of-sales-come-from-people-who-dont-interact-with-ads-2012-10

facebook money

Facebook is trying to close the loop between ad exposure on the social network and real-life buying habits. 

For years, it has been difficult to prove that someone seeing an ad on Facebook (or anywhere else for that matter) became more likely to buy the product.

Brad Smallwood, Facebook’s director of Monetization Analytics, expanded on what the social network’s new partnership with Datalogix means for marketers at the IAB Mixx Conference during Advertising Week.

The partnership will allow Facebook clients to match user data with Datalogix sales data, and draw conclusions about whether ads on Facebook actually increased purchases. (Datalogix purchased data on 70 million American households.)

“The outcomes that happen in the grocery store, at the car dealership,” Smallwood said of the initiative that “for the first time ever that draws that elusive straight line from ad exposure to purchase.”

One overall takeaway from the data — which Smallwood said doesn’t identify consumers by name — is clear: Unless you’re dealing with a specific type of campaign (i.e. direct online sales) the answer isn’t direct response or clicks.

According to Smallwood, “99 percent of sales come from people who don’t interact with ads. They consume the message and then when they go to the store they purchase.”

Other important takeaways include:

  • Of Facebook’s study that measured 50 campaigns, 70 percent saw a 3x greater return on ad spend, and 49 percent saw a 5x or greater return on ad spend.
  • “Reach is a crucial driver,” Smallwood! said. A nd digital campaigns that managed to find the proper reach were 70 percent more effective at driving purchases than ROI.
  • Smallwood said that marketers see a 40 percent increase in ROI by finding the “optimal frequency point.” He compared finding the frequency “sweet spot” in social to other platforms: “In TV you don’t want to send 50 impressions to one person, but you also don’t want to send one.”

Although some privacy groups are asking the FTC to investigate whether this partnership violates consumers’ privacy, Smallwood portrayed the new initiative as a “move away from the models that don’t maximize.”

“We at Facebook are dedicated to help you understand stuff like that.”

Please follow Advertising on Twitter and Facebook.

Join the conversation about this story »

Tags: , , , , , , , , , , , , , , , , , , ,

Monday, October 1st, 2012 news No Comments

Apple’s A6 CPU actually clocked at around 1.3GHz, per new Geekbench report

Source: http://www.engadget.com/2012/09/26/apple-a6-cpu-13ghz-geekbench-confirmed-overclocking/

Apple's A6 CPU actually clocked at around 13GHz, per new Geekbench report

As the initial wave of iPhone 5 reviews hit, it looked as if Apple’s dual-core A6 processor was sporting a clock speed of around 1GHz. We saw reports (and confirmed with our own handset) ranging between 1.00 and 1.02GHz, but a new Geekbench build (v2.3.6) has today revealed a horse of a different color. According to Primate Labs’ own John Poole, the latest version of the app — which landed on the App Store today — “features a dramatically improved processor frequency detection algorithm, which consistently reports the A6’s frequency as 1.3GHz.” In speaking with us, he affirmed that “earlier versions of Geekbench had trouble determining the A6’s frequency, which lead to people claiming the A6’s frequency as 1.0GHz as it was the most common value Geekbench reported.”

When we asked if he felt that the A6 was capable of dynamically overclocking itself for more demanding tasks, he added: “I don’t believe the A6 has any form of processor boost. In our testing, we found the 1.3GHz was constant regardless of whether one core or both cores were busy.” Our own in-house iPhone 5 is regularly displaying 1.29GHz, while a tipster’s screenshot (hosted after the break) clearly display 1.30GHz. Oh, and if anyone wants to dip their iPhone 5 in a vat of liquid nitrogen while trying to push things well over the 2GHz level, we certainly wouldn’t try to dissuade your efforts.

[Thanks, Bruno]

Continue reading Apple’s A6 CPU actually clocked at around 1.3GHz, per new Geekbench report

Filed under: ,

Apple’s A6 CPU actually clocked at around 1.3GHz, per new Geekbench report originally appeared on Engadget on Wed, 26 Sep 2012 19:31:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourcePrimate Labs, Geekbench (App Store)  | Email this | Comments

Tags: , , , , , , , , , , , , , , , , , ,

Wednesday, September 26th, 2012 news No Comments

How Google Crunches All That Data

Source: http://gizmodo.com/5495097/how-google-crunches-all-that-data

If data centers are the brains of an information company, then Google is one of the brainiest there is. Though always evolving, it is, fundamentally, in the business of knowing everything. Here are some of the ways it stays sharp.

For tackling massive amounts of data, the main weapon in Google’s arsenal is MapReduce, a system developed by the company itself. Whereas other frameworks require a thoroughly tagged and rigorously organized database, MapReduce breaks the process down into simple steps, allowing it to deal with any type of data, which it distributes across a legion of machines.

Looking at MapReduce in 2008, Wired imagined the task of determining word frequency in Google Books. As its name would suggest, the MapReduce magic comes from two main steps: mapping and reducing.

The first of these, the mapping, is where MapReduce is unique. A master computer evaluates the request and then divvies it up into smaller, more manageable “sub-problems,” which are assigned to other computers. These sub-problems, in turn, may be divided up even further, depending on the complexity of the data set. In our example, the entirety of Google Books would be split, say, by author (but more likely by the order in which they were scanned, or something like that) and distributed to the worker computers.

Then the data is saved. To maximize efficiency, it remains on the worker computers’ local hard drives, as opposed to being sent, the whole petabyte-scale mess of it, back to some central location. Then comes the second central step: reduction. Other worker machines are assigned specifically to the task of grabbing the data from the computers that crunched it and paring it down to a format suitable for solving the problem at hand. In the Google Books example, this second set of machines would reduce and compile the processed data into lists of individual words and the frequency with which they appeared across Google’s digital library.

The finished product of the MapReduce system is, as Wired says, a “data set about your data,” one that has been crafted specifically to answer the initial question. In this case, the new data set would let you query any word and see how often it appeared in Google Books.

MapReduce is one way in which Google manipulates its massive amounts of data, sorting and resorting it into different sets that reveal new meanings and have unique uses. But another Herculean task Google faces is dealing with data that’s not already on its machines. It’s one of the most daunting data sets of all: the internet.

Last month, Wired got a rare look at the “algorithm that rules the web,” and the gist of it is that there is no single, set algorithm. Rather, Google rules the internet by constantly refining its search technologies, charting new territories like social media and refining the ones in which users tread most often with personalized searches.

But of course it’s not just about matching the terms people search for to the web sites that contain them. Amit Singhal, a Google Search guru, explains, “you are not matching words; you are actually trying to match meaning.”

Words are a finite data set. And you don’t need an entire data center to store them—a dictionary does just fine. But meaning is perhaps the most profound data set humanity has ever produced, and it’s one we’re charged with managing every day. Our own mental MapReduce probes for intent and scans for context, informing how we respond to the world around us.

In a sense, Google’s memory may be better than any one individual’s, and complex frameworks like MapReduce ensure that it will only continue to outpace us in that respect. But in terms of the capacity to process meaning, in all of its nuance, any one person could outperform all the machines in the Googleplex. For now, anyway. [Wired, Wikipedia, and Wired]

Image credit CNET

Memory [Forever] is our week-long consideration of what it really means when our memories, encoded in bits, flow in a million directions, and might truly live forever.

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Wednesday, March 17th, 2010 news No Comments

Social “media” is created – the total quantity, reach, frequency, and intensity are not pre-known

Social “media” is created – the total quantity, reach, frequency, and intensity are not pre-known. The “media” that is generated can be positive or negative or both. Extremity or “extreme-ness” is usually a necessary ingredient. Extremely positive, extremely entertaining, extremely negative, etc.

Oprah Winfrey KFC Grilled Chicken disaster, coupon debacle. Her reputation may have been permanently tarnished because she was found out to have been paid by KFC to promote the coupon tied to the launch of KFC grilled chicken.

oprah-kfc-blog-volume

Dominos was on the hot seat when 2 employees shot a video of them sticking mozzarella cheese up their nose and then putting it into the pizza.

dominos-blog-mentions

Motrin offended the sensibilities of moms when they implied that a baby was a cool “accessory.”  The blogosphere and twitter lit up with people taking exception to that.

motrin-blog-mentions

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Wednesday, July 29th, 2009 social networks No Comments

Top Posts for Week Ending July 26th

  • Notes from the front lines: Facebook advertising metrics and benchmarks
  • crispin porter bogusky’s beta site
  • The Perfect Babe – Megan Fox (pics)
  • The hardest thing to do in web 2.0 …
  • marketing misconceptions, advertising misconceptions, social media misconceptions
  • What is Web 3.0? Characteristics of Web 3.0
  • Bing is bigger than CNN, Digg, Twitter? Not so fast!
  • Smaller social networks are losing even the few users they have…
  • Harry Potter and the Half-Blood Prince and other Harry Potter Movies
  • Branding is still a useful activity? Reach and frequency is still a useful metric?
  • Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

    Monday, July 27th, 2009 digital No Comments

    Top Posts Week Ending July 17, 2009

  • Notes from the front lines: Facebook advertising metrics and benchmarks
  • crispin porter bogusky’s beta site
  • The hardest thing to do in web 2.0 …
  • The Perfect Babe – Megan Fox (pics)
  • marketing misconceptions, advertising misconceptions, social media misconceptions
  • Bing is bigger than CNN, Digg, Twitter? Not so fast!
  • Smaller social networks are losing even the few users they have…
  • Branding is still a useful activity? Reach and frequency is still a useful metric?
  • What is Web 3.0? Characteristics of Web 3.0
  • Merovingian Knot (video)
  • Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

    Tuesday, July 21st, 2009 digital No Comments

    social intensity – kpi (key performance indicator)

    social networks – the places where people go online to socialize


    social actions – the act of socializing on social networks


    social intensity – the frequency and quantity of social actions; this can/should be a new KPI (key performance indicator) for marketers to assess whether digital marketing efforts are working and yielding positive results against business objectives

    Tags: , , , , , , , , , , , , , , , , , , , , ,

    Wednesday, January 21st, 2009 digital No Comments

    Dr. Augustine Fou is Digital Consigliere to marketing executives, advising them on digital strategy and Unified Marketing(tm). Dr Fou has over 17 years of in-the-trenches, hands-on experience, which enables him to provide objective, in-depth assessments of their current marketing programs and recommendations for improving business impact and ROI using digital insights.

    Augustine Fou portrait
    http://twitter.com/acfou
    Send Tips: tips@go-digital.net
    Digital Strategy Consulting
    Dr. Augustine Fou LinkedIn Bio
    Digital Marketing Slideshares
    The Grand Unified Theory of Marketing