+ Reply to Thread
Page 11 of 12 FirstFirst 1 11 12 LastLast
Results 201 to 220 of 233

Thread: The Problems with Facebook

  1. Link to Post #201
    Avalon Member norman's Avatar
    Join Date
    25th March 2010
    Location
    too close to the hot air exhaust
    Age
    68
    Posts
    8,902
    Thanks
    9,946
    Thanked 55,108 times in 8,176 posts

    Default Re: The Problems with Facebook

    Quote Posted by Bill Ryan (here)
    I've just finished watching Part 2 of the new PBS documentary The Facebook Dilemma. It was fascinating (as was Part 1) — but My God, there are major problems here with (e.g.) using A.I. to filter out 'hate speech' and 'fake news'.

    I'm sure I'm not alone in detecting a marked Liberal leaning to the entire documentary — factually based though it appears, or presents itself, to be.

    (At this point, these questions of course spill over to other threads and discussions.)
    Left leaning is an understatement. Considering the timing, it's really a thinly disguised party political broadcast that says "understand why you voted for Trump, and don't be fooled again".

    On the plus side, it shows very well what a complete idiot Zuckerberg is, but then it ( fraudulently ) lines up all the wrong reasons to 'smack his backside'.
    ..................................................my first language is TYPO..............................................

  2. The Following 4 Users Say Thank You to norman For This Post:

    Ba-ba-Ra (3rd November 2018), justntime2learn (11th January 2019), onawah (16th November 2018), Tintin (13th June 2019)

  3. Link to Post #202
    UK Avalon Founder Bill Ryan's Avatar
    Join Date
    7th February 2010
    Location
    Ecuador
    Posts
    34,271
    Thanks
    209,031
    Thanked 457,541 times in 32,788 posts

    Default Re: The Problems with Facebook

    Quote Posted by Bill Ryan (here)
    Yes, it's a VERY good documentary.

    Amazingly, here's a piece about this from Dr Mercola. Kudos to him.
    The Facebook Dilemma

    Story at-a-glance
    • As of the third quarter of 2018, 2.27 billion people actively used Facebook, the world’s largest social media site, up from 1 billion in 2012
    • Facebook is unique in its ability to monetize the time people spend on its platform. During the third quarter of 2018, the site generated more than $6 per user
    • Ninety-eight percent of Facebook’s revenue comes from advertising, which totaled $39.9 billion in 2017
    • The addition of the “Like” button in 2009 revolutionized the company’s ability to gather personal data — information about your preferences that can then be sold
    • The significant danger with giving out personal data is that you’re opening yourself up to be a target of manipulation — whether you’re being manipulated to buy something you don’t need or believe something that isn’t true.
    As of the third quarter of 2018, 27 billion people actively used Facebook, the world's largest social media site, up from 1 billion in 2012. On average, each user spends about 41 minutes using the site daily, down from 50 minutes average in 2016.

    Some, of course, spend far more. Teens, for instance, may spend up to nine hours perusing the site, the consequences of which are only beginning to be understood.

    As noted by The Motley Fool, Facebook is unique in its ability to monetize the time people spend on its platform. During the third quarter of 2018, the site generated more than $6 per user. For the fourth quarter of 2017, Facebook raked in a total of $12.97 billion, $4.3 billion of which was net profit.

    Most of this revenue — $11.4 billion for the fourth quarter alone — came from mobile ads,5 which are customized to users' preferences and habits. According to CNN Money,6 98 percent of Facebook's revenue comes from advertising, totaling $39.9 billion in 2017.

    Facebook's Primary Business Is Collecting and Selling Your Personal Data

    Facebook has repeatedly been caught mishandling users' data and/or lying about its collection practices. The fact is, its entire profit model is based on the selling of personal information that facilitates everything from targeted advertising to targeted fraud.

    Like Google, Facebook records, tracks and stores every single thing you do on Facebook: every post, comment, "like," private message and file ever sent and received, contacts, friends lists, login locations, stickers and more. Even the recurrent use of certain words is noted and can become valuable currency for advertisers.

    For individuals who start using Facebook at a young age, the lifetime data harvest could be inconceivably large, giving those who buy or otherwise access that information a very comprehensive picture of the individual in question.

    Facebook also has the ability to access your computer or smartphone's microphone without your knowledge. If you suddenly find yourself on the receiving end of ads for products or services you just spoke about out loud, chances are one or more apps are linked into your microphone and are eavesdropping.

    In the featured video, "The Facebook Dilemma," Frontline PBS correspondent James Jacoby investigates Facebook's influence over the democracy of nations, and the lax privacy parameters that allowed for tens of millions of users' data to be siphoned off and used in an effort to influence the U.S. elections.

    The Early Days of Facebook

    The Frontline report starts out showing early video footage of Zuckerberg in his first office, complete with a beer keg and graffiti on the walls, talking about the success of his social media platform. At the time, in 2005, Facebook had just hit 3 million users.

    In an early Harvard lecture, Zuckerberg talks about how he believes it's "more useful to make things happen and apologize later than it is to make sure you dot all your i's now, and not get stuff done." As noted by Roger McNamee, an early Facebook investor, it was Zuckerberg's "renegade philosophy and disrespect for authority that led to the Facebook motto, 'Move fast and break things.'"

    While that motto speaks volumes today, "It wasn't that they intended to do harm, as much as they were unconcerned about the possibility that harm would result," McNamee says. As for the sharing of information, Zuckerberg assured a journalist in an early interview that no user information would be sold or shared with anyone the user had not specifically given permission to.

    In the end, Zuckerberg’s quest to “Give people the power to share and make the world more open and connected,” has had far-reaching consequences, affecting global politics and technology, and raising serious privacy issues that have yet to be resolved.

    For years, however, employees firmly believed Facebook had the power to make the world a better place. As noted by Tim Sparapani, Facebook director of public policy from 2009 to 2011, Facebook "was the greatest experiment in free speech in human history," and a "digital nation state."

    However, the company — with its largely homogenous workforce of 20-something tech geeks — has proven to be more than a little naïve about its mission to improve the world through information sharing. Naomi Gleit, vice president of social good, the company's growth team, says they were slow to understand "the ways in which Facebook might be used for bad things."

    The Facebook News Feed

    One of the key features of Facebook that keeps users engaged is the news feed, described by former product manager on Facebook's advertising team, Antonio Garcia Martinez, as "Your personalized newspaper; your 'The New York Times' of you, channel you. It is your customized, optimized vision of the world."

    However, the information that appears in your newsfeed isn't random. From the very beginning, it was driven by a secret algorithm, a mathematical formula that ranked stories in terms of importance based on your individual preferences. This personalization is "the secret sauce," to quote Martinez, that keeps users scrolling and sharing.

    The addition of the "Like" button in 2009 revolutionized the company's ability to gather personal data — information about your preferences that can then be sold for cold hard cash. It also "acted as a social lubricant" and a "flywheel of engagement," Soleio Cuervo, a former product manager for the company, says.

    The ability to get feedback through "likes" made people feel like they were being heard, and this ultimately became "the driving force of the product," Cuervo says. However, the "Like" button also suddenly allowed Facebook to determine who you care about most among your friends and family, what kind of content makes you react or take action, and which businesses and interests are truly important to you — information that helps build your personality profile and can be sold.

    The Legal Provision That Allowed Facebook to Exist and Flourish

    The Facebook news feed was made possible by laws that do not hold internet companies liable for the content posted on their website. As explained by Sparapani, "Section 230 of the Communications Decency Act is the provision which allows the internet economy to grow and thrive. And Facebook is one of the principal beneficiaries of this provision."

    Section 230 of the Communications Decency Act basically says an internet provider cannot be held responsible if someone posts something violent, offensive or even unlawful on their site. According to Sparapani, Facebook “took a very libertarian perspective” with regard to what it would allow on its site.

    Aside from a few basic common decency rules, the company was “reluctant to interpose our value system on this worldwide community,” Sparapani says. Were they concerned about truth becoming obfuscated amid a flood of lies? Jacoby wonders. “No,” Sparapani says. “We relied on what we thought were the public’s common sense and common decency to police the site."

    Real-World Impacts of Social Media

    The tremendous impact of social media, the ability to share information with like-minded individuals, became apparent during the so-called “Arab Spring” in 2011, when a Facebook page created by Wael Ghonim, a Google employee in the Middle East, literally sparked a revolution that led to the resignation of Egyptian President Muhammad Hosni El Sayed Mubarak, just 18 days after a Facebook call-out for protest resulted in hundreds of thousands of people taking to the streets.

    Around the world, it became clear that Facebook could be used to create democratic change; that it has the power to change society as we know it. Alas, with the good comes the bad. After the revolution, conflict in the Middle East spiraled out of control as the polarization between opposing sides grew — and the social media environment both bred and encouraged that polarization.

    What's worse, Facebook's news feed algorithm was actually designed to reward polarizing material with greater distribution. The end result played out in the streets, where sectarian violence led to bloodshed.

    "The hardest thing for me was seeing the tool that brought us together tearing us apart,” Ghonim says, adding, “These tools are just enablers for whomever; they don’t separate between what’s good and bad. They just look at engagement metrics.” Since the Arab Spring, the rise of fake news has been relentless.

    "Everything that happened after the Arab Spring should have been a warning sign to Facebook,” says Zeynep Tufekci, a researcher and former computer programmer. One major problem, she believes, is that Facebook was unprepared to monitor all of the content coming from every corner of the globe.

    She urged the company to hire more staff, and to hire people who know the language and understand the local culture in each region Facebook is available. Still, it's unlikely that any company, at any size, would be able to police the content of a social network with more than 2 billion users.

    Privacy — What Privacy?

    In order for Facebook to go public, it had to be profitable, which is where the selling of user data comes in. By selling the information the platform has collected about you as you move through content and even web pages outside of Facebook, "liking" and commenting on posts along the way, marketers are able to target their chosen market.

    While this seems innocuous enough at first glance, this data harvesting and selling has tremendous ramifications, opening people up to be purposely deceived and misled.

    Zuckerberg, whose experience with advertising was limited, hired former Google vice president of global online sales and operations, Sheryl Sandberg, as chief operating officer. In one interview, Sandberg stresses that Facebook is "focused on privacy," and that their business model "is by far the most privacy-friendly to consumers."

    "That's our mission," Zuckerberg chimes in, adding "We have to do that because if people feel like they don't have control over how they're sharing things, then we're failing them." "It really is the point that the only things Facebook knows about you are things you've done and told us," Sandberg says.

    Internally, however, Sandberg demanded revenue growth, which meant selling more ads, which led to data harvesting that today exceeds people’s wildest imagination.

    How to Build an Orwellian Surveillance Machine

    By partnering with data brokering companies, Facebook has access to an incredible amount of data that has nothing to do with what you post online — information on your credit card transactions, where you live, where you shop, how your family is spending its time, where you work, what you eat, read, listen to and much more.

    Information is also being collected about all other websites you’re perusing, outside of Facebook’s platform. All of this information, obtained by companies without your knowledge, is shared with Facebook, so that Facebook can sell ads that target specific groups of users. As noted by Tufekci, in order for Facebook’s business model to work, “it has to remain a surveillance machine."

    In short, it’s the ultimate advertising tool ever created. The price? Your privacy. Sparapani was so uncomfortable with this new direction of Facebook, he resigned before the company’s partnering with data brokers took effect.

    The extent of Facebook's data collection remained largely unknown until Max Schrems, an Austrian privacy advocate, filed 22 complaints with the Irish Data Protection Commission, where Facebook's international headquarters are located.

    Schrems claimed that Facebook’s personal data collection violated European privacy law, as Facebook was not telling users how that data was being used. In the end, nothing happened. As noted by Schrems, it was obvious that “even if you violate the law, the reality is it’s very likely not going to be enforced.” In the U.S., the situation is even worse, as there are no laws governing emerging technologies which utilize9 the kinds of data collection done by Facebook.

    Federal Trade Commission Investigates Privacy Concerns

    A 2010 investigation of Facebook's data collection by the U.S. Federal Trade Commission (FTC) revealed the company was sharing user data with third party software developers without the users' consent — conduct the FTC deemed deceptive.

    The FTC also grew concerned about the potential misuse of personal information, as Facebook was not tracking how third parties were using the information. They just handed over access, and these third parties could have been absolutely anyone capable of developing a third-party app for the site. Facebook settled the FTC's case against them without admitting guilt, but agreed by consent order to "identify risk to personal privacy" and eliminate those risks.

    Internally, however, privacy issues were clearly not a priority, according to testimony by Sandy Parakilas, Facebook's platform operations manager between 2011 and 2012 who, during his time with the company, ended up in charge of solving the company's privacy conundrum — a responsibility he felt significantly underqualified for, considering its scope.

    The Cambridge Analytica Scandal

    Facebook, with founder Mark Zuckerberg at its helm, faced a firestorm after The New York Times and British media outlets reported Cambridge Analytica used "improperly gleaned" data from 87 million Facebook users to influence American voters during the 2016 presidential election.

    Cambridge Analytica data scientist Christopher Wylie, who blew the whistle on his employer, revealed the company built "a system that could profile individual U.S. voters in order to target them with personalized political advertisements" during the presidential campaign.

    Parakilas insisted Facebook could have prevented the whole thing had they actually paid attention to and beefed up their internal security practices. Indeed, Cambridge Analytica used the very weakness the FTC had identified years before — a third-party personality quiz app called "This Is Your Digital Life."

    The Dark Side of Social Media Rears Its Ugly Head Again

    Indeed, the U.S. Department of Defense has also expressed its concerns about Facebook, noting the ease with which it can spread disinformation. As noted by former Defense Advanced Research Projects Agency program manager, Rand Waltzman, the significant danger with giving out personal data is that you’re opening yourself up to be a target of manipulation — whether you’re being manipulated to buy something you don’t need or believe something that isn’t true.

    Between 2012 and 2015, Waltzman and colleagues published 200 scientific papers on the potential threats posed by social media, detailing how Facebook and other platforms could be used for nefarious purposes. According to Waltzman, disinformation can be turned "into a serious weapon" on Facebook, as you have the ability to mislead enormous amounts of people with very little effort.

    Essentially, Facebook allows for the propagation of propaganda at an enormous scale. "It's the scale that makes it a weapon," Waltzman says. Jacoby interviews a young Russian who claims to have worked as a paid social media propagandist for the Russian government, using fake Facebook profiles to spread false information and sow distrust of the Ukranian government.

    The reach of this disinformation was made all the greater by the fact that you can pay to promote certain posts. In the end, all of the tools created by Facebook to benefit advertisers work equally well as government propaganda tools. The end result is tragic, as fake news has mushroomed to incomprehensible levels. Taking anything at face value these days is risky business, no matter how legitimate it may appear.

    Understand the Risks of Social Media Use

    Social media has many wonderful benefits. But there’s a dark side, and it’s important to be aware of this. Sen. Ron Wyden (D-OR) has actually drafted legislation to protect consumer information by enforcing strict punishments, including jail time for up to 20 years, for senior company executives who fail to follow the guidelines to protect user data. As reported by Endgadget:
    "The FTC would add 175 new members to its staff to carry out enforcement and would be given the ability to penalize a company up to four percent of its revenue for its first violation. Companies would also be required to submit regular reports to the FTC to disclose any privacy lapses that have occurred.
    Companies making more than $1 billion in revenue and handling information from more than 1 million people and smaller companies handling the data of more than 50 million people would be subject to the regular check-ins. Failure to comply would care a punishment of potential jail time for executives.
    The legislation would also institute a Do Not Track list. When a consumer joins the list, companies would be barred from sharing their data with third parties or using it to serve up targeted advertisements … Even if consumers don't choose to join the list, they would be granted the ability to review information collected about them, see who it has been shared with or sold to and challenge any inaccuracies."
    Aside from privacy concerns and fake news, Facebook lurking has also been linked to decreased emotional well-being, and online bullying, social isolation and depression have all become serious problems among our youth.

    The obvious answer to all of these issues is to minimize your use of Facebook, and be mindful of what you post, click on and comment on while there. Information is still being gathered on your personal life by other data brokers, but at least it won’t be as effectively “weaponized” against you if it’s not tied to your Facebook profile.

  4. The Following 10 Users Say Thank You to Bill Ryan For This Post:

    Carmody (17th August 2019), Denise/Dizi (2nd August 2019), Hervé (16th November 2018), Johan (Keyholder) (24th August 2019), justntime2learn (11th January 2019), Magnus (18th November 2018), meeradas (17th November 2018), onawah (16th November 2018), Tintin (11th January 2019), Yoda (16th November 2018)

  5. Link to Post #203
    Avalon Member Flash's Avatar
    Join Date
    26th December 2010
    Location
    Montreal
    Posts
    9,637
    Thanks
    38,027
    Thanked 53,692 times in 8,940 posts

    Default Re: The Problems with Facebook

    I am actually using messenger to communicate with two friends about a trip we plan to do. No facebooking on it, no research on my side on google about 2 hotels we were looking at.

    Well, next day, I open my facebook to have advertising from the two hotels we had chatted about. Private conversation, used to pull advertising on us. Everyting is registered, used and abuse. No private conversations on messenger, ever.

    It really felt like I had been spied on, and it is spying on us actually. I told my friend and they were also feeling aggravated and spied on. Very uncomfortable.

    - no more sex talk on messenger - lol just kidding.
    Last edited by Flash; 16th November 2018 at 21:12.
    How to let the desire of your mind become the desire of your heart - Gurdjieff

  6. The Following 11 Users Say Thank You to Flash For This Post:

    Bill Ryan (16th November 2018), Carmody (17th August 2019), Denise/Dizi (2nd August 2019), Deux Corbeaux (7th December 2018), Hervé (16th November 2018), justntime2learn (11th January 2019), meeradas (17th November 2018), onawah (16th November 2018), ThePythonicCow (16th November 2018), Tintin (11th January 2019), Valerie Villars (17th November 2018)

  7. Link to Post #204
    Avalon Member Flash's Avatar
    Join Date
    26th December 2010
    Location
    Montreal
    Posts
    9,637
    Thanks
    38,027
    Thanked 53,692 times in 8,940 posts

    Default Re: The Problems with Facebook

    That is it, its official, I hate facebook!! The exchanges are at the lowest possible denominator: rude, stupid people overtaking any chat, most are not even able to read, and most never ever admit anything they wrote that could be wrong. Nor that others do not always have bad intentions

    But the unbelievable lack of basic civility still astonishes me. Usually I only repost some information, the only usefulness is to contact some friends and even there, you are pursued with advertising because your private conversations are read by facebook.

    Of course someone was rude to me today and for one of the rare times I answered back, the stupidity was too much to keep silent.

    I also was acquainted with someone with a real low but deemed normal IQ coupled with zero heart, just a miserable despisable being similar to the one commenting today on facebook.

    It discourages me to ever imagine that humanity can survive, if what I saw are average humans.

    Ok my rant is over, it was quite needed to blow up some steam a bit.
    How to let the desire of your mind become the desire of your heart - Gurdjieff

  8. The Following 6 Users Say Thank You to Flash For This Post:

    avid (7th December 2018), Carmody (17th August 2019), Denise/Dizi (2nd August 2019), justntime2learn (11th January 2019), petra (6th March 2019), ThePythonicCow (7th December 2018)

  9. Link to Post #205
    United States Avalon Member onawah's Avatar
    Join Date
    28th March 2010
    Language
    English
    Posts
    22,209
    Thanks
    47,682
    Thanked 116,102 times in 20,640 posts

    Default Re: The Problems with Facebook

    You might want to try Facebook Purity. It gives you more options for managing your FB page.
    Quote Posted by Flash (here)
    That is it, its official, I hate facebook!!
    Each breath a gift...
    _____________

  10. The Following 3 Users Say Thank You to onawah For This Post:

    avid (7th December 2018), Denise/Dizi (2nd August 2019), Flash (7th December 2018)

  11. Link to Post #206
    UK Avalon Member avid's Avatar
    Join Date
    19th March 2010
    Location
    NW UK
    Language
    English
    Posts
    2,884
    Thanks
    58,293
    Thanked 15,637 times in 2,654 posts

    Default Re: The Problems with Facebook

    Thanks for this Flash, I have tried to fathom the privacy settings - which keep changing - for ages. Any search on Google is spotted on Facebook, therefore advertising et al...
    Our searches are filed into our history - however much we try to switch off/log out. Any conversations linked to other platforms are documented and archived. There is little space left for privacy. An innocent telephone call is recorded. Let’s face it - we are encompassed by vile interfaces, so is the future ‘snail-mail’? I still keep my FB account as I use it for disseminating links to health-related info, such as Fluoride Action Network, devices to stop weather manipulation, and to try to assure folk this is not just my paranoia - with evidentials.
    Don’t despair, who cares if some stupid spy wants to see where you are going on holiday - it’s pathetic, they’ll miss out on your enjoyment 😉
    The love you withhold is the pain that you carry
    and er..
    "Chariots of the Globs" (apols to Fat Freddy's Cat)

  12. The Following 3 Users Say Thank You to avid For This Post:

    Denise/Dizi (2nd August 2019), Flash (7th December 2018), justntime2learn (11th January 2019)

  13. Link to Post #207
    UK Avalon Member avid's Avatar
    Join Date
    19th March 2010
    Location
    NW UK
    Language
    English
    Posts
    2,884
    Thanks
    58,293
    Thanked 15,637 times in 2,654 posts

    Default Re: The Problems with Facebook

    Quote Posted by onawah (here)
    You might want to try Facebook Purity. It gives you more options for managing your FB page.
    Quote Posted by Flash (here)
    That is it, its official, I hate facebook!!
    Agreed, once my iMac back up to date, my fb purity was great, no ads etc etc... Just relied on ipad/iphone for past 2 years, so upgrade iMac here I come. The cd drive failed, and couldn’t upload new system upgrades... 🙄
    The love you withhold is the pain that you carry
    and er..
    "Chariots of the Globs" (apols to Fat Freddy's Cat)

  14. The Following 4 Users Say Thank You to avid For This Post:

    Denise/Dizi (2nd August 2019), Flash (7th December 2018), justntime2learn (11th January 2019), onawah (7th December 2018)

  15. Link to Post #208
    UK Avalon Founder Bill Ryan's Avatar
    Join Date
    7th February 2010
    Location
    Ecuador
    Posts
    34,271
    Thanks
    209,031
    Thanked 457,541 times in 32,788 posts

    Default Re: The Problems with Facebook

    From https://cnbc.com/2019/01/08/facebook...ss-blamed.html

    Inside Facebook's 'cult-like' workplace, where dissent is discouraged and employees pretend to be happy all the time
    8 Jan 2019
    • More than a dozen former Facebook employees detailed how the company's leadership and its performance review system has created a culture where any dissent is discouraged.
    • Employees say Facebook's stack ranking performance review system drives employees to push out products and features that drive user engagement without fully considering potential long-term negative impacts on user experience or privacy.
    • Reliance on peer reviews creates an underlying pressure for Facebook employees to forge friendships with colleagues for the sake of career advancement.
    At a company-wide town hall in early October, numerous Facebook employees got in line to speak about their experiences with sexual harassment.

    The company called the special town hall after head of policy Joel Kaplan caused an internal uproar for appearing at the congressional hearing for Judge Brett Kavanaugh. A young female employee was among those who got up to speak, addressing her comments directly to COO Sheryl Sandberg.

    "I was reticent to speak, Sheryl, because the pressure for us to act as though everything is fine and that we love working here is so great that it hurts," she said, according to multiple former Facebook employees who witnessed the event.

    "There shouldn't be this pressure to pretend to love something when I don't feel this way," said the employee, setting off a wave of applause from her colleagues at the emotional town hall in Menlo Park, California.

    The episode speaks to an atmosphere at Facebook in which employees feel pressure to place the company above all else in their lives, fall in line with their manager's orders and force cordiality with their colleagues so they can advance. Several former employees likened the culture to a "cult."

    This culture has contributed to the company's well-publicized wave of scandals over the last two years, such as governments spreading misinformation to try to influence elections and the misuse of private user data, according to many people who worked there during this period. They say Facebook might have have caught some of these problems sooner if employees were encouraged to deliver honest feedback. Amid these scandals, Facebook's share price fell nearly 30 percent in 2018 and nearly 40 percent since a peak in July, resulting in a loss of more than $252 billion in market capitalization.

    Meanwhile, Facebook's reputation as being one of the best places in Silicon Valley to work is starting to show some cracks. According to Glassdoor, which lets employees anonymously review their workplaces, Facebook fell from being the best place to work in the U.S. to No. 7 in the last year.

    But employees don't complain in the workplace.

    "There's a real culture of 'Even if you are f---ing miserable, you need to act like you love this place,'" said one ex-employee who left in October. "It is not OK to act like this is not the best place to work."

    This account is based on conversations with more than a dozen former Facebook employees who left between late 2016 and the end of 2018. These people requested anonymity in describing Facebook's work culture, including its "stack ranking" employee performance evaluation system and their experiences with it, because none is authorized by Facebook to talk about their time there. This stack ranking system is similar to the one that was notoriously used by Microsoft before the company abandoned it in 2013, the former Facebook employees said.

    Facebook declined to comment on former employees' characterization of the work place as "cult-like."

    Inside the bubble

    Former employees describe a top-down approach where major decisions are made by the company's leadership, and employees are discouraged from voicing dissent — in direct contradiction to one of Sandberg's mantras, "authentic self."

    For instance, at an all-hands meeting in early 2017, one employee asked Facebook Vice President David Fischer a tough question about a company program. Fischer took the question and answered, but within hours, the employee and his managers received angry calls from the team running that program, this person said.

    "I never felt it was an environment that truly encouraged 'authentic self' and encouraged real dissent because the times I personally did it, I always got calls," said the former manager, who left the company in early 2018.

    The sentiment was echoed by another employee who left in 2017.

    "What comes with scale and larger operations is you can't afford to have too much individual voice," said this person. "If you have an army, the larger the army is, the less individuals have voice. They have to follow the leader."

    In this employee's two years at Facebook, his team grew from a few people to more than 50. He said "it was very much implied" to him and his teammates that they trust their leaders, follow orders and avoid having hard conversations.

    The company's culture of no-dissent prevented employees from speaking up about the impact that News Feed had on influencing the 2016 U.S. election, this person added.


    Facebook is obviously in disarray, says Jim Cramer 9:58 AM ET Mon, 19 Nov 2018 | 04:52

    The message was clear in August 2016 when the company laid off the editorial staff of its trending news team, shortly after some workers on that team leaked to the press that they were suppressing conservative-leaning stories. Employees were further discouraged from speaking up following the election, when CEO Mark Zuckerberg brushed off the accusation that Facebook could have impacted the election, calling that idea "crazy."

    The former employee described "a bubble" at the company in which employees are dissuaded from giving managers critical feedback or challenging decisions.

    "I'm pretty disappointed in that because I have a lot of respect for Sheryl, and she preaches about giving hard feedback," the employee said.

    "All the things we were preaching, we weren't doing enough of them. We weren't having enough hard conversations. They need to realize that. They need to reflect and ask if they're having hard conversations or just being echo chambers of themselves."

    Show no weakness

    Many former employees blamed the cult-like atmosphere partly on Facebook's performance review system, which requires employees to get reviews from approximately five of their peers twice a year. This peer review system pressures employees to forge friendships with colleagues at every possible opportunity, whether it be going to lunch together each day or hanging out after work.

    "It's a little bit of a popularity contest," said one manager who left the company in 2017. "You can cherry-pick the people who like you — maybe throw in one bad apple to equalize it."

    Peers can provide feedback directly to their colleagues, or they can send the reviews to the employee's manager. That feedback is typically treated as anonymous and cannot be challenged.

    "You have invisible charges against you, and that figures mightily into your review," said an employee who left in October. "Your negative feedback can haunt you for all your days at Facebook."

    Several former employees said that peers and managers iced them out because they had personal commitments or problems that required significant attention outside of work.

    For instance, one employee who left in recent weeks said a manager was critical in a public team meeting because the employee didn't attend a team-building event outside work. At the time, this person was going through a divorce.

    "She definitely marked me down for not attending those team-building events, but I couldn't attend because I was going through my own issues and needed work-life balance," said the employee.

    Employees are not required to attend after-hours events, according to a Facebook spokeswoman, adding that collaboration is important at the company.

    Another manager who also left the company in recent weeks said she once took multiple weeks of vacation instead of going on medical leave to treat a major illness. She says she did this based on advice from her supervisor.

    "I was afraid that if I told too many people or took too much time off, I would be seen as unable to do my job," the former manager said. "I was scared that if I let up in any way, shape or form they would crumble me, and they did."


    Facebook's costs mount as data scandals deepen 2:13 PM ET Thu, 20 Dec 2018 | 01:37

    Ironically, one of the best ways to see the desperation to be liked is to follow Facebook employees on Facebook itself.

    Employees parade the company's projects and post any report on the benefits of working at the company or the positive impact the company is making on the world. This is in part a show for peers and managers, former employees said.

    "People are very mindful about who they're connected with on Facebook who they also work with and how what they're posting will put them in a favorable light to their managers," an employee who left in 2016 said.

    As with many social media users, the online content does not always reflect the offline emotions.

    "There's so many people there who are unhappy, but their Facebook posts alone don't reflect the backdoor conversations you have with people where they're crying and really unhappy," she said.

    How employees are graded

    Twice a year, this peer feedback comes into play in so-called calibration meetings, where employees are given one of seven grades.

    Managers deliberate with their peers to grade employees in all levels below them. As the review process moves up the chain over the course of multiple weeks, lower-level managers gradually leave the room, until the company's vice presidents finish the calibration. At this point, Zuckerberg and Sandberg sign off that their vice presidents have done due diligence, and each employee's grade for the past six months is finalized.

    But there's a companywide limit on the percentage of employees who can receive each grade. So during the reviews process, managers compete against their peer managers to secure strong grades for their direct reports. Managers are compelled to vouch fiercely for their favorite employees, but don't speak up for employees they don't like or who have previously received poor ratings.

    "There's a saying at Facebook that once you have one bad half, you're destined for bad halves the rest of your time there. That stigma will follow you," said a manager who left in September.

    According to two former executives, the grade breakdown is approximately as follows:
    • "Redefine," the highest grade, is given to fewer than 5 percent of employees
    • "Greatly exceeds expectations": 10 percent
    • "Exceeds": 35 percent
    • "Meets all": 35 to 40 percent
    • "Meets most," a low grade that puts future employment at risk, goes to most of the remaining 10 to 15 percent
    • "Meets some" grades are extremely rare and are seen as an indication that you're probably getting fired, according to multiple employees.
    • "Does not meet" are exceptionally rare, as most employees are fired before they get to that level.
    The distribution of these grades are not a hard limit but rather a recommended guidance for managers to follow, according to a Facebook spokeswoman.

    Facebook isn't the only tech company to use a performance evaluation system where a percentage of employees is pegged to each performance grade, meaning that there's always a fixed population at risk of being fired. Pioneered by Jack Welch at General Electric in the 1990s and sometimes known as "stack ranking," this method is fairly common in Silicon Valley and was most notoriously used by Microsoft until the company got rid of it in 2013 after widespread employee complaints.


    Facebook VP Bickert discusses efforts to combat hate speech 6:11 PM ET Wed, 12 Dec 2018 | 03:03

    Stack ranking systems work well at companies with competitive environments that compare employees on objectively measurable performance, according to Alexandra Michel, professor at the University of Pennsylvania who studies work culture. However, the system tends to break down and cause distrust among employees and create a political atmosphere when applied by companies that measure performance subjectively, or companies that demand employee loyalty in exchange for benefits and the promise of career advancement, Michel said.

    "If you have an environment that is completely cutthroat like Wall Street, this system works pretty well," Michel said. "But if you have employees who come in and want to be taken care of, want to learn, want to be part of a warm group and people who care about them — that's a very jarring mismatch."

    Since early 2017, Facebook has become more rigorous in distributing grades by specific percentages, according to multiple former employees.

    "I had a boss literally say to me 'You don't have enough people in 'meets some,' 'meets most,' and 'meets all,'" said a former director who left earlier this year. "I was finding myself making up things to be hypercritical of employees to give them lower ratings than they really deserved."

    These twice-yearly reviews encourage employees to be particularly productive around June and December, working nights and weekends as they race to impress bosses before reviews, which are typically completed in August and February. It's especially true in December, the half Facebook predominantly uses to determine which employees will receive promotions.

    This rush causes employees to focus on short-term goals and push out features that drive user engagement and improve their own metrics without fully considering potential long-term negative impacts on user experience or privacy, multiple former employees said.

    "If you're up for promotion, and it's based on whether you get a product out or not, you're almost certainly going to push that product out," a former engineer said. "Otherwise you're going to have to wait another year to get that promotion."

    As employees begin gathering peer reviews and buckling up for their next round of calibrations in February, the process will reveal how employees are thinking of the company after a bruising 2018, according to employees who left recently.

    There will be an extra level of anxiety around the process this time, one person said. Folks who have been wanting to leave will be hoping to notch a high rating so they can depart on good terms. Others who are committed to the company will be torn between speaking up about their concerns or staying in line for the sake of their careers. Any changes to company's grading process this time could reveal whether Facebook is taking special steps to keep valued employees around, or continuing along the same lines.

    "This review cycle will be particularly colorful for them," according to a director who left recently.

    WATCH: Here's how to see which apps have access to your Facebook data — and cut them off


    Here's how to see which apps have access to your Facebook data — and cut them off 8:19 PM ET Thu, 22 March 2018 | 01:10
    Last edited by Bill Ryan; 11th January 2019 at 14:57.

  16. The Following 13 Users Say Thank You to Bill Ryan For This Post:

    avid (11th January 2019), Billy (11th January 2019), Debra (11th January 2019), Denise/Dizi (2nd August 2019), Deux Corbeaux (11th January 2019), Hervé (11th January 2019), justntime2learn (11th January 2019), meeradas (11th January 2019), norman (11th January 2019), Sophocles (11th January 2019), ThePythonicCow (11th January 2019), Tintin (11th January 2019), Valerie Villars (11th January 2019)

  17. Link to Post #209
    UK Avalon Member avid's Avatar
    Join Date
    19th March 2010
    Location
    NW UK
    Language
    English
    Posts
    2,884
    Thanks
    58,293
    Thanked 15,637 times in 2,654 posts

    Default Re: The Problems with Facebook

    Thanks, got rid of change.org and gofundme, the only 2 that could access me. 👍
    The love you withhold is the pain that you carry
    and er..
    "Chariots of the Globs" (apols to Fat Freddy's Cat)

  18. The Following 3 Users Say Thank You to avid For This Post:

    Bill Ryan (11th January 2019), Denise/Dizi (2nd August 2019), Tintin (6th March 2019)

  19. Link to Post #210
    Avalon Member norman's Avatar
    Join Date
    25th March 2010
    Location
    too close to the hot air exhaust
    Age
    68
    Posts
    8,902
    Thanks
    9,946
    Thanked 55,108 times in 8,176 posts

    Default Re: The Problems with Facebook

    Quote Posted by Bill Ryan (here)

    From https://cnbc.com/2019/01/08/facebook...ss-blamed.html

    Inside Facebook's 'cult-like' workplace, where dissent is discouraged and employees pretend to be happy all the time
    8 Jan 2019

    I've posted a link to an edited audio recording of Alex Jones expanding very much more broadly the extent of what's going on with social media and what's about to hit us in the next few months. It's a huge operation that was originated by Obama even before they knew Hillary was not going to be the President. It seems little has slowed or prevented their plan from coming into being.

    Get ready folks, it's all about to notch up to the next level.

    https://projectavalon.net/forum4/show...=1#post1269061
    ..................................................my first language is TYPO..............................................

  20. The Following 5 Users Say Thank You to norman For This Post:

    avid (12th January 2019), Bill Ryan (12th January 2019), Carmody (17th August 2019), Denise/Dizi (2nd August 2019), Tintin (6th March 2019)

  21. Link to Post #211
    UK Avalon Member sunwings's Avatar
    Join Date
    23rd May 2016
    Location
    Barcelona
    Age
    40
    Posts
    659
    Thanks
    3,253
    Thanked 4,667 times in 638 posts

    Default Re: The Problems with Facebook

    The most recent craze on Facebook is the ten year challenge. However many people are asking Was The Facebook '10 Year Challenge' A Way To Mine Data For Facial Recognition AI?

    Last week a new Facebook challenge went viral asking users to post a photo from 10 years ago and one from today captioning “how did aging effect you?” Now being called the “10-Year Challenge.” Over 5.2 million, including many celebrities, participating in this challenge.

    “Imagine that you wanted to train a facial recognition algorithm on age-related characteristics and, more specifically, on age progression (e.g., how people are likely to look as they get older). Ideally, you'd want a broad and rigorous dataset with lots of people's pictures. It would help if you knew they were taken a fixed number of years apart—say, 10 years,” said O’Neill.

    https://www.forbes.com/sites/nicolem.../#73c514975859

  22. The Following 5 Users Say Thank You to sunwings For This Post:

    Bill Ryan (20th January 2019), Carmody (17th August 2019), Constance (6th March 2019), Tintin (6th March 2019), waree (20th January 2019)

  23. Link to Post #212
    Avalon Member Flash's Avatar
    Join Date
    26th December 2010
    Location
    Montreal
    Posts
    9,637
    Thanks
    38,027
    Thanked 53,692 times in 8,940 posts

    Default Re: The Problems with Facebook

    Quote Posted by sunwings (here)
    The most recent craze on Facebook is the ten year challenge. However many people are asking Was The Facebook '10 Year Challenge' A Way To Mine Data For Facial Recognition AI?

    Last week a new Facebook challenge went viral asking users to post a photo from 10 years ago and one from today captioning “how did aging effect you?” Now being called the “10-Year Challenge.” Over 5.2 million, including many celebrities, participating in this challenge.

    “Imagine that you wanted to train a facial recognition algorithm on age-related characteristics and, more specifically, on age progression (e.g., how people are likely to look as they get older). Ideally, you'd want a broad and rigorous dataset with lots of people's pictures. It would help if you knew they were taken a fixed number of years apart—say, 10 years,” said O’Neill.

    https://www.forbes.com/sites/nicolem.../#73c514975859
    spot on, I am sure this is it
    How to let the desire of your mind become the desire of your heart - Gurdjieff

  24. The Following 5 Users Say Thank You to Flash For This Post:

    Bill Ryan (21st January 2019), sunwings (21st January 2019), Tintin (6th March 2019), Valerie Villars (21st January 2019), waree (20th January 2019)

  25. Link to Post #213
    United States Avalon Member Bluegreen's Avatar
    Join Date
    18th July 2014
    Location
    Ø
    Language
    ¿
    Posts
    10,823
    Thanks
    45,835
    Thanked 52,179 times in 10,108 posts

    Default Re: The Problems with Facebook

    Internal Facebook Leak Reveals Global Bribery Scheme

    . . . . . . . . .To Soften Data Privacy Laws


    http://www.zerohedge.com/news/2019-0...a-privacy-laws


    . . . . . . . .

  26. The Following 4 Users Say Thank You to Bluegreen For This Post:

    Bill Ryan (11th June 2019), Carmody (17th August 2019), Tintin (6th March 2019), Valerie Villars (6th March 2019)

  27. Link to Post #214
    France On Sabbatical
    Join Date
    7th March 2011
    Location
    Brittany
    Posts
    16,763
    Thanks
    60,315
    Thanked 95,891 times in 15,481 posts

    Default Re: The Problems with Facebook

    ...


    The real problem with Facebook:


    "La réalité est un rêve que l'on fait atterrir" San Antonio AKA F. Dard

    Troll-hood motto: Never, ever, however, whatsoever, to anyone, a point concede.

  28. The Following 8 Users Say Thank You to Hervé For This Post:

    avid (11th June 2019), Bill Ryan (11th June 2019), Carmody (17th August 2019), Constance (12th June 2019), Deux Corbeaux (11th June 2019), PurpleLama (1st August 2019), Sophocles (13th June 2019), Tintin (11th June 2019)

  29. Link to Post #215
    France On Sabbatical
    Join Date
    7th March 2011
    Location
    Brittany
    Posts
    16,763
    Thanks
    60,315
    Thanked 95,891 times in 15,481 posts

    Default Re: The Problems with Facebook

    ...

    The real problem with all of 'em:


    How NeoCon Billionaire Paul Singer Is Driving the Outsourcing of US Tech Jobs to Israel

    by Whitney Webb
    June 11th, 2019


    Paul Singer | AP photo archive
    Several U.S. tech giants including Google, Microsoft and Intel Corporation have filled top positions with former members of Israeli military intelligence and are heavily investing in their Israeli branches while laying off thousands of American employees, all while receiving millions of dollars in U.S. government subsidies funded by American taxpayers.
    WASHINGTON — With nearly 6 million Americans unemployed and regular bouts of layoffs in the U.S. tech industry, major American tech companies like Google, Microsoft and Intel Corporation are nonetheless moving key operations, billions in investments, and thousands of jobs to Israel — a trend that has largely escaped media attention or concern from even “America first” politicians. The fact that this massive transfer of investment and jobs has been so overlooked is particularly striking given that it is largely the work of a single leading neoconservative Republican donor who has given millions of dollars to President Donald Trump.

    To make matters worse, many of these top tech companies shifting investment and jobs to Israel at record rates continue to collect sizable U.S. government subsidies for their operations while they move critical aspects of their business abroad, continue to layoff thousands of American workers, and struggle to house their growing company branches in Israel. This is particularly troubling in light of the importance of the tech sector to the overall U.S. economy, as it accounts for 7.1 percent of total GDP and 11.6 percent of total private-sector payroll.

    Furthermore, many of these companies are hiring members of controversial Israeli companies — known to have spied on Americans, American companies, and U.S. federal agencies — as well as numerous members of Israeli military intelligence as top managers and executives.

    This massive transfer of the American tech industry has largely been the work of one leading Republican donor — billionaire hedge fund manager Paul Singer, who also funds the neoconservative think tank American Enterprise Institute (AEI), the Islamophobic and hawkish think tank Foundation for Defense of Democracies (FDD), the Republican Jewish Coalition (RJC), and also funded the now-defunct Foreign Policy Initiative (FPI).

    Singer’s project to bolster Israel’s tech economy at the U.S.’ expense is known as Start-Up Nation Central, which he founded in response to the global Boycott, Divest and Sanctions (BDS) movement that seeks to use nonviolent means to pressure Israel to comply with international law in relation to its treatment of Palestinians.

    This project is directly linked to Israeli Prime Minister Benjamin Netanyahu, who in recent years has publicly mentioned that it has been his “deliberate policy” to have former members of Israel’s “military and intelligence units … merge into companies with local partners and foreign partners” in order to make it all but impossible for major corporations and foreign governments to boycott Israel.

    In this report, MintPress identifies dozens of former members of an elite Israeli military intelligence unit who now hold top positions at Microsoft, Google and Facebook.

    Singer’s nonprofit organization has acted as the vehicle through which Netanyahu’s policy has been realized, via the group’s close connections to the Israeli PM and Singer’s long-time support for Netanyahu and the Likud Party. With deep ties to Netanyahu, the American Israel Public Affairs Committee (AIPAC), and controversial tech companies — like Amdocs — that spied on the American government, this Singer-funded organization has formed a nexus of connections between the public and private sectors of both the American and Israeli economies with the single goal of making Israel the new technology superpower, largely at the expense of the American economy and government, which currently gives $3.2 billion in aid to Israel annually.

    Researched and developed in Israel
    In recent years, the top U.S. tech companies have been shifting many of their most critical operations, particularly research and development, to one country: Israel. A 2016 report in Business Insider noted that Google, Facebook, Microsoft, Amazon and Apple had all opened up research and development (R&D) centers in recent years, with some of them having as many as three such centers in Israel, a country roughly the size of New Jersey. Other major tech companies that have also opened key operation and research centers in Israel include Sandisk, Nvidia, PayPal, Palantir and Dell. Forbes noted last year that the world’s top 10 tech companies were now “doing mission-critical work in Israel that’s core to their businesses back at HQ.”

    Yet, some of these tech giants, particularly those based in the U.S., are heavily investing in their Israeli branches while laying off thousands of American employees, all while receiving millions of dollars in U.S. government subsidies funded by American taxpayers.

    For example, Intel Corporation, which is the world’s second largest manufacturer of semiconductor computer chips and is headquartered in California, has long been a major employer in Israel, with over 10,000 employees in the Jewish state. However, earlier this year, Intel announced that it would be investing $11 billion in a new factory in Israel and would receive around $1 billion in an Israeli government grant for that investment. Just a matter of months after Intel announced its major new investment in Israel, it announced a new round of layoffs in the United States.

    Yet this is just one recent example of what has become a trend for Intel. In 2018, Intel made public its plan to invest $5 billion in one of its Israeli factories and had invested an additional $15 billion in Israeli-created autonomous driving technology a year prior, creating thousands of Intel jobs in Israel. Notably, over that same time frame, Intel has cut nearly 12,000 jobs in the United States. While this great transfer of investment and jobs was undermining the U.S. economy and hurting American workers, particularly in the tech sector, Intel received over $25 million dollars in subsidies from the U.S. federal government.

    A similar phenomenon has been occurring at another U.S.-based tech giant, Microsoft. Beginning in 2014 and continuing into 2018, Microsoft has laid off well over 20,000 employees, most of them Americans, in several different rounds of staff cuts. Over that same time period, Microsoft has been on a hiring spree in Israel, building new campuses and investing billions of dollars annually in its Israel-based research and development center and in other Israeli start-up companies, creating thousands of jobs abroad. In addition, Microsoft has been pumping millions of dollars into technology programs at Israeli universities and institutes, such as the Technion Institute. Over this same time frame, Microsoft has received nearly $197 million in subsidies from the state governments of Washington, Iowa and Virginia.

    Though Israeli politicians and tech company executives have praised this dramatic shift as the result of Israel’s tech prowess and growing reputation as a technological innovation hub, much of this dramatic shift has been the work of the Netanyahu-tied Singer’s effort to counter a global movement aimed at boycotting Israel and to make Israel a global “cyber power.”

    Start-Up Nation Central and the Neocons


    Paul Singer | AP photo archive

    In 2009, a book titled Start Up Nation: The Story of Israel’s Economic Miracle, written by American neoconservative Dan Senor and Jerusalem Post journalist Saul Singer (unrelated to Paul), quickly rose to the New York Times bestseller list for its depiction of Israel as the tech start-up capital of the world. The book — published by the Council on Foreign Relations, where Senor was then serving as Adjunct Senior Fellow — asserts that Israel’s success in producing so many start-up companies resulted from the combination of its liberal immigration laws and its “leverage of the business talents of young people with military experience.”

    “The West needs innovation; Israel’s got it,” wrote Senor and Singer. In a post-publication interview with the blog Freakonomics, Senor asserted that service in the Israeli military was crucial to Israel’s tech sector success, stating that:
    “Certain units have become technology boot camps, where 18- to 22-year-olds get thrown projects and missions that would make the heads spin of their counterparts in universities or the private sector anywhere else in the world. The Israelis come out of the military not just with hands-on exposure to next-gen technology, but with training in teamwork, mission orientation, leadership, and a desire to continue serving their country by contributing to its tech sector — a source of pride for just about every Israeli.”
    The book, in addition to the many accolades it received from the mainstream press, left a lasting impact on top Republican donor Paul Singer, known for funding the most influential neoconservative think tanks in America, as noted above. Paul Singer was so inspired by Senor and Singer’s book that he decided to spend $20 million to fund and create an organization with a similar name. He created the Start-Up Nation Central (SUNC) just three years after the book’s release in 2012.

    To achieve his vision, Singer – who is also a top donor to the Republican Party and Trump – tapped Israeli economist Eugene Kandel, who served as Netanyahu’s national economic adviser and chaired the Israeli National Economic Council from 2009 to 2015.

    Senor was likely directly involved in the creation of SUNC, as he was then employed by Paul Singer and, with neoconservatives Bill Kristol and Robert Kagan, co-founded the FPI, which Singer had long funded before it closed in 2017. In addition, Dan Senor’s sister, Wendy Singer (unrelated to either Paul or Saul), long-time director of Israel’s AIPAC office, became the organization’s executive director.

    SUNC’s management team, in addition to Eugene Kandel and Wendy Singer, includes Guy Hilton as the organization’s general manager. Hilton is a long-time marketing executive at Israeli telecommunications company Amdocs, where he “transformed” the company’s marketing organization. Amdocs was once highly controversial in the United States after it was revealed by a 2001 Fox News investigation that numerous federal agencies had investigated the company, which then had contracts with the 25 largest telephone companies in the country, for its alleged role in an aggressive espionage operation that targeted the U.S. government. Hilton worked at Microsoft prior to joining Amdocs.

    Beyond the management team, SUNC’s board of directors includes Paul Singer, Dan Senor and Terry Kassel — who work for Singer at his hedge fund, Elliott Management — and Rapheal Ouzan. Ouzan was an officer in the elite foreign military intelligence unit of Israel, Unit 8200, who co-founded BillGuard the day after he left that unit, which is often compared to the U.S.’ National Security Agency (NSA). Within five months of its founding, BillGuard was backed by funding from PayPal founder Peter Thiel and former CEO of Google, Eric Schmidt. Ouzan is also connected to U.S. tech companies that have greatly expanded their Israeli branches since SUNC’s founding — such as Microsoft, Google, PayPal and Intel, all of which support Ouzan’s non-profit Israel Tech Challenge.

    According to reports from the time published in Haaretz and Bloomberg, SUNC was explicitly founded to serve as “a foreign ministry for Israel’s tech industry” and “to strength Israel’s economy” while also aiming to counter the Boycott, Divest and Sanctions (BDS) movement that seeks to use a nonviolent boycott to end the illegal military occupation of the West Bank and Israeli apartheid, as well as the growth of illegal Jewish-only settlements in occupied Palestinian territory.

    Since its founding, SUNC has sought to transfer tech jobs from foreign companies to Israel by developing connections and influence with foreign governments and companies so that they “deepen their relationship with Israel’s tech industry.” Though SUNC has since expanded to include other sectors of the Israeli “start-up” economy, its focus has long remained on Israel’s tech, specifically its cybersecurity industry. Foreign investment in this single Israeli industry has grown from $227 million in 2014 to $815 million in 2018.

    In addition to its own activities, SUNC appears to be closely linked to a similar organization, sponsored by Coca Cola and Daimler Mercedes Benz, called The Bridge, which also seeks to connect Israeli start-up companies with large international corporations. Indeed, SUNC, according to its website, was actually responsible for Daimler Mercedes Benz’s decision to join The Bridge, thanks to a delegation from the company that SUNC hosted in Israel and the connections made during that visit.

    Teaming up with Israel’s Unit 8200


    Members of Israel’s signals intelligence Unit 8200 work under a Saudi flag. Photo | Moti Milrod

    Notably, SUNC has deep ties to Israel’s military intelligence unit known as Unit 8200 and, true to Start Up Nation’s praise of IDF service as key to Israel’s success, has been instrumental in connecting Unit 8200 alumni with key roles in foreign companies, particularly American tech companies. For instance, Maty Zwaig, a former lieutenant colonel in Unit 8200, is SUNC’s current director of human capital programs, and SUNC’s current manager of strategic programs, Tamar Weiss, is also a former member of the unit.

    One particularly glaring connection between SUNC and Unit 8200 can be seen in Inbal Arieli, who served as SUNC’s Vice President of Strategic Partnerships from 2014 to 2017 and continues to serve as a senior adviser to the organization. Arieli, a former lieutenant in Unit 8200, is the founder and head of the 8200 Entrepreneurship and Innovation Support Program (EISP), which was the first start-up accelerator in Israel aimed at harnessing “the vast network and entrepreneurial DNA of [Unit] 8200 alumni” and is currently one of the top company accelerators in Israel. Arieli was the top executive at 8200 EISP while working at SUNC.

    Another key connection between SUNC and Unit 8200 is SUNC’s promotion of Team8, a company-creation platform whose CEO and co-founder is Nadav Zafrir, former commander of Unit 8200. In addition to prominently featuring Team8 and Zafrir on the cybersecurity section of its website, SUNC also sponsored a talk by Zafrir and an Israeli government economist at the World Economic Forum, often referred to as “Davos,” that was attended personally by Paul Singer.

    Team8’s investors include Google’s Eric Schmidt, Microsoft, and Walmart — and it recently hired former head of the NSA and U.S. Cyber Command, retired Admiral Mike Rogers. Team8 described the decision to hire Rogers as being “instrumental in helping strategize” Team8’s expansion in the United States. However, Jake Williams, a veteran of NSA’s Tailored Access Operations hacking unit, told CyberScoop:
    “Rogers is not being brought into this role because of his technical experience. …It’s purely because of his knowledge of classified operations and his ability to influence many in the U.S. government and private-sector contractors.”
    In addition to connections to Unit 8200-linked groups like Team8 and 8200 EISP, SUNC also directly collaborates with the IDF in an initiative aimed at preparing young Israeli women to serve in Unit 8200. That initiative, called the CyberGirlz Club, is jointly funded by Israel’s Defense Ministry, SUNC and the Rashi Foundation, the philanthropic organization set up by the Leven family of Perrier-brand water, which has close ties to the Israeli government and IDF.

    “Our aim is to bring the girls to this process already skilled, with the knowledge needed to pass the exams for Unit 8200 and serve in the military as programmers,” Zwaig told Israel National News.

    Seeding American tech
    The connections between SUNC and Unit 8200 are troubling for more than a few reasons, one of which being that Unit 8200, often likened to the U.S.’ NSA, closely coordinates with Israel’s intelligence agency, the Mossad, and is responsible for 90 percent of the intelligence material obtained by the Israeli government, according to its former commander Yair Cohen. Cohen told Forbes in 2016, that “there isn’t a major operation, from the Mossad or any intelligence security agency, that 8200 is not involved in.” For obvious reasons, the fact that an organization founded by an American billionaire is actively promoting the presence of former military intelligence officers in foreign companies, specifically American companies, while also promoting the transfer of jobs and investment to that same country, is very troubling indeed.

    Particularly troubling is the fact that, since SUNC’s founding, the number of former Unit 8200 members in top positions in American tech companies has skyrocketed. Based on a non-exhaustive analysis conducted by Mintpress of over 200 LinkedIn accounts of former Israeli military intelligence and intelligence officers in three major tech companies, numerous former Unit 8200 alumni were found to currently hold top managerial or executive positions in Microsoft, Google and Facebook.

    At Microsoft, managers for at least 15 of the company’s products and programs — including Microsoft’s lead managers for engineering, product strategy, threat analytics and cloud business intelligence — publicly listed their affiliation with Unit 8200 on their LinkedIn accounts. In addition, the general manager of Microsoft’s Israeli Research and Development Center is also a former member of Unit 8200. In total, of the 200 accounts analyzed, 50 of them currently worked for Microsoft.

    Similarly, at Google, 28 former Unit 8200 members at the company were identified from their LinkedIn accounts. Among them are Google’s Engineering Director, its strategic partner manager, two growth marketing leads, its lead technical manager, and six product and program managers, including Google’s manager for trust and safety search.

    Facebook also has several Unit 8200 members in prominent positions, though fewer than Google and Microsoft. MintPress identified at least 13 Unit 8200 alumni working for Facebook, including its director of engineering, lead manager for express wi-fi, and technical program manager. Notably, Facebook has spent the last several years collaborating with Israel’s government to censor Israel’s critics.

    Of course, there is likely much more influence of Unit 8200 on these companies than this non-exhaustive analysis revealed, given that many of these companies acquired several Israeli start-ups run by and staffed by many Unit 8200 alumni who subsequently went on to found new companies and start-ups a few years or shortly after acquisition. Furthermore, due to the limitations of LinkedIn’s set-up, MintPress was not able to access the complete list of Unit 8200 alumni at these three tech companies, meaning that the eye-opening numbers found were generated by a relatively small sample.

    This jump in Unit 8200 members in top positions in tech companies of global importance is actually a policy long promoted by Netanyahu, whose long-time economic adviser is the chief executive at SUNC. During an interview with Fox News last year, Netanyahu was asked by Fox News host Mark Levin if the large growth seen in recent years in Israel’s technology sector was part of Netanyahu’s plan. Netanyahu responded, “That’s very much my plan … It’s a very deliberate policy.” He later added that “Israel had technology because the military, especially military intelligence, produced a lot of capabilities. These incredibly gifted young men and women who come out of the military or the Mossad, they want to start their start-ups.”

    Netanyahu further outlined this policy at the 2019 Cybertech conference in Tel Aviv, where he stated that Israel’s emergence as one of the top five “cyber powers” had “required allowing this combination of military intelligence, academia and industry to converge in one place” and that this further required allowing “our graduates of our military and intelligence units to merge into companies with local partners and foreign partners.” The direct tie-ins of SUNC to Netanyahu and the fact that Paul Singer has also been a long-time political donor and backer of Netanyahu suggest that SUNC is a key part of Netanyahu’s policy of placing former military intelligence and intelligence operatives in strategic positions in major technology companies.

    Notably, just as SUNC was founded to counter the BDS movement, Netanyahu has asserted that this policy of ensuring Israel’s role as a “cyber power” is aimed at increasing its diplomatic power and specifically undermining BDS as well as the United Nations, which has repeatedly condemned Israel’s government for war crimes and violations of international law in relation to the Palestinians.

    Building the bi-national surveillance state


    A Google data center in Hamina, Finland. (AP/Google)

    Top U.S. tech companies have filled top positions with former members of Israeli military intelligence and moved strategic and critical operations to Israel, boosting Israel’s economy at the expense of America’s, and SUNC’s role in this marked shift merits scrutiny.

    A powerful American billionaire has built an influential organization with deep connections to the U.S.-Israel lobby (AIPAC), an Israeli company that has been repeatedly investigated for spying on the U.S. government (Amdocs), and the elite Israeli military intelligence unit (Unit 8200) that has used its influential connections to the U.S. government and the U.S. private sector to dramatically shift the operations and make-up of major companies in a critical sector of the U.S. economy.

    Further consider that U.S. government documents leaked by Edward Snowden have flagged Israel as “leading threat” to the infrastructure of U.S. financial and banking institutions, which use much of the software produced by these top tech companies, and have also flagged Israel as a top espionage threat. One U.S. government document cited Israel as the third most aggressive intelligence service against the U.S. behind Russia and China. Thus, Paul Singer’s pet project in Start-Up Nation Central has undermined not only the U.S. economy but arguably U.S. national security as well.

    This concern is further exacerbated by the deep ties connecting top tech companies like Microsoft and Google to the U.S. military. Microsoft and Google are both key military contractors — Microsoft in particular, given that it is set to win a lucrative contract for the Pentagon’s cloud management and has partnered with the Department of Defense to produce a “secure” election system known as ElectionGuard that is set to be implemented in some U.S. states for the 2020 general election.
    Whitney Webb is a MintPress News journalist based in Chile. She has contributed to several independent media outlets including Global Research, EcoWatch, the Ron Paul Institute and 21st Century Wire, among others. She has made several radio and television appearances and is the 2019 winner of the Serena Shim Award for Uncompromised Integrity in Journalism.
    =====================================

    Quote MintPress identifies dozens of former members of an elite Israeli military intelligence unit who now hold top positions at Microsoft, Google and Facebook.
    Voilà... why 'em monopolies ain't following "New Testament" nor even Torah's "old testament" ethic and values but only the ever shifting Talmudic precepts of the Sabbatean Frankists...
    Last edited by Hervé; 13th June 2019 at 13:27.
    "La réalité est un rêve que l'on fait atterrir" San Antonio AKA F. Dard

    Troll-hood motto: Never, ever, however, whatsoever, to anyone, a point concede.

  30. The Following 6 Users Say Thank You to Hervé For This Post:

    Bill Ryan (13th June 2019), Carmody (17th August 2019), Franny (1st August 2019), onevoice (13th June 2019), Sophocles (13th June 2019), ThePythonicCow (13th June 2019)

  31. Link to Post #216
    France On Sabbatical
    Join Date
    7th March 2011
    Location
    Brittany
    Posts
    16,763
    Thanks
    60,315
    Thanked 95,891 times in 15,481 posts

    Default Re: The Problems with Facebook

    Via Jim Stone:


    File the following under "you can't make this stuff up"

    Facebook denies shadow banning, then receives patent for it's proprietary shadow banning methods

    From The New American:
    "Facebook has continually denied that it participates in the practice of shadow banning - a method of blocking a users' posts or comments from everyone except the user who made the post or comment. But a newly granted patent shows that Facebook not only does practice shadow banning, but wants to protect - by patent - the method it uses for doing so.
    "Despite the fact that Facebook executives denied the practice in congressional testimony in April, the company was awarded a patent by the U.S. Patent and Trademark Office (USPTO) earlier this month for an automated system that would
    "receive a list of proscribed content and block comments containing the proscribed content by reducing the distribution of those comments to other viewing users" while continuing to "display the blocked content to the commenting user such that the commenting user is not made aware that his or her comment was blocked."

    A better definition of shadow banning would be hard to write.

    "And since Facebook would use the patented system to shadow ban "proscribed" (read: banned) content, one can safely assume that would include political speech deemed unacceptable by the social-media behemoth. After all, Facebook recently slapped down a post by this magazine's parent organization, The John Birch Society as "hate speech." That post consisted of the cover of the July 8 issue of the print edition of The New American. That cover showed a real picture of an illegal border crossing and carried the caption, "Immigrant Invasion."
    [...]
    "With a newly patented automatic system for shadow banning anything "proscribed" by Facebook, the company would have no trouble pressing the digital mute button on JBS or any other user whose posts run counter to accepted liberal norms."
    Jim Stone's comment: Well Well Well, Facebook got caught lying. Again. More at the link.
    "La réalité est un rêve que l'on fait atterrir" San Antonio AKA F. Dard

    Troll-hood motto: Never, ever, however, whatsoever, to anyone, a point concede.

  32. The Following 10 Users Say Thank You to Hervé For This Post:

    avid (1st August 2019), Bill Ryan (1st August 2019), Cara (2nd August 2019), Carmody (17th August 2019), Denise/Dizi (2nd August 2019), Ernie Nemeth (1st August 2019), Franny (1st August 2019), meeradas (1st August 2019), NancyV (2nd August 2019), Valerie Villars (1st August 2019)

  33. Link to Post #217
    UK Avalon Founder Bill Ryan's Avatar
    Join Date
    7th February 2010
    Location
    Ecuador
    Posts
    34,271
    Thanks
    209,031
    Thanked 457,541 times in 32,788 posts

    Default Re: The Problems with Facebook

    On 13 August 2019,

    Mercola.com leaves Facebook today


    Story at-a-glance
    • As of today, Mercola.com will no longer maintain an active Facebook page. The page will remain, but no further posts will be made. Anyone who has been following me on Facebook is urged to sign up as a subscriber to my newsletter instead
    • Earlier this year I issued a poll to see how you would feel about my leaving Facebook. The results arrived in late March, 2019, with just over 65% agreeing with my decision to withdraw from the platform
    • July 24, 2019, the U.S. Federal Trade Commission announced Facebook will pay a $5 billion fine to settle some of the known privacy breaches, including that of Cambridge Analytica
    • Corporate changes are also required under the FTC order. As CEO, Zuckerberg will be required to provide the FTC with a quarterly statement guaranteeing Facebook’s privacy program complies with the order, and “a false certification could trigger civil or even criminal penalties”
    • $5 billion is only one month's worth of revenue for Facebook, and its planned integration of Instagram, Messenger and WhatsApp will turn it into a global super-monopoly with unprecedented data mining capabilities
    As you probably know, Facebook has promised to combat "fake news" on its platform, but its censorship doesn't end at blatantly fake news articles — far from it. Information that is unfavorable to Facebook (or its advertisers) keeps getting censored out as well.

    One example was the censoring of U.S. presidential candidate Sen. Elizabeth Warren, D-Mass. Warren is an outspoken proponent of breaking up monopolies such as Amazon, Facebook and Google, and has vowed to introduce "sweeping new regulation of Silicon Valley," should she be elected president.

    Three of Warren's ads were reportedly removed by Facebook in March 2019, with a message saying the ads were deleted because they went "against Facebook's advertising policies." Warren took to Twitter to comment on the removal, saying this is an example of why her proposal is so sorely needed.

    Facebook is also "hiding" content that is critical of vaccines, and has barred "ads that contain "misinformation" about vaccines."3 It's likely only a matter of time before the platform starts censoring other health-related content as well — anything that doesn't parrot the drug industry's propaganda — as is already being done by Google.

    In May 2019, Google updated its quality rater guidelines in such a way that even expert views are now buried if deemed "harmful to the public." Google used to rank pages based on whether an author could prove their expertise based on how many people visited a page or the number of other reputable sites that linked to that page. It no longer works that way. Now, if a page is deemed harmful to the public, it gets the lowest possible rating regardless of expertise.

    Mercola.com will no longer maintain a Facebook presence

    Earlier this year I issued a poll to see how you would feel about my leaving Facebook. The results arrived in late March 2019, with just over 65% agreeing with my decision to withdraw from the platform.
    So, as of today, Mercola.com will no longer maintain an active Facebook page. The page will remain, but no further posts will be made. Anyone who has been following me on Facebook is urged to sign up as a subscriber to my newsletter instead.

    If you have any friends or relatives who are seriously interested in their health, please share important articles with them and encourage them to subscribe to our newsletter as well. Here's how to make sure you continue receiving your Mercola Newsletter subscription.

    Why I'm leaving Facebook

    As described in previous articles, Facebook has repeatedly been caught subverting users' privacy. Your hobbies, habits and preferences are meticulously tracked by the site, and your personal data is then sold to whomever wants access to it.

    This is ostensibly done for the sole purpose of creating targeted marketing, but there have been no real safeguards in place to prevent scammers and even political agents from using the data, as detailed in Frontline's "The Facebook Dilemma," featured above.

    In it, Frontline PBS correspondent James Jacoby investigates Facebook's influence over the democracy of nations, and the lax privacy parameters that allowed for tens of millions of users' data to be siphoned off and used in an effort to influence the U.S. elections.

    The entire profit model of Facebook is based on the selling of your personal information. For individuals who start using Facebook at a young age, the lifetime data harvest is likely to be inconceivably large, giving those who buy or otherwise access that information an extraordinarily comprehensive picture of the individual in question.

    Facebook even has the ability to access your computer or smartphone's microphone without your knowledge. If you suddenly find yourself on the receiving end of ads for products or services you just spoke about out loud, chances are one or more apps are linked into your microphone and are eavesdropping.

    The idea that I'm contributing to the invasive data mining of my 1.8 million Facebook followers has never sat well with me, and I feel leaving the platform and going back to depending on email is the most responsible way to move forward.

    Facebook fact-checkers protect advertisers

    Like Google, Facebook employs fact-checkers such as Snopes in an effort to prevent the proliferation of fake news. Their fact-checking is far from unbiased, however, and the bias appears to be directed by Facebook leadership.

    According to a December 2018 report by The Guardian, Brooke Binkowski, former managing editor for Snopes, stated that "it appeared that Facebook was pushing reporters to prioritize debunking misinformation that affected Facebook advertisers."

    At that point, "You're not doing journalism any more. You're doing propaganda," Binkowski told The Guardian. I couldn't agree more, and my site has been on the receiving end of that agenda.

    Below is a screen shot of a Facebook post for one of my Splenda articles, which based on "fact-checking" by Snopes was classified as "False," thereby reducing its potential views by an average of 80%.10 This despite the fact that I'm reporting published, peer-reviewed science.



    Snopes also bungled its fact-checking of a vaccine injury report by former CBS correspondent Sharyl Attkisson. Snopes clearly had an agenda, which was to discredit Attkisson's report, as they simply didn't look at the facts presented. According to Attkisson:
    "[T]he Snopes article debunks claims that were never made and uses one-sided references as its sources — other propagandists — without disclosing their vaccine industry ties."
    The fact of the matter is, Snopes engages in massive censorship of natural health, and promotes industry talking points regardless of what the scientific reality is.

    Indeed, I would argue there's simply no way one can trust any given organization or company to dictate credibility and preside over what's true and what's not. There are typically two or more sides to any story, and money can easily tip the scales on which side gets to be "true" and which is deemed "false."

    Facebook's $5 billion fine should have been $2 trillion

    Facebook is currently facing a number of legal probes and lawsuits regarding its controversial data-sharing practices and poor security measures. July 24, 2019, the U.S. Federal Trade Commission announced Facebook will pay a $5 billion fine to settle some of the known privacy breaches, including that of Cambridge Analytica.

    While $5 billion may seem like a lot of money, their stock actually rose after the announcement. Facebook should have paid $2 trillion in fines for their serious privacy violations.

    The federal government gave Facebook a sweetheart deal despite what they reported:
    "If you've ever wondered what a paradigm shift looks like, you're witnessing one today. The FTC's $5 billion civil penalty against Facebook for violations of an earlier FTC order is record-breaking and history-making.
    In addition, the settlement requires Facebook to implement changes to its privacy practices, its corporate structure, and the role of CEO Mark Zuckerberg that are seismic in scope. Simply put, when it comes to the business of consumer privacy, it's no longer business as usual at Facebook."
    Facebook faces corporate restructuring

    The new corporate structure of Facebook "will hold the company accountable for the decisions it makes about its users' privacy," the FTC notes. However, while this is the largest-ever penalty imposed on a company found to be in violation for consumer privacy violations, it still only amounts to one month's worth of revenue.

    In fact it was so small that Facebook's stock went up MORE than $5 billion the day the fine was announced. The fine should have been at least $50 billion but more likely $500 billion. So, while the FTC tries to trump the fine as a considerable win, skepticism may still be warranted.

    The agency had originally considered holding Facebook founder, Mark Zuckerberg, personally accountable, but that didn't happen. And, considering how little $5 billion actually means to Facebook, without personal accountability, the likelihood of marked change could be minimal. As reported by ARS Technica:
    "Democratic members of Congress blasted the settlement. 'This reported $5 billion penalty is barely a tap on the wrist, not even a slap,' Sen. Richard Blumenthal (D-Conn.) said in a statement.
    'Such a financial punishment for purposeful, blatant illegality is chump change for a company that makes tens of billions of dollars every year. Will Facebook be compelled to alter its present, systematic abuse of privacy? Based on the reported settlement, the answer is sadly, no.'
    Sen. Ron Wyden (D-Ore.) agreed. 'Despite Republicans' promises to hold big tech accountable, the FTC appears to have failed miserably at its best opportunity to do so,' Wyden said.
    'No level of corporate fine can replace the necessity to hold Mark Zuckerberg personally responsible for the flagrant, repeated violations of Americans' privacy. That said, this reported fine is a mosquito bite to a corporation the size of Facebook.'"
    Will Facebook change?

    Zuckerberg has repeatedly demonstrated a complete lack of integrity when it comes to fulfilling his promises of privacy.

    In fact, in a 2010 talk given at the Crunchie awards, he stated that "privacy is no longer a social norm," implying that using social media automatically strips you of the right to privacy, and that is why they do not respect it. Will any of this change? According to the FTC, the new order requires Facebook to:
    "… implement a stringent program to monitor third-party developers and terminate access to any developer that doesn't follow the rules. In addition, Facebook can't use for advertising purposes the phone numbers it obtained specifically for security.
    When it comes to facial recognition technology, the order requires Facebook to give clear notice of how it uses that information and it must get consumers' express consent before putting that data to a materially different use.
    Facebook also will have to encrypt passwords and can't ask people for their passwords to other services, and must report any privacy incident to the FTC within 30 days. On top of everything Facebook will have to do to protect consumers' privacy, it also has to implement a comprehensive data security program.
    Another important consideration: These new accountability provisions don't just apply to Facebook. They also apply to companies Facebook controls, like Instagram, WhatsApp, and other Facebook-owned affiliates that it shares consumers' information with between now and 2039."
    Facebook faces privacy oversight

    While Zuckerberg escaped personal accountability for his decisions up to this point, his job description is being overhauled by the FTC's new order. FTC writes:
    "The order explains in detail a new system of independent control, multi-layer accountability, and personal responsibility over Facebook's practices, and substantially limits Mr. Zuckerberg's unfettered say in privacy decisions.
    In fact, for the next 20 years, anytime Facebook makes a privacy decision, multiple independent watchdogs will be looking over its shoulder … Facebook's Board of Directors will name a new subgroup that will serve as an Independent Privacy Committee.
    Facebook officers and employees — including Mr. Zuckerberg — are disqualified from membership. The Committee will be briefed about all material privacy risks and issues at the company, and has approval-and-removal authority over a new cadre of designated compliance officers and a third-party assessor that will not answer to Facebook."
    Designated compliance officers approved by the Independent Privacy Committee with oversee Facebook's day-to-day privacy program, and "a third-party assessor with broad monitoring powers" will be appointed (and approved by the FTC) to evaluate Facebook's privacy practices on a biannual basis.

    Zuckerberg himself will have no control over any of these parties (the Independent Privacy Committee, the compliance officers or the third-party assessor), and he will henceforth also have some of his own skin in the game.

    As CEO, Zuckerberg will be required to provide the FTC with a quarterly statement guaranteeing Facebook's privacy program complies with the order, and "a false certification could trigger civil or even criminal penalties," the FTC states.

    The FTC will also have "unparalleled access to Facebook's decision-making" and can at any time request documentation pertaining to any decision made without running into red tape that might limit its right to discovery.

    Facebook still has unprecedented data mining capabilities

    According to the FTC, its goal with this settlement "is the creation of a new culture at Facebook where the company finally lives up to the privacy promises it has made to the millions of American consumers who use its platform."

    Whether this intention turns into reality remains to be seen. The fact of the matter is, Facebook is still a global monopoly, and its plan to integrate Instagram, Messenger and WhatsApp will turn it into a global super-monopoly.

    This merger has been criticized by tech experts, as it robs users of their ability to choose between messaging services, leaving them virtually no choice but to submit to Facebook's invasive privacy settings. Even if privacy is improved through the FTC's new order, it still gives Facebook unprecedented data mining capabilities.

    February 7, 2019, Forbes27 reported the German antitrust regulator, Bundeskartellamt, has become the first to prohibit "the cross-application data sharing that underpins Facebooks's advertising business model."

    Facebook's services will be banned in Germany if it integrates the three messaging platforms, Bundeskartellamt warns. If other countries follow suit, the merger would fall through, as it probably should. Facebook's data mining already poses a large enough threat.

    Whether you worry about data mining or not, if you're a chronic user of Facebook, you may still want to consider unplugging from time to time for your psychological health. According to a study by researchers at New York University and Stanford, Facebook users report feeling happier and more satisfied with life after leaving the platform for a month.

    They were also less likely to report feelings of anxiety, depression and loneliness — a finding that supports the idea that social media is a poor substitute for actual face-to-face interactions.

  34. The Following 13 Users Say Thank You to Bill Ryan For This Post:

    anandacate (17th August 2019), avid (17th August 2019), Carmody (17th August 2019), Hervé (17th August 2019), Kryztian (17th August 2019), Mark (Star Mariner) (17th August 2019), meeradas (19th September 2019), Mike (17th August 2019), Rosemarie (17th August 2019), Sadieblue (17th August 2019), Sue (Ayt) (17th August 2019), Valerie Villars (17th August 2019), Yoda (17th August 2019)

  35. Link to Post #218
    Avalon Member Carmody's Avatar
    Join Date
    19th August 2010
    Location
    Winning The Galactic Lottery
    Posts
    11,389
    Thanks
    17,597
    Thanked 82,316 times in 10,234 posts

    Default Re: The Problems with Facebook

    Still not a member of Facebook.......

    Even though I should be, for commercial reasons (it is literally costing me money), the answer is still no, and is likely to remain that way.

    Secondly, I don't trust them to take their hooks out of me, if I was a member and then quit.

    I suspect the agreement, or their interpretation of the agreement... is that they have the rights to follow me and collect and use data from me...forever.
    Last edited by Carmody; 17th August 2019 at 18:08.
    Interdimensional Civil Servant

  36. The Following 7 Users Say Thank You to Carmody For This Post:

    Bill Ryan (17th August 2019), Hervé (17th August 2019), Kryztian (18th August 2019), meeradas (19th September 2019), Muzz (17th August 2019), Philippe (17th August 2019), wondering (17th August 2019)

  37. Link to Post #219
    Moderator (on Sabbatical) Cara's Avatar
    Join Date
    12th February 2014
    Location
    Dubai, United Arab Emirates
    Language
    English
    Posts
    1,431
    Thanks
    9,850
    Thanked 7,481 times in 1,331 posts

    Default Re: The Problems with Facebook

    Facebook steps into the role of the judiciary.... this may be a precedent for corporate managed and controlled law in other spheres of social life.

    As quoted in the article below, Mark Zuckerberg:
    “We expect the board will only hear a small number of cases at first, but over time we hope it will expand its scope and potentially include more companies across the industry as well”

    Quote Facebook will bankroll an ‘independent supreme court’ to moderate your content & set censorship precedents
    Published time: 18 Sep, 2019 01:43

    Facebook has unveiled the charter for its ‘supreme court,’ a supposedly independent content moderation board that will take money from, and be appointed by, Facebook itself – while making binding decisions. What could go wrong?

    Facebook has released preliminary plans for an “Oversight Board” tasked with reviewing content disputes. The 40-member body, referred to previously as Facebook’s “supreme court,” will have the authority to make binding decisions regarding cases brought to it by users or by the social media behemoth itself, according to a white paper released Tuesday, which stresses that the new board will be completely independent of Facebook, by popular request.

    The company has clearly taken pains to make this new construct look independent, the sort of place a user might be able to go to get justice after being deplatformed by an algorithm incapable of understanding sarcasm or context. But board members will be paid out of a trust funded by Facebook and managed by trustees appointed by Facebook, while the initial board members will also be appointed by Facebook.

    “We agreed with feedback that Facebook alone should not name the entire board,” the release states, proceeding to outline how Facebook will select “a small group of initial members,” who will then fill out the rest of the board. The trustees – also appointed by Facebook – will make the formal appointments of members, who will serve three-year terms.

    Facebook insists it is “committed to selecting a diverse and qualified group” – no current or former Facebook employees or spouses thereof, current government officials or lobbyists (former ones are apparently OK), high-ranking officials within political parties (low-ranking is apparently cool), or significant shareholders of Facebook need apply. A law firm will be employed to vet candidates for conflicts of interest, but given Facebook’s apparent inability to recognize the conflict of interest inherent in paying “independent” board members to make binding content decisions, it’s hard to tell what would qualify as a conflict.

    How will Facebook decide which cases get the democracy treatment? Cases with significant real-world impact – meaning they affect a large number of people, threaten “someone else’s voice, safety, privacy, or dignity,” or have sparked public debate – and are difficult to parse with regard to existing policy will be heard first. “For now,” only Facebook-initiated cases will be heard by the board – Facebook users will be able to launch their own appeals by mid-2020. Is the company merely reaching for an “independent” rubber-stamp to justify some of its more controversial decisions as the antitrust sharks start circling? Decisions will not only be binding, but also applicable to other cases not being heard, if they’re deemed similar enough – potentially opening a Pandora’s box of far-reaching censorship.

    In a letter accompanying the white paper, Facebook CEO Mark Zuckerberg claims the company’s moderators take into account “authenticity, safety, privacy, and dignity – guided by international human rights standards” when they make a decision to take down content. Given that the company’s own lawyers have questioned the very existence of users’ privacy, what does this bode for the other “values,” let alone international human rights standards?

    Perhaps most ominously, Zuckerberg seems to have bigger things in mind for his Oversight Board than merely weighing in on Facebook content moderation decisions. “We expect the board will only hear a small number of cases at first, but over time we hope it will expand its scope and potentially include more companies across the industry as well” (emphasis added). Not exactly a throwaway line from the man who said he wanted Facebook to become an internet driver’s license. The private-sector social credit score may be closer than we think – and Zuckerberg would very much like to be the scorekeeper.

    By Helen Buyniski, RT
    From: https://www.rt.com/news/469041-faceb...mpression=true
    *I have loved the stars too dearly to be fearful of the night*

  38. The Following 2 Users Say Thank You to Cara For This Post:

    Bill Ryan (19th September 2019), meeradas (19th September 2019)

  39. Link to Post #220
    Moderator (on Sabbatical) Cara's Avatar
    Join Date
    12th February 2014
    Location
    Dubai, United Arab Emirates
    Language
    English
    Posts
    1,431
    Thanks
    9,850
    Thanked 7,481 times in 1,331 posts

    Default Re: The Problems with Facebook

    Facebook enters the mind control tech sector:

    Quote Facebook to buy startup that lets people control computers with their mind
    Published time: 24 Sep, 2019 09:57

    Social media giant Facebook has agreed to acquire New York neural interface startup CTRL-Labs for up to a reported $1 billion. The company develops software allowing users to control computer devices with their brain.

    Facebook is seeking “more natural, intuitive ways” to work with different devices and wants to build “this kind of technology at scale,” the company’s vice-president of augmented reality and virtual reality divisions, Andrew Bosworth, said in a post on Monday announcing the acquisition. CTRL-Labs will join the company’s Reality Labs team, a division formerly known as Oculus Research, which focuses on AR and VR technology.

    While the financial terms of the deal have not been revealed, media reports claim that the tech startup cost the social media giant between $500 million and $1 billion, which could make it one of Facebook’s most substantial acquisitions in the last 5 years.

    In his post, Bosworth mentioned CTRL-Labs’ flagship product - a wristband that measures neuron activity to transmit it into computer input. However, the device does not exactly read your mind, it decodes electrical impulses that come from muscle fibers as they move and translates them into a digital signal your device can understand.

    “Technology like this has the potential to open up new creative possibilities and reimagine 19th century inventions in a 21st century world,” the vice-president said. “This is how our interactions in VR and AR can one day look. It can change the way we connect.”

    CTRL-Labs was founded in 2015 by Thomas Reardon and Patrick Kaifosh, both PhDs in neuroscience. Reardon told Forbes that he considered the bracelet a “universal controller for all your interactions with technology.” In February, the startup drew interest from other tech giants, as it raised $28 million from Alphabet GV and Amazon Alexa Fund.
    From: https://www.rt.com/business/469486-f...mpression=true
    *I have loved the stars too dearly to be fearful of the night*

+ Reply to Thread
Page 11 of 12 FirstFirst 1 11 12 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts