You know it's coming..the next Android flagship from Samsung Galaxy Note 20 theverge.com/2020/7/7/21316609/samsung-galaxy-unpacked-note-20-ultra-fold-2-event-date-august-5-announcement
You know it's coming..the next Android flagship from Samsung Galaxy Note 20 theverge.com/2020/7/7/21316609/samsung-galaxy-unpacked-note-20-ultra-fold-2-event-date-august-5-announcement
Still using Zoom? Zoom won’t encrypt free calls because it wants to comply with law enforcement If you’re a free Zoom user, and waiting for the company to roll out end-to-end encryption for better protection of your calls, you’re out of luck. Free calls won’t be encrypted, and law enforcement will be able to access your information in case of ‘misuse’ of the platform. Zoom CEO Eric Yuan today said that the video conferencing app’s upcoming end-to-end encryption feature will be available to only paid users. After announcing the company’s financial results for Q1 2020, Yuan said the firm wants to keep this feature away from free users to work with law enforcement in case of the app’s misuse: "Free users, for sure, we don’t want to give that [end-to-end encryption]. Because we also want to work it together with FBI and local law enforcement, in case some people use Zoom for bad purpose." In the past, platforms with end-to-end encryption, such as WhatsApp, have faced heavy scrutiny in many countries because they were unable to trace the origins of problematic and misleading messages. Zoom likey wants to avoid being in such a position, and wants to comply with local laws to keep operating across the globe. Alex Stamos, working as a security consultant with Zoom, said it wants to catch repeat offenders for hate speech or child exploitative content by not offering end-to-end encryption t0 free users. In March, The Intercept published a report stating that the company doesn’t use end-to-end encryption, despite claiming that on its website and security white paper. Later, Zoom apologized and issued a clarification to specify it didn’t provide the feature at that time. Last month, the company acquired Keybase.io, an encryption-based identity service, to build its end-to-end encryption offering. Yuan said today that the company got a lot of feedback from users on encryption, and it’s working out on executing it. However, he didn’t specify a release date for the feature. According to the Q1 2020 results, the company grew 169% year-on-year in terms of revenue. Zoom has more than 300 million daily participants attending meetings through the platform.
Yup, Facebook and Mark are evil. Facebook reportedly ignored its own research showing algorithms divided users An internal Facebook report presented to executives in 2018 found that the company was well aware that its product, specifically its recommendation engine, stoked divisiveness and polarization, according to a new report from The Wall Street Journal. Yet, despite warnings about the effect this could have on society, Facebook leadership ignored the findings and has largely tried to absolve itself of responsibility with regard to partisan divides and other forms of polarization it directly contributed to, the report states. The reason? Changes might disproportionately affect conservatives and might hurt engagement, the report says. “Our algorithms exploit the human brain’s attraction to divisiveness,” one slide from the presentation read. The group found that if this core element of its recommendation engine were left unchecked, it would continue to serve Facebook users “more and more divisive content in an effort to gain user attention & increase time on the platform.” A separate internal report, crafted in 2016, said 64 percent of people who joined an extremist group on Facebook only did so because the company’s algorithm recommended it to them, the WSJ reports. FACEBOOK FOUND THAT ITS ALGORITHMS WERE PUSHING PEOPLE TO JOIN EXTREMEST ORGANIZATIONS Leading the effort to downplay these concerns and shift Facebook’s focus away from polarization has been Joel Kaplan, Facebook’s vice president of global public policy and former chief of staff under President George W. Bush. Kaplan is a controversial figure in part due to his staunch right-wing politics — he supported Supreme Court Justice Brett Kavanaugh throughout his nomination — and his apparent ability to sway CEO Mark Zuckerberg on important policy matters. Kaplan has taken on a larger role within Facebook since the 2016 election, and critics say his approach to policy and moderation is designed to appease conservatives and stave off accusations of bias. Kaplan, for instance, is believed to be partly responsible for Facebook’s controversial political ad policy, in which the company said it would not regulate misinformation put forth in campaign ads by fact-checking them. He’s also influenced Facebook’s more hands-off approach to speech and moderation over the last few years by arguing the company doesn’t want to seem biased against conservatives. The Wall Street Journal says Kaplan was instrumental in weakening or entirely killing proposals to change the platform to promote social good and reduce the influence of so-called “super-sharers,” who tended to be aggressively partisan and, in some cases, so hyper-engaged that they might be paid to use Facebook or might be a bot. Yet, Kaplan pushed back against some of the proposed changes — many of which were crafted by News Feed integrity lead Carlos Gomez Uribe — for fear they would disproportionately affect right-wing pages, politicians, and other parts of the user base that drove up engagement. One notable project Kaplan undermined was called Common Ground, which sought to promote politically neutral content on the platform that might bring people together around shared interests like hobbies. But the team building it said it might require Facebook take a “moral stance” in some cases by choosing not to promote certain types of polarizing content and that the effort could harm overall engagement over time, the WSJ reports. The team has since been disbanded. In a statement, a Facebook spokesperson tells The Verge, “We’ve learned a lot since 2016 and are not the same company today. We’ve built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform’s impact on society so we continue to improve. Just this past February we announced $2M in funding to support independent research proposals on polarization.”
🥜🥜
A group for the nerds.
27 Followers
1 Followings
95 feeds
0 Followers
0 Followings
108 feeds