The end of all those Chinese phone brands' plan in India? China’s Oppo canceled the live online launch of its flagship smartphone in India on Wednesday after a border clash between the two countries that has renewed calls from local Indian trader groups to shun Chinese products. business.financialpost.com/pmn/business-pmn/oppo-cancels-live-online-phone-launch-in-india-amid-calls-to-boycott-chinese-goods-2

Related Feeds

Finally, someone to beat tiktok..? Also from China tho TikTok has a new competitor: Zynn, a nearly button-for-button clone of TikTok that differentiates itself with one key twist — it pays users to sign up, watch videos, and convince others to follow suit. The app launched at the beginning of May, and it’s now the number one free app in Apple’s App Store and in the top 10 on Google’s Play Store theverge.com/2020/5/29/21274994/zynn-tiktok-clone-pay-watch-videos-kuaishou-bytedance-rival

That’s one way to get users...

Just stop using FB's WhatsApp, if you care about privacy Signal announces new face-blurring tool for Android and iOS Encrypted messaging app Signal has announced a new face-blurring tool that will be incorporated into the latest Android and iOS versions of the software. Users sharing pictures through the app will be able to quickly blur faces, adding another layer of privacy to pictures, though not necessarily hiding the subject’s identity completely. In a blog post announcing the update, Signal co-founder Moxie Marlinspike linked the update to the worldwide protests against racism and police violence sparked by the killing of George Floyd by law enforcement. These protests have led to record downloads for Signal, which uses end-to-end encryption to make messages harder to intercept. “We’ve also been working to figure out additional ways we can support everyone in the street right now,” writes Marlinspike. “One immediate thing seems clear: 2020 is a pretty good year to cover your face.” When you take a picture through Signal and select the Blur option in the toolbar, the app will automatically detect any faces it spots in your image. If it misses any, users can simply blur out faces by hand, or blur any other features they want to hide. All processing is done on-device, meaning uncensored images never leave the user’s phone. Although blurring faces in photographs certainly makes pictures more private, it’s by no means a foolproof way of anonymizing images and hiding someone’s identity. Some blurring and pixellation methods can be reversed with the right tools, for example. And anyone seeking to identify someone in a picture can work from other information, such as clothing and tattoos, which can be compared with other, un-blurred images. Even if attendees at a protest, for example, hide the identity of fellow protestors, that doesn’t mean other groups and individuals will do the same. Surveillance cameras, police body cameras, and press photographers are all capturing images. Ultimately, the best way to obscure your identity is to take matters into your own hands and wear a mask.

Yup, Facebook and Mark are evil. Facebook reportedly ignored its own research showing algorithms divided users An internal Facebook report presented to executives in 2018 found that the company was well aware that its product, specifically its recommendation engine, stoked divisiveness and polarization, according to a new report from The Wall Street Journal. Yet, despite warnings about the effect this could have on society, Facebook leadership ignored the findings and has largely tried to absolve itself of responsibility with regard to partisan divides and other forms of polarization it directly contributed to, the report states. The reason? Changes might disproportionately affect conservatives and might hurt engagement, the report says. “Our algorithms exploit the human brain’s attraction to divisiveness,” one slide from the presentation read. The group found that if this core element of its recommendation engine were left unchecked, it would continue to serve Facebook users “more and more divisive content in an effort to gain user attention & increase time on the platform.” A separate internal report, crafted in 2016, said 64 percent of people who joined an extremist group on Facebook only did so because the company’s algorithm recommended it to them, the WSJ reports. FACEBOOK FOUND THAT ITS ALGORITHMS WERE PUSHING PEOPLE TO JOIN EXTREMEST ORGANIZATIONS Leading the effort to downplay these concerns and shift Facebook’s focus away from polarization has been Joel Kaplan, Facebook’s vice president of global public policy and former chief of staff under President George W. Bush. Kaplan is a controversial figure in part due to his staunch right-wing politics — he supported Supreme Court Justice Brett Kavanaugh throughout his nomination — and his apparent ability to sway CEO Mark Zuckerberg on important policy matters. Kaplan has taken on a larger role within Facebook since the 2016 election, and critics say his approach to policy and moderation is designed to appease conservatives and stave off accusations of bias. Kaplan, for instance, is believed to be partly responsible for Facebook’s controversial political ad policy, in which the company said it would not regulate misinformation put forth in campaign ads by fact-checking them. He’s also influenced Facebook’s more hands-off approach to speech and moderation over the last few years by arguing the company doesn’t want to seem biased against conservatives. The Wall Street Journal says Kaplan was instrumental in weakening or entirely killing proposals to change the platform to promote social good and reduce the influence of so-called “super-sharers,” who tended to be aggressively partisan and, in some cases, so hyper-engaged that they might be paid to use Facebook or might be a bot. Yet, Kaplan pushed back against some of the proposed changes — many of which were crafted by News Feed integrity lead Carlos Gomez Uribe — for fear they would disproportionately affect right-wing pages, politicians, and other parts of the user base that drove up engagement. One notable project Kaplan undermined was called Common Ground, which sought to promote politically neutral content on the platform that might bring people together around shared interests like hobbies. But the team building it said it might require Facebook take a “moral stance” in some cases by choosing not to promote certain types of polarizing content and that the effort could harm overall engagement over time, the WSJ reports. The team has since been disbanded. In a statement, a Facebook spokesperson tells The Verge, “We’ve learned a lot since 2016 and are not the same company today. We’ve built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform’s impact on society so we continue to improve. Just this past February we announced $2M in funding to support independent research proposals on polarization.”

All Things Tech

A group for the nerds.

All Things Tech

Hinsken

27 Followers

1 Followings

95 feeds

Hinsken

DMC

0 Followers

0 Followings

108 feeds

DMC

India

Maharashtra

Nagpur District

Nagpur Urban Taluka

Nagpur

Bharat Nagar