Google is putting its foot down on phishing scams on YouTube Shorts.
The search giant has recently announced it is making certain links posted on YouTube Shorts videos unclickable to prevent more people from falling for phishing scams.
YouTube launched its YouTube Shorts offering to the public in 2021 in response to TikTok's rising popularity at the time.
Clickable No More
Google mentioned in its YouTube Help announcement that clickable social media icons from all desktop channel banners will no longer show on Aug. 10. Additionally, links in the descriptions, comment sections, and in the vertical live feed of YouTube Shorts videos will become unclickable on Aug. 31 on a gradual timing.
According to a report from Android Police, the change was due to scammers and people trying to steal information from potential victims via posting fake links on YouTube Shorts videos. With Youtube's short-form video platform taking off in 2021 to the tune of two billion monthly logged-in users, it is no wonder they have begun to take advantage of the community that formed around the feature.
"While we have existing systems and policies in place to detect and remove these kinds of links, we need to take additional preventative measures to make it harder to take advantage of users via links," Google explained in its announcement.
The new rule is expected to make scammers' efforts to fool potential victims more difficult, however, it does make plugging channels, content, and other external sites difficult for YouTubers.
Thankfully, Google revealed it will work on tools and features to help creators more safely include links in their content. It said it would introduce a safer way to post such links "by the end of September."
Meanwhile, YouTube viewers on mobile and desktop will start seeing prominent clickable links on creators' channel profiles near the 'Subscribe' button. This feature is expected to help YouTubers link their content in the meantime.
Other Anti-Phishing Measures
Apart from making links unclickable in YouTube Shorts videos, Google is also reducing impersonation across YouTube and using a more powerful comment moderation tooling to help reduce phishing scams beyond fake links. The search giant said that it made substantial improvements to its policies and systems that help detect and remove impersonating channels.
These improvements target those that impersonate popular YouTubers, artists, and public figures in particular. You may remember that Elon Musk pointed out YouTube's impersonation problem, as Forbes pointed out in its article.
With the improvements Google made, there should be fewer impersonators on its video-sharing platform soon.
Aside from these improvements, Google also shared it enhanced its "Increase Strictness" feature, which detects and holds potentially spammy and inappropriate comments for optional review. With the improvements Google made, it saw a 200% increase in comments held for review through Increased Strictness, comparing the first week of June (after the improvements) with the first week of May (before the improvements).
Related Article : YouTube Changes Home Page to Let Users Disable Watch History