Social Bot

A social bot (also: socialbot or socbot) is a particular type of chatbot that is employed in social media networks to automatically generate messages (e.g. tweets) or in general advocate certain ideas, support campaigns, and public relations either by acting as a "follower" or even as a fake account that gathers followers itself. In this respect, social bots can be said to have passed the Turing test.[1][2] Social bots appear to have played a significant role in the United States presidential election, 2016[3][4] and their history appears to go back at least to the United States midterm elections, 2010.[5] It is estimated that 9-15% of active Twitter accounts may be social bots[6] and that 15% of the total Twitter population active in the US Presidential election discussion were bots. At least 400,000 thousand bots were responsible for about 3.8 million tweets, roughly 19% of the total volume.[7] All these claims are disputed.[]

Twitterbots are already well-known examples, but corresponding autonomous agents on Facebook and elsewhere have also been observed. Nowadays, social bots can generate convincing internet personas that are well capable of influencing real people,[8][1][9] although they are not always reliable.[10]

Social bots, besides being able to produce messages autonomously, also share many traits with spambots with respect to their tendency to infiltrate large user groups.[11]

Unless strict regulations on their use are passed, socialbots are expected to play a major role in future shaping of public opinion by autonomously acting as incessant and never-tiring influencers.[12][13][14]

Uses

Lutz Finger identifies 5 immediate uses for social bots:[15]

  • foster fame: having an arbitrary number of (unrevealed) bots as (fake) followers can help simulate real success
  • spamming: having advertising bots in online chats is similar to email spam, but a lot more direct
  • mischief: e.g. signing up an opponent with a lot of fake identities and spam the account or help others discover it to discreditize the opponent
  • bias public opinion: influence trends by countless messages of similar content with different phrasings
  • limit free speech: important messages can be pushed out of sight by a deluge of automated bot messages

The effects of all points can be likened to and support methods of traditional psychological warfare.

Detection

The first generation of bots could sometimes be distinguished from real users by their often superhuman capacities to post messages around the clock (and at massive rates). Later developments have succeeded in imprinting more "human" activity and behavioural patterns in the agent.[16] To unambiguously detect social bots as what they are, a variety of criteria must be applied together using pattern detection techniques, some of which are:[9]

  • cartoon figures as user pictures
  • sometimes also random real user pictures are captured (identity fraud)
  • reposting rate
  • temporal patterns
  • sentiment expression
  • followers-to-friends ratio[17][18]
  • length of user names
  • variability in (re)posted messages

Botometer[19] (formerly BotOrNot) is a public Web service that checks the activity of a Twitter account and gives it a score based on how likely the account is to be a bot. The system leverages over a thousand features.[20][6] An active method that worked well in detecting early spam bots was to set up honeypot accounts where obvious nonsensical content was posted and then dumbly reposted (retweeted) by bots.[21] Another method of detection is analysis of speed of change of the social network metrics: in particular, number of friends or followers for social bot grows very quickly, and clustering stays very low. That is explained by usage of "friend farm" services to collect large number of friends in a short period of time.[22]

See also

References

  1. ^ a b "What is socialbot? - Definition from WhatIs.com". whatis.techtarget.com. Retrieved . 
  2. ^ https://www.nytimes.com/2014/11/20/fashion/social-media-bots-offer-phony-friends-and-real-profit.html
  3. ^ Bessi, A & Ferrara, E. (2016) Social Bots Distort the 2016 US Presidential election online discussion. First Monday 21(11), 2016
  4. ^ Shao, Chengcheng; Giovanni Luca Ciampaglia; Onur Varol; Kaicheng Yang; Alessandro Flammini; Filippo Menczer (2018). "The spread of low-credibility content by social bots". arXiv:1707.07592Freely accessible. 
  5. ^ Ratkiewicz, Jacob; Michael Conover; Mark Meiss; Bruno Gonçalves; Alessandro Flammini; Filippo Menczer (2011). "Detecting and Tracking Political Abuse in Social Media". Proc. 5th International AAAI Conf. on Web and Social Media (ICWSM). 
  6. ^ a b Varol, Onur; Emilio Ferrara; Clayton A. Davis; Filippo Menczer; Alessandro Flammini (2017). "Online Human-Bot Interactions: Detection, Estimation, and Characterization". Proc. International AAAI Conf. on Web and Social Media (ICWSM). 
  7. ^ Bessi, A & Ferrara, E. (2016) Social Bots Distort the 2016 US Presidential election online discussion. First Monday 21(11), 2016
  8. ^ Alessandro Bessi and Emilio Ferrara (2016-11-07). "Social bots distort the 2016 U.S. Presidential election online discussion". First Monday. 
  9. ^ a b Ferrara, Emilio; Varol, Onur; Davis, Clayton; Menczer, Filippo; Flammini, Alessandro (2016). "The Rise of Social Bots". Communications of the ACM. 59 (7): 96-104. doi:10.1145/2818717. 
  10. ^ China kills AI chatbots after they start praising US, criticising communists Yahoo! News August 5, 2017
  11. ^ Ferrara, Emilio (2017). "Measuring social spam and the effect of bots on information diffusion in social media". arXiv:1708.08134Freely accessible. 
  12. ^ "How robots could shape Germany's political future". The Local. 21 November 2016. "Social Bots" were the sinister cyber friend in the US elections who didn't actually exist. Could they also shape how Germans vote next year? 
  13. ^ "The rise of political bots on social media". Deutsche Welle. August 6, 2016. 
  14. ^ "How online 'chatbots' are already tricking you". BBC. 2014-06-09. Intelligent machines that can pass for humans have long been dreamed of, but as Chris Baraniuk argues, they're already among us. 
  15. ^ Lutz Finger (Feb 17, 2015). "Do Evil - The Business Of Social Media Bots". forbes.com. 
  16. ^ Romanov, Aleksei; Alexander Semenov; Oleksiy Mazhelis; Jari Veijalainen (2017). "Detection of Fake Profiles in Social Media - Literature Review". Proceedings of the 13th International Conference on Web Information Systems and Technologies. 
  17. ^ "How to Find and Remove Fake Followers from Twitter and Instagram : Social Media Examiner". 
  18. ^ "TwitterAudit". 
  19. ^ "Botometer". 
  20. ^ Davis, Clayton A.; Onur Varol; Emilio Ferrara; Alessandro Flammini; Filippo Menczer (2016). "BotOrNot: A System to Evaluate Social Bots". Proc. WWW Developers Day Workshop. doi:10.1145/2872518.2889302. 
  21. ^ "How to Spot a Social Bot on Twitter". technologyreview.com. 2014-07-28. Social bots are sending a significant amount of information through the Twittersphere. Now there's a tool to help identify them 
  22. ^ Romanov, Aleksei; Alexander Semenov; Jari Veijalainen (2017). "Revealing Fake Profiles in Social Networks by Longitudinal Data Analysis". Proceedings of the 13th International Conference on Web Information Systems and Technologies. 

  This article uses material from the Wikipedia page available here. It is released under the Creative Commons Attribution-Share-Alike License 3.0.

Social_bot
 



 

Connect with defaultLogic
What We've Done
Led Digital Marketing Efforts of Top 500 e-Retailers.
Worked with Top Brands at Leading Agencies.
Successfully Managed Over $50 million in Digital Ad Spend.
Developed Strategies and Processes that Enabled Brands to Grow During an Economic Downturn.
Taught Advanced Internet Marketing Strategies at the graduate level.


Manage research, learning and skills at NCR Works. Create an account using LinkedIn to manage and organize your omni-channel knowledge. NCR Works is like a shopping cart for information -- helping you to save, discuss and share.


  Contact Us