In 2016, Microsoft released its chatbot Tay onto Twitter to engage in "playful conversation" with users. In less than 24 hours, Tay began spouting racist and sexist comments. More recently, in U.S. courtrooms, judges are increasingly using algorithms (instead of cash bails) to predict which criminal defendants will flee or commit another crime, even though a 2016 ProPublica investigation found that such algorithms may be biased against black prisoners.

Tech companies have made big advances in terms of building artificially intelligent software that gets smarter over time and potentially makes life and work easier. But these examples reveal an uncomfortable reality about A.I.: even the most intelligent bots can be biased.

Who is building the robots--and how--will only become more important questions in the future. In 2016, sales of consumer robots reached $3.8 billion. They're expected to reach $13.2 billion by 2022, according to market firm Tractica. Add to that the coming wave of self-driving cars and virtual assistants in the workplaces, and you see a future in which A.I. is going to continue to play a bigger role in the culture and the economy.

"One source of urgency is simply due to the fact that we're about to experience a proliferation of robots in society," says Jim Boerkoel, a robotics professor at Harvey Mudd. 

Boerkoel says removing all biases from robots is an enormous and difficult task. Bots and A.I. are built by human beings who have implicit biases. Even if you could design an A.I. algorithm to be completely agnostic to race, gender, religion, and orientation, "[for robots] to be effective, they will inevitably need to learn from experience, either through interactions with the world or from data provided to them," he says.

A number of startups are working on this problem. Their goal? Minimize bias problems from the get-go by designing A.I. systems that reflect a broader range of human experiences.

The invisible human hand

Founded in 2009 at M.I.T Media Lab, Affectiva makes software that claims to detect and understand human emotions and facial expressions just by scanning a person's face. The software's applications vary widely: Learning apps have employed it to better understand how students use them, Giphy uses it to tag GIFs with emotions, and global research firms use it to measure audience responses to TV ads and movie trailers. Recently, Affectiva began shipping software to auto companies that can monitor a driver's face and emotions to create a better car experience. For example, the car may suggest that you don't stop by the grocery store and do it tomorrow instead if it detects that you're tired or frustrated.

The company says that avoiding bias in its algorithms starts with accumulating an enormous set of diverse data. To that end, Affectiva says that it has analyzed over six million faces in 87 countries--a process that involves hundreds of millions of different data points, says Affectiva director of applied A.I. Jay Turcot. The optical sensors that work with Affectiva's software to capture details of facial expressions, such as the movements of an eyebrow or the corners of a mouth, are placed in the background in an effort to analyze people acting naturally.

Such heavy data collection requires a team of human annotators, who help feed the algorithms, to manually tag what they see in the data. This process helps Affectiva's scientists like Turcot look for potential biases hidden in the algorithms. For instance, Turcot says that their data showed that in everyday conversations women tend to laugh more than men, which could help perpetuate gender bias. If the human annotators didn't make sure to balance the data appropriately, the algorithm might inadvertently conclude that laughter is essentially a "woman's thing"--which is not so helpful for software that aims to read people's emotions to provide a better driving experience.

A robot that's anything you want it to be

It's not just the data powering the A.I. that can be biased. The physical design of robots can reflect certain prejudices as well--particularly when engineers anthropomorphize them, says Boerkoel. Consider, for example, the sheer number of bot assistants that are given female names, voices, and (in some cases) bodies: Siri, Alexa, and Sophia are just a few examples.

Furhat Robotics doesn't want to determine the gender--or for that matter, species--of its robots for you. The Stockholm-based social A.I. robotics startup aims to build robots that use language and gestures to converse with people in a natural manner. The robot comes as a head and stand (and optional fur hat). Furhat says the robot learns how to be more conversational by speaking with humans; it is equipped with microphone sensors and cameras that pick up and convert the speech to text using machine learning.

The signature feature of Furhat is its "projection mask," a plastic mask that can look like a man, woman, animal, or even a Disney-inspired avatar, with the help of computer animation. Securing $2.5 million in seed funding last September, Furhat has partnered with such companies as Honda, Intel, Disney, and KPMG, which use the robot for various social purposes: a job interview trainer, a robot that tells stories to kids, a conversation tool for the elderly. Furhat will be piloting robots in March at Frankfurt Airport to communicate with international travelers.

"Once [the founders] had this tool, what made this special was the ability to represent any human or non-human or any gender and that started becoming a founding philosophy of the company," says Furhat's senior business developer Joe Mendelson. 

\n

Robots that are specialists instead of generalists

\n

Another potential way to limit bias in A.I. systems is to narrow the focus of what they're designed to do. X.ai is a good example of a bot with a single-minded purpose. The New York-based startup created an A.I. virtual assistant--which goes by the name of Amy or Andrew--that can manage your schedule: You ping it on Slack or CC it on an email and the bot can arrange meetings, send out calendar invites, and plan any reschedules or cancellations. The assistant is fully autonomous, so it doesn't need additional input or control after being asked to do a job.

\n

Founder Dennis Mortensen argues that, in the future, the most effective virtual assistants will eventually \"do one thing and one thing very well.\"

\n

\"I'm not convinced that we'll end up with a single A.I. or one of the personal assistants, whether it'll be Google Assistant or Siri, that will be an entity that can answer all your questions--that just doesn't sound realistic,\" he says.

\n

He suggests that in the same way that there isn't just one app on your phone that can do all things, virtual assistants shouldn't be expected to overstep their job descriptions. Amy and Andrew are designed to care only about dates, times, locations, and names of people. The algorithm simply doesn't process \"bad\" input--for instance, racist or sexist language--Mortensen says.

\n

A $23 million investment led by Two Sigma Ventures in 2016 has allowed x.ai to build its dataset from scratch. The company has about 70 A.I. trainers--which make up two-thirds of the startup's team--who assemble and label data that comes from email conversations with Amy and Andrew. Despite the female and male distinction, the virtual assistants behave in exactly the same way. And like most A.I. systems, nuanced language is still an ongoing challenge for the algorithms. (If someone sends a note at 12:30 a.m. and says she's free \"tomorrow,\" does she really mean today?)

\n

Despite the progress being made, bias will likely continue to creep into technology in ways that tech companies haven't even thought of yet. 

\n

\"I'd argue that it is at least as hard as trying to fix implicit biases within ourselves and society,\" Boerkoel says. \"While implicit bias training can help designers of technology attempt to keep biases in check, it is impossible to be truly blind to all of the ways a culture has shaped our views about gender, race, and religion.\"

","inc_code_only_text":null,"inc_pubdate":"2018-02-28 07:00:00","inc_promo_date":"2018-02-28 07:00:00","inc_custom_pubdate":null,"inc_feature_image_override":"","inc_feature_image_background_color_override":null,"inc_show_feature_imageflag":true,"inc_feature_image_style":"pano","inc_image_caption_override":null,"inc_autid":0,"inc_typid":1,"inc_staid":7,"inc_serid":0,"inc_prtid":0,"inc_activeflag":true,"inc_copyeditedflag":true,"inc_flag_for_reviewflag":false,"inc_lock_articleflag":false,"inc_react_displayflag":true,"inc_filelocation":"michelle-cheng/how-do-you-design-a-robot-that-isnt-sexist-or-racist-its-harder-than-you-think.html","inc_override_url":null,"inc_hide_article_sidebarflag":false,"inc_custom_sidebar":null,"inc_show_read_moreflag":true,"inc_display_video_at_bottomflag":true,"inc_autoplay_videoflag":true,"inc_full_width_read_moreflag":false,"inc_custom_footer":null,"inc_custom_teaser":null,"inc_hide_video_prerollflag":false,"inc_custom_css":null,"inc_custom_javascript":null,"inc_canonical_url":null,"inc_meta_keywords":null,"inc_column_name_override":null,"inc_newsworthyflag":false,"inc_notepad":"[[Below is a bit inaccurate. I thought the human annotators help look for bias in the data. Also, it was not a problem of bias they actually faced, as they were able to prevent it by tweaking their data collection efforts to make sure that the data doesn't become biased]]\r\n\r\nStill, even with its careful data collection methods, Turcot says Affectiva discovered a potential gender bias issue WHEN as it was analyzing audio clips of drivers FOR WHAT PURPOSE/CUSTOMER?. The results suggested that women drivers tend to laugh more than male drivers. [[WHY IS THIS AN ISSUE OF BIAS?]]\r\n\r\nIf Affectiva didn't have its team of human annotators, who help feed the algorithms, actively looking for bias and weeding it out in its reams of data, this sort of finding might have gone unnoticed and WHAT WOULD HAVE HAPPENED?. In the case of the drivers, Affectiva wound up DOING WHAT TO CORRECT FOR THE BIAS? \r\n\r\nhas partnerships with global research firms that use the startup's technology to measure audience responses to TV ads and movie trailers, which allows them to collect large sets of diverse data:  , which have been long-time users of robotic technology. \r\n\r\nTo help prevent biases in their algorithms, Affectiva's Chief Marketing Officer Gabi Zijderveld says that it comes down to how you collect your data: Their software, which works with any optical sensor, is placed in the background to record people in action. The software analyzes spontaneous facial expressions, such as the corners of eyebrows or the corners of your mouth. Zijderveld says that this allows for data to be collected in a more \"natural\" setting, leading to a better representation of the human experience.\r\n\r\nThe diversity of data and the amount of data allows Affectiva to more accurately determine human expressions, but requires hundred of millions of data points, This vastness requires a team of human annotators (they decline on saying number), who help feed the algorithms, helps weed out the bias in the data.\r\n\r\nTurcot mentioned that a gender bias issue in the data they found and overcame was that their audio clips indicated that women drivers tend to laugh more than male drivers. To get a more accurate picture, he says that there are several ways to solve it: either ensuring that an equal number of women and men are being used, or weighing the number of men more heavily in their algorithms. \r\n\r\nTo help prevent that problem, Turcot says that they pay closer attention to balancing their data, so that there is an equal representation of men and women. This may include getting more female subjects or training the algorithm to make sure they weigh the number of men more heavily.\r\n\r\n[[people won't necessarily care if A.I. systems are human-like or not. He says virtual assistants and that's all that will matter. He says, for instance, if he asked Alexa device to bring him a coke, he wouldn't care if it's human or not.]]\r\n\r\nMortensen says the next phase of x.ai's assistants will allow people to talk to computers as though they're human--but the point isn't to fool them into believing they're human. He says that by telling people upfront that the bots are machines, they will approach the the machines differently. For instance, people may not feel \"awkward\" about certain things that may come off as being \"socially unacceptable\" such as sending a message to a bot at 3 a.m. or rescheduling with a bot four times, says Mortensen.\r\n\r\n\"The point is to make them natural enough [so] you can say it the way you want it,\" says Mortensen. So rather than worrying about how to phrase a request for the machine to understand (for example, Alexa has a list of commands), people can assume that the agent will understand them.","inc_track_changesflag":false,"time_updated":"2018-09-05 15:42:59","channels":[{"id":7,"cnl_name":"Innovate","cnl_filelocation":"innovate","cnl_featuretype":"None","cnl_custom_color":"9DC786","cnl_calculated_color":null,"cnl_contributor_accessflag":true,"cnl_custom_article_footer":null,"cnl_global_nav_background_color":null,"cnl_global_nav_background_gradient_start":null,"cnl_global_nav_background_gradient_end":null,"cnl_iflid":0,"sortorder":null}],"categories":[],"primarychannelarray":null,"authors":[{"id":7335,"aut_name":"Michelle Cheng","aut_usrid":4423708,"aut_base_filelocation":"michelle-cheng","aut_imgid":339668,"aut_twitter_id":"mbcheng15","aut_title":"Editorial assistant, Inc.com","aut_blurb":"Michelle Cheng is an editorial assistant at Inc. She has written for FiveThirtyEight, KQED News, and Forbes. She is a graduate of Boston University.","aut_footer_blurb":"Michelle Cheng is an editorial assistant at Inc. She has written for FiveThirtyEight, KQED News, and Forbes. She is a graduate of Boston University.","aut_column_name":null,"aut_atyid":1,"aut_newsletter_location":null,"authorimage":"https://www.incimages.com/uploaded_files/image/100x100/MichelleCheng-headshot_339668.jpg","sortorder":null}],"images":[{"id":347410,"sortorder":null}],"inlineimages":[],"photoEssaySlides":null,"readMoreArticles":null,"slideshows":[],"videos":[{"id":14898,"tags":"","sortorder":0}],"bzwidgets":null,"relatedarticles":null,"comparisongrids":[],"products":[],"keys":["Innovate","Michelle Cheng","Inc.com staff writer"],"meta_description":"A.I. systems are built by human beings with implicit biases. So it's no wonder bots are often biased too. A slew of startups are hoping to do something about that.","brandview":null,"internationalversion":[],"imagemodels":[{"id":347410,"img_foreignkey":null,"img_gettyflag":false,"img_reusableflag":false,"img_rightsflag":false,"img_usrid":3477065,"img_pan_crop":null,"img_tags":null,"img_reference_name":"Furhat Robotics","img_caption":"Furhat Robotics.","img_custom_credit":"Courtesy Furhat Robotics AB","img_bucketref":null,"img_panoramicref":"Furhat-Hallway.jpg","img_super_panoramicref":null,"img_tile_override_imageref":null,"img_skyscraperref":null,"img_gallery_imageref":null,"credit":"Courtesy Furhat Robotics AB","sizes":{"panoramic":{"original":"uploaded_files/image/Furhat-Hallway.jpg","1230x1672":"uploaded_files/image/1230x1672/Furhat-Hallway_347410.jpg","1940x900":"uploaded_files/image/1940x900/Furhat-Hallway_347410.jpg","1270x734":"uploaded_files/image/1270x734/Furhat-Hallway_347410.jpg","0x734":"uploaded_files/image/0x734/Furhat-Hallway_347410.jpg","1150x540":"uploaded_files/image/1150x540/Furhat-Hallway_347410.jpg","970x450":"uploaded_files/image/970x450/Furhat-Hallway_347410.jpg","640x290":"uploaded_files/image/640x290/Furhat-Hallway_347410.jpg","635x367":"uploaded_files/image/635x367/Furhat-Hallway_347410.jpg","0x367":"uploaded_files/image/0x367/Furhat-Hallway_347410.jpg","575x270":"uploaded_files/image/575x270/Furhat-Hallway_347410.jpg","385x240":"uploaded_files/image/385x240/Furhat-Hallway_347410.jpg","336x336":"uploaded_files/image/336x336/Furhat-Hallway_347410.jpg","300x520":"uploaded_files/image/300x520/Furhat-Hallway_347410.jpg","300x200":"uploaded_files/image/300x200/Furhat-Hallway_347410.jpg","284x160":"uploaded_files/image/284x160/Furhat-Hallway_347410.jpg","155x90":"uploaded_files/image/155x90/Furhat-Hallway_347410.jpg","100x100":"uploaded_files/image/100x100/Furhat-Hallway_347410.jpg","50x50":"uploaded_files/image/50x50/Furhat-Hallway_347410.jpg"}}}],"formatted_text":"<p dir="ltr">In 2016, <a href="https://www.inc.com/business-insider/microsoft-second-quarter-earnings-report-q218.html">Microsoft</a> released its <a href="https://www.inc.com/will-yakowicz/chatbot-to-help-143-million-equifax-victims-sue.html">chatbot</a> Tay onto Twitter to engage in "playful conversation" with users. In less than 24 hours,<a href="https://www.inc.com/cameron-albert-deitch/microsoft-artificial-intelligence-project-shows-why-we-cant-have-nice-things.html"> Tay began spouting racist and sexist comments</a>. More recently, in U.S. courtrooms, judges are increasingly using <a target="_blank" href="https://www.seattletimes.com/business/ai-in-the-court-when-algorithms-rule-on-jail-time-2/">algorithms (instead of cash bails)</a> to predict which criminal defendants will flee or commit another crime, even though a 2016 ProPublica investigation found that such algorithms may be<a target="_blank" href="https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing"> biased against black prisoners</a>.</p>\n<p dir="ltr">Tech companies have made big advances in terms of building <a href="https://www.inc.com/kevin-j-ryan/3-ex-googlers-founded-spoke-artificial-intelligence-platform.html">artificially intelligent software </a>that gets smarter over time and potentially makes life and work easier. But these examples reveal an uncomfortable reality about A.I.: even the most intelligent bots can be biased.</p>\n<p dir="ltr">Who is building the robots--and how--will only become more important questions in the future. In 2016, sales of consumer robots reached $3.8 billion. They're expected to reach $13.2 billion by 2022, according to market firm <a target="_blank" rel="nofollow" href="https://www.tractica.com/research/consumer-robotics/">Tractica</a>. Add to that the coming wave of self-driving cars and virtual assistants in the workplaces, and you see a future in which A.I. is going to continue to play a bigger role in the culture and the economy.</p>\n<p>"One source of urgency is simply due to the fact that we're about to experience a proliferation of robots in society," says Jim Boerkoel, a robotics professor at Harvey Mudd.&nbsp;</p>\n<p dir="ltr">Boerkoel says removing all biases from robots is an enormous and difficult task. Bots and A.I. are built by human beings who have implicit biases. Even if you could design an A.I. algorithm to be completely agnostic to race, gender, religion, and orientation,&nbsp;"[for robots] to be effective, they will inevitably need to learn from experience, either through interactions with the world or from data provided to them," he says.</p>\n<p dir="ltr">A number of startups are&nbsp;working on this problem. Their goal? Minimize bias problems from the get-go by designing A.I. systems that reflect a broader range of human experiences.</p>\n<h2 dir="ltr">The invisible human hand</h2>\n<p dir="ltr">Founded in 2009 at M.I.T Media Lab, Affectiva&nbsp;makes software that claims to detect and understand human emotions and facial expressions just by scanning a person's face. The software's&nbsp;applications vary widely: Learning apps have employed&nbsp;it to better understand how students use them, Giphy uses it to tag GIFs with emotions, and global research firms use it to measure audience responses to TV ads and movie trailers.&nbsp;Recently, Affectiva began shipping software<strong>&nbsp;</strong>to auto companies that can monitor a driver's face and emotions&nbsp;to create a&nbsp;better car experience. <a target="_blank" rel="nofollow" href="http://blog.affectiva.com/driver-emotion-recognition-and-real-time-facial-analysis-for-the-automotive-industry">For example</a>, the car may suggest that you don't stop by the grocery store and do it tomorrow instead if it detects that you're tired or frustrated.</p>\n<p dir="ltr">The company says that avoiding bias in its algorithms starts with accumulating an enormous set of diverse data. To that end, Affectiva&nbsp;says that it has&nbsp;analyzed over six million faces in 87 countries--a process that&nbsp;involves hundreds of millions of different data points, says Affectiva director of applied A.I. Jay Turcot. The optical sensors that work with Affectiva's software to&nbsp;capture details of facial expressions, such as the movements of an eyebrow or the corners of a mouth, are placed in the background in an effort to analyze people acting naturally.</p>\n<p>Such heavy data collection requires a team of human annotators, who help feed the algorithms, to manually tag what they see in the data. This process helps&nbsp;Affectiva's scientists like Turcot look&nbsp;for potential biases hidden in&nbsp;the algorithms. For instance,&nbsp;Turcot says that their data&nbsp;showed that in everyday&nbsp;conversations women tend to laugh more than men, which could help perpetuate gender bias. If the human annotators didn't make sure to balance the data appropriately, the algorithm might inadvertently conclude that laughter is essentially a "woman's thing"--which is not so helpful for software that aims to read people's emotions to provide a better driving experience.</p>\n<h2 dir="ltr">A robot that's anything you want it to be</h2>\n<p dir="ltr">It's not just the data powering the A.I. that can be biased. The physical design of robots can reflect certain prejudices as well--particularly when engineers anthropomorphize them, says Boerkoel. Consider, for example, the sheer number of bot assistants that are given female names, voices, and (in some cases) bodies: Siri, Alexa, and <a target="_blank" href="https://www.youtube.com/watch?v=Bg_tJvCA8zw">Sophia</a> are just a few examples.</p>\n<p dir="ltr">Furhat Robotics doesn't want to determine the gender--or for that matter, species--of its robots for you. The&nbsp;Stockholm-based social A.I. robotics startup aims to build robots that use language and gestures to converse with people in a natural manner. The robot comes as a head and stand (and&nbsp;optional&nbsp;fur hat). Furhat says the robot learns how to be more conversational by speaking with humans; it&nbsp;is&nbsp;equipped with microphone sensors and cameras that pick&nbsp;up and convert&nbsp;the speech&nbsp;to text using machine learning.</p>\n<p dir="ltr">The&nbsp;signature feature of Furhat is its "projection mask," a plastic mask that can look like a man, woman, animal, or&nbsp;even a Disney-inspired avatar, with the help of computer animation. <a target="_blank" rel="nofollow" href="https://www.pehub.com/2017/09/furhat-robotics-rakes-in-2-5-mln-seed/">Securing $2.5 million in seed funding last September,</a> Furhat has partnered with such companies as Honda, Intel, Disney, and KPMG, which use the robot for various social purposes: a job interview trainer, a robot that tells stories to kids, a conversation tool for the elderly. Furhat will be piloting robots in March at Frankfurt Airport to communicate with international travelers.</p>\n<p dir="ltr">"Once [the founders] had this tool, what made this special was the ability to represent any human or non-human or any gender and that started becoming a founding philosophy of the company," says Furhat's senior business developer Joe Mendelson.&nbsp;</p>\n<p dir="ltr"><iframe frameborder="0" height="315" src="https://www.youtube.com/embed/HBTBlIFMIJ0" width="560"></iframe></p>\n<h2 dir="ltr">Robots that are specialists instead of generalists</h2>\n<p dir="ltr">Another potential way to limit bias in A.I. systems is to narrow the focus of what they're designed to&nbsp;do. X.ai is a good example of a bot with a single-minded purpose.&nbsp;The New York-based startup&nbsp;created an A.I. virtual assistant--which goes by the name of Amy or Andrew--that can manage your schedule: You ping it on Slack or CC it on an email and&nbsp;the bot can arrange meetings,&nbsp;send out calendar invites, and plan any reschedules or cancellations. The assistant is&nbsp;fully autonomous, so it doesn't need additional input or control after being asked to do a job.</p>\n<p dir="ltr">Founder Dennis Mortensen&nbsp;argues that, in the future, the most effective virtual assistants will eventually "do one thing and one thing very well."</p>\n<p dir="ltr">"I'm not convinced that we'll end up with a single A.I. or one of the personal assistants, whether it'll be Google Assistant or Siri,&nbsp;that will be an entity that can answer all your questions--that just doesn't sound realistic," he says.</p>\n<p dir="ltr">He suggests that in the same way that there isn't just one app on your phone that can do all things, virtual assistants shouldn't be expected to overstep their job descriptions. Amy and Andrew are designed to care only about dates, times, locations, and names of people. The algorithm simply doesn't process "bad" input--for instance, racist or sexist language--Mortensen says.</p>\n<p dir="ltr">A <a target="_blank" href="https://www.crunchbase.com/organization/x-ai">$23 million investment</a>&nbsp;led by&nbsp;Two Sigma Ventures in 2016 has allowed x.ai to build its dataset from scratch.&nbsp;The company has about 70 A.I. trainers--which make up two-thirds of the startup's team--who assemble and label data that comes from email conversations with Amy and Andrew.&nbsp;Despite the female and male distinction, the virtual assistants behave in exactly the same way. And like most A.I. systems, nuanced language is still an ongoing challenge for the algorithms. (If someone sends a note at 12:30 a.m. and says she's free "tomorrow," does she really mean today?)</p>\n<p dir="ltr">Despite the progress being made, bias will likely continue to creep into technology in ways that tech companies haven't even thought of yet.&nbsp;</p>\n<p dir="ltr">"I'd argue that it is at least as hard as trying to fix implicit biases within ourselves and society,"&nbsp;Boerkoel says.&nbsp;"While implicit bias training can help designers of technology attempt to keep biases in check, it is impossible to be truly blind to all of the ways a culture has shaped our views about gender, race, and religion."</p>","adinfo":{"c_type":"article","showlogo":true,"cms":"inc203119","video":"yes","aut":["michelle-cheng"],"channelArray":{"topid":"7","topfilelocation":"innovate","primary":["innovate"],"primaryFilelocation":["innovate"],"primaryname":["Innovate"]},"adzone":"/4160/mv.inc/innovate/innovate/innovate"},"seriesname":null,"editorname":null,"commentcount":null,"inc5000companies":[],"inc5000list":{"id":null,"ifl_list":null,"ifl_year":null,"ifl_custom_data_description":null,"ifl_filelocation":null,"ifl_sharetext":null,"ifl_data_endpoint":null,"ifl_columns":null,"ifl_rows_per_page":null,"ifl_filter_columns":null,"ifl_permanently_hidden_columns":null,"ifl_extra_large_hidden_columns":null,"ifl_large_hidden_columns":null,"ifl_medium_hidden_columns":null,"ifl_small_hidden_columns":null,"ifl_extra_small_hidden_columns":null,"ifl_currency":null,"ifl_enable_accent_rule_topflag":false,"ifl_enable_accent_rule_bottomflag":false,"ifl_table_accent_rule_color":null,"ifl_table_header_background_color":null,"ifl_table_header_text_color":null,"ifl_table_row_stripe_color":null,"ifl_enable_filterflag":false,"ifl_filter_background_color":null,"ifl_filter_dropdown_border_color":null,"ifl_filter_dropdown_text_color":null,"ifl_enable_pagination_topflag":false,"ifl_enable_pagination_bottomflag":false,"ifl_pagination_bar_color":null,"ifl_filter_reset_button_color":null,"ifl_methodology":null,"ifl_pubdate":null,"ifl_default_sort":null,"companylist":null},"companies":[],"buyerzonewidgets":[],"photoEssaySlideModels":null,"custom_article_footer":null,"ser_footer_blurb":null,"dayssincepubdate":601,"trackingpixel":"","promotions":null,"inline_script_tags":[],"loadedFully":true,"promoimage":"https://www.incimages.com/uploaded_files/image/970x450/Furhat-Hallway_347410.jpg","largepromoimage":"https://www.incimages.com/uploaded_files/image/1940x900/Furhat-Hallway_347410.jpg","pubdate":"2018-02-28T07:00:00.000Z","author":"Michelle Cheng","authortwitter":"@mbcheng15","articlesection":"Innovate"},"server1142960":{"id":237336,"inc_homepage_headline":"How Apple Could Crush Netflix, Spotify, and Disney Plus With a $25 Per Month All-Inclusive Plan","inc_homepage_headline_ab_test":"Apple Could Beat Netflix, Spotify, and Disney Plus With This Simple Strategy","inc_headline":"How Apple Could Crush Netflix, Spotify, and Disney Plus With a $25 Per Month All-Inclusive Plan","inc_filelocation":"https://www.inc.com/jason-aten/how-apple-could-crush-netflix-spotify-disney-plus-with-a-25-per-month-all-inclusive-plan.html?icid=readmoretext_ab","tilefeatureimage":"https://www.incimages.com/uploaded_files/image/300x200/getty_1032224410_200013492000928072_404918.jpg","tilefeatureimageX2":"https://www.incimages.com/uploaded_files/image/600x400/getty_1032224410_200013492000928072_404918.jpg","brandview":null,"loadedFully":false},"server1142961":{"id":237342,"inc_homepage_headline":"What You Should Do Differently if You Want Honest Employee Feedback","inc_homepage_headline_ab_test":"How to Get Every Employee to Speak Up and Share Ideas","inc_headline":"What You Should Do Differently if You Want Honest Employee Feedback","inc_filelocation":"https://www.inc.com/maria-haggerty/what-you-should-do-differently-if-you-want-honest-employee-feedback.html?icid=readmoretext_ab","tilefeatureimage":"https://www.incimages.com/uploaded_files/image/300x200/getty_1041740040_404937.jpg","tilefeatureimageX2":"https://www.incimages.com/uploaded_files/image/600x400/getty_1041740040_404937.jpg","brandview":null,"loadedFully":false},"server1142962":{"id":236754,"inc_homepage_headline":"How Where You Live Can Impact Your Startup","inc_homepage_headline_ab_test":"Where You Live Can Impact Your Startup","inc_headline":"How Where You Live Can Impact Your Startup","inc_filelocation":"https://www.inc.com/kenny-kline/how-where-you-live-can-impact-your-startup.html?icid=readmoretext_ab","tilefeatureimage":"https://www.incimages.com/uploaded_files/image/300x200/getty_959639206_403853.jpg","tilefeatureimageX2":"https://www.incimages.com/uploaded_files/image/600x400/getty_959639206_403853.jpg","brandview":null,"loadedFully":false},"server1142963":{"id":237361,"inc_homepage_headline":"Women Should Not 'Fix' Themselves to Fit Into Sexist Work Environments","inc_homepage_headline_ab_test":"Changing Who You Are to Fit In at Male-Dominated Workspaces Isn't Going to Cut It","inc_headline":"Women Should Not 'Fix' Themselves to Fit Into Sexist Work Environments","inc_filelocation":"https://www.inc.com/amy-nelson/women-should-not-fix-themselves-to-fit-into-sexist-work-environments.html?icid=readmoretext_ab","tilefeatureimage":"https://www.incimages.com/uploaded_files/image/300x200/getty_596702517_20001498181884398255_404998.jpg","tilefeatureimageX2":"https://www.incimages.com/uploaded_files/image/600x400/getty_596702517_20001498181884398255_404998.jpg","brandview":null,"loadedFully":false}},"videoHash":{"server1142964":{"id":236180,"inc_homepage_headline":null,"inc_homepage_headline_ab_test":null,"inc_headline":"Why These Female Founders Say Hiring a Diverse Team Is Crucial to Success","inc_deck":"Nicole Gibbons, founder of paint brand Clare, and Ishveen Anand, founder of sports marketing company OpenSponsorship, sat down to discuss the most surprising--and challenging--aspects of entrepreneurship.","inc_filelocation":"https://www.inc.com/video/why-these-female-founders-say-hiring-a-diverse-team-is-crucial-to-success.html","video":true,"vidid":"15773","tilefeatureimage":"https://www.incimages.com/uploaded_files/image/300x200/FF_Nicole_Ishveen_SITE_POSTER_402774.png","tilefeatureimageX2":"https://www.incimages.com/uploaded_files/image/600x400/FF_Nicole_Ishveen_SITE_POSTER_402774.png"},"server1142965":{"id":14898,"vid_kaltura_id":"1_gkym72cp","vid_jw_identifier":"xavOSUmm","vid_title":"5 Times Elon Musk Made Everyone Terrified of Artificial Intelligence","vid_description":null}},"gridHash":{},"productHash":{},"user":{"loggedIn":false},"mustReadArticles":{"articles":[],"videos":[],"desktoplogo":null,"mobilelogo":null,"logolink":null,"containedInList":null},"articleBundleHash":{"server1142969":{"articleId":"server1142959","videos":["server1142965"],"grids":[],"slideshows":[],"slides":[],"readmorearticles":[237336,237342,236754,237361],"images":["server1142966"],"channels":["server1142967"],"authors":["server1142968"]}},"articlePage":{"infiniteArticleCollection":[{"articleBundleId":"server1142969"}]},"imageHash":{"server1142966":{"id":347410,"img_foreignkey":null,"img_gettyflag":false,"img_reusableflag":false,"img_rightsflag":false,"img_usrid":3477065,"img_pan_crop":null,"img_tags":null,"img_reference_name":"Furhat Robotics","img_caption":"Furhat Robotics.","img_custom_credit":"Courtesy Furhat Robotics AB","img_bucketref":null,"img_panoramicref":"Furhat-Hallway.jpg","img_super_panoramicref":null,"img_tile_override_imageref":null,"img_skyscraperref":null,"img_gallery_imageref":null,"credit":"Courtesy Furhat Robotics AB","sizes":{"panoramic":{"original":"uploaded_files/image/Furhat-Hallway.jpg","1230x1672":"uploaded_files/image/1230x1672/Furhat-Hallway_347410.jpg","1940x900":"uploaded_files/image/1940x900/Furhat-Hallway_347410.jpg","1270x734":"uploaded_files/image/1270x734/Furhat-Hallway_347410.jpg","0x734":"uploaded_files/image/0x734/Furhat-Hallway_347410.jpg","1150x540":"uploaded_files/image/1150x540/Furhat-Hallway_347410.jpg","970x450":"uploaded_files/image/970x450/Furhat-Hallway_347410.jpg","640x290":"uploaded_files/image/640x290/Furhat-Hallway_347410.jpg","635x367":"uploaded_files/image/635x367/Furhat-Hallway_347410.jpg","0x367":"uploaded_files/image/0x367/Furhat-Hallway_347410.jpg","575x270":"uploaded_files/image/575x270/Furhat-Hallway_347410.jpg","385x240":"uploaded_files/image/385x240/Furhat-Hallway_347410.jpg","336x336":"uploaded_files/image/336x336/Furhat-Hallway_347410.jpg","300x520":"uploaded_files/image/300x520/Furhat-Hallway_347410.jpg","300x200":"uploaded_files/image/300x200/Furhat-Hallway_347410.jpg","284x160":"uploaded_files/image/284x160/Furhat-Hallway_347410.jpg","155x90":"uploaded_files/image/155x90/Furhat-Hallway_347410.jpg","100x100":"uploaded_files/image/100x100/Furhat-Hallway_347410.jpg","50x50":"uploaded_files/image/50x50/Furhat-Hallway_347410.jpg"}}}},"channelHash":{"server1142967":{"id":7,"cnl_name":"Innovate","cnl_filelocation":"innovate","cnl_featuretype":"None","cnl_custom_color":"9DC786","cnl_calculated_color":null,"cnl_contributor_accessflag":true,"cnl_custom_article_footer":null,"cnl_global_nav_background_color":null,"cnl_global_nav_background_gradient_start":null,"cnl_global_nav_background_gradient_end":null,"cnl_iflid":0,"sortorder":null}},"authorHash":{"server1142968":{"id":7335,"aut_name":"Michelle Cheng","aut_usrid":4423708,"aut_base_filelocation":"michelle-cheng","aut_imgid":339668,"aut_twitter_id":"mbcheng15","aut_title":"Editorial assistant, Inc.com","aut_blurb":"Michelle Cheng is an editorial assistant at Inc. She has written for FiveThirtyEight, KQED News, and Forbes. She is a graduate of Boston University.","aut_footer_blurb":"Michelle Cheng is an editorial assistant at Inc. She has written for FiveThirtyEight, KQED News, and Forbes. She is a graduate of Boston University.","aut_column_name":null,"aut_atyid":1,"aut_newsletter_location":null,"authorimage":"https://www.incimages.com/uploaded_files/image/100x100/MichelleCheng-headshot_339668.jpg","sortorder":null}},"author":{"authorData":{},"isFetching":false,"isFetched":false,"error":null},"companyProfile":[],"channel":{"channel":[],"isFetching":false,"isFetched":false,"error":null},"guide":{"guide":{},"isFetching":false,"isFetched":false,"error":null},"sitemap":{"sitemap":[],"isFetching":false,"isFetched":false,"error":null},"hostname":"node.inc.com","homePage":{"topArticles":[],"packages":[],"isFetching":false,"isFetched":false,"error":null},"ifl":{"ifl":[],"isFetching":false,"isFetched":false,"error":null}};