Executives today are held responsible not only for business performance, but also for their company's impact on society and the greater good. This is especially true in the artificial intelligence (AI) space -- and rightfully so. While AI shows a lot of promise, there are still many concerns about the responsible use of AI.

I'm used to engaging in conversations about AI ethics with peers and people in my network. But a recent visit to my son Adam's classroom got me thinking that we need to expand the conversation in a somewhat unexpected realm: amongst kids. 

Kids today have a different view on emerging technologies than those of their parents -- after all, they've grown up surrounded by AI and thus have different perspectives on how it should manifest in our lives. We need to pay attention to what these kids have to say, as they represent the next generation of consumers and technologists who will build AI and live with it. 

I wanted to dig into this further, so at my company Affectiva's recent Emotion AI Summit, we held a panel to discuss the future of AI with three students: Adam (age 10), Tessel (age 13) and Kavya (age 17). 

Here are three lessons that tech leaders can learn from kids: 

1. We need to do better when it comes to privacy, transparency, and education. 

One resounding takeaway I heard from these kids was a concern with data privacy and the need for transparency. On the panel, Tessel cited a great example about click-through agreements. We've all seen these -- when you download an app or a new upgrade on your phone, you're asked to read and agree to the terms and conditions, which are often excessively long and inundated with legalese jargon. 

Tessel explained that, when we click through to accept an upgrade on our phones, it's not always clear what we're agreeing to. We're agreeing to it legally, but are we really okay with it? 

This is precisely the problem. How can people be comfortable with new technology like AI, and consent to using it, without a clear understanding of what the technology is doing? Whether it's collecting our data and if that data is being stored? And how the data will be used? 

If kids can make these observations, then we as an industry are long overdue to address it. We need to take a long, hard look at our efforts to educate users, and push ourselves to honestly examine whether or not consent is really informed consent. Otherwise, the public will never trust AI -- and as a consumer, I can't blame them. 

2. Power needs to be distributed.

I often say that technology itself is neutral; but, depending on how technology is used, it can be a force for good or for bad. Kids are astutely aware of this. On the panel, Adam expressed a concern that AI will give too much power to governments or tech giants, and that in the wrong hands, it can be abused. 

This issue of power asymmetry is something that demands our attention. Powerful technologies are often in the hands of large corporations or governments, but the people consuming these technologies and sharing their data are not deriving the same value, and often don't have equal access. This has dire implications for social and economic mobility. People with access to AI will be able to work more efficiently and will have a leg-up on those who don't have access. I worry about the impact this will have on communities and populations that are already impoverished, as AI could continue to widen that gap. 

We need to create guidelines that ensure that AI is applied in an equitable way, and does not widen the bounds of inequality. On the flip side, Adam was also optimistic that, if we can apply AI equitably, it has the potential to solve some of society's biggest issues, like hunger or access to education. I truly believe this is the case, but if we don't start thinking about power distribution now, we risk institutionalizing AI in a way that will exacerbate inequalities. 

3. Diversity needs to be a priority.

It's no secret that diversity (or lack thereof) continues to be a major issue for the tech industry. As a woman and a computer scientist, I worry that young women will be discouraged from pursuing technical careers for this reason. 

But I'm encouraged by stories of young women working to change the imbalance, even before they start their careers. On the panel, Kavya spoke about the need to get girls involved in STEM from a younger age. She cited research that shows that, despite the fact that tech jobs are on the rise, the percentage of women in these roles is growing at a troublingly slow pace. Kavya started a Girls Who Code club at her high school to help combat this, which is amazing. 

The tech industry needs to do better and prioritize diversity of all kinds -- gender, ethnicity, background, education and age. It's not just the right thing to do, it's also a business imperative. Without diversity, tech companies will build products that fail to serve the majority of people. 

I love that kids are internalizing the need for diversity at a young age. And looking at the panelists, I was proud of the diverse perspectives they represented: all different ages and backgrounds (Adam is Egyptian-American, Tessel is Dutch-American, and Kavya is Indian-American). 

As a parent I sometimes worry about the adverse effects of technology on my kids. But I was struck by how aware and savvy young people are about technology and its implications. As young as they are, these kids raised issues that we as an industry are long overdue in addressing. The future is in good hands, but we need to take action today to address these concerns that impact not just the next generation, but all of us. So tech leaders, take note.

Published on: Oct 30, 2019
The opinions expressed here by Inc.com columnists are their own, not those of Inc.com.