{"id":6574,"date":"2023-01-11T22:02:16","date_gmt":"2023-01-11T22:02:16","guid":{"rendered":"https:\/\/techclot.com\/index.php\/2023\/01\/11\/program-teaches-us-air-force-personnel-the-fundamentals-of-ai-mit-news\/"},"modified":"2023-01-11T22:02:16","modified_gmt":"2023-01-11T22:02:16","slug":"program-teaches-us-air-force-personnel-the-fundamentals-of-ai-mit-news","status":"publish","type":"post","link":"https:\/\/techclot.com\/index.php\/2023\/01\/11\/program-teaches-us-air-force-personnel-the-fundamentals-of-ai-mit-news\/","title":{"rendered":"Program teaches US Air Force personnel the fundamentals of AI | MIT News"},"content":{"rendered":"<p><a href=\"https:\/\/www.google.com\/url?rct=j&#038;sa=t&#038;url=https:\/\/news.mit.edu\/2023\/ai-training-program-us-air-force-0111&#038;ct=ga&#038;cd=CAIyHDkyYmU1MGQ5NjY1NjYxZTA6Y28udWs6ZW46R0I&#038;usg=AOvVaw0sB0ur6Tzf-wpg-0e_INJe\">Program teaches US Air Force personnel the fundamentals of AI | MIT News<\/a><\/p>\n<p><div><img data-recalc-dims=\"1\" decoding=\"async\" data-src=\"https:\/\/i0.wp.com\/techclot.com\/wp-content\/uploads\/2023\/01\/GPlFkD.jpg?w=640&#038;ssl=1\" class=\"ff-og-image-inserted lazyload\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\"><\/div>\n<div class=\"news-article--content--body--inner\">\n<div class=\"paragraph paragraph--type--content-block-text paragraph--view-mode--default\">\n<p>A new academic program developed at MIT aims to teach U.S. Air and Space Forces personnel to understand and utilize artificial intelligence technologies. In a recent <a href=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/9962632\" target=\"_blank\" rel=\"noopener\">peer-reviewed study<\/a>, the program researchers found that this approach was effective and well-received by employees with diverse backgrounds and professional roles.<\/p>\n<p>The project, which was funded by the Department of the Air Force\u2013MIT Artificial Intelligence Accelerator, seeks to contribute to AI educational research, specifically regarding ways to maximize learning outcomes at scale for people from a variety of educational backgrounds.<\/p>\n<p>Experts in MIT Open Learning built a curriculum for three general types of military personnel \u2014 leaders, developers, and users \u2014 utilizing existing MIT educational materials and resources. They also created new, more experimental courses that were targeted at Air and Space Forces leaders.<\/p>\n<p>Then, MIT scientists led a research study to analyze the content, evaluate the experiences and outcomes of individual learners during the 18-month pilot, and propose innovations and insights that would enable the program to eventually scale up.<\/p>\n<p>They used interviews and several questionnaires, offered to both program learners and staff, to evaluate how 230 Air and Space Forces personnel interacted with the course material. They also collaborated with MIT faculty to conduct a content gap analysis and identify how the curriculum could be further improved to address the desired skills, knowledge, and mindsets.<\/p>\n<p>Ultimately, the researchers found that the military personnel responded positively to hands-on learning; appreciated asynchronous, time-efficient learning experiences to fit in their busy schedules; and strongly valued a team-based, learning-through-making experience but sought content that included more professional and soft skills. Learners also wanted to see how AI directly applied to their day-to-day work and the broader mission of the Air and Space Forces. They were also interested in more opportunities to engage with others, including their peers, instructors, and AI experts.<\/p>\n<p>Based on these findings, which the program researchers recently <a href=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/9962632\" target=\"_blank\" rel=\"noopener\">shared at the IEEE Frontiers in Education Conference<\/a>, the team is augmenting the educational content and adding new technical features to the portal for the next iteration of the study, which is currently underway and will extend through 2023.<\/p>\n<p>\u201cWe are digging deeper into expanding what we think the opportunities for learning are, that are driven by our research questions but also from understanding the science of learning about this kind of scale and complexity of a project. But ultimately we are also trying to deliver some real translational value to the Air Force and the Department of Defense. This work is leading to a real-world impact for them, and that is really exciting,\u201d says principal investigator Cynthia Breazeal, who is MIT\u2019s dean for digital learning, director of MIT RAISE (Responsible AI for Social Empowerment and Education), and head of the Media Lab\u2019s Personal Robots research group.<\/p>\n<p><strong>Building learning journeys<\/strong><\/p>\n<p>At the outset of the project, the Air Force gave the program team a set of profiles that captured educational backgrounds and job functions of six basic categories of Air Force personnel. The team then created three archetypes it used to build \u201clearning journeys\u201d \u2014 a series of training programs designed to impart a set of AI skills for each profile.<\/p>\n<p>The Lead-Drive archetype is an individual who is making strategic decisions; the Create-Embed archetype is a technical worker who is implementing AI solutions; and the Facilitate-Employ archetype is an end-user of AI-augmented tools.<\/p>\n<p>It was a priority to convince the Lead-Drive archetype of the importance of this program, says lead author Andr\u00e9s Felipe Salazar-Gomez, a research scientist at MIT Open Learning.<\/p>\n<p>\u201cEven inside the Department of Defense, leaders were questioning if training in AI is worth it or not,\u201d he explains. \u201cWe first needed to change the mindset of the leaders so they would allow the other learners, developers, and users to go through this training. At the end of the pilot we found they embraced this training. They had a different mindset.\u201d<\/p>\n<p>The three learning journeys, which ranged from six to 12 months, included a combination of existing AI courses and materials from MIT Horizon, MIT Lincoln Laboratory, MIT Sloan School of Management, the Computer Science and Artificial Intelligence Laboratory (CSAIL), the Media Lab, and MITx MicroMasters programs. Most educational modules were offered entirely online, either synchronously or asynchronously.<\/p>\n<p>Each learning journey included different content and formats based on the needs of users. For instance, the Create-Embed journey included a five-day, in-person, hands-on course taught by a Lincoln Laboratory research scientist&nbsp;that offered a deep dive into technical AI material, while the Facilitate-Employ journey comprised self-paced, asynchronous learning experiences, primarily drawing on MIT Horizon materials that are designed for a more general audience.<\/p>\n<p>The researchers also created two new courses for the Lead-Drive cohort. One, a synchronous online course called The Future of Leadership: Human and AI Collaboration in the Workforce,<em> <\/em>developed in collaboration with Esme Learning, was based on the leaders\u2019 desire for more training around ethics and human-centered AI design and more content on human-AI collaboration in the workforce. The researchers also crafted an experimental, three-day, in-person course called Learning Machines: Computation, Ethics, and Policy that immersed leaders in a constructionist-style learning experience where teams worked together on a series of hands-on activities with autonomous robots that culminated in an escape-room style capstone competition that brought everything together.<\/p>\n<p>The Learning Machines course was wildly successful, Breazeal says.<\/p>\n<p>\u201cAt MIT, we learn by making and through teamwork. We thought, what if we let executives learn about AI this way?\u201d she explains. \u201cWe found that the engagement is much deeper, and they gained stronger intuitions about what makes these technologies work and what it takes to implement them responsibly and robustly. I think this is going to deeply inform how we think about executive education for these kinds of disruptive technologies in the future.\u201d<\/p>\n<p><strong>Gathering feedback, enhancing content<\/strong><\/p>\n<p>Throughout the study, the MIT researchers checked in with the learners using questionnaires to obtain their feedback on the content, pedagogies, and technologies used. They also had MIT faculty analyze each learning journey to identify educational gaps.<\/p>\n<p>Overall, the researchers found that the learners wanted more opportunities to engage, either with their peers through team-based activities or with faculty and experts through synchronous components of online courses. And while most personnel found the content to be interesting, they wanted to see more examples that were directly applicable to their day-to-day work.<\/p>\n<p>Now in the second iteration of the study, researchers are using that feedback to enhance the learning journeys. They are designing knowledge checks that will be a part of the self-paced, asynchronous courses to help learners engage with the content. They are also adding new tools to support live Q&amp;A events with AI experts and help build more community among learners.<\/p>\n<p>The team is also looking to add specific Department of Defense examples throughout the educational modules, and include a scenario-based workshop.<\/p>\n<p>\u201cHow do you upskill a workforce of 680,000 across diverse work roles, all echelons, and at scale? This is an MIT-sized problem, and we are tapping into the world-class work that MIT Open Learning has been doing since 2013 \u2014 democratizing education on a global scale,\u201d says Maj. John Radovan, deputy director of the DAF-MIT AI Accelerator. \u201cBy leveraging our research partnership with MIT, we are able to research the optimal pedagogy of our workforce through focused pilots. We are then able to quickly double down on unexpected positive results and pivot on lessons learned. This is how you accelerate positive change for our airmen and guardians.\u201d<\/p>\n<p>As the study progresses, the program team is sharpening their focus on how they can enable this training program to reach a larger scale.<\/p>\n<p>\u201cThe U.S. Department of Defense is the largest employer in the world. When it comes to AI, it is really important that their employees are all speaking the same language,\u201d says Kathleen Kennedy, senior director of MIT Horizon and executive director of the MIT Center for Collective Intelligence. \u201cBut the challenge now is scaling this so that learners who are individual people get what they need and stay engaged. And this will certainly help inform how different MIT platforms can be used with other types of large groups.\u201d<\/p>\n<\/p><\/div>\n<\/p><\/div>\n<p><strong><a href=\"https:\/\/blockads.fivefilters.org\">Adblock test<\/a><\/strong> <\/p>\n<p>Published at Wed, 11 Jan 2023 05:11:45 +0000<\/p>\n<p><a href=\"https:\/\/www.google.com\/url?rct=j&#038;sa=t&#038;url=https:\/\/www.thehindubusinessline.com\/info-tech\/putting-the-art-in-artificial-intelligence\/article66356234.ece&#038;ct=ga&#038;cd=CAIyHDkyYmU1MGQ5NjY1NjYxZTA6Y28udWs6ZW46R0I&#038;usg=AOvVaw2GvgFgdZmmLhg8gH2b7SgD\">AI art. Putting the art in artificial intelligence &#8211; The Hindu Business Line<\/a><\/p>\n<p><p>Teddy bears working on new AI research underwater with 1990s tech. An astronaut playing with cats as pixel art. A bowl of soup that is a portal to another dimension drawn on a cave wall. <\/p>\n<p>These are just a few of the infinite bizarre combinations of prompts that users can type into DALL-E 2, an open-source AI platform, which lets users generate art pieces based on textual prompts that they type in. What started as a research project is now available to users as a beta version: with a promise of photorealism, great resolution, and accurate images. <\/p>\n<h5 class=\"title inline-title\">\nGenerating AI art on Dream by Wombo<br \/>\n<\/h5>\n<div class=\"lead-video-cont\"><\/div>\n<p><span class=\"caption-cont bold\">Here is a short walk through on how to generate AI art on Dream by Wombo using the prompt, \u2018A dog playing with a yarn of wool on the moon.\u2019<\/span><\/p>\n<div class=\"img-full-width\"><img decoding=\"async\" src=\"https:\/\/www.thehindubusinessline.com\/economy\/macro-economy\/z7s004\/article65944872.ece\/alternates\/LANDSCAPE_320\/filler784\" data-src-template=\"https:\/\/bl-i.thgim.com\/public\/news\/llvdd5\/article66356300.ece\/alternates\/FREE_660\/Dream_TradingCard.jpg\" data-original=\"https:\/\/bl-i.thgim.com\/public\/news\/llvdd5\/article66356300.ece\/alternates\/FREE_660\/Dream_TradingCard.jpg\" alt=\"AI art generated on Dream with the prompt \u2018A pond filled with blooming lilies and some frogs\u2019\" title=\"AI art generated on Dream with the prompt \u2018A pond filled with blooming lilies and some frogs\u2019\" data-device-variant=\"FREE~FREE~FREE~FREE\" class=\" lazy\" width=\"100%\" height=\"100%\"><\/p>\n<div class=\"caption-cont \">\n<h4>\nAI art generated on Dream with the prompt \u2018A pond filled with blooming lilies and some frogs\u2019<br \/>\n<\/h4>\n<\/div>\n<\/div>\n<p>Generative AI is now everywhere- beyond art, <a href=\"https:\/\/www.thehindubusinessline.com\/info-tech\/internet-sensation-chatgpt-attracts-the-dark-side-of-tech\/article66353627.ece\" target=\"_blank\" rel=\"noopener\">OpenAI\u2019s ChatGPT<\/a> mimics human conversations and can write things from student essays to computer programmes. There is also Lensa AI, an AI-based photo editing app that allows users to generate avatars from pre-existing photos. From code to conversations, art to music compositions, anything seems possible to be generated, by anyone, with artificial intelligence at the helm. <\/p>\n<p><b>Also read: <a href=\"https:\/\/www.thehindubusinessline.com\/news\/variety\/lensa-ai-filter-is-this-app-safe\/article66265711.ece\" target=\"_blank\" rel=\"noopener\">Lensa AI: Is this app safe?<\/a>&nbsp;<\/b><\/p>\n<p>As this wave surges across online platforms, the talk surrounding AI art is growing louder, with digital artists voicing their views and critique on AI art.\n<\/p>\n<h5 class=\"sub_head\">\nLearning curve<br \/>\n<\/h5>\n<p>Mira Malhotra, Founder of Studio Kohl, a graphic design studio, detailed her experience with different AI text-to-image websites. \u201cMidjourney was a bit of a learning curve because I had to first learn how to use Discord, which wasn\u2019t very user-friendly. DALL-E was easier to use but generated a lot of \u201ccursed\u201d faces and immediately assumed I needed close crops. Wonder AI had similar issues,\u201d she said. <\/p>\n<p>\u201cSince then, I\u2019ve studied how prompts are built to generate pleasing and usable images, and I see that while great flukes are definitely possible, users need to be very specific on how they write prompts, and for that, you definitely do have to have experience in art.\u201d Malhotra also said that \u201cAI is creating a very cookie-cutter, banal, and uninspiring bunch of images unless used correctly by a real creative.\u201d <\/p>\n<h5 class=\"sub_head\">\nThe criticism<br \/>\n<\/h5>\n<p>AI art is, by no means, free of criticism. Some artists complain about the \u2018Uncanny valley\u2019 effect of AI-generated faces, while some stress the need for proper regulation. Regardless, the widespread backlash against AI art has swept art forums, with \u201cNo AI Art\u201d images flooding sites, such as ArtStation and Twitter. The online discourse began when artists such as @ZakugaMignon tweeted the image, writing, \u201cAi \u201cart\u201d is currently scraping the web for art and uses it in datasets. No artist gave consent to have their art used. We were not compensated.\u201d<\/p>\n<div class=\"inline_embed article-block-item\">\n<blockquote class=\"twitter-tweet\">\n<p lang=\"en\" dir=\"ltr\">1\/6 I created this image for everyone to use wherever they want. <br \/>Ai creates the &#8220;art&#8221; you see on the backs of artists being exploited. Ai &#8220;art&#8221; is currently scraping the web for art and uses it in datasets. No artist gave consent to have their art used. We were not compensated <a href=\"https:\/\/t.co\/eGn352MyCj\">pic.twitter.com\/eGn352MyCj<\/a><\/p>\n<p>\u2014 \ud83c\udfee Zakuga Mignon Art\ud83c\udfee (@ZakugaMignon) <a href=\"https:\/\/twitter.com\/ZakugaMignon\/status\/1602629045254082560?ref_src=twsrc%5Etfw\">December 13, 2022<\/a><\/p><\/blockquote>\n<\/div>\n<p>The need for clear differentiation then becomes pertinent. Jayesh Joshi, art director of Schbang, a creative agency, believes that AI art should be treated differently from digital art. \u201cMan-made and machine-made art are clumped together, which isn\u2019t fair.\u201d <\/p>\n<p>A paramount concern that many digital artists have expressed is that several AI art generators use an image bank of pre-existing, man-made art with no scope of crediting the original artist. Joshi further said, \u201cAI development companies should be mindful of what they feed this image bank, and use their platform to uplift artists whose works they\u2019re using &#8211; possibly commissioning the creators.\u201d\n<\/p>\n<h5 class=\"sub_head\">\nStriking a balance<br \/>\n<\/h5>\n<p>Another artist, Aditya Mehta, CEO of Art&amp;Found, observed the need to strike a balance between AI and man-made art. \u201cIt is interesting to see how AI and Machine Learning keep evolving, which is an inevitable and endless pursuit of how AI can co-exist with and help humanity.<\/p>\n<p>My view is that you can\u2019t compare the two. You can have them co-exist and collaborate. Art simply created using AI-generated art tools doesn\u2019t excite or shock me. What excites me is seeing how someone uses them to tell a compelling story,\u201d he said.<\/p>\n<p><b>Also read:<a href=\"https:\/\/www.thehindubusinessline.com\/info-tech\/as-ai-rises-lawmakers-try-to-catch-up\/article66313171.ece\" target=\"_blank\" rel=\"noopener\"> As AI rises, lawmakers try to catch up<\/a><\/b><\/p>\n<p>To reach such a level of collaboration, AI art can be deployed for several purposes, including the generation of reference images that allows artists to input their vision through prompts. The nuance here, according to these artists, is that AI art is used as an aiding tool, not the medium itself. <\/p>\n<p>\u201cAI art can also streamline processes and help in developing quick thumbnail sketches, palette experiments, composition layouts, and lighting variations. If done right, AI can assist human artists in creating brilliant new work,\u201d added Joshi. <\/p>\n<p>With the spotlight on AI art shining brighter, one thing seems certain: it is going to be near impossible to nip its growth in the bud. \u201cI see the spread of AI growing exponentially in 2023 to memes, metaverse, animation, films, CGI, gaming, comics, graphic novels, communication aid, and education,\u201d said Mehta. \u201cAnd one Black Mirror episode.\u201d<\/p>\n<div class=\"share-article-section second-article-share share-article-section-mobile\"><a data-cstyle=\"art\" href=\"https:\/\/www.thehindubusinessline.com\/info-tech\/putting-the-art-in-artificial-intelligence\/article66356234.ece#comments_66356234\" data-id=\"66356234\"><\/a><\/p>\n<div class=\"share-page\" data-title=\"Putting the art in artificial intelligence\" data-url=\"https:\/\/www.thehindubusinessline.com\/info-tech\/putting-the-art-in-artificial-intelligence\/article66356234.ece\"><\/div>\n<div class=\"share-date\">Published on January 11, 2023 <\/div>\n<\/div>\n<p><strong><a href=\"https:\/\/blockads.fivefilters.org\">Adblock test<\/a><\/strong> <\/p>\n<p>Published at Wed, 11 Jan 2023 05:07:56 +0000<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Program teaches US Air Force personnel the fundamentals of AI | MIT News A new&#8230;<\/p>\n","protected":false},"author":3,"featured_media":6573,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[3],"tags":[],"class_list":["post-6574","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-artificial-intelligence"],"jetpack_featured_media_url":"https:\/\/i0.wp.com\/techclot.com\/wp-content\/uploads\/2023\/01\/GPlFkD.jpg?fit=1920%2C1280&ssl=1","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p3orZX-1I2","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/posts\/6574","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/comments?post=6574"}],"version-history":[{"count":0,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/posts\/6574\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/media\/6573"}],"wp:attachment":[{"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/media?parent=6574"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/categories?post=6574"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/tags?post=6574"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}