Reading view

There are new articles available, click to refresh the page.

AI coding assistants can help startups develop products, seed VCs believe

By now, there’s hardly a coder in the world who isn’t using an AI copilot in some way. But using GitHub Copilot or Cursor.AI to ask technical questions and get debugging help could be just the beginning. AI coding may one day involve agents that can write the programs themselves based on a natural language […]

© 2024 TechCrunch. All rights reserved. For personal use only.

GitHub Copilot moves beyond OpenAI models to support Claude 3.5, Gemini

The large language model-based coding assistant GitHub Copilot will switch from exclusively using OpenAI's GPT models to a multi-model approach over the coming weeks, GitHub CEO Thomas Dohmke announced in a post on GitHub's blog.

First, Anthropic's Claude 3.5 Sonnet will roll out to Copilot Chat's web and VS Code interfaces over the next few weeks. Google's Gemini 1.5 Pro will come a bit later.

Additionally, GitHub will soon add support for a wider range of OpenAI models, including GPT o1-preview and o1-mini, which are intended to be stronger at advanced reasoning than GPT-4, which Copilot has used until now. Developers will be able to switch between the models (even mid-conversation) to tailor the model to fit their needs—and organizations will be able to choose which models will be usable by team members.

Read full article

Comments

© Vaughn Ridley/Collision via Sportsfile

AI Literacy: Getting Started

The speed of recent innovation is head spinning. Here’s some help. 

GUEST COLUMN | by Delia DeCourcy

“As artificial intelligence proliferates, users who intimately understand the nuances, limitations, and abilities of AI tools are uniquely positioned to unlock AI’s full innovative potential.” 

Ethan Mollick’s insight from his recent book Co-Intelligence: Living and Working with AI, is a great argument for why AI literacy is crucial for our students and faculty right now. To understand AI, you have to use it – a lot – not only so you know how AI can assist you, but also, as Mollick explains, so you know how AI will impact you and your current job–or in the case of students, the job they’ll eventually have. 

What is AI Literacy?

Definitions of AI literacy abound but most have a few characteristics in common:

 

Deeper dimensions of that second bullet could include knowing the difference between AI and generative AI; understanding the biases and ethical implications of large language model training; and mastering prompting strategies to name a few.

AI Literacy and Future Readiness

If the two-year generative AI tidal wave originating with ChatGPT going live isn’t enough to stoke your belief in the need for AI literacy, consider these facts and statistics:

  • Studies from the National Artificial Intelligence Advisory Committee (NAIAC) in 2023 show that 80% of the US workforce do some tasks that will be affected by large language models, and 20% of jobs will see about half their daily tasks affected by AI. 
  • A poll conducted by Impact Research for the Walton Family Foundation revealed that as of June 2024, about half of K-12 students and teachers said they use ChatGPT at least weekly. 
  • According to a June report from Pearson, 56% of higher education students said that generative AI tools made them more efficient in the spring semester, while only 14% of faculty were confident about using AI in their teaching. 
  • AI is already integrated into many of the devices and platforms we use every day. That’s now true in education as well with the integration of the Gemini chatbot in Google Workspace for Education and Microsoft’s offering of Copilot to education users.

Supporting institutions, educators, and students with AI literacy

Institutions – Assess, Plan, Implement

Assessing institutional readiness for generative AI integration, planning, and implementation means looking not only at curriculum integration and professional development for educators, but also how this technology can be used to personalize the student experience, streamline administration, and improve operating costs – not to mention the critical step of developing institutional policies for responsible and ethical AI use. This complex planning process assumes a certain level of AI literacy for the stakeholders contributing to the planning. So some foundational learning might be in order prior to the “assess” stage.

‘This complex planning process assumes a certain level of AI literacy for the stakeholders contributing to the planning. So some foundational learning might be in order prior to the “assess” stage.’

Fortunately for K-12 leaders, The Council of the Great City Schools and CoSN have developed a Gen AI Readiness Checklist, which helps districts think through implementation necessities from executive leadership to security and risk management to ensure a roll out aligns with existing instructional and operational objectives. It’s also helpful to look at model districts like Gwinnett County Schools in Georgia that have been integrating AI into their curriculum since before ChatGPT’s launch.

Similarly, in higher education, Educause provides a framework for AI governance, operations, and pedagogy and has also published the 2024 Educause AI Landscape Study that helps colleges and universities better understand the promise and pitfalls of AI implementation. For an example of what AI assessment and planning looks like at a leading institution, see The Report of the Yale Task Force on Artificial Intelligence published in June of this year. The document explains how AI is already in use across campus, provides a vision for moving forward, and suggests actions to take.

Educators – Support Innovation through Collaboration

Whether teaching or administrating, in university or K12, educators need to upskill and develop a generative AI toolbox. The more we use the technology, the better we will understand its power and potential. Fortunately, both Google Gemini and Microsoft Copilot have virtual PD courses that educators can use to get started. From there, it’s all about integrating these productivity platforms into our day to day work to “understand the nuances, limitations, and abilities” of the tools. And for self-paced AI literacy learning, Common Sense Education’s AI Foundations for Educators course introduces the basics of AI and ethical considerations for integrating this technology into teaching.

The best learning is inherently social, so working with a team or department to share discoveries about how generative AI can help with personalizing learning materials, lesson plan development, formative assessment, and daily productivity is ideal. For more formalized implementation of this new technology, consider regular coaching and modeling for new adopters. At Hillsborough Township Public Schools in New Jersey, the district has identified a pilot group of intermediate and middle school teachers, technology coaches, and administrators who are exploring how Google Gemini can help with teaching and learning this year. With an initial pre-school year PD workshop followed by regular touch points, coaching, and modeling, the pilot will provide the district a view of if and how they want to scale generative AI with faculty across all schools.

‘The best learning is inherently social, so working with a team or department to share discoveries about how generative AI can help with personalizing learning materials, lesson plan development, formative assessment, and daily productivity is ideal.’

In higher education, many institutions are providing specific guidance to faculty about how generative AI should and should not be used in the classroom as well as how to address it in their syllabi with regard to academic integrity and acceptable use. At the University of North Carolina at Chapel Hill, faculty are engaging in communities of practice that examine how generative AI is being used in their discipline and the instructional issues surrounding gen AI’s use, as well as re-designing curriculum to integrate this new technology. These critical AI literacy efforts are led by the Center for Faculty Excellence and funded by Lenovo’s Instructional Innovation Grants program at UNC. This early work on generative AI integration will support future scaling across campus. 

Students – Integrate AI Literacy into the Curriculum

The time to initiate student AI literacy is now. Generative AI platforms are plentiful and students are using them. In the work world, this powerful technology is being embraced across industries. We want students to be knowledgeable, skilled, and prepared. They need to understand not only how to use AI responsibly, but also how it works and how it can be harmful. 

‘We want students to be knowledgeable, skilled, and prepared. They need to understand not only how to use AI responsibly, but also how it works and how it can be harmful.’

The AI literacy students need will vary based on age. Fortunately, expert organizations like ISTE have already made recommendations about the vocabulary and concepts K12 educators can integrate at which grades to help students understand and use AI responsibly. AI literacy must be integrated across the curriculum in ways that are relevant for each discipline. But this is one more thing to add to educators’ already full plates as they themselves develop their own AI literacy. Fortunately, MIT, Stanford, and Common Sense Education have developed AI literacy materials that can be integrated into existing curriculum. And Microsoft has an AI classroom toolkit that includes materials on teaching prompting. 

The speed of recent innovation is head spinning. Remaining technologically literate in the face of that innovation is no small task. It will be critical for educators and institutions to assess and implement AI in ways that matter, ensuring it is helping them achieve their goals. Just as importantly, educators and institutions play an essential role in activating students’ AI literacy as they take the necessary steps into this new technology landscape and ultimately embark on their first professional jobs outside of school. 

Delia DeCourcy is a Senior Strategist for the Lenovo Worldwide Education Portfolio. Prior to joining Lenovo she had a 25-year career in education as a teacher, consultant, and administrator, most recently as the Executive Director of Digital Teaching and Learning for a district in North Carolina. Previously, she was a literacy consultant serving 28 school districts in Michigan focusing on best practices in reading and writing instruction. Delia has also been a writing instructor at the University of Michigan where she was awarded the Moscow Prize for Excellence in Teaching Composition. In addition, she served as a middle and high school English teacher, assistant principal, and non-profit director. She is the co-author of the curriculum text Teaching Romeo & Juliet: A Differentiated Approach published by the National Council for the Teachers of English. Connect with Delia on LinkedIn

The post AI Literacy: Getting Started appeared first on EdTech Digest.

GitHub’s Copilot goes multi-model and adds support for Anthropic’s Claude and Google’s Gemini

GitHub today announced that it will now allow developers to switch between a number of large language models when they use Copilot Chat, its code-centric ChatGPT-like service. Until now, Copilot Chat was powered by OpenAI’s GPT-4. Going forward, developers can choose between Anthropic’s Claude 3.5 Sonnet, Google’s Gemini 1.5 Pro, and OpenAI’s GPT-4o, o1-preview, and […]

© 2024 TechCrunch. All rights reserved. For personal use only.

GitHub’s Copilot comes to Apple’s Xcode

At its Universe conference, GitHub today announced a number of major new products, including the Spark project for writing applications entirely with AI, as well as multi-model support for its Copilot service. But Copilot itself is also getting quite a few updates. With this release, Microsoft-owned GitHub is bringing Copilot to Apple’s Xcode environment for […]

© 2024 TechCrunch. All rights reserved. For personal use only.

❌