Copilot is ready for takeoff: Microsoft rolls out artificial intelligence for Windows


Already we’ve seen studies reviewing the security of GitHub Copilot’s Code contributions. A paper published by researchers at Cornell University last August reviewed the impact of using AI in code and how secure or how vulnerable that code is if you rely on developers using Github to augment their coding skills.

The paper indicates that “as the user adds lines of code to the program, Copilot continuously scans the program and periodically uploads some subset of lines, the position of the user’s cursor, and metadata before generating some code options for the user to insert.”

The AI generates code that is functionally relevant to the program as implied by comments, docstrings, and function names, the paper states. “Copilot also reports a numerical confidence score for each of its proposed code completions, with the top-scoring (highest-confidence) score presented as the default selection for the user. The user can choose any of Copilot’s options.”

Copilot-generated code can create vulnerabilities

The study found that upon testing 1,692 programs generated in 89 different code-completion scenarios, 40% were found to be vulnerable. As the authors indicated, “while Copilot can rapidly generate prodigious amounts of code, our conclusions reveal that developers should remain vigilant (‘awake’) when using Copilot as a co-pilot. Ideally, Copilot should be paired with appropriate security-aware tooling during both training and generation to minimize the risk of introducing security vulnerabilities.”

Ultimately you need to start thinking and planning about your firm’s implementations of any and all AI modules that will arrive in your operating systems, in your API implementations, or in your code. The use of AI doesn’t mean that the application or code is vetted by default — rather, it’s just a different type of input that you need to review and manage.

In the case of Microsoft AI inputs that are coming to desktops and applications, some, like Copilot for Windows, come as native to the platform, without additional costs, and may be managed with Group Policy, Intune, or other management tools. Once you have deployed the October security updates to a sample Windows 11 22H2 workstation, an IT department can proactively manage Copilot in Windows by using the group policy or Intune tools noted here.


Leave a Reply

Your email address will not be published. Required fields are marked *