Prompt Security Launches With AI Protection for the Enterprise

Prompt Security Launches With AI Protection for the Enterprise

January 24, 2024 at 10:05AM

Prompt Security launched a solution that uses AI to secure companies’ AI products, preventing prompt injection and jailbreaks. Their approach also aims to prevent accidental exposure of sensitive data to tools like ChatGPT. Recognizing potential risks of generative AI adoption, Prompt Security offers protection by inspecting prompts and model responses, cataloging AI tools used, and defining access policies. Co-founders came from Orca Security, with a $5 million seed round led by Hetz Ventures.

After the launch of Prompt Security, which offers an AI-based solution to secure companies’ AI products, there is notable concern in the industry about the potential risks associated with adopting generative AI (GenAI). A recent Dark Reading survey highlights several risks, including the opaqueness of third-party tools, a lack of consensus on GenAI guidelines and policies, and data governance concerns.

Prompt Security’s solution aims to address these risks by safeguarding interactions with GenAI in organizations. It checks each prompt and model response to protect against exposure of sensitive data, block harmful content, and defend against GenAI-specific attacks such as prompt injection, jailbreaking, and data extraction. Additionally, it utilizes contextual LLM-based models to detect and redact sensitive data, safeguarding customer and employee information as well as intellectual property from accidental exposure.

The solution also provides the capability to catalog the array of AI tools used within an organization, allowing the security team to monitor their usage and define access policies by application and user group.

The co-founders of Prompt Security, Itamar Golan and Lior Drihem, previously held positions at Orca Security. The $5 million seed round for Prompt Security was led by Hetz Ventures, with participation from Four Rivers and multiple angel investors.

Full Article