The Roadmap

Wisent-Guard is not fully optimized. We will continue to add new features to support your journey and decrease the time to happiness.

Support for other models

Having the ability to really easily use Qwen, Deepseek and Mistral with utility functions for determining the optimal layer to read representations from and extensive tests

Integrations for Easier Deployment

Please let us know what open-source inference frameworks Wisent-Guard should be integrated with. We want to make it easy and intuitive for you to deploy our software!

Documentation in other languages

Especially Mandarin to help our users

Optimal layers for different models

Research and provide recommended optimal layers for different model families (Qwen, Mistral, Llama) to maximize representation quality and detection accuracy

Optimal layers for different tasks

Task-specific layer recommendations for different use cases (harmful content detection, hallucination detection, bias detection, etc.)

Optimal activation collection strategy

Improved methods for efficiently collecting and processing activations to minimize computational overhead while maintaining detection quality

Latency measurement and optimization

Built-in tools to measure and optimize latency when using Wisent-Guard for generating detection scores in production environments

Optimal training sample size

Research-backed recommendations for the optimal number of training samples needed to achieve good classifier performance for different detection tasks

Azure Marketplace Integration

One-click deployment on Azure infrastructure with seamless integration into Azure Machine Learning and Azure OpenAI Service

AWS Marketplace Integration

Deploy on AWS with auto-scaling support, integration with Amazon SageMaker, and compatibility with AWS Bedrock models

Google Cloud Platform Integration

Integration with Vertex AI and Cloud Run for scalable deployment, with support for Google Cloud's Gemini and PaLM models

Want to influence our roadmap?

We value your feedback and want to build features that matter most to you. Let us know what you'd like to see!