Skip to content

Conversation

@marouanetalaa
Copy link
Contributor

Description

Implement a Gemini LLMClient and test for it.

Related Issue

Implement a Gemini LLMClient #1901

Type of Change

  • 📚 Examples / docs / tutorials / dependencies update
  • 🔧 Bug fix (non-breaking change which fixes an issue)
  • 🥂 Improvement (non-breaking change which improves an existing feature)
  • 🚀 New feature (non-breaking change which adds functionality)
  • 💥 Breaking change (fix or feature that would cause existing functionality to change)
  • 🔐 Security fix

Checklist

  • I've read the CODE_OF_CONDUCT.md document.
  • I've read the CONTRIBUTING.md guide.
  • I've written tests for all new methods and classes that I created.
  • I've written the docstring in Google format for all the methods and classes that I used.
  • I've updated the pdm.lock running pdm update-lock (only applicable when pyproject.toml has been
    modified)

@kevinmessiaen kevinmessiaen self-requested a review June 11, 2024 02:05
@kevinmessiaen
Copy link
Member

Thanks for the contribution 👍 I'll take a look at the PR!

Copy link
Member

@kevinmessiaen kevinmessiaen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The client is working properly however it doesn't integrate with the Giskard (scan, raget, ...).

The ChatMessage that are sent as input to the LLM client can have 3 roles (system, assistant and user).

In our case assistant should be mapped to model. For system we will need to map it to user with a custom format the that the model understand the importance of the system prompt: https://www.googlecloudcommunity.com/gc/AI-ML/Implementing-System-Prompts-in-Gemini-Pro-for-Chatbot-Creation/m-p/715501

let me know if you need help on this. You can take a look on our Bedrock client implementation.

PS: ideally this code snippet should be working:

client = GeminiClient()
response = client.complete(messages=[
    ChatMessage(role='system', content='You are a "ping" service that always reply with "IT WORKS!"'),
    ChatMessage(role='user', content='Hello, does it works?'),
    ChatMessage(role='assistant', content='IT WORKS!'),
    ChatMessage(role='user', content='What is your goal?'),
])

assert response.role == 'assistant' # model should be mapped to assistant in the response too
assert response.role == 'IT WORKS!'

@kevinmessiaen kevinmessiaen self-assigned this Jun 24, 2024
@kevinmessiaen kevinmessiaen enabled auto-merge June 24, 2024 06:11
@kevinmessiaen kevinmessiaen merged commit 565f58c into Giskard-AI:main Jul 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

4 participants