Freddy Copilot for Developers is a Generative AI-powered development experience tool that allows quicker and more intuitive app development. Stretching across the Freshworks app development journey, it integrates seamlessly with Visual Studio code and can suggest code snippets through conversational messages.
It is a powerful VS Code extension designed to help developers streamline their Freshworks application development process. With Freddy Copilot, developers can perform operations such as
- Learn app development via Chat-based tutorials
- Rapid code generation
- Perform 1-Click actions such as
- 1-Click Review
- 1-Click Documentation
- 1-Click Publish
- VScode 1.5+
- FDK 9.0.2 or higher
- Node v18.13.0 or higher
- Open the copilot interface by clicking on the extension.
- You can also use Freddy using the Command Pallete with Cmd + Shift + P and searching for
- Use rapid code generation workflow for input queries
- Use Stop App and Run app options to handle app lifecycle
- Use Pack and Publish to upload the app to the application management portal (AMP)
- To kick things off, ensure you go through the necessary Prerequisites and get a thorough understanding.
- Create a Developer Account. Within the Dev Account, Sign up for a free trial account with a Freshworks product of your choice.
- Install the Freshworks CLI, aka FDK
- Configure Freddy Copilot, and start building Freshworks Applications.
- Install the extension from VS Code Marketplace
- Navigate to extension settings
- Configure the setting values for “FDK path”, “API Key”, “User Name” and “Context Settings”.
The extension supports the following configuration keys
- Freddy: Fdk Path - You need to set up the FDK path of your installation directory
- Freddy: Freddy Api Key - API key to access Freddy Copilot for Developers
- Freddy: User Name - User name to be displayed against input prompts in the conversational interface
- Freddy: Context Settings - Number of previous conversation responses to be remembered and referred to while framing a new answer with stored context for the respective conversation. This ensures more accuracy in the answer given.
Note: Context setting
Enabling this feature will consume more tokens, roughly corresponding to the number of messages in a conversation (about 2800 tokens).
By default, the value is set at 5, with the option to raise it to a maximum of 10.
Increasing this value enhances accuracy, but keep in mind that it will significantly raise token consumption per query.
Refer to the detailed instructions available here.