-
-
Notifications
You must be signed in to change notification settings - Fork 122
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
insert my custom informations to reply to my costumers #31
Comments
see how to create a custom model? also consider checking this issue |
thank you my dear friend, now i did what the doc said, i just dont understand how say that, i'm using GEMINI, where i config it? because after to do all changes like you said, when i try to send a message using my custom model, i received this message in my Prompt: It seems like it trying to use OpenIA instead Gemini.. /* Models config files */ const config: Config = { export default config; |
even using Gemini AI i need to have a OpenAI account too? or just with Gemini its possible to make it works? |
actually my custom model is not working with my custom prefix "!get" Custom: [ if i start a chat using my custom prefix !get i receive the message about AI key because it try to use OpenAI that's default: It happens even if i config my defaultModel: 'Gemini' like above... And i'd like know how to let my custom model enabled all the time even if i don't start a chat using the prefix? is it possible? thank you so much! |
done! working as expected!! thank you so much! something like: "Do you need something more?" just to appear like a human is there, something more natural.. |
how to keep using my custom model without to use prefix? Custom: [ |
good to hear. star this repo and help us reach 128 stars.
It is possible to send a message if our system remains idle for N number of seconds. But I'm not going to add this to the master branch you can implement it on your side. try this in |
let me try! thanks! |
currently, it is not supported. Maybe I'll add it later keep this issue open Appendix: [How to send whatsapp message in whatsapp-web.js](https://stackoverflow.com/questions/65157125/how-to-send-whatsapp-message-in-whatsapp-web-js) |
do it please! because when we use a comercial whatsapp number for this purpose, we always will have people asking for questions about the company, so it make sense. while you dont implement this possibility, can i replace all messages received and replace the content for something like: is it possible? where can i do it? |
trying to do what you said: private async onMessage(message: Message) {
when i start a chat, after to receive the first message i receive this error: Error: Evaluation failed: Error: wid error: invalid wid Node.js v21.0.0 **Something about chatId: |
DONE! Please keep one thing in mind if you're making money out of this bot then you should also support this project since this project is open source. |
yep! for sure! i just trying to check what possibilities when we think about real scenario. We already have people doing that to make money like the link above i sent for you. The last question: hahaha |
i didn't understand what you meant by connecting the custom model to DB. if you are asking for loading context from the database then yes it is possible by adding a URL of your content. In this case it would be,
|
I just saw your profile are you the CEO of the company? |
yep! but my company just have one employee, me! its not a big company! |
how to use without prefix in a custom model? i'm trying to do it this way but it kill the application: Custom: [ |
:-) let me know if you need my help I'm a student doing open-source work on the GitHub |
first of all, install latest code and set - don't use empty space as prefix |
let me check! |
still not working.. I can send messages without prefix but it does not work trying to find the answers in my custom mode Custom: [ |
wait let me check |
/* Models config files */
import { Config } from './types/Config';
const config: Config = {
chatGPTModel: "gpt-3.5-turbo", // learn more about GPT models https://platform.openai.com/docs/models
models: {
ChatGPT: {
prefix: '!chatgpt', // Prefix for the ChatGPT model
enable: true // Whether the ChatGPT model is enabled or not
},
DALLE: {
prefix: '!dalle', // Prefix for the DALLE model
enable: true // Whether the DALLE model is enabled or not
},
StableDiffusion: {
prefix: '!stable', // Prefix for the StableDiffusion model
enable: true // Whether the StableDiffusion model is enabled or not
},
GeminiVision: {
prefix: '!gemini-vision', // Prefix for the GeminiVision model
enable: true // Whether the GeminiVision model is enabled or not
},
Gemini: {
prefix: '!gemini', // Prefix for the Gemini model
enable: true // Whether the Gemini model is enabled or not
},
Custom: [
{
/** Custom Model */
modelName: 'whatsapp-respostas', // Name of the custom model
prefix: '!bot', // Prefix for the custom model
enable: true, // Whether the custom model is enabled or not
/**
* context: "file-path (.txt, .text, .md)",
* context: "text url",
* context: "text"
*/
modelToUse: 'Gemini',
context: './static/whatsapp-respostas.txt', // Context for the custom model
}
]
},
enablePrefix: {
/** if enable, reply to those messages start with prefix */
enable: false, // Whether prefix messages are enabled or not
/** default model to use if message not starts with prefix and enable is false */
defaultModel: 'Gemini' // Default model to use if no prefix is present in the message
}
};
export default config; |
fixed my dear friend! |
did you see something about this error? i tried to work with your example but no way... |
welcome if you need any kind of help let me know |
i think this code will not work remove it for now I'll add this feature may be after few days still I'm busy with university FYP(Final Year Project) |
I dont know if is possible for you, but i'm thinking we have a meeting to talk about some possibilities together. |
its okay! take your time my friend! |
talking about what? |
we build one tool to be integrated in one of my WebApplications, its about job. |
Hello!
Im trying to check if is possible to let your project like this one:
https://www.youtube.com/watch?v=Sh94c6yn5aQ
It seem like you have almost everything done, im just trying to insert my custom informations like this guy did and after this interact with my whatsapp Business. is it possible?
i've created my .env file and also connected with my whatsapp through qrcode generated, so in my terminal i have this message:
QR has been generated! | Scan QR Code with you're mobile.
✔ User Authenticated!
✔ Client is ready | All set!
after that, i dont know what more can i do to reach this level above sent.
thank you so much!
The text was updated successfully, but these errors were encountered: