Skip to content

About

Overview \ What it is

Bro AI is a tool that brings frontier Large Language models (AI) right inside Maya. Use natural language to control Maya, edit scenes, perform tasks and create new tools.

AI Providers

Bro AI supports multiple AI providers. This includes, but is not limited to:

Cloud

  • OpenAI (ChatGPT)
  • Anthropic (Claude)
  • OpenRouter

Local

  • Ollama
  • Ooba TextGeneration WebUI
  • llama.cpp web server

And more! It supports any OpenAI API compatible AI provider.

Usage

  1. Enter your prompt
  2. Click one of the buttons below and wait for AI to respond
  3. Check the code
  4. Run the code
  5. Save the code

It's really that simple and straightforward.

Generate Code

This button will generate code and it will show up in the code editor inside BroAI window. AI is made aware of scene contents and other contextual information.

Run Generated Code

This button will run the code inside BroAI window.

Save Generated Code

This button will save the code into the "custom_scripts" folder of BroTools. It will be available from BroTools - Extras - Custom Scripts menu. You will need to use BroTools - Extras - Custom Scripts - Reload (To ADD new scripts) for new scripts to show up in that list.

You can also create a shelf button from the generated code by selecting it and Left-Mouse dragging it into the shelf (unlike Middle Mouse Drag from Maya's built-in script editor).

Cloud use

To use cloud providers you will need to get token or API key. You can set or change API key for currently selected provider by clicking the Edit Token button. If there's no token for current provider BroAI will ask when you try to run generation for the first time.

For OpenAI you can get your API key from their OpenAI Platform: https://platform.openai.com/api-keys

For Anthropic you can get your API key from the Anthropic Console: https://console.anthropic.com/settings/keys

Local use

One of the main features I wanted to add into BroAI is ability to use local models like LLaMa, Qwen, Mistral and so on. This means that all data processing will happen on your machine, either your own PC or your own server, without going into the cloud. This can be crucial for data privacy and protection. Local models may also be better suited for certain tasks, they can often work even faster than cloud alternatives if you have the appropriate hardware. And local models are only going to get better and faster.

To connect to local models you will need to do 2 main things:

  1. Set up BroAI to connect to your local AI server
  2. Set up and run local AI

1. Set up BroAI to connect to your local AI server

This is an easy task.

  1. Select OpenAI from the providers dropdown.
  2. Enable the Custom server checkbox
  3. Enter IP and port of your server, for example: http://127.0.0.1:5000 for local AI with Ooba TextGen WebUI (by default) or http://127.0.0.1:11434 for Ollama (by default)
  4. If it asks you for API Key\Token - you can enter pretty much anything there, for example none, or just leave your OpenAI token there, if you used it before.

That's it for the BroAI side!

2. Set up Local AI

To set up local AI you will need to install one of the following:

For TextGeneration WebUI you will need to enable API by running with --api flag.

You can also run your AI server on any other PC (Server) that you have access to. In that case you will need to change the IP, and if it's located outside your network - expose it to the internet or set up a VPN. But this is out of the scope of this documentation, as it's nothing special to BroTools, BroAI or even LLMs in general, it's general networking... stuff.

Disclaimer

BroTools, including BroAI, is provided as is, without any guarantees or warranty. Use of BroTools is at your own risk. The author of BroTools is not liable for any damages, including but not limited to: - loss of data - loss of profit - loss of reputation - any other possible damages - any damages caused by the generated code

Current AI tech (Large Language Models, LLMs) used by BroAI is not perfect and may generate code that is not correct, not functional, not optimal or does not do what you expect it to do. It is your responsibility to review and test the generated code before using it in production.

This is true of ANY AI tool that relies on Large Language Models. Due to the nature of the technology they can hallucinate with confidence. They don't know what they don't know, and will happily supply you with wrong information or broken code.

That being said, AI code generation can be a very powerful tool that can greatly speed up your workflow and provide good results most of the time.