8000 Releases · DeabLabs/cannoli · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Releases: DeabLabs/cannoli

1.7.4

10 May 16:14
Compare
Choose a tag to compare

Dataview integration currently broken, removing for now

1.7.2

10 May 03:46
Compare
Choose a tag to compare

Remove logging

1.7.1

10 May 03:40
Compare
Choose a tag to compare

Fix dataview replacement issue (modifiers still broken)

1.7

10 May 02:00
Compare
Choose a tag to compare
1.7

Dataview integration and more AI providers

Be sure to delete and re-load the Cannoli College folder from settings to see the new example canvases. Check out 2.2 and 4.6 to get examples of the new features.

Cannoli now renders Dataview DQL queries so the LLM can see them!

If you have the Dataview plugin enabled on a vault, and there's a valid query inside of a node or embedded note, the LLM will see the results of that query as a Markdown list or table.

DataviewJS queries are not implemented yet, but we're still working to get that in as well. (p.s. if you're reading this and have arcane dataview knowledge that might help please reach out on the repo)

Groq, Gemini, and Anthropic

New LLM providers are now available. In addition to OpenAI and Ollama, you can use these new providers as a default or select providers at the node level with config arrows.

You can edit the settings for a provider from the settings page, by selecting the provider you'd like to edit in the "AI Provider" dropdown

Thank you!

Momentum has returned and we have more features coming soon. Please reach out to us on the repo with feature ideas and suggestions!

We'd also love to see any of the delicious cannolis you've made so far.

1.6.1

21 Apr 22:28
Compare
Choose a tag to compare

Ollama

Cannoli now has support for running local LLMs with Ollama!

To switch to local LLMs, change the "AI provider" top level setting to Ollama, and make sure the ollama url reflects your setup (the default is usually the case).

We also need to configure the OLLAMA_ORIGINS environment variable to "" in order for requests from obsidian desktop to reach the ollama server successfully. Reference this document to configure this environment variable for each operating system, for example, in Mac OS you will run the command launchctl setenv OLLAMA_ORIGINS "" in your terminal and restart ollama.

You can change the default model in the settings, and define the model per-node in Cannolis themselves using config arrows as usual, but note that the model will have to load every time you change it, so having several models in one cannoli will take longer.

Function calling is not implemented yet, so Choice arrows and Field arrows currently don't work with Ollama.

All OpenAI chat models available

All chat OpenAI models are now available on cannoli, so long as you give the correct model name string in the settings or in the config arrow. Not all models have the correct price numbers currently, but now you won't have to wait on us to update the list to use openAI models.

1.5.10

08 Dec 16:35
Compare
Choose a tag to compare

Add gpt-3.5-turbo back into model list

1.5.9

07 Dec 03:15
Compare
Choose a tag to compare

Sorry for the delay on these (embarrassingly simple) fixes. I'm excited to get back in the swing of things! Next big thing I'm working on is enabling the image generation models in call nodes.

1.5.5

17 Oct 02:50
Compare
Choose a tag to compare
  • Added SELECTION variable (text highlighted with cursor)

    • This variable can be used the same way as the NOTE variable
    • Can be used to write to selection
    • More details in Cannoli College section 4.5
  • Toggle swapping colors for content and call nodes in settings (under Canvas preferences)

    • Colorless nodes seem to be a better fit for content nodes than purple (colorless is default, there tend to me more content nodes than call nodes in a cannoli, LLM nodes should feel more intentional)
    • Give it a try and see how that schema feels, and let me know what you think!
    • There's no time like the present for an ugly breaking change
  • Miscellaneous bug fixes

1.5.4

05 Oct 15:48
Compare
Choose a tag to compare
  • Add plimit setting
  • add chatConverter message limiting
  • fix bugs with settings pane
  • update cannoli college
  • Check for new cannoli command on file creation

1.5.3

05 Oct 03:32
Compare
Choose a tag to compare
  • Enable editing properties of notes using the ":propertyName" format in arrows.
  • Add the loop number variable: "{{#}}" which renders as the iteration of the loop group the node is in. Additionally, multiple pound symbols like "{{##}}" correspond to higher level loop groups, in the case of nested loops.
  • Updated cannoli college
0