How to recover from Error processing request (java.nio.BufferUnderflowException) after input of nt file?
After many successful imports of mixed RDF triple formats (ttl, nt, etc), the import chokes on one by getting into what seems like a very long (maybe infinite) loop.
When trying to connect to the triplestore instance, I see the following message:
“Error processing request (java.nio.BufferUnderflowException).”
GraphDB Talk to Your Graph feature not supporting OpenSearch VectorDB
I am trying to set up Graph’s Talk to Your Graph feature using OpenSearch as a Vector database.
Seems like the ChatGPT retrieval plugin doesn’t support OpenSearch integration as of now. The error occuring is as below:
Slow connection to GraphDB from a third-party application
I’m trying to test if GraphDB can be used along with the NLP annotation tool called INCEpTION.
Can’t get Azure GPT Model to work for TTYG ChatGPT Retrival Plugin
I can’t get my Azure OpenAI configured correctly, while I’m able to get everything (for the Talk To Your Graph Features, TTYG) working with my OpenAI API key. I can also confirm that my Azure OpenAI key/config is working for other tools.
GraphDB File Import UI Bug
When I try to import an RDF file through the import user data tab in the UI, I get a bug, and I’m unable to select any option for uploading, see screenshot. It worked in Firefox at one point.
JDBC connector for docker compose
I’ve been trying to setup the MySQL JDBC connector following the instructions i found here but I’m afraid I’m still having issues with my setup.
Can I augment the GraphDB Entity Linking with my own graph db content
I have got this working. However, is there any way I can augment the reference data (wikidata, dbpedia) with my own local graphdb content. if I wanted to recognise my own entities, can I add my db to the kwowledge base?