Community
Search works differently with mailadresses in /search and /admin/items - is there a setting where we can change this?
These work the same:
/search?term="[[email protected]](mailto:[email protected])" -> 1 result
/admin/items?term="[[email protected]](mailto:[email protected])" -> 1 result
These work differently:
/search?term=[[email protected]](mailto:[email protected]) -> 1 result
/admin/items?term=[[email protected]](mailto:[email protected]) -> 999 results (shows everybody in the company with "domain.com" anywhere within the item)
It seems /admin/item does not consider [[email protected]](mailto:[email protected]) to be one (1) string but many strings, divided by the @ sign.
Is there a setting we (or you) can adjust so they work the same?
ChatGPT suggests
> To handle email addresses as single terms, you can configure a custom analyzer in your Elasticsearch index settings. This custom analyzer can use the keyword tokenizer, which treats the entire string as a single token. Here's how you can define and use a custom analyzer in your Elasticsearch index:
```json JSON
{
"settings": {
"analysis": {
"analyzer": {
"email_analyzer": {
"type": "custom",
"tokenizer": "keyword",
"filter": ["lowercase"]
}
}
}
},
"mappings": {
"properties": {
"email": {
"type": "text",
"analyzer": "email_analyzer"
}
}
}
}
```
Posted by Dennis about 1 month ago
Is it possible to use the same time zone in data and result tables?
Dates are stored in ISO:
![](https://files.readme.io/2dd6d7b-image.png)
and the search query in Onify admin uses this when searching.
However the result table which shows the results seem to show them in my local time (+2 hours) which can make strange results when searching with dates. 😁
![](https://files.readme.io/0ab6b68-image.png)
Is there a way to get them to use the same time zone? If not, consider this a feature request to use one consistent way. Or even better - a toggle/setting which let me decide which format to use everywhere. 😎
Posted by Dennis about 2 months ago
How can I upload for example a jpg-file with the api?
I get creating a file from a from a base64-encoded dummy-string to work perfectly:
![](https://files.readme.io/980e50a-image.png)
However for the endpoint **upload** I get "Path is required". I try /test, /test/test.txt etc but get the same error.
![](https://files.readme.io/a46214e-image.png)
I'm not even sure what to put in the field "conten"t yet, just trying to get ok on the path first.
Ultimately I would like to upload for example a jpg-file. How would one accomplish that?
Swagger just says
![](https://files.readme.io/7e8d3c1-image.png)
and doesn't really advice or give example of how to use it or the difference between creating and uploading. :)
Posted by Dennis about 2 months ago
How can I call onifyElevatedApiRequest by javascript?
I am trying to replicate this call with the Connector to onifyElevatedApiRequest in javascript?
![](https://files.readme.io/08d8b63-image.png)
I assumed this was gonna work:
```Text javascript
const forfragan = {
url: '/my/config/settings',
query: { tag: 'azure' },
role: 'flow'
}
try {
let result = await environment.services.onifyElevatedApiRequest(forfragan);
```
but I get "Request failed: Error: Missing authentication (401).
Also, what is "role" - it is not in the list of arguments on <https://support.onify.co/docs/workflows-2>
Maybe the Connector automatically includes some header + authorization that is needed to be added manually in js?
Posted by Dennis 2 months ago
How can I use a script in flow to call the httpRequest service?
I have used the service (connector) `httpRequest` in a service task. Is it also possible to call the function from a script (task)?
Posted by Robert Lundsten 2 months ago
Possibility to change the 5 minute limit of "Short running synchrone (non persistent) workflows"?
We have a subflow consisting of a few tasks, that takes anything from 1 to approx 9 minutes to finish.
When calling any subflow with /api/v2/my/workflows/run/{key} it is limited to a runtime of 5 minutes before it fails as of <https://support.onify.co/reference/postmyworkflowsrunkey>
This seems to be a hardcoded value, and probably set to be enough for most of such task. But having such a limit will in our case force us to move the logic from a subflow to our main flow to not have timeouts, which is unfortunate in terms of maintainability etc.
Is there any possibility to change set this limit to another value within our own Onify instance?
In return I can provide you with an enhanced and more reliable version of this blueprint. ;)
<https://github.com/onify/blueprint-cloudblue-commerce-change-subscription>
Posted by Dennis 3 months ago
Blueprint for data mapping in Onify?
"Data mapping is the process of matching fields from one database to another. It's the first step to facilitate data migration, data integration, and other data management tasks."
We need to start some data mapping processes between Onify and other systems. Is there a such a blueprint for Onifys standard tables and fields that we can start from? For custom items etc we will of course still have to do it from scratch.
B t w, here is such a free such template for Miro:
<https://miro.com/templates/entity-relationship-diagram/>
![](https://files.readme.io/8996b86-image.png)
Posted by Dennis 3 months ago
Feature request: Implement retry cycles for servicetasks (failedJobRetryTimeCycle)
It would be great if we could use the Camunda function "retry cycle" in Onify to retry a task, that sometimes fails, a specified number of times with a specified time apart. Now we need to build this logic ourselves around tasks that need such "safety net" to not exit a full flow because of some intermittent downtime or unreliable service e t c. 😊
![](https://files.readme.io/5cf3f89-image.png)
\<bpmn:serviceTask id="sendEmailTask" name="Send Email Task"
camunda:asyncBefore="true"
camunda:failedJobRetryTimeCycle="R3/PT30S">
\</bpmn:serviceTask>
Posted by Dennis 3 months ago