llm-providers need proxy agent supports

1 Like

I don’t know much about vscode extension development, here is the temporary solution for desktop(node) runtime I have tested:

import { fetch, ProxyAgent } from 'undici'

    fetch(apiEndpoint, {
        method: 'POST',
        body: JSON.stringify({ contents: messages }),
        headers: {
            'Content-Type': 'application/json',
        },
// ++++
        dispatcher: new ProxyAgent({ uri: new URL(process.env.https_proxy!).toString() }), 
        signal,
    })

Then set the env https_proxy for vscode startup entry.