<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"><channel><category>pcsoft.br.windev</category><copyright>Copyright 2026, PC SOFT</copyright><lastBuildDate>29 Jan 2026 12:24:08 Z</lastBuildDate><pubDate>29 Jan 2026 11:35:42 Z</pubDate><description>&lt;# &#13;
Phoenix AI - Windows 11 Setup (Ollama + LLaMA + RAG + LangGraph)&#13;
- Instala dependências Python&#13;
- Cria venv&#13;
- Baixa modelos no Ollama&#13;
- Cria estrutura kb/&#13;
- Gera scripts index_kb.py e rag_graph.py&#13;
- Roda index e inicia o chat RAG&#13;
&#13;
Como usar:&#13;
1) Instale manualmente: Python 3.11+ (marque "Add to PATH") e Ollama for Windows.&#13;
2) Abra PowerShell como Usuário normal.&#13;
3) Rode:  .\setup_phoenix_ai.ps1&#13;
&#13;
Obs: Se der erro de ExecutionPolicy:&#13;
Set-ExecutionPolicy -Scope CurrentUser RemoteSigned&#13;
#&gt;&#13;
&#13;
$ErrorActionPreference = "Stop"&#13;
&#13;
function Write-Step($msg) {&#13;
  Write-Host "`n==&gt; $msg" -ForegroundColor Cyan&#13;
}&#13;
&#13;
function Assert-Command($cmd, $hint) {&#13;
  if (-not (Get-Command $cmd -ErrorAction SilentlyContinue)) {&#13;
    Write-Host "`nERRO: '$cmd' não encontrado no PATH." -ForegroundColor Red&#13;
    Write-Host "Dica: $hint" -ForegroundColor Yellow&#13;
    exit 1&#13;
  }&#13;
}&#13;
&#13;
Write-Step "Checando pré-requisitos (python, pip, ollama)"&#13;
Assert-Command "python" "Instale Python 3.11+ (64-bit) e marque 'Add Python to PATH'."&#13;
Assert-Command "pip" "O pip vem junto do Python. Reinstale o Python marcando PATH."&#13;
Assert-Command "ollama" "Instale o Ollama for Windows e reabra o PowerShell."&#13;
&#13;
Write-Step "Mostrando versões"&#13;
python --version&#13;
pip --version&#13;
ollama --version&#13;
&#13;
# Pasta do projeto&#13;
$ProjectDir = Join-Path $PWD "phoenix_ai"&#13;
Write-Step "Criando pasta do projeto: $ProjectDir"&#13;
if (-not (Test-Path $ProjectDir)) {&#13;
  New-Item -ItemType Directory -Path $ProjectDir | Out-Null&#13;
}&#13;
Set-Location $ProjectDir&#13;
&#13;
# Criar venv&#13;
$VenvDir = Join-Path $ProjectDir ".venv"&#13;
Write-Step "Criando venv em: $VenvDir"&#13;
if (-not (Test-Path $VenvDir)) {&#13;
  python -m venv .venv&#13;
} else {&#13;
  Write-Host "Venv já existe. Mantendo." -ForegroundColor DarkGray&#13;
}&#13;
&#13;
# Ativar venv&#13;
Write-Step "Ativando venv"&#13;
$ActivatePs1 = Join-Path $VenvDir "Scripts\Activate.ps1"&#13;
if (-not (Test-Path $ActivatePs1)) {&#13;
  Write-Host "ERRO: Não achei $ActivatePs1" -ForegroundColor Red&#13;
  exit 1&#13;
}&#13;
. $ActivatePs1&#13;
&#13;
# Atualizar pip&#13;
Write-Step "Atualizando pip"&#13;
python -m pip install -U pip&#13;
&#13;
# Instalar deps Python&#13;
Write-Step "Instalando dependências (langgraph, langchain, ollama, chroma)"&#13;
pip install -U langgraph langchain langchain-ollama langchain-chroma chromadb langchain-text-splitters&#13;
&#13;
# Baixar modelos Ollama&#13;
Write-Step "Baixando modelos no Ollama (llama3.1 + embeddings)"&#13;
ollama pull llama3.1&#13;
ollama pull mxbai-embed-large&#13;
&#13;
# Estrutura KB&#13;
Write-Step "Criando estrutura de base de conhecimento (kb/...)"&#13;
$KbDir = Join-Path $ProjectDir "kb"&#13;
$KbCommands = Join-Path $KbDir "commands"&#13;
$KbFunctions = Join-Path $KbDir "functions"&#13;
$KbCases = Join-Path $KbDir "cases"&#13;
&#13;
foreach ($d in @($KbDir, $KbCommands, $KbFunctions, $KbCases)) {&#13;
  if (-not (Test-Path $d)) { New-Item -ItemType Directory -Path $d | Out-Null }&#13;
}&#13;
&#13;
# Criar um exemplo mínimo (para você testar na hora)&#13;
$SampleMd = Join-Path $KbCommands "print.md"&#13;
if (-not (Test-Path $SampleMd)) {&#13;
@"&#13;
# print&#13;
&#13;
## O que faz&#13;
Imprime texto na saída padrão.&#13;
&#13;
## Sintaxe&#13;
print(text)&#13;
&#13;
## Parâmetros&#13;
- text: string&#13;
&#13;
## Retorno&#13;
- void&#13;
&#13;
## Exemplo&#13;
print("Olá, Phoenix!")&#13;
"@ | Set-Content -Path $SampleMd -Encoding utf8&#13;
}&#13;
&#13;
# Gerar index_kb.py&#13;
Write-Step "Gerando index_kb.py"&#13;
$IndexPy = Join-Path $ProjectDir "index_kb.py"&#13;
@"&#13;
import os&#13;
from pathlib import Path&#13;
&#13;
from langchain_text_splitters import RecursiveCharacterTextSplitter&#13;
from langchain_chroma import Chroma&#13;
from langchain_ollama import OllamaEmbeddings&#13;
&#13;
KB_DIR = Path("kb")&#13;
CHROMA_DIR = Path("chroma_db")&#13;
COLLECTION = "phoenix_kb"&#13;
&#13;
def load_markdown_files(root: Path):&#13;
    docs = []&#13;
    for p in root.rglob("*.md"):&#13;
        text = p.read_text(encoding="utf-8", errors="ignore")&#13;
        rel = p.relative_to(root).as_posix()&#13;
        docs.append({"text": text, "metadata": {"source": rel}})&#13;
    return docs&#13;
&#13;
def main():&#13;
    if not KB_DIR.exists():&#13;
        raise SystemExit("Pasta kb/ não existe. Crie kb/ e coloque .md lá dentro.")&#13;
&#13;
    embeddings = OllamaEmbeddings(model="mxbai-embed-large")&#13;
&#13;
    vectorstore = Chroma(&#13;
        collection_name=COLLECTION,&#13;
        persist_directory=str(CHROMA_DIR),&#13;
        embedding_function=embeddings,&#13;
    )&#13;
&#13;
    raw_docs = load_markdown_files(KB_DIR)&#13;
&#13;
    splitter = RecursiveCharacterTextSplitter(&#13;
        chunk_size=1200,&#13;
        chunk_overlap=150,&#13;
    )&#13;
&#13;
    texts = []&#13;
    metadatas = []&#13;
    for d in raw_docs:&#13;
        chunks = splitter.split_text(d["text"])&#13;
        for i, c in enumerate(chunks):&#13;
            texts.append(c)&#13;
            metadatas.append({**d["metadata"], "chunk": i})&#13;
&#13;
    vectorstore.add_texts(texts=texts, metadatas=metadatas)&#13;
&#13;
    print(f"OK: indexados {len(texts)} chunks em {CHROMA_DIR}/ coleção {COLLECTION}")&#13;
&#13;
if __name__ == "__main__":&#13;
    main()&#13;
"@ | Set-Content -Path $IndexPy -Encoding utf8&#13;
&#13;
# Gerar rag_graph.py&#13;
Write-Step "Gerando rag_graph.py"&#13;
$RagPy = Join-Path $ProjectDir "rag_graph.py"&#13;
@"&#13;
from typing_extensions import TypedDict, List&#13;
&#13;
from langgraph.graph import StateGraph, START, END&#13;
from langchain_chroma import Chroma&#13;
from langchain_ollama import ChatOllama, OllamaEmbeddings&#13;
&#13;
CHROMA_DIR = "chroma_db"&#13;
COLLECTION = "phoenix_kb"&#13;
&#13;
class State(TypedDict):&#13;
    question: str&#13;
    contexts: List[str]&#13;
    answer: str&#13;
&#13;
def retrieve(state: State) -&gt; dict:&#13;
    embeddings = OllamaEmbeddings(model="mxbai-embed-large")&#13;
    vs = Chroma(&#13;
        collection_name=COLLECTION,&#13;
        persist_directory=CHROMA_DIR,&#13;
        embedding_function=embeddings,&#13;
    )&#13;
    retriever = vs.as_retriever(search_kwargs={"k": 6})&#13;
    docs = retriever.get_relevant_documents(state["question"])&#13;
    contexts = [d.page_content for d in docs]&#13;
    return {"contexts": contexts}&#13;
&#13;
def generate(state: State) -&gt; dict:&#13;
    llm = ChatOllama(model="llama3.1", temperature=0)&#13;
&#13;
    context_text = "\n\n---\n\n".join(state["contexts"]).strip()&#13;
&#13;
    prompt = f\"""&#13;
Você é o assistente oficial da linguagem Phoenix.&#13;
Responda APENAS usando o contexto abaixo. Se o contexto não tiver a resposta, diga:&#13;
"Não sei com base na base atual."&#13;
&#13;
CONTEXTO:&#13;
{context_text}&#13;
&#13;
PERGUNTA:&#13;
{state["question"]}&#13;
&#13;
RESPOSTA (com exemplo de código e notas de uso):&#13;
\""".strip()&#13;
&#13;
    out = llm.invoke(prompt)&#13;
    return {"answer": out.content}&#13;
&#13;
def build_app():&#13;
    g = StateGraph(State)&#13;
    g.add_node("retrieve", retrieve)&#13;
    g.add_node("generate", generate)&#13;
&#13;
    g.add_edge(START, "retrieve")&#13;
    g.add_edge("retrieve", "generate")&#13;
    g.add_edge("generate", END)&#13;
&#13;
    return g.compile()&#13;
&#13;
if __name__ == "__main__":&#13;
    app = build_app()&#13;
&#13;
    while True:&#13;
        q = input("\nPergunta&gt; ").strip()&#13;
        if not q:&#13;
            continue&#13;
        if q.lower() in ("exit", "quit"):&#13;
            break&#13;
&#13;
        result = app.invoke({"question": q, "contexts": [], "answer": ""})&#13;
        print("\n" + result["answer"])&#13;
"@ | Set-Content -Path $RagPy -Encoding utf8&#13;
&#13;
# Indexar KB&#13;
Write-Step "Indexando base de conhecimento (index_kb.py)"&#13;
python .\index_kb.py&#13;
&#13;
# Rodar RAG&#13;
Write-Step "Iniciando chat RAG (rag_graph.py). Para sair: exit"&#13;
python .\rag_graph.py&#13;
&#13;
--&#13;
Adriano José Boller&#13;
______________________________________________&#13;
Consultor e Representante Oficial da&#13;
PcSoft no Brasil&#13;
+55 (41) 99949 1800&#13;
adrianoboller@gmail.com&#13;
skype: adrianoboller&#13;
http://wxinformatica.com.br/</description><ttl>30</ttl><generator>WEBDEV</generator><language>pt_BR</language><link>https://forum.pcsoft.fr/es-ES/pcsoft.br.windev/5370-como-instalar-pelo-powershell-sua-usar-ela-com/read.awp</link><title>Como instalar pelo Powershell a sua iA e usar ela com os seus sistemas</title><managingEditor>moderateur@pcsoft.fr (El moderador)</managingEditor><webMaster>webmaster@pcsoft.fr (El webmaster)</webMaster><item><author>Boller</author><category>pcsoft.br.windev</category><comments>https://forum.pcsoft.fr/es-ES/pcsoft.br.windev/5370-como-instalar-pelo-powershell-sua-usar-ela-com-5372/read.awp</comments><pubDate>29 Jan 2026 12:24:08 Z</pubDate><description>Sim. WebDev + Ollama é ainda mais fácil que WinDev, porque você já está no mundo HTTP.&#13;
&#13;
Você tem 2 jeitos:&#13;
&#13;
a) Jeito certo (…</description><guid isPermaLink="true">https://forum.pcsoft.fr/es-ES/pcsoft.br.windev/5370-como-instalar-pelo-powershell-sua-usar-ela-com-5372/read.awp</guid><link>https://forum.pcsoft.fr/es-ES/pcsoft.br.windev/5370-como-instalar-pelo-powershell-sua-usar-ela-com-5372/read.awp</link><source url="https://forum.pcsoft.fr/es-ES/pcsoft.br.windev/5370-como-instalar-pelo-powershell-sua-usar-ela-com/read.awp">Como instalar pelo Powershell a sua iA e usar ela com os seus sistemas</source><title>Re: Como instalar pelo Powershell a sua iA e usar ela com os seus sistemas</title></item><item><author>Boller</author><category>pcsoft.br.windev</category><comments>https://forum.pcsoft.fr/es-ES/pcsoft.br.windev/5370-como-instalar-pelo-powershell-sua-usar-ela-com-5371/read.awp</comments><pubDate>29 Jan 2026 11:46:00 Z</pubDate><description>Sim — dá pra usar com WX (WinDev/WebDev/WinDev Mobile). Só não é “rodar Python dentro do WX”. O padrão é:&#13;
&#13;
WX = app/IDE&#13;
IA = …</description><guid isPermaLink="true">https://forum.pcsoft.fr/es-ES/pcsoft.br.windev/5370-como-instalar-pelo-powershell-sua-usar-ela-com-5371/read.awp</guid><link>https://forum.pcsoft.fr/es-ES/pcsoft.br.windev/5370-como-instalar-pelo-powershell-sua-usar-ela-com-5371/read.awp</link><source url="https://forum.pcsoft.fr/es-ES/pcsoft.br.windev/5370-como-instalar-pelo-powershell-sua-usar-ela-com/read.awp">Como instalar pelo Powershell a sua iA e usar ela com os seus sistemas</source><title>Re: Como instalar pelo Powershell a sua iA e usar ela com os seus sistemas</title></item></channel></rss>
