feat: add AI chat and summarize endpoints with database integration

- Implemented a new chat endpoint that utilizes Groq for chat completions based on news articles.
- Added a summarize endpoint that fetches news articles from the database and generates summaries using Groq.
- Introduced a new package "@vueuse/core" for improved reactivity.
- Created a comprehensive command UI component with various subcomponents for better user interaction.
- Developed a scraping module using Scrapy to fetch news articles from Google News.
- Added validation and sanitization for slug parameters in the fetch article endpoint.
This commit is contained in:
yuanhau 2025-05-10 21:57:38 +08:00
parent 92a0358744
commit bf357f1c84
35 changed files with 809 additions and 11 deletions

1
scraping/.python-version Normal file
View file

@ -0,0 +1 @@
3.13

12
scraping/main.py Normal file
View file

@ -0,0 +1,12 @@
import scrapy
class BlogSpider(scrapy.Spider):
name = 'blogspider'
start_urls = ['https://news.google.com/u/4/home?hl=zh-TW&gl=TW&ceid=TW:zh-Hant&pageId=none']
def parse(self, response):
for title in response.css('.oxy-post-title'):
yield {'title': title.css('::text').get()}
for next_page in response.css('a.next'):
yield response.follow(next_page, self.parse)

7
scraping/pyproject.toml Normal file
View file

@ -0,0 +1,7 @@
[project]
name = "scraping"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.13"
dependencies = []

View file

@ -0,0 +1 @@
scrapy

8
scraping/uv.lock generated Normal file
View file

@ -0,0 +1,8 @@
version = 1
revision = 1
requires-python = ">=3.13"
[[package]]
name = "scraping"
version = "0.1.0"
source = { virtual = "." }