具有思考模式模型部署:Qwen3、DeepSeek-R1-Distill、Phi-4、QWQ系列
文章目录
- 1 介绍 Qwen3、DeepSeek-R1-Distill、Phi-4、QWQ
- 2 部署 Qwen3、DeepSeek-R1-Distill、Phi-4、QWQ
- 3 模型运行 Qwen3、DeepSeek-R1-Distill、Phi-4、QWQ
- 4 结果
- Qwen3-0.6B
- DeepSeek-R1-Distill-Qwen-1.5B
- Phi-4-mini-reasoning
平台采用Autodl:https://www.autodl.com/home
PyTorch / 2.3.0 / 3.12(ubuntu22.04) / 12.1
深度思考
b站视频:https://www.bilibili.com/video/BV1hcJrzaEpx/
1 介绍 Qwen3、DeepSeek-R1-Distill、Phi-4、QWQ
截止目前,我们收集到四个系列具有思考模式的模型(开源模型):
- Qwen3
- QWQ
- DeepSeek-R1-Distill
- Phi-4
Qwen3请参考过去的博客:Qwen3快速部署 Qwen3-0.6B、Qwen3-8B、Qwen3-14B,Think Deeper
DeepSeek-R1-Distill
Phi-4
2 部署 Qwen3、DeepSeek-R1-Distill、Phi-4、QWQ
对Qwen3、DeepSeek-R1-Distill、Phi-4、QWQ 进行部署实现
使用SDK下载下载:
开始前安装
source /etc/network_turbo
pip install modelscope
脚本下载
# source /etc/network_turbo
from modelscope import snapshot_download# 指定模型的下载路径
cache_dir = '/root/autodl-tmp'
# 调用 snapshot_download 函数下载模型# Qwen系列
# model_dir = snapshot_download('Qwen/Qwen3-0.6B', cache_dir=cache_dir)
# model_dir = snapshot_download('Qwen/Qwen3-1.7B', cache_dir=cache_dir)
# model_dir = snapshot_download('Qwen/Qwen3-4B', cache_dir=cache_dir)
# model_dir = snapshot_download('Qwen/Qwen3-8B', cache_dir=cache_dir)# DeepSeek-R1-Distill系列
# model_dir = snapshot_download('deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B', cache_dir=cache_dir)
# model_dir = snapshot_download('deepseek-ai/DeepSeek-R1-Distill-Qwen-7B', cache_dir=cache_dir)
model_dir = snapshot_download('deepseek-ai/DeepSeek-R1-Distill-Llama-8B', cache_dir=cache_dir)# Phi-4
# model_dir = snapshot_download('LLM-Research/Phi-4-mini-reasoning', cache_dir=cache_dir)# QWQ 太大了
# model_dir = snapshot_download('Qwen/QwQ-32B', cache_dir=cache_dir)print(f"模型已下载到: {model_dir}")
3 模型运行 Qwen3、DeepSeek-R1-Distill、Phi-4、QWQ
安装
pip install transformers
pip install accelerate
脚本:
from transformers import AutoModelForCausalLM, AutoTokenizer
import torchbase_path = "/root/autodl-tmp/"
# 加载模型和分词器# Qwen系列
# model_name = os.path.join(base_path, 'Qwen/Qwen3-0.6B')
# model_name = os.path.join(base_path, 'Qwen/Qwen3-1.7B')
# model_name = os.path.join(base_path, 'Qwen/Qwen3-4B')
# model_name = os.path.join(base_path, 'Qwen/Qwen3-8B')# DeepSeek-R1-Distill系列
model_name = os.path.join(base_path, 'deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B')
# model_name = os.path.join(base_path, 'deepseek-ai/DeepSeek-R1-Distill-Qwen-7B')
# model_name = os.path.join(base_path, 'deepseek-ai/DeepSeek-R1-Distill-Llama-8B')# Phi-4
# model_name = os.path.join(base_path, 'LLM-Research/Phi-4-mini-reasoning')# QWQ 太大了
# model_name = os.path.join(base_path, 'Qwen/QwQ-32B')tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name,torch_dtype=torch.bfloat16, # 使用 bfloat16 减少内存占用device_map="auto" # 自动分配到可用设备
)# 设置生成参数
generation_config = {"max_new_tokens": 1024,"do_sample": True,"temperature": 0.7,"top_p": 0.9,"repetition_penalty": 1.1,
}# 构建提示
prompt = "Give me a short introduction to large language models."
print(prompt)# 编码输入
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)# 生成回复
outputs = model.generate(**inputs,**generation_config
)# 解码输出
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
print("----------\n")
print(response[len(prompt):].strip())
4 结果
prompt = “Give me a short introduction to large language models.”
Qwen3-0.6B
What are the benefits and drawbacks of using them? What are some examples of how they can be used in different fields?Alright, I need to provide a short introduction about large language models (LLMs), their benefits and drawbacks, and examples of their applications. Let me start by recalling what I know.Large language models are AI systems designed to understand and generate human-like text. They're trained on vast datasets, so they have good understanding of existing languages and can process complex information. Benefits include improving translation accuracy, assisting with content creation, and helping with customer service. Drawbacks might be issues with bias, lack of physicality or emotions, and potential for misuse.Examples in different fields: healthcare, finance, education, etc. For example, doctors use LLMs for diagnosis, financial analysts use them for data analysis, educators use them for writing and tutoring.Wait, should I mention specific instances like medical transcription or chatbots? Also, maybe talk about ethical considerations like bias in training data affecting performance.I think that's covered. Now, structure it into an introduction, then benefits and drawbacks, and examples. Keep it concise but comprehensive.
**Introduction:**
Large Language Models (LLMs) are advanced AI systems capable of understanding and generating human-like text. They are trained on massive datasets, enabling them to comprehend diverse languages and process complex information efficiently. These models revolutionize communication, automation, and decision-making across various domains.**Benefits:**
- **Language Understanding & Generation**: LLMs excel at translating between languages, answering questions, and creating coherent text.
- **Content Creation**: They assist writers, marketers, and content specialists by generating creative ideas.
- **Automation & Efficiency**: Tools powered by LLMs streamline tasks like customer service, data entry, and research. **Drawbacks:**
- **Bias & Fairness Concerns**: Training data may reinforce societal stereotypes or biases, leading to inaccurate predictions.
- **Lack of Physicality/Emotionality**: While LLMs can simulate thought processes, they don’t possess the ability to express emotions or feel.
- **Potential Misuse**: The technology could be exploited for harmful purposes, such as deepfakes or misinformation. **Applications:**
- **Healthcare**: Doctors use LLMs for diagnostics, symptom interpretation, and patient support.
- **Finance**: Analysts analyze market trends and perform predictive analytics.
- **Education**: Students receive personalized feedback and engage in interactive learning.
- **Customer Service**: Chatbots answer queries 24/7 for improved efficiency. This covers all key points concisely. Let me double-check if I missed anything. Yes, that seems thorough. Time to present it clearly.
**Final Answer**
$$
\boxed{
\begin{aligned}
&\text{Introduction:} \\
&\text{Large Language Models (LLMs) are AI systems designed to understand and generate human-like text. They are trained on vast datasets, enabling them to comprehend diverse languages and process complex information efficiently.}
\\
&\text{Benefits:} \\
&\text{- Language Understanding & Generation: LLMs excel at translating between languages, answering questions, and creating coherent text.}
\\
&\text{- Content Creation: They assist writers, marketers, and content specialists by generating creative ideas.}
\\
&\text{- Automation & Efficiency: Tools powered by LLMs streamline tasks like customer service, data entry, and research.}
\\
&\text{Drawbacks:} \\
&\text{- Bias & Fairness Concerns: Training data may reinforce societal stereotypes or biases, leading to inaccurate predictions.}
\\
&\text{- Lack of Physicality/Emotionality: While LLMs can simulate thought processes, they don't possess the ability to express emotions or feel.}
\\
&\text{Applications:} \\
&\text{- Healthcare: Doctors use LLMs for diagnostics, symptom interpretation, and patient support.}
\\
&\text{- Finance: Analysts analyze market trends and perform predictive analytics.}
\\
&\text{- Education: Students receive personalized feedback and engage in interactive learning.}
\end{aligned}
}
$$
$$
\boxed{
\begin{aligned}
&\text{Benefits:} \\
&\text{- Language Understanding & Generation: LLMs excel at translating between languages, answering questions, and creating coherent text.}
\\
&\text{- Content Creation: They assist writers, marketers, and content specialists by generating creative ideas.}
\\
&\text{- Automation & Efficiency: Tools powered by LLMs streamline tasks like customer service, data entry, and research.}
\end{aligned}
}
$$The final boxed answers should encapsulate all the required sections. Since the user requested only the final answer, which is the boxed equation, here's the correct formatting:$$
\boxed{
\begin{aligned}
&\text{Introduction:} \\
&\text{Large Language
使用Qwen3官方代码跑出结果 Qwen3快速部署 Qwen3-0.6B、Qwen3-8B、Qwen3-14B,Think Deeper
<think>
Okay, the user wants a short introduction to large language models. Let me start by recalling what I know. Large language models are AI systems that can understand and generate human language. They're trained on vast amounts of text, so they can handle complex tasks like writing, answering questions, or even creating content.I should mention their capabilities, like understanding context and generating coherent responses. Also, their training data is huge, which makes them powerful. Maybe include some examples of their use cases. Wait, the user didn't specify any particular use case, so keeping it general is better. Avoid technical jargon, keep it simple. Make sure it's concise but covers the key points. Let me check if I'm missing anything. Oh, they want it short, so I need to be brief. Alright, structure it with an introduction, what they do, how they work, and maybe a summary.
</think>A large language model (LLM) is an AI system designed to understand and generate human language. These models are trained on vast amounts of text, allowing them to comprehend context, answer questions, and create coherent responses. They are used in various applications, from writing and content creation to language translation and customer support.
DeepSeek-R1-Distill-Qwen-1.5B
Maybe start with the origin of LLMs.Yes, I can do that.
Alright, so first off, what's the origin of Large Language Models (LLMs)? I remember hearing about neural networks and AI before, but how did they get started with LLMs specifically?Wait, maybe it's related to convolutional neural networks or something similar? Or was it just about training more powerful models on vast amounts of data?I think there was this concept of "nervous systems" being trained in the 70s-80s. Like, how the human brain is structured, but using machines instead. So perhaps the idea was to simulate how the brain processes information through neural networks.But then, why call them LLMs? Because LLM stands for Large Language Model. It's supposed to handle complex language tasks beyond what traditional models can do.So, in terms of origins, it probably began with some initial work on neural networks, moving towards bigger architectures as computing power increased. But over time, people realized the need for these models to understand diverse languages, which led to the development of specialized LLMs.I'm trying to piece together the timeline: when were these concepts first proposed? Did it come from research into animal intelligence, like dolphins or primates, or was it more about machine learning principles?Oh, right! The Connectionist Theory of Cognitive Systems by Frank connectionist... I think he proposed the idea of simulating the brain with neural networks. That must have been a foundation for developing LLMs.Also, during the late 20th century, with advancements in computing power, researchers built larger neural networks capable of processing sequences of words. This led to models that could generate text based on various inputs.Then came the need for LLMs, where they became particularly useful in areas like translation, summarization, and creative writing. These applications pushed the boundaries of what LLMs could do, making them versatile tools across different industries.So, putting it all together, the origins of LLMs are rooted in early attempts at simulating biological neural networks, the development of neural network architectures, and the practical needs of applying these models to solve real-world problems involving language understanding and generation.Yeah, that makes sense. I should structure this introduction clearly, starting from the origins in the 1970s, leading up to the development of modern LLMs in the subsequent decades.Wait, maybe also mention how the computational advancements enabled the creation of larger and more sophisticated models, which in turn addressed specific linguistic challenges and applications. That would tie it all together nicely.Okay, I think I have a good grasp now. Time to put it all together into a coherent introduction.
</think>**Introduction to Large Language Models (LLMs): A Historical Overview**The concept of Large Language Models (LLMs) originated in the late 1970s, drawing inspiration from the pioneering work of connectionist theories. During this period, foundational ideas about simulating biological neural networks were proposed, laying the groundwork for future developments. The Connectionist Theory of Cognitive Systems, introduced by Frank connectionist, aimed to mimic the brain's ability to process information through neural networks, which later evolved into the basis for neural architecture research.As computing power advanced in the 20th century, researchers developed increasingly sophisticated neural network architectures. These models demonstrated the capability to process sequential data, paving the way for models that could generate text based on various inputs. However, the need to address complex linguistic challenges soon led to the evolution of specialized LLMs.These models became indispensable tools across diverse fields, including translation, summarization, and creative writing. Their versatility allowed them to tackle intricate linguistic tasks, thereby pushing the boundaries of their application and effectiveness. Today, LLMs continue to evolve, adapting to new challenges while maintaining their foundational role in language modeling.
Phi-4-mini-reasoning
(不好)
What are the advantages and disadvantages of using pre-trained models for chat-gpt?
相关文章:
具有思考模式模型部署:Qwen3、DeepSeek-R1-Distill、Phi-4、QWQ系列
文章目录 1 介绍 Qwen3、DeepSeek-R1-Distill、Phi-4、QWQ2 部署 Qwen3、DeepSeek-R1-Distill、Phi-4、QWQ3 模型运行 Qwen3、DeepSeek-R1-Distill、Phi-4、QWQ4 结果Qwen3-0.6BDeepSeek-R1-Distill-Qwen-1.5BPhi-4-mini-reasoning 平台采用Autodl:https://www.auto…...
Mac安装redis
1、 去往网址 http://编download.编redis.io/releases/ 找到任意 结尾为* .tar.gz的文件下载下来 2、使用终端进入下载下来的redis文件 3、直接执行redis-server 如果出现redis标志性的图代表成功 如果显示command not found :redis-server 则在终端再进入src文件夹下&…...
python-leetcode 71.每日温度
题目: 给定一个整数数组 temperatures ,表示每天的温度,返回一个数组 answer ,其中 answer[i] 是指对于第 i 天,下一个更高温度出现在几天后。如果气温在这之后都不会升高,请在该位置用 0 来代替。 可以理…...
[250521] DBeaver 25.0.5 发布:SQL 编辑器、导航器全面升级,新增 Kingbase 支持!
目录 DBeaver 25.0.5 发布:SQL 编辑器、导航器全面升级,新增 Kingbase 支持! DBeaver 25.0.5 发布:SQL 编辑器、导航器全面升级,新增 Kingbase 支持! 近日,DBeaver 发布了 25.0.5 版本…...
Java枚举详解
文章目录 1. 引言1.1 什么是枚举1.2 为什么需要枚举1.3 枚举的优势 2. 枚举基础2.1 枚举的声明与使用基本声明在类中定义枚举枚举的基本使用 2.2 枚举的常用方法1. values()2. valueOf(String name)3. name()4. ordinal()5. toString()6. compareTo(E o)7. equals(Object other…...
Android13 wifi设置国家码详解
Android13 wifi设置国家码详解 文章目录 Android13 wifi设置国家码详解一、前言二、设置wifi国家码相关代码1、adb或者串口也能设置和获取当前国家码(1)查询命令的方式(2)获取和设置国家码的示例 2、Java代码设置国家码3、获取当前…...
Docker安装MinIO对象存储中间件
MinIO 是一个高性能、分布式的对象存储系统,兼容 Amazon S3 云存储服务协议,广泛应用于企业存储、大数据、机器学习和容器化应用等领域。以下是详细介绍: 核心特点 兼容 S3 API :全面兼容 Amazon S3 API,这意味着使用…...
EasyPan 使用及功能优化
文章目录 在线体验为什么我想做这个?kiftd网盘EasyPan EasyPan 客制化,升级为 RokiPan登录界面主界面分享 上传&下载速度测试下载上传 个人优化(部分截图):已实现功能汇总(原版 优化 )待实…...
word通配符表
目录 一、word查找栏代码&通配符一览表二、word替换栏代码&通配符一览表三、参考文献 一、word查找栏代码&通配符一览表 序号清除使用通配符复选框勾选使用通配符复选框特殊字符代码特殊字符代码or通配符1任意单个字符^?一个任意字符?2任意数字^#任意数字&#…...
word格式相关问题
页眉 1 去除页眉横线: 双击打开页眉,然后点击正文样式,横线就没有了。 2 让两部分内容的页眉不一样: 使用“分节符”区分两部分内容,分节符可以在“布局-分隔符”找到。然后双击打开页眉,取消“链接到前一…...
springboot使用xdoc-report包导出word
背景:项目需要使用xdoc-report.jar根据设置好的word模版,自动填入数据 导出word 框架使用 我的需求是我做一个模板然后往里面填充内容就导出我想要的word文件,问了下chatgpt还有百度,最后选用了xdocreport这个框架,主…...
电脑中所有word文件图标变白怎么恢复
电脑中的word文件图标变白,如下图所示: 解决方法: 1.winR-->在弹出的运行窗口中输入“regedit”(如下图所示),点击确定: 2.按照路径“计算机\HKEY_CLASSES_ROOT\Word.Document.12\DefaultIcon”去找到“࿰…...
node.js如何实现双 Token + Cookie 存储 + 无感刷新机制
node.js如何实现双 Token Cookie 存储 无感刷新机制 为什么要实施双token机制? 优点描述安全性Access Token 短期有效,降低泄露风险;Refresh Token 权限受限,仅用于获取新 Token用户体验用户无需频繁重新登录,Toke…...
如何从 iPhone 获取照片:5 个有效解决方案
有时,我们在 iPhone 上积累了太多照片,因此有必要从 iPhone 上删除照片。无论您的设备需要更多空间,还是只是想备份珍贵的记忆以妥善保管,您都可以找到从 iPhone 上拍摄照片的有效方法。您可以选择完成任务的最佳方式。 第 1 部分…...
大模型知识
############################################################## 一、vllm大模型测试参数和原理 tempreature top_p top_k ############################################################## tempreature top_p top_k 作用:总体是控制模型的发散程度、多样…...
微软正式发布 SQL Server 2025 公开预览版,深度集成AI功能
微软在今年的 Build 2025 大会上正式发布了 SQL Server 2025 公开预览版,标志着这一经典数据库产品在 AI 集成、安全性、性能及开发者工具方面的全面升级。 AI 深度集成与创新 原生向量搜索:SQL Server 2025 首次将 AI 功能直接嵌入数据库引擎ÿ…...
git中,给分支打标签
1.创建标签 标签可以是轻量级标签或带注释的标签两种 轻量级标签 git tag <tag-name> 带注释的标签 git tag -a <tag-name> -m "标签信息" 2.查看标签 git tag 查看标签详细信息 git show <tag-name> 3.推送标签到远程仓库 推送指定标签…...
微软 Build 2025:开启 AI 智能体时代的产业革命
在 2025 年 5 月 19 日的微软 Build 开发者大会上,萨提亚・纳德拉以 "我们已进入 AI 智能体时代" 的宣言,正式拉开了人工智能发展的新纪元。这场汇聚了奥特曼、黄仁勋、马斯克三位科技领袖的盛会,不仅发布了 50 余项创新产品&#…...
k8s面试题-ingress
场景:我通过deployment更新pod,ingress是怎么把新的请求流量发送到我新的pod的?是怎么监控到我更新的pod的? 在 Kubernetes 中,Ingress 是一种 API 对象,用于管理外部访问到集群内服务的 HTTP 和 HTTPS 路…...
时间序列预测的迁移学习
在本文中,我们将了解如何将迁移学习应用于时间序列预测,以及如何在多样化的时间序列数据集上训练一次预测模型,之后无需训练即可在不同数据集上进行预测。我们将使用开源 Darts 库,仅用几行代码即可完成所有这些操作。你可以点击此…...
react中运行 npm run dev 报错,提示vite.config.js出现错误 @esbuild/win32-x64
在React项目中运行npm run dev时,如果遇到vite.config.js报错,提示esbuild/win32-x64在另一个平台中被使用,通常是由于依赖冲突或缓存问题导致的。解决方法是删除node_modules文件夹,并重新安装依赖。 如下图: 解决办…...
量子计算模拟:从理论到实践
量子计算作为现代物理学与计算机科学的前沿交叉领域,正逐渐从纯理论研究走向实际应用。本文将介绍一个基于Python的量子计算模拟框架,重点探讨量子比特、量子门操作以及量子态的演化等核心概念,并通过可视化手段直观展示量子态的特性。 量子…...
143.重排链表的尝试
listnode* p0cur,cur->nextq;p0->next有没有改变 ChatGLM 让我们逐步分析这段代码,看看 p0->next 是否发生了改变。 假设我们有一个链表,cur 是当前节点,q 是另一个节点。代码如下: 代码生成完成 C代码 首先&#…...
精准识别记忆细胞!Elabscience PE Anti-Human/Mouse CD44 抗原特异性抗体
概述 CD44 是一种广泛表达的细胞表面黏附分子,参与细胞迁移、炎症反应及肿瘤转移等关键生物学过程。Elabscience 推出的PE Anti-Human/Mouse CD44 Antibody [IM7](货号:E-AB-F1100D)是一款高特异性、高灵敏度的流式抗体ÿ…...
OpenCV CUDA模块特征检测与描述------一种基于快速特征点检测和旋转不变的二进制描述符类cv::cuda::ORB
操作系统:ubuntu22.04 OpenCV版本:OpenCV4.9 IDE:Visual Studio Code 编程语言:C11 算法描述 cv::cuda::ORB 是 OpenCV 库中 CUDA 模块的一部分,它提供了一种基于快速特征点检测和旋转不变的二进制描述符的方法,用于…...
OpenCV CUDA模块特征检测与描述------创建一个 盒式滤波器(Box Filter)函数createBoxFilter()
操作系统:ubuntu22.04 OpenCV版本:OpenCV4.9 IDE:Visual Studio Code 编程语言:C11 算法描述 cv::cuda::createBoxFilter 是 OpenCV CUDA 模块中的一个工厂函数,用于创建一个 盒式滤波器(Box Filter)&…...
【八股战神篇】Spring高频面试题汇总
专栏简介 Bean 的生命周期了解么? 延伸 谈谈自己对于 Spring IoC 的了解 延伸 什么是动态代理? 延伸 动态代理和静态代理的区别 延伸 Spring AOP的执行流程 延伸 Spring的事务什么情况下会失效? 延伸 专栏简介 八股战神篇专栏是基于各平台共上千篇面经,上万道…...
高阶数据结构——红黑树实现
目录 1.红黑树的概念 1.1 红黑树的规则: 1.2 红黑树的效率 2.红黑树的实现 2.1 红黑树的结构 2.2 红黑树的插入 2.2.1 不旋转只变色(无论c是p的左还是右,p是g的左还是右,都是一样的变色处理方式) 2.2.2 单旋变色…...
java综合交易所13国语言,股票,区块链,外汇,自带客服系统运营级,有测试
这套pc和H5是一体的,支持测试,目前只有外汇和区块链,某站居然有人卖3.8w,还觉得自己这个价格很好 自带客服系统,虽然是老的,但是可玩性还是很高的,也支持c2c,理财,质押&a…...
六:操作系统虚拟内存之缺页中断
深入理解操作系统:缺页中断 (Page Fault) 的处理流程 在上一篇文章中,我们介绍了虚拟内存和按需调页 (Demand Paging) 的概念。虚拟内存为每个进程提供了巨大的、独立的虚拟地址空间,并通过页表 (Page Table) 将虚拟页面 (Virtual Page) 映射…...
iOS 15.4.1 TrollStore(巨魔商店)安装教程详解:第二篇
🚀 iOS 15.4.1 TrollStore(巨魔商店)安装教程详解 ✨ 前言🛠️ 如何安装 TrollStore?第一步:打开 Safari 浏览器第二步:选择对应系统版本安装方式第三步:访问地址,下载配…...
【JAVA】比较器Comparator与自然排序(28)
JAVA 核心知识点详细解释 Java中比较器Comparator的概念和使用方法 概念 Comparator 是 Java 中的一个函数式接口,位于 java.util 包下。它用于定义对象之间的比较规则,允许我们根据自定义的逻辑对对象进行排序。与对象的自然排序(实现 Comparable 接口)不同,Comparat…...
bitbar环境搭建(ruby 2.4 + rails 5.0.2)
此博客为武汉大学WA学院网络安全课程,理论课大作业Web环境搭建。 博主搭了2天!!!血泪教训是还是不能太相信ppt上的教程。 一开始尝试了ppt上的教程,然后又转而寻找网络资源 cs155源代码和docker配置,做到…...
Spring Boot接口通用返回值设计与实现最佳实践
一、核心返回值模型设计(增强版) package com.chat.common;import com.chat.util.I18nUtil; import com.chat.util.TraceUtil; import lombok.AllArgsConstructor; import lombok.Data; import lombok.Getter;import java.io.Serializable;/*** 功能: 通…...
线上 Linux 环境 MySQL 磁盘 IO 高负载深度排查与性能优化实战
目录 一、线上告警 二、问题诊断 1. 系统层面排查 2. 数据库层面分析 三、参数调优 1. sync_binlog 参数优化 2. innodb_flush_log_at_trx_commit 参数调整 四、其他优化建议 1. 日志文件位置调整 2. 生产环境核心参数配置模板 3. 突发 IO 高负载应急响应方案 五、…...
React--函数组件和类组件
React 中的函数组件和类组件是两种定义组件的方式,它们有以下主要区别: 1. 语法与定义方式 函数组件: 是 JavaScript 函数,接收 props 作为参数,返回 JSX。 const MyComponent (props) > {return <div>Hell…...
GitHub 趋势日报 (2025年05月20日)
本日报由 TrendForge 系统生成 https://trendforge.devlive.org/ 🌐 本日报中的项目描述已自动翻译为中文 📈 今日整体趋势 Top 10 排名项目名称项目描述今日获星总星数语言1virattt/ai-hedge-fundAI对冲基金团队⭐ 1781⭐ 31163Python2public-apis/pub…...
uni.getLocation()和uni.openSetting()
文章目录 环境背景问题分析问题1问题2 uni.getLocation()和uni.openSetting()的区别和联系其它uni.getLocation()的failuni.openSetting()的authSetting对象 参考 环境 Windows 11 专业版HBuilder X 4.65微信开发者工具 Stable 1.06.2412050 背景 在小程序开发中,…...
医疗行业数据共享新实践:如何用QuickAPI打通诊疗全流程数据壁垒
在医疗行业,数据的高效流转直接影响诊疗效率和患者体验。某三甲医院在数字化转型中发现,虽然已积累大量核心业务数据,但各科室系统间的数据互通仍存在明显瓶颈——检验科的报告无法实时同步至门诊系统,药房库存数据与采购系统脱节…...
管理会议最佳实践:高效协同与价值最大化
1.会前准备:明确目标与计划 1.1 明确会议目的 1.1.1 必要性评估 开会前需自问是否真的需要开会,若问题可通过邮件、文档或异步沟通解决,则应避免开会,以节省时间和资源。 1.1.2 目标定义 清晰定义会议目标,如决策、信息同步、创意讨论等,并提前告知参与者,使大家明确参…...
万物智联,重塑未来:鸿蒙操作系统的实战突破与生态崛起
鸿蒙操作系统(HarmonyOS)作为华为自主研发的分布式操作系统,自2019年发布以来,已从技术探索迈入大规模商用阶段。截至2025年,鸿蒙系统不仅成为全球第二大移动操作系统,更在政企数字化、工业制造、金融科技等…...
人工智能路径:技术演进下的职业发展导航
当生成式AI能够自主完成创意设计、商业分析和代码编写时,职业发展的传统路径正在被重新测绘。人工智能路径不再是一条预设的直线,而演变为包含多重可能性的动态网络——未来的职业成功,将取决于在技术变革中持续定位自身价值节点的能力。 一…...
深入理解Java虚拟机之垃圾收集器篇(垃圾回收器的深入解析待完成TODO)
目录 **一. 如何判断对象的存亡**引用计数算法:可达性分析算法: **二. Java中的四种引用****三. 垃圾回收算法****1. 标记 - 清除算法****2. 标记 - 复制算法****3. 标记 - 整理算法****4. 分代收集理论**(了解即可) **四. 十种主流垃圾收集器****3.1 Serial 收集器****3.2 Par…...
牛客网 NC16407 题解:托米航空公司的座位安排问题
牛客网 NC16407 题解:托米航空公司的座位安排问题 题目分析 解题思路 本题可以采用深度优先搜索(DFS)来解决: 从左上角开始,按行优先顺序遍历每个座位对于每个座位,有两种选择: 选择该座位(如果满足条件…...
拉普拉斯高斯(LoG)滤波器掩模的注意事项
目录 问题: 解答: 一、高斯函数归一化:消除幅度偏差 1. 归一化的定义 2. 为何必须归一化? 二、拉普拉斯系数和为零:抑制直流项干扰 1. 拉普拉斯算子的特性 2. 系数和不为零的后果 三、直流项如何影响零交叉点&…...
OSPF基础实验-多区域
互联接口、IP地址如下图所示,所有设备均创建Loopback0,其IP地址为10.0.x.x/24,其中x为设备编号。 R1、R3的所有接口以及R2的GE0/0/4接口属于OSPF区域2,R2、R4的Loopback0接口及互联接口属于OSPF区域0,R4、R5的互联接口…...
ERP 与 WMS 对接深度解析:双视角下的业务与技术协同
在企业数字化运营的复杂体系中,ERP(企业资源规划)与 WMS(仓储管理系统)的有效对接,已成为优化供应链管理、提升运营效率的关键环节。本文将从 ERP 和 WMS 两个核心视角出发,深度剖析两者对接过程…...
基于 Node.js 的 HTML 转 PDF 服务
这是一个基于 Node.js 开发的 Web 服务,主要功能是将 HTML 内容转换为 PDF 文件。项目使用了 Express 作为 Web 框架,Puppeteer 作为 PDF 生成引擎,提供了简单易用的 API 接口。前端开发人员提供了一个简单而强大的 HTML 转 PDF 解决方案&…...
Java阻塞队列(BlockingQueue)的使用:ArrayBlockingQueue类、LinkedBlockingQueue类
1、阻塞队列的介绍 Java 中的阻塞队列(BlockingQueue) 是多线程编程中用于协调生产者和消费者线程的重要工具,属于 java.util.concurrent 包。它的核心特点是:当队列为空时,消费者线程会被阻塞,直到队列中有新元素;当队列满时,生产者线程会被阻塞,直到队列有空闲…...
esp32cmini SK6812 2个方式
1 #include <SPI.h> // ESP32-C系列的SPI引脚 #define MOSI_PIN 7 // ESP32-C3/C6的SPI MOSI引脚 #define NUM_LEDS 30 // LED灯带实际LED数量 - 确保与实际数量匹配! #define SPI_CLOCK 10000000 // SPI时钟频率 // 颜色结构体 st…...