mle-interview
  • 面试指南
  • 数据结构与算法
    • 列表
      • 912. Sort an Array
      • 215. Kth Largest Element
      • 977. Squares of a Sorted Array
      • 605. Can Place Flowers
      • 59. Spiral Matrix II
      • 179. Largest Number
      • 31. Next Permutation
    • 二分查找
      • 704. Binary Search
      • 69. Sqrt(x)
      • 278. First Bad Version
      • 34. Find First and Last Position of Element in Sorted Array
      • 33. Search in Rotated Sorted Array
      • 81. Search in Rotated Sorted Array II
      • 162. Find Peak Element
      • 4. Median of Two Sorted Arrays
      • 1095. Find in Mountain Array
      • 240. Search a 2D Matrix II
      • 540. Single Element in a Sorted Array
      • 528. Random Pick with Weight
      • 1300. Sum of Mutated Array Closest to Target
      • 410. Split Array Largest Sum
      • 1044. Longest Duplicate Substring
      • *644. Maximum Average Subarray II
      • *1060. Missing Element in Sorted Array
      • *1062. Longest Repeating Substring
      • *1891. Cutting Ribbons
    • 双指针
      • 26. Remove Duplicate Numbers in Array
      • 283. Move Zeroes
      • 75. Sort Colors
      • 88. Merge Sorted Arrays
      • 167. Two Sum II - Input array is sorted
      • 11. Container With Most Water
      • 42. Trapping Rain Water
      • 15. 3Sum
      • 16. 3Sum Closest
      • 18. 4Sum
      • 454. 4Sum II
      • 409. Longest Palindrome
      • 125. Valid Palindrome
      • 647. Palindromic Substrings
      • 209. Minimum Size Subarray Sum
      • 5. Longest Palindromic Substring
      • 395. Longest Substring with At Least K Repeating Characters
      • 424. Longest Repeating Character Replacement
      • 76. Minimum Window Substring
      • 3. Longest Substring Without Repeating Characters
      • 1004. Max Consecutive Ones III
      • 1658. Minimum Operations to Reduce X to Zero
      • *277. Find the Celebrity
      • *340. Longest Substring with At Most K Distinct Characters
    • 链表
      • 203. Remove Linked List Elements
      • 19. Remove Nth Node From End of List
      • 876. Middle of the Linked List
      • 206. Reverse Linked List
      • 92. Reverse Linked List II
      • 24. Swap Nodes in Pairs
      • 707. Design Linked List
      • 148. Sort List
      • 160. Intersection of Two Linked Lists
      • 141. Linked List Cycle
      • 142. Linked List Cycle II
      • 328. Odd Even Linked List
    • 哈希表
      • 706. Design HashMap
      • 1. Two Sum
      • 146. LRU Cache
      • 128. Longest Consecutive Sequence
      • 73. Set Matrix Zeroes
      • 380. Insert Delete GetRandom O(1)
      • 49. Group Anagrams
      • 350. Intersection of Two Arrays II
      • 299. Bulls and Cows
      • *348. Design Tic-Tac-Toe
    • 字符串
      • 242. Valid Anagram
      • 151. Reverse Words in a String
      • 205. Isomorphic Strings
      • 647. Palindromic Substrings
      • 696. Count Binary Substrings
      • 28. Find the Index of the First Occurrence in a String
      • *186. Reverse Words in a String II
    • 栈与队列
      • 225. Implement Stack using Queues
      • 54. Spiral Matrix
      • 155. Min Stack
      • 232. Implement Queue using Stacks
      • 150. Evaluate Reverse Polish Notation
      • 224. Basic Calculator
      • 20. Valid Parentheses
      • 1472. Design Browser History
      • 1209. Remove All Adjacent Duplicates in String II
      • 1249. Minimum Remove to Make Valid Parentheses
      • *281. Zigzag Iterator
      • *1429. First Unique Number
      • *346. Moving Average from Data Stream
    • 优先队列/堆
      • 692. Top K Frequent Words
      • 347. Top K Frequent Elements
      • 973. K Closest Points
      • 23. Merge K Sorted Lists
      • 264. Ugly Number II
      • 378. Kth Smallest Element in a Sorted Matrix
      • 295. Find Median from Data Stream
      • 767. Reorganize String
      • 1438. Longest Continuous Subarray With Absolute Diff Less Than or Equal to Limit
      • 895. Maximum Frequency Stack
      • 1705. Maximum Number of Eaten Apples
      • *1086. High Five
    • 深度优先DFS
      • 二叉树
      • 543. Diameter of Binary Tree
      • 101. Symmetric Tree
      • 124. Binary Tree Maximum Path Sum
      • 226. Invert Binary Tree
      • 104. Maximum Depth of Binary Tree
      • 951. Flip Equivalent Binary Trees
      • 236. Lowest Common Ancestor of a Binary Tree
      • 987. Vertical Order Traversal of a Binary Tree
      • 572. Subtree of Another Tree
      • 863. All Nodes Distance K in Binary Tree
      • 1110. Delete Nodes And Return Forest
      • 230. Kth Smallest element in a BST
      • 98. Validate Binary Search Tree
      • 235. Lowest Common Ancestor of a Binary Search Tree
      • 669. Trim a Binary Search Tree
      • 700. Search in a Binary Search Tree
      • 108. Convert Sorted Array to Binary Search Tree
      • 450. Delete Node in a BST
      • 938. Range Sum of BST
      • *270. Closest Binary Search Tree Value
      • *333. Largest BST Subtree
      • *285. Inorder Successor in BST
      • *1485. Clone Binary Tree With Random Pointer
      • 回溯
      • 39. Combination Sum
      • 78. Subsets
      • 46. Permutation
      • 77. Combinations
      • 17. Letter Combinations of a Phone Number
      • 51. N-Queens
      • 93. Restore IP Addresses
      • 22. Generate Parentheses
      • 856. Score of Parentheses
      • 301. Remove Invalid Parentheses
      • 37. Sodoku Solver
      • 图DFS
      • 126. Word Ladder II
      • 212. Word Search II
      • 79. Word Search
      • 399. Evaluate Division
      • 1376. Time Needed to Inform All Employees
      • 131. Palindrome Partitioning
      • 491. Non-decreasing Subsequences
      • 698. Partition to K Equal Sum Subsets
      • 526. Beautiful Arrangement
      • 139. Word Break
      • 377. Combination Sum IV
      • 472. Concatenated Words
      • 403. Frog Jump
      • 329. Longest Increasing Path in a Matrix
      • 797. All Paths From Source to Target
      • 695. Max Area of Island
      • 341. Flatten Nested List Iterator
      • 394. Decode String
      • *291. Word Pattern II
      • *694. Number of Distinct Islands
      • *1274. Number of Ships in a Rectangle
      • *1087. Brace Expansion
    • 广度优先BFS
      • 102. Binary Tree Level Order Traversal
      • 103. Binary Tree Zigzag Level Order Traversal
      • 297. Serialize and Deserialize Binary Tree
      • 310. Minimum Height Trees
      • 127. Word Ladder
      • 934. Shortest Bridge
      • 200. Number of Islands
      • 133. Clone Graph
      • 130. Surrounded Regions
      • 752. Open the Lock
      • 815. Bus Routes
      • 1091. Shortest Path in Binary Matrix
      • 542. 01 Matrix
      • 1293. Shortest Path in a Grid with Obstacles Elimination
      • 417. Pacific Atlantic Water Flow
      • 207. Course Schedule
      • 210. Course Schedule II
      • 787. Cheapest Flights Within K Stops
      • 444. Sequence Reconstruction
      • 994. Rotting Oranges
      • 785. Is Graph Bipartite?
      • *366. Find Leaves of Binary Tree
      • *314. Binary Tree Vertical Order Traversal
      • *269. Alien Dictionary
      • *323. Connected Component in Undirected Graph
      • *490. The Maze
    • 动态规划
      • 70. Climbing Stairs
      • 72. Edit Distance
      • 377. Combination Sum IV
      • 1335. Minimum Difficulty of a Job Schedule
      • 97. Interleaving String
      • 472. Concatenated Words
      • 403. Frog Jump
      • 674. Longest Continuous Increasing Subsequence
      • 62. Unique Paths
      • 64. Minimum Path Sum
      • 368. Largest Divisible Subset
      • 300. Longest Increasing Subsequence
      • 354. Russian Doll Envelopes
      • 121. Best Time to Buy and Sell Stock
      • 132. Palindrome Partitioning II
      • 312. Burst Balloons
      • 1143. Longest Common Subsequence
      • 718. Maximum Length of Repeated Subarray
      • 174. Dungeon Game
      • 115. Distinct Subsequences
      • 91. Decode Ways
      • 639. Decode Ways II
      • 712. Minimum ASCII Delete Sum for Two Strings
      • 221. Maximal Square
      • 1277. Count Square Submatrices with All Ones
      • 198. House Robber
      • 213. House Robber II
      • 1235. Maximum Profit in Job Scheduling
      • 740. Delete and Earn
      • 87. Scramble String
      • 1140. Stone Game II
      • 322. Coin Change
      • 518. Coin Change II
      • 1048. Longest String Chain
      • 44. Wildcard Matching
      • 10. Regular Expression Matching
      • 32. Longest Valid Parentheses
      • 1043. Partition Array for Maximum Sum
      • *256. Paint House
      • 926. Flip String to Monotone Increasing
      • *1062. Longest Repeating Substring
      • *1216. Valid Palindrome III
    • 贪心
      • 56. Merge Intervals
      • 621. Task Scheduler
      • 135. Candy
      • 376. Wiggle Subsequence
      • 55. Jump Game
      • 134. Gas Station
      • 1005. Maximize Sum Of Array After K Negations
      • 406. Queue Reconstruction by Height
      • 452. Minimum Number of Arrows to Burst Balloons
      • 738. Monotone Increasing Digits
    • 单调栈
      • 739. Daily Temperatures
      • 503. Next Greater Element II
      • 901. Online Stock Span
      • 85. Maximum Rectangle
      • 84. Largest Rectangle in Histogram
      • 907. Sum of Subarray Minimums
      • 239. Sliding Window Maximum
    • 前缀和
      • 53. Maximum Subarray
      • 523. Continuous Subarray Sum
      • 304. Range Sum Query 2D - Immutable
      • 1423. Maximum Points You Can Obtain from Cards
      • 1031. Maximum Sum of Two Non-Overlapping Subarrays
    • 并查集
      • 684. Redundant Connection
      • 721. Accounts Merge
      • 547. Number of Provinces
      • 737. Sentence Similarity II
      • *305. Number of Islands II
    • 字典树trie
      • 208. Implement Trie
      • 211. Design Add and Search Words Data Structure
      • 1268. Search Suggestions System
      • *1166. Design File System
      • *642. Design Search Autocomplete System
    • 扫描线sweep line
      • 253. Meeting Room II
      • 1094. Car Pooling
      • 218. The Skyline Problem
      • *759. Employee Free Time
    • tree map
      • 729. My Calendar I
      • 981. Time Based Key-Value Store
      • 846. Hand of Straights
      • 480. Sliding Window Median
      • 318. Count of Smaller Numbers After Self
    • 数学类
      • 50. Pow(x, n)
      • *311. Sparse Matrix Multiplication
      • 382. Linked List Random Node
      • 398. Random Pick Index
      • 29. Divide Two Integers
    • 设计类
      • 1603. Design Parking System
      • 355. Design Twitter
      • 1396. Design Underground System
      • *359. Logger Rate Limiter
      • *353. Design Snake Game
      • *379. Design Phone Directory
      • *588. Design In-Memory File System
      • *1244. Design A Leaderboard
    • SQL
  • 机器学习
    • 数学基础
    • 评价指标
    • 线性回归
    • 逻辑回归
    • 树模型
    • 深度学习
    • 支持向量机
    • KNN
    • 无监督学习
    • k-means
    • 强化学习 RL
    • 自然语言处理 NLP
    • 大语言模型 LLM
    • 机器视觉 CV
    • 多模态 MM
    • 分布式机器学习
    • 推荐系统
    • 异常检测与风控
    • 模型解释性
    • 多任务学习
    • MLops
    • 特征工程
    • 在线学习
    • 硬件 cuda/triton
    • 产品case分析
    • 项目deep dive
    • 机器学习代码汇总
  • 系统设计
    • 面向对象设计
      • 电梯设计
      • 停车场设计
      • Unix文件系统设计
    • 系统设计
      • 设计社交网站Twitter
      • 设计视频网站Youtube
      • 短网址系统
      • 爬虫系统
      • 任务调度系统
      • 日志系统
      • 分布式缓存
      • 广告点击聚合系统
      • webhook
    • 机器学习系统设计
      • 推荐系统
      • 搜索引擎
      • Youtube视频推荐
      • Twitter推荐
      • 广告点击预测
      • 新闻推送推荐
      • POI推荐
      • Youtube视频搜索
      • 有害内容检测
      • 大模型RAG
      • 大模型Agent
      • 信贷风控
      • 朋友推荐
      • 去重复性/版权检测
      • 情感分析
      • 目标检测
      • 问答系统
      • 知识图谱问答
  • 行为面试
    • 领导力法则
    • 问答举例
  • 案例分享
    • 准备工作
    • 面试小抄
    • 面试之后
Powered by GitBook
On this page
  • 1. 优化目标与间隔
  • 2. 对偶
  • 3. 核技巧
  • 4. 软间隔
  • 5. 问答
  • 6. 代码
  • 参考
  1. 机器学习

支持向量机

间隔最大化来得到最优分离超平面。方法是将这个问题形式化为一个凸二次规划问题,还可以等价位一个正则化的合页损失最小化问题

1. 优化目标与间隔

y^=sign(wT⋅x+b)\begin{align*} \hat y=sign(w^T\cdot x+b) \end{align*}y^​=sign(wT⋅x+b)​

目标: 支持向量离超平面的距离最远,其中支持向量:分类正确且距离边界最小的样本

min⁡w,b12∥w∥2s.t. yi(wTxi+b)≥1\begin{align*} &\min \limits_{w,b}\frac{1}{2}\left \|w \right\|^2 \\ &\text{s.t. } y_i(w^Tx_i+b)\geq1 \\ \end{align*}​w,bmin​21​∥w∥2s.t. yi​(wTxi​+b)≥1​

hinge loss在训练后期天然偏重困难样本的损失,对于深度学习也有启发意义

KKT条件

2. 对偶

  • 当目标函数与约束函数法向量平行的时候,得到带约束目标函数的最小值。使用拉格朗日乘子法将带约束的目标函数求解问题转化成无约束条件的目标函数求解问题

L(w,b,α)=12∥w∥2+∑i=1mαi(1−yi(wTxi+b))\begin{align*} &L(w,b,\alpha)=\frac{1}{2}\left \|w \right\|^2+\sum_{i=1}^{m} \alpha_i(1-y_i(w^Tx_i+b)) \end{align*}​L(w,b,α)=21​∥w∥2+i=1∑m​αi​(1−yi​(wTxi​+b))​

3. 核技巧

  • 内积:两个样本之间关系的度量

  • 多项式核函数:线性回归交叉特征

  • 高斯核函数: 映射到无限维空间,直观理解是设置几个landmark,然后计算每个点到这个landmark的距离。这里的距离是把landmark拉宽成一个高斯概率密度,然后计算RBF Kernel 是所有多项式核函数的线性组合

4. 软间隔

引入松弛变量(slack variable),松弛变量允许一部分变量越界,但是要付出代价的,新增一个超参数来决定惩罚力度,这里的惩罚和一般的损失函数一样,分类正确的惩罚为0,分类错误的开始有惩罚

5. 问答

  • what's the difference between logistic regression and SVM

    • loss type, logistic loss for LR, hinge loss for SVM

    • LR is parametric model (Bernoulli distribution), SVM with RBF kernel is non-parametric model

    • For SVM, only support vectors will influence the model, and every sample will influence LR model

    • SVM with structural risk minimization with L2 norm naturally, LR use experimential risk minimization

    • SVM normally use kernel function to solve the unlinear problem, LR not

  • pros

    • Performs well in Higher dimension

    • Best algorithm when classes are separable

    • Outliers have less impact

    • SVM is suited for extreme case binary classification

  • cons

    • Slow: For larger dataset, it requires a large amount of time to process.

    • Poor performance with Overlapped classes : Does not perform well in case of overlapped classes.

    • Selecting appropriate hyperparameters is important

    • Selecting the appropriate kernel function can be tricky.

6. 代码

# Learning policy is to maximum the margin solved by the convex quadratic programming

import numpy as np
import random
from sklearn import datasets

def load_data():
    iris = datasets.load_iris()
    x, y = iris.data, iris.target
    return x, y

class SVM(object):
    def __init__(self, kernal, C):
        self.kernal = kernal
        self.C = C

    def fit(self, x_train, y_train):
        """by SMO"""
        self.x_train = x_train
        self.y_train = y_train
        self.N = x_train.shape[0]
        self.alpha = np.zeros((self.N, 1))  # alpha from the dual representation
        self.SMO(max_iter=1000)

    def predict_score(self, x_new):
        y_score = np.dot(x_new, self.w) + self.b
        return y_score[0]

    def predict(self, x_new):
        y_new = np.sign(self.predict(x_new))
        return y_new

    def SMO(self, max_iter):
        iter = 0
        while iter < max_iter:
            # step 1: choose two alpha
            iter += 1
            for i in range(self.N):
                j = random.randint(0, self.N - 1)
                if j == i:
                    continue
                else:
                    x_i, y_i, x_j, y_j = (
                        self.x_train[i, :],
                        self.y_train[i, 0],
                        self.x_train[j, :],
                        self.y_train[j, 0],
                    )
                    alpha_i, alpha_j = self.alpha[i, 0], self.alpha[j, 0]

                    L, H = self.calculate_LU(y_i, y_j, alpha_i, alpha_j)
                    self.calculate_w_b()

                    y_i_score = self.predict_score(x_i)
                    y_j_score = self.predict_score(x_j)

                    E_i = self.calculate_E(y_i_score, y_i)
                    E_j = self.calculate_E(y_j_score, y_j)

                    # step 2: update two alpha
                    alpha_j_new = alpha_j + y_j * (E_i - E_j) / self.calculate_gram(x_i, x_j)
                    alpha_j_new = min(alpha_j_new, H)
                    alpha_j_new = max(alpha_j_new, L)
                    alpha_i_new = alpha_i + y_i * y_j * (alpha_j - alpha_j_new)
                    print(alpha_j_new, alpha_i_new)
                    self.alpha[i, 0] = alpha_i_new
                    self.alpha[j, 0] = alpha_j_new

            self.w, self.b = self.calculate_w_b()
        return self.alpha

    def calculate_LU(self, yi, yj, alpha_i, alpha_j):
        if yi == yj:
            L = max(0, alpha_j + alpha_i - self.C)
            H = min(self.C, alpha_j + alpha_i)
        else:
            L = max(0, alpha_j - alpha_i)
            H = min(self.C, self.C + alpha_j - alpha_i)
        return L, H

    def calculate_E(self, y_hat, y):
        E = y_hat - y
        return E

    def calculate_gram(self, x_i, x_j):
        x_i = x_i.reshape(1, -1)
        x_j = x_j.reshape(1, -1)
        if self.kernal == "linear":
            k_ij = x_i.dot(x_i.T) + x_j.dot(x_j.T) - 2 * x_i.dot(x_j.T)
            return k_ij[0, 0]

    def calculate_w_b(self):
        self.w = np.dot(self.x_train.T, (self.alpha * self.y_train).reshape(-1, 1))
        self.b = np.mean(self.y_train - np.dot(self.x_train, self.w))
        return self.w, self.b

    def hinge_loss(self):
        pass

    def __str__(self):
        return "weights\t:%s\n bias\t:%f\n" % (self.w, self.b)

if __name__ == "__main__":
    x, y = load_data()
    y[np.where(y == 0)] = -1
    y[np.where(y == 2)] = -1
    y = y.reshape(-1, 1)
    print("x shape", x.shape)
    print("y shape", y.shape)
    svm = SVM(kernal="linear", C=1.0)
    svm.fit(x, y)
    y_hat = svm.predict(x)

参考

Previous深度学习NextKNN

Last updated 27 days ago

支持向量机(SVM)是什么意思
https://github.com/LasseRegin/SVM-w-SMO
对偶问题 Dual Problem 与支持向量机 SVM (可视化理解) - 锦恢的文章 - 知乎