x
uploads/hyperplane.jpg

hyperplane n.【數學】超平面。

hyperplasia

When the increased data are learned , the data , which belong to bok but near by the hyperplane , play an important role in the changing of the new hyperplane . and the change focuses in the local area of the increased data . based on the above two ideas , a improved incremental learning procedure is advanced and tested 而且當新增加樣本時, bok中那些在分類面邊緣增加的數據對分類面的改變都起著重要的作用,而且更重要的是分類面大部分的變化集中在新增樣本區域的周圍,基于這種思想提出的方法區域增量svm學習算法在實驗中也充分驗證了以上兩點改進。

The separating hyperplane structured by support vectors . the larger data set in real world demands higher efficiency . decomposition is the first practical method to deal with larger data set . it decomposes the training set to two parts : active and inactive , the active part is called working set 由于現實世界的數據量一般比較大,因此對優化的效率要求較高,分解是第一種實用的可處理大數據集的技術,它把訓練集分成固定大小的工作集和非工作集兩部分,每次迭代只解決一個工作集中的子優化問題。

Support vector machine is a kind of new general learning machine based on statistical learning theory . in order to solve a complicated classification task , it mapped the vectors from input space to feature space in which a linear separating hyperplane is structured . the margin is the distance between the hyperplane and a hyperplane through the closest points 支持向量機是在統計學習理論基礎上發展起來的一種通用學習機器,其關鍵的思想是利用核函數把一個復雜的分類任務通過核函數映射使之轉化成一個在高維特征空間中構造線性分類超平面的問題。

A geometric transversal is defined to be an affine subspace ( such as a point , a line , a plane , or a hyperplane ) intersecting every member of a given family . in part i we discuss three kinds of such problems . in chapter 2 we discuss point transversal to a family of translates of a convex sets in the plane , where we prove a famous conjecture of griinbaum ' s by a concrete and straightforward method for some special cases 如果一仿射子空間(如一個點,一條直線,一個平面,或一個超平面)與一給定集族的每一個元都相交,則我們稱該仿射子空間為該給定集族的一個幾何橫截(點橫截,直線橫截,平面橫截等) ,也稱該仿射子空間橫截該給定集族。

The advantage of multistage support vector machine is embodied in three aspects . first , towards the unpredicted areas of other multiclass support vector machines , multistage support vector machine can predict them more correctly ; secondly , according to the experimental comparison , the dissertation shows us the high accuracy of its evaluate performance . finally , for a multiclass classification , multistage support vector machine need less support vectors to construct multistage hyperplane than the other three methods , so the multistage support vector has the better generalization 多級支持向量機的優點主要體現在三個方面:一方面,對于其他幾種多類支持向量機不能處理的不可測區域,它有了明顯的改善;另一方面,本文通過實驗比較,指出了多級支持向量機測試準確率高的特點;最后,對于一個多類問題,多級支持向量機在構造多級超平面時需要的支持向量明顯少于其余三種多類支持向量機,因此具有更強的泛化能力。

Abstract : a methodology to reduce the input fuzzy sets with the hyperplane of generalized state error is discussed in this paper based on sliding mode control ( smc ) theory , a method of varying nonlinear fuzzy sets range using some parameter is proposed , and some internal properties of fuzzy controller is analyzed to show that the fuzzy controller outperform the pid controller , such as the stability and steady - state error 文摘:根據滑動模態原理,將模糊控制系統的輸入量簡化為廣義跟蹤誤差的一個超平面,并基于三角形的非線性劃分語言變量的隸屬度,分析了模糊控制系統的某些性質,表明在系統穩定性、穩態誤差等指標方面,模糊控制器優于一般的pid控制器

The separating plane with maximal margin is the optimal separating hyperplane which has good generation ability . to find a optimal separating hyperplane leads to a quadratic programming problem which is a special optimization problem . after optimization all vectors are evaluated a weight . the vector whose weight is not zero is called support vector 而尋找最優分類超平面需要解決二次規劃這樣一個特殊的優化問題,通過優化,每個向量(樣本)被賦予一個權值,權值不為0的向量稱為支持向量,分類超平面是由支持向量構造的。

By mapping input data into a high dimensional characteristic space in which an optimal separating hyperplane is built , svm presents a lot of advantages for resolving the small samples , nonlinear and high dimensional pattern recognition , as well as other machine - learning problems such as function fitting Svm的基本思想是通過非線性變換將輸入空間變換到一個高維空間,然后在這個新的空間中求取最優分類超平面。它在解決小樣本、非線性及高維模式識別問題中表現出許多特有的優勢,并能夠推廣應用到函數擬合等其他機器學習問題中。

The idea is proposed that those increased date , which near the separating hyperplane , is significant for the forming of the new hyperplane , whenever these date are classed by the former hyperplane to test error set berr or test right set bok 與傳統的增量學習方法不同,本文中,作者認為那些在分類面邊緣增加的數據對分類面的改變都起著重要的作用,無論這些數據被初碩士論文支持向量機在圖像處理應用中若干問題研究始分類器p劃分到測試錯誤集berr或者測試正確集b 。

If these points can be cut by a hyperplane - in other words , an n - dimensional geometric figure corresponding to the line in the above example - then there is a set of weights and a threshold that define a tlu whose classifications match this cut 如果這些點可以被超平面換句話說,對應于上面示例中的線的n維的幾何外形切割,那么就有一組權系數和一個閾值來定義其分類剛好與這個切割相匹配的tlu 。

In chapter 4 we obtain the helly number for hyperplane transversal to translates of a convex cube in r ~ ( d ) . where we prove that the helly number for such families is 5 when d = 2 , and is greater than or equal to d + 3 when d 3 在第4章中我們探討了o中超平面橫截單位立方體平移形成的集族的heily數,證得碑中此heily數為5 ,在呼中此heily數z民并推廣至呼,在胸中此heily數d 3

In addition , all the system states are on the sliding hyperplane at the initial instant , the reaching phase of smc is eliminated and the global robustness and stability of the closed - loop system can be guaranteed with the proposed control strategy 此外,控制策略使得系統的初始狀態已經處于滑模面上,從而消除了滑模控制的到達階段,進而確保了閉環系統的全局魯棒性和穩定性。

Is that if a set of points in n - space is cut by a hyperplane , then the application of the perceptron training algorithm will eventually result in a weight distribution that defines a tlu whose hyperplane makes the wanted cut )下的結論是,如果n維空間的點集被超平面切割,那么感知器的培訓算法的應用將會最終導致權系數的分配,從而定義了一個tlu ,它的超平面會進行需要的分割。

For this problem , a separating hyperplane is designed with the principle of maximizing the distance between two class centers , and a novel support vector machine , called maximal class - center margin support vector machine ( mccm - svm ) is designed 為了解決這個問題,本文以兩個類中心距離最大為準則建立分類超平面,構造一個新的支持向量機,稱作類中心最大間隔支持向量機。

Pcc takes the normal vector of a hyperplane as the projecting direction , onto which the algebraic sum of all samples “ projections is maximized , such that samples in one class can be separated well from the other by this hyperplane 主分量分類器是在兩類樣本投影代數和最大的前提下,獲得最佳投影方向(分類面法方向) ,實現樣本分類。它的不足之處在于: 1

Chapter 2 has systematically discussed machine learning problem , which is the basic of svm , with statistical learning theory or slt . secondly , chapter 3 has educed the optimal hyperplane from pattern recognition 第二章探討了支持向量機理論基礎? ?學習問題,尤其是對vapnik等人的統計學習理論( slt )結合學習問題作了系統的闡述。

The multiple - hyperplane classifier , which is investigated from the complexity of optimization problem and the generalization performance , is the explicit extension of the optimal separating hyperplanes classifier 多超平面分類器從優化問題的復雜度和運行泛化能力兩方面進行研究,是最優分離超平面分類器一種顯而易見的擴展。

Svm maps input vectors nonlinearly into a high dimensional feature space and constructs the optimum separating hyperplane in the spade to realize modulation recognition 支撐矢量機把各個識別特征映射到一個高維空間,并在高維空間中構造最優識別超平面分類數據,實現通信信號的調制識別。

When traditional support vector machines separate data containing noises , the obtained hyperplane is not an optimal one 使用傳統的支持向量機對含有噪聲的數據分類時,所得到的超平面往往不是最優超平面。