matlab 文字检测

matlab 文字检测,第1张

赵辉《Visual+C++_MATLAB图像处理与识别实用案例精选》

程序代码说明

P0201:MATLAB赋值

P0202:MATLAB中的for循环

P0203:MATLAB中的for循环和if条件

P0205:MATLAB图像处理的基本 *** 作

P0206:MATLAB高级图像处理 *** 作

P0207:根据RGB图像创建一幅灰度图像

P0208:二值图像的取反 *** 作

P0209:用imshow函数显示图像

P0210:在同一个窗口内显示两幅图像

P0301:数字图像矩阵数据的显示及其傅立叶变换

P0302:二维离散余弦变换的图像压缩

P0303:采用灰度变换的方法增强图像的对比度

P0304:直方图均匀化

P0305:模拟图像受高斯白噪声和椒盐噪声的影响

P0306:采用二维中值滤波函数medfilt2对受椒盐噪声干扰的图像滤波

P0307:采用MATLAB中的函数filter2对受噪声干扰的图像进行均值滤波

P0308:图像的自适应魏纳滤波

P0309:运用5种不同的梯度增强法进行图像锐化

P0310:图像的高通滤波和掩模处理

P0311:利用巴特沃斯(Butterworth)低通滤波器对受噪声干扰的图像进行平滑处理

P0312:利用巴特沃斯(Butterworth)高通滤波器对图像进行锐化处理

P0401:用Prewitt算子检测图像的边缘

P0402:用不同σ值的LoG算子检测图像的边缘

P0403:用Canny算子检测图像的边缘

P0404:图像的阈值分割

P0405:用水线阈值法分割图像

P0406:对矩阵进行四叉树分解

P0407:将图像分为文字和非文字的两个类别

P0408:形态学梯度检测租誉二值图像的边缘

P0409:形态学实例——从PCB图像中删除所有电流线,仅保留芯片对象

P0502:计算图像中的欧拉数

P0610:神经网络的实例

P0701:细胞边缘检测

P0702:癌细胞形态学分析

P0703:癌细胞颜色分析

P0801:索书号文字图像分割

P0802:粘连字符切分

P0803:文字识别

P0804:彩色车牌分割

P0805:商标文字分割

Recognition:文字识别的识别子函数

StrDetect01:文字识别的结构特征提取子函数程序代码说明

P0901:灰度AGV路径识别

P0902:彩色AGV路径识别

P0903:HSI彩色空间的AGV路径识别樱型喊

P0904:路径中脊野心线的定位

P0905:Radon变换的AGV路径偏差检测

另外,站长团上有产品团购,便宜有保证

文字识别 要用神经网络。

具体参考神经网络的guide文件,关于 Character Recognition 应用:

11-15

Appcr1: Character Recognition

It is often useful to have a machine perform pattern recognition. In particular,

machines that can read symbols are very cost effective. A machine that reads

banking checks can process many more checks than a human being in the same

time. This kind of application saves time and money, and eliminates the

requirement that a human perform such a repetitive task. The demonstration

appcr1 shows how character recognition can be done with a backpropagation

network.

Problem Statement

A network is to be designed and trained to recognize the 26 letters of the

alphabet. An imaging system that digitizes each letter centered in the system’s

field of vision is available. The result is that each letter is represented as a 5 by

7 grid of Boolean values.

For example, here is the letter A.

Load the alphabet letter definitions and their target representations.

[alphabet,targets] = prprob

However, the imaging system is not perfect, and the letters can suffer from

noise.11 Applications

11-16

Perfect classification of ideal input vectors is required, and reasonably accurate

classification of noisy vectors.

The twenty-six 35-element input vectors are defined in the function prprob as

a matrix of input vectors called alphabet. The target vectors are also defined

in this file with a variable called targets. Each target vector is a 26-element

vector with a 1 in the position of the letter it represents, and 0’s everywhere

else. For example, the letter A is to be represented by a 1 in the first element

(as A is the first letter of the alphabet), and 0’s in elements two through

twenty-six.

Neural Network

The network receives the 35 Boolean values as a 35-element input vector. It is

then required to identify the letter by responding with a 26-element output

vector. The 26 elements of the output vector each represent a letter. To operate

correctly, the network should respond with a 1 in the position of the letter being

presented to the network. All other values in the output vector should be 0.

In addition, the network should be able to handle noise. In practice, the

network does not receive a perfect Boolean vector as input. Specifically, the

network should make as few mistakes as possible when classifying vectors with

noise of mean 0 and standard deviation of 0.2 or less.

Architecture

The neural network needs 35 inputs and 26 neurons in its output layer to

identify the letters. The network is a two-layer log-sigmoid/log-sigmoid

Appcr1: Character Recognition

11-17

network. The log-sigmoid transfer function was picked because its output

range (0 to 1) is perfect for learning to output Boolean values.

The hidden (first) layer has 25 neurons. This number was picked by guesswork

and experience. If the network has trouble learning, then neurons can be added

to this layer. If the network solves the problem well, but a smaller more

efficient network is desired, fewer neurons could be tried.

The network is trained to output a 1 in the correct position of the output vector

and to fill the rest of the output vector with 0’s. However, noisy input vectors

can result in the network’s not creating perfect 1’s and 0’s. After the network

is trained the output is passed through the competitive transfer function

compet. This makes sure that the output corresponding to the letter most like

the noisy input vector takes on a value of 1, and all others have a value of 0.

The result of this postprocessing is the output that is actually used.

Initialization

Create the two-layer network with newff.

net = newff(alphabet,targets,25)

Training

To create a network that can handle noisy input vectors, it is best to train the

network on both ideal and noisy vectors. To do this, the network is first trained

on ideal vectors until it has a low sum squared error.

Then the network is trained on 10 sets of ideal and noisy vectors. The network

is trained on two copies of the noise-free alphabet at the same time as it is

trained on noisy vectors. The two copies of the noise-free alphabet are used to

maintain the network’s ability to classify ideal input vectors.

p1

a1

1

1

n1

n2

35 x 1

10

x1

10 x 1

26 x 1

26 x 1

26 x 1

Input

26 x 10


欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/yw/12498099.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2023-05-25
下一篇 2023-05-25

发表评论

登录后才能评论

评论列表(0条)

保存