《Web安全之机器学习入门》笔记:第十一章 11.2 Apriori算法 hello world

《Web安全之机器学习入门》笔记:第十一章 11.2 Apriori算法 hello world,第1张

《Web安全之机器学习入门》笔记:第十一章 11.2 Apriori算法 hello world

        本小节示例Apriori算法的基本使用方法,apriori算法主要应用在推荐系统,用于挖掘出数据关联规则的算法,它用来找出数据值中频繁出现的数据集合,找出这些集合的模式有助于我们做一些决策。

1.源码修改

作者配套的源码是基于python2的,在python3环境运行有问题,报错如下

C:ProgramDataAnaconda3python.exe C:/Users/liujiannan/PycharmProjects/pythonProject/Web安全之机器学习入门/code/11-1.py
Traceback (most recent call last):
  File "C:/Users/liujiannan/PycharmProjects/pythonProject/Web安全之机器学习入门/code/11-1.py", line 95, in 
    L, suppData = apriori(myDat, 0.5)
  File "C:/Users/liujiannan/PycharmProjects/pythonProject/Web安全之机器学习入门/code/11-1.py", line 45, in apriori
    L1, suppData = scanD(D, C1, minSupport)
  File "C:/Users/liujiannan/PycharmProjects/pythonProject/Web安全之机器学习入门/code/11-1.py", line 17, in scanD
    numItems = float(len(D))
TypeError: object of type 'map' has no len()

修改前

    C1 = createC1(dataSet)
    D = map(set, dataSet)

修改如下

    C1 = list(createC1(dataSet))
    D = list(map(set, dataSet))

2.完整源码如下所示

def createC1(dataSet):
    C1 = []
    for transaction in dataSet:
        for item in transaction:
            if [item] not in C1:
                C1.append([item])
    C1.sort()
    return map(frozenset, C1)


def scanD(D, Ck, minSupport):
    ssCnt = {}
    for tid in D:
        for can in Ck:
            if can.issubset(tid):
                ssCnt[can] = ssCnt.get(can, 0) + 1
    numItems = float(len(list(D)))
    retList = []
    supportData = {}
    for key in ssCnt:
        support = ssCnt[key] / numItems
        if support >= minSupport:
            retList.insert(0, key)
        supportData[key] = support
    return retList, supportData


def aprioriGen(Lk, k):
    retList = []
    lenLk = len(Lk)
    for i in range(lenLk):
        for j in range(i + 1, lenLk):
            L1 = list(Lk[i])[: k - 2];
            L2 = list(Lk[j])[: k - 2];
            L1.sort();
            L2.sort()
            if L1 == L2:
                retList.append(Lk[i] | Lk[j])
    return retList


def apriori(dataSet, minSupport=0.5):
    C1 = list(createC1(dataSet))
    D = list(map(set, dataSet))
    L1, suppData = scanD(D, C1, minSupport)
    L = [L1]
    k = 2

    while (len(L[k - 2]) > 0):
        Ck = aprioriGen(L[k - 2], k)
        Lk, supK = scanD(D, Ck, minSupport)
        suppData.update(supK)
        L.append(Lk)
        k += 1
    return L, suppData


def calcConf(freqSet, H, supportData, brl, minConf=0.7):
    prunedH = []
    for conseq in H:
        conf = supportData[freqSet] / supportData[freqSet - conseq]
        if conf >= minConf:
            print(freqSet - conseq, '-->', conseq, 'conf:', conf)
            brl.append((freqSet - conseq, conseq, conf))
            prunedH.append(conseq)
    return prunedH


def rulesFromConseq(freqSet, H, supportData, brl, minConf=0.7):
    m = len(H[0])

    if len(freqSet) > m + 1:
        Hmp1 = aprioriGen(H, m + 1)
        Hmp1 = calcConf(freqSet, Hmp1, supportData, brl, minConf)

        if len(Hmp1) > 1:
            rulesFromConseq(freqSet, Hmp1, supportData, brl, minConf)


def generateRules(L, supportData, minConf=0.7):
    bigRuleList = []
    for i in range(1, len(L)):
        for freqSet in L[i]:
            H1 = [frozenset([item]) for item in freqSet]

            if i > 1:
                rulesFromConseq(freqSet, H1, supportData, bigRuleList, minConf)
            else:
                calcConf(freqSet, H1, supportData, bigRuleList, minConf)
    return bigRuleList

if __name__ == '__main__':
    myDat = [ [ 1, 3, 4 ], [ 2, 3, 5 ], [ 1, 2, 3, 5 ], [ 2, 5 ] ]

    L, suppData = apriori(myDat, 0.5)
    rules = generateRules(L, suppData, minConf=0.7)
    print('rules:n', rules)

3.测试结果

frozenset({5}) --> frozenset({2}) conf: 1.0
frozenset({2}) --> frozenset({5}) conf: 1.0
frozenset({1}) --> frozenset({3}) conf: 1.0
rules:
 [(frozenset({5}), frozenset({2}), 1.0), (frozenset({2}), frozenset({5}), 1.0), (frozenset({1}), frozenset({3}), 1.0)]

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/5721593.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-18
下一篇 2022-12-18

发表评论

登录后才能评论

评论列表(0条)

保存