例如:项目场景:使用DFA算法使用敏感词库对文本过滤和判断
问题描述
调用文本过滤和判断接口时,每次去调用,会导致cpu飙升达到96%以上,在高并发情况下,会发送获取结果不一致的问题
public static ConcurrentHashMap sensitiveWordMap;
//初始化敏感词的方法
public static synchronized void init(Set<String> sensitiveWordSet) {
initSensitiveWordMap(sensitiveWordSet);
}
` private static void initSensitiveWordMap(Set<String> sensitiveWordSet) {
//初始化敏感词容器,减少扩容 *** 作
sensitiveWordMap = new ConcurrentHashMap(sensitiveWordSet.size());
String key;
Map nowMap;
Map<String, String> newWorMap;
//迭代sensitiveWordSet
Iterator<String> iterator = sensitiveWordSet.iterator();
while (iterator.hasNext()) {
//关键字
key = iterator.next();
nowMap = sensitiveWordMap;
for (int i = 0; i < key.length(); i++) {
//转换成char型
char keyChar = key.charAt(i);
//库中获取关键字
Object wordMap = nowMap.get(keyChar);
//如果存在该key,直接赋值,用于下一个循环获取
if (wordMap != null) {
nowMap = (Map) wordMap;
} else {
//不存在则,则构建一个map,同时将isEnd设置为0,因为他不是最后一个
newWorMap = new HashMap<>();
//不是最后一个
newWorMap.put("isEnd", "0");
nowMap.put(keyChar, newWorMap);
nowMap = newWorMap;
}
if (i == key.length() - 1) {
//最后一个
nowMap.put("isEnd", "1");
}
}
}
}
原因分析:
在本地用虚拟机工具,visualVm,Jconsole查看内存情况,发现内存的堆达到了3G,而且是折线形式,一会高一会低,一直GC,GC回收不过来
sensitiveWordMap = new ConcurrentHashMap(sensitiveWordSet.size());
这里会频繁的创建ConcurrentHashMap对象,一直创建又回收
解决方案:在项目启动的时候就初始化,存入本地缓存,将sensitiveWordMap 敏感词库也存入本地缓存,后面再方法调用初始化,直接拿本地缓存
//在启动的时候初始化,实现启动监听
public class AsynInitWordBiz implements ApplicationListener<ApplicationReadyEvent> {
//启动时,初始化默认的敏感词数据
@Override
public void onApplicationEvent(ApplicationReadyEvent applicationReadyEvent) {
//拿到枚举数组
SensitiveEnum[] sensitiveEnums = SensitiveEnum.values();
for (SensitiveEnum sensitiveEnum : sensitiveEnums) {
SensitiveWordInitRequest sensitiveWordInitRequest = new SensitiveWordInitRequest();
sensitiveWordInitRequest.setType(sensitiveEnum.getType());
initMap(sensitiveWordInitRequest);
log.info("启动时,预热类型为{}的敏感词词典", sensitiveEnum.getType());
}
}
}
//初始化的方法
public void initMap(SensitiveWordInitRequest sensitiveWordInitRequest) {
Set<String> sensitiveWords = findAllWordsResponse.getSensitiveWords();
SensitiveWordUtil.init(sensitiveWords);
//存入本地缓存
CacheUtil.putCache(RiskContant.getRiskKey(sensitiveWordInitRequest.getType()), SensitiveWordUtil.sensitiveWordMap);
}
//工具类重新初始化方法
public static synchronized void init(ConcurrentHashMap map) {
sensitiveWordMap=map;
}
/在接口调用的时候,从缓存去取,再初始化
ConcurrentHashMap concurrentHashMap = (ConcurrentHashMap) CacheUtil.getCache(RiskContant.getRiskKey(sensitiveWordFindAllWordsRequest.getType()));
SensitiveWordUtil.init(concurrentHashMap);
欢迎分享,转载请注明来源:内存溢出
评论列表(0条)