SpringBoot进阶教程(八十七)数据压缩
最近在使用Redis的时候,经常遇到一些不常用的大key,对存储有一些负担。就想着把大key压缩一下。压缩可以分很多种,比如拆分JSON字符串, 压缩JSON字符串,优化JSON体积,流式处理大型JSON和分段存储。v拆分JSON字符串
1.1按结构拆分数组拆分:若JSON包含大型数组,可将其拆分为多个小数组。
// 示例:将大数组拆分为多个子数组
JSONArray bigArray = new JSONArray(jsonString);
int chunkSize = 100;
for (int i = 0; i < bigArray.length(); i += chunkSize) {
JSONArray chunk = new JSONArray();
for (int j = i; j < Math.min(i + chunkSize, bigArray.length()); j++) {
chunk.put(bigArray.get(j));
}
String chunkJson = chunk.toString();
// 处理或保存chunkJson
}对象拆分:若JSON是嵌套对象,可按层级拆分为子对象。
1.2按大小拆分(流式处理)使用流式API(如Jackson的JsonParser)逐块读取JSON内容,避免一次性加载到内存:
JsonFactory factory = new JsonFactory();
try (JsonParser parser = factory.createParser(new File("large.json"))) {
while (parser.nextToken() != null) {
// 逐Token处理,如按特定条件拆分
}
}v压缩JSON字符串
2.1使用GZIP压缩import java.util.zip.GZIPOutputStream;
import java.io.ByteArrayOutputStream;
public static byte[] compress(String data) throws IOException {
ByteArrayOutputStream bos = new ByteArrayOutputStream(data.length());
try (GZIPOutputStream gzip = new GZIPOutputStream(bos)) {
gzip.write(data.getBytes());
}
return bos.toByteArray();
}
// 压缩后的数据可用于传输或存储
byte[] compressed = compress(jsonString);2.2使用Deflater压缩import java.util.zip.Deflater;
public static byte[] deflateCompress(String data) {
Deflater deflater = new Deflater();
deflater.setInput(data.getBytes());
deflater.finish();
byte[] buffer = new byte;
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
while (!deflater.finished()) {
int count = deflater.deflate(buffer);
outputStream.write(buffer, 0, count);
}
deflater.end();
return outputStream.toByteArray();
}v优化JSON体积
3.1移除无用空格使用紧凑格式(无缩进、换行):
new JSONObject(jsonString).toString(); // 默认紧凑格式3.2缩短键名将长字段名替换为短名称:
{"n":"Alice","a":30} // 原始键名可能为"name"、"age"v流式处理大型JSON
使用流式API逐步解析,避免内存溢出:
// Jackson流式API示例
JsonFactory factory = new JsonFactory();
try (JsonParser parser = factory.createParser(new File("large.json"))) {
JsonToken token;
while ((token = parser.nextToken()) != null) {
if (token == JsonToken.START_ARRAY) {
while (parser.nextToken() != JsonToken.END_ARRAY) {
// 逐条处理数组元素
JsonNode node = parser.readValueAsTree();
// 处理node...
}
}
}
}v分页处理
其实也是拆分,将数据拆成若干份
v实践方案
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.util.zip.GZIPInputStream;
import java.util.zip.GZIPOutputStream;
import org.apache.commons.codec.binary.Base64;
import java.nio.charset.StandardCharsets;
public class CompressHelper {
private static final ObjectMapper objectMapper = new ObjectMapper();
/**
* 方式1:去除JSON中的空格/换行等冗余字符(文本压缩)
* @param formattedJson 格式化的JSON字符串(含空格换行)
* @return 紧凑格式的JSON字符串
* @throws IOException JSON解析异常
*/
public static String compressJsonByRemovingSpaces(String formattedJson) throws IOException {
JsonNode jsonNode = objectMapper.readTree(formattedJson);
return objectMapper.writeValueAsString(jsonNode);
}
/**
* 方式2:使用GZIP算法对JSON字符串进行二进制压缩(适合网络传输)
* @param json 原始JSON字符串
* @return Base64编码的压缩后字符串(可直接传输)
* @throws IOException 压缩异常
*/
public static String compressJsonByGzip(String json) throws IOException {
try (ByteArrayOutputStream byteOut = new ByteArrayOutputStream();
GZIPOutputStream gzipOut = new GZIPOutputStream(byteOut)) {
gzipOut.write(json.getBytes("UTF-8"));
gzipOut.finish();
return Base64.encodeBase64String(byteOut.toByteArray());
}
}
public static String decompressJson(String source) throws IOException {
byte[] compressedData = Base64.decodeBase64(source);
try (ByteArrayInputStream byteIn = new ByteArrayInputStream(compressedData);
GZIPInputStream gzipIn = new GZIPInputStream(byteIn);
ByteArrayOutputStream byteOut = new ByteArrayOutputStream()) {
// 读取压缩数据并解压缩
byte[] buffer = new byte;
int len;
while ((len = gzipIn.read(buffer)) != -1) {
byteOut.write(buffer, 0, len);
}
return byteOut.toString(StandardCharsets.UTF_8.name());
}
}
}v源码地址
https://github.com/toutouge/javademosecond/tree/master/hellolearn
来源:程序园用户自行投稿发布,如果侵权,请联系站长删除
免责声明:如果侵犯了您的权益,请联系站长,我们会及时删除侵权内容,谢谢合作!
页:
[1]