Skip to content

Commit

Permalink
Merge branch 'DataLinkDC:dev' into dev
Browse files Browse the repository at this point in the history
  • Loading branch information
gaoyan1998 authored Oct 29, 2023
2 parents c2ab7ef + 2d5b1a1 commit 1acfae8
Show file tree
Hide file tree
Showing 154 changed files with 1,556 additions and 13,422 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/backend.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ jobs:
fail-fast: true
matrix:
jdk: [8, 11]
flink: [1.13, 1.14, 1.15, 1.16, 1.17]
flink: [1.14, 1.15, 1.16, 1.17, 1.18]

timeout-minutes: 30
env:
Expand Down
7 changes: 3 additions & 4 deletions .github/workflows/docker_build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,6 @@ jobs:
fail-fast: true
matrix:
url: [ registry.cn-hangzhou.aliyuncs.com ,docker.io ]
# FLINK_VERSION: [1.11.6 , 1.12.7 , 1.13.6 , 1.14.6 , 1.15.2 , 1.16.0, 1.17.0]
include:
- url: registry.cn-hangzhou.aliyuncs.com
namespace: dinky
Expand Down Expand Up @@ -154,14 +153,12 @@ jobs:
fail-fast: true
matrix:
url: [ registry.cn-hangzhou.aliyuncs.com ,docker.io ]
FLINK_VERSION: [1.13.6 , 1.14.6 , 1.15.2 , 1.16.0, 1.17.0]
FLINK_VERSION: [1.14.6 , 1.15.2 , 1.16.0, 1.17.0, 1.18.0]
include:
- url: registry.cn-hangzhou.aliyuncs.com
namespace: dinky
- url: docker.io
namespace: dinkydocker
- FLINK_VERSION: 1.13.6
FLINK_BIG_VERSION: 1.13
- FLINK_VERSION: 1.14.6
FLINK_BIG_VERSION: 1.14
- FLINK_VERSION: 1.15.4
Expand All @@ -170,6 +167,8 @@ jobs:
FLINK_BIG_VERSION: 1.16
- FLINK_VERSION : 1.17.1
FLINK_BIG_VERSION: 1.17
- FLINK_VERSION: 1.18.0
FLINK_BIG_VERSION: 1.18
steps:
- uses: actions/checkout@v3
- name: Move Dockerfile
Expand Down
18 changes: 10 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,21 @@
# Dinky

<div align="center">
[![License](https://img.shields.io/badge/license-Apache%202-4EB1BA.svg?style=socialflat-square&)](https://www.apache.org/licenses/LICENSE-2.0.html)
[![Total Lines](https://img.shields.io/github/stars/DataLinkDC/dinky?style=socialflat-square&label=stars)](https://github.com/DataLinkDC/dinky/stargazers)
[![CN doc](https://img.shields.io/badge/文档-中文版-blue.svg?style=socialflat-square&)](README_zh_CN.md)
[![EN doc](https://img.shields.io/badge/document-English-blue.svg?style=socialflat-square&)](README.md)
</div>

[![Stargazers over time](https://starchart.cc/DataLinkDC/dinky.svg)](https://starchart.cc/DataLinkDC/dinky)

## Introduction

Dinky is an out of the box one-stop real-time computing platform dedicated to the construction and practice of Unified Streaming & Batch and Unified Data Lake & Data Warehouse. Based on Apache Flink, Dinky provides the ability to connect many big data frameworks including OLAP and Data Lake.
Dinky is an out-of-the-box, one-stop, real-time computing platform dedicated to the construction and practice of Unified Streaming & Batch and Unified Data Lake & Data Warehouse. Based on Apache Flink, Dinky provides the ability to connect many big data frameworks including OLAP and Data Lake.

## Feature

Its main feature are as follows:
Its main features are as follows:

- Immersive Flink SQL Data Development: Automatic prompt completion, syntax highlighting, statement beautification, online debugging, syntax verification, execution plan, MetaStore, lineage, version comparison, etc.
- Support FlinkSQL multi-version development and execution modes: Local,Standalone,Yarn/Kubernetes Session,Yarn Per-Job,Yarn/Kubernetes Application.
Expand All @@ -29,7 +31,7 @@ Its main feature are as follows:
- Support automatically managed SavePoint/CheckPoint recovery and triggering mechanisms: latest, earliest, specified, etc.
- Support resource management: Cluster instance, cluster configuration, jar, data source, alarm group, alarm instance, document, global variable, system configuration, etc.
- Support enterprise-level management: multi-tenant, user, role, project space.
- More hidden features are waiting for friends to explore.
- More hidden features await exploration by our users.

## Principle

Expand Down Expand Up @@ -88,7 +90,7 @@ See [source code compilation](https://github.com/DataLinkDC/dinky/blob/dev/docs/

## How to Upgrade

Due to many functions, there are many bugs and optimization points. It is strongly recommended to use or upgrade to the latest version.
Due to the numerous functionalities, there are several bugs and optimization points that need attention.. It is strongly recommended to use or upgrade to the latest version.

Upgrade steps:

Expand All @@ -98,9 +100,9 @@ Upgrade steps:

## Thanks

Standing on the shoulders of giants, Dinky was born. For this we express our heartfelt thanks to all the open source software used and its communities! We also hope that we are not only beneficiaries of open source, but also contributors to open source. We also hope that partners who have the same enthusiasm and belief in open source will join in and contribute to open source together.
Standing on the shoulders of giants, Dinky was born. For this we express our heartfelt thanks to all the open source software used and its communities! We also hope that we are not only beneficiaries of open source, but also contributors to open source. We also hope that partners who share our enthusiasm and belief in open source will join us in contributing to the open-source community.

A partial list of acknowledgements follows:
Below is a partial list of acknowledgements:

[Apache Flink](https://github.com/apache/flink)

Expand All @@ -124,13 +126,13 @@ A partial list of acknowledgements follows:

[SpringBoot]()

Thanks to [JetBrains](https://www.jetbrains.com/?from=dlink) for sponsoring a free open source license.
Thanks to [JetBrains](https://www.jetbrains.com/?from=dlink) for providing a free open-source license.

[![JetBrains](https://raw.githubusercontent.com/DataLinkDC/dinky/dev/images/main/jetbrains.svg)](https://www.jetbrains.com/?from=dlink)

## Get Help

1.Create an issue and describe it clearly.
1.Create an issue and provide a clear description.

2.Visit the [official website](http://www.dlink.top/#/) website to read the latest documentation manual.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@
import org.dinky.data.enums.ProcessStepType;
import org.dinky.data.enums.ProcessType;
import org.dinky.data.enums.SseTopic;
import org.dinky.data.enums.Status;
import org.dinky.data.exception.BusException;
import org.dinky.data.exception.DinkyException;
import org.dinky.data.model.ProcessEntity;
Expand Down Expand Up @@ -75,6 +76,9 @@ public List<ProcessEntity> list() {
}

public ProcessEntity getProcess(String processName) {
if (logPross.containsKey(processName)) {
return logPross.get(processName);
}
try {
String filePath = String.format("%s/tmp/log/%s.json", System.getProperty("user.dir"), processName);
String string = FileUtil.readString(filePath, StandardCharsets.UTF_8);
Expand Down Expand Up @@ -120,14 +124,14 @@ public void appendLog(String processName, String stepPid, String log) {
*/
public void registerProcess(ProcessType type, String processName) throws RuntimeException {
if (logPross.containsKey(processName)) {
throw new BusException("Another user is running an action to suppress this request");
throw new BusException(Status.PROCESS_REGISTER_EXITS);
}
ProcessEntity entity = ProcessEntity.builder()
.key(UUID.fastUUID().toString())
.log(new StringBuilder())
.status(ProcessStatus.INITIALIZING)
.type(type)
.title(processName)
.title(type.getValue())
.startTime(LocalDateTime.now())
.children(new CopyOnWriteArrayList<>())
.build();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -67,13 +67,13 @@ public static SseEmitter connectSession(String sessionKey) {
log.warn("Session key already exists: {}", sessionKey);
closeSse(sessionKey);
}
SseEmitter sseEmitter = new SseEmitter(60 * 1000L);
SseEmitter sseEmitter = new SseEmitter(60 * 1000L * 10);
sseEmitter.onError(err -> onError(sessionKey, err));
sseEmitter.onTimeout(() -> onTimeout(sessionKey));
sseEmitter.onCompletion(() -> onCompletion(sessionKey));
try {
// Set the client reconnection interval, 0 to reconnect immediately
sseEmitter.send(SseEmitter.event().reconnectTime(0));
sseEmitter.send(SseEmitter.event().reconnectTime(1000));
} catch (IOException e) {
throw new RuntimeException(e);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,11 +21,14 @@

import org.dinky.data.model.History;
import org.dinky.data.result.ProTableResult;
import org.dinky.data.result.Result;
import org.dinky.service.HistoryService;

import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

import com.fasterxml.jackson.databind.JsonNode;
Expand Down Expand Up @@ -62,4 +65,14 @@ public class HistoryController {
public ProTableResult<History> listHistory(@RequestBody JsonNode para) {
return historyService.selectForProTable(para);
}

/**
* 获取Job实例的所有信息
*/
@GetMapping("/getLatestHistoryById")
@ApiOperation("Get latest history info by id")
@ApiImplicitParam(name = "id", value = "task id", dataType = "Integer", paramType = "query", required = true)
public Result<History> getLatestHistoryById(@RequestParam Integer id) {
return Result.succeed(historyService.getLatestHistoryById(id));
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -37,4 +37,12 @@ public interface HistoryService extends ISuperService<History> {
*/
@Deprecated
boolean removeHistoryById(Integer id);

/**
* Get latest history info by task id.
*
* @param id The ID of the task.
* @return History info.
*/
History getLatestHistoryById(Integer id);
}
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,8 @@

import org.springframework.stereotype.Service;

import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper;

/**
* HistoryServiceImpl
*
Expand All @@ -43,4 +45,12 @@ public boolean removeHistoryById(Integer id) {
}
return removeById(id);
}

@Override
public History getLatestHistoryById(Integer id) {
return baseMapper.selectOne(new QueryWrapper<History>()
.eq("task_id", id)
.orderByDesc("start_time")
.last("limit 1"));
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ private IResult executeMSFlinkSql(StudioMetaStoreDTO studioMetaStoreDTO) {

@Override
public JdbcSelectResult getCommonSqlData(Integer taskId) {
return ResultPool.getCommonSqlCache(taskId);
return (JdbcSelectResult) ResultPool.getCommonSqlCache(taskId);
}

@Override
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -160,8 +160,6 @@ private String[] buildParams(int id) {
.url(dsProperties.getUrl())
.username(dsProperties.getUsername())
.password(dsProperties.getPassword())
.dinkyAddr(SystemConfiguration.getInstances().getDinkyAddr().getValue())
.split(SystemConfiguration.getInstances().getSqlSeparator())
.build();
String encodeParam = Base64.getEncoder()
.encodeToString(JsonUtils.toJsonString(appParamConfig).getBytes());
Expand All @@ -179,7 +177,7 @@ public void preCheckTask(TaskDTO task) throws TaskNotDoneException, SqlExplainEx
&& task.getJobInstanceId() > 0) {
JobInstance jobInstance = jobInstanceService.getById(task.getJobInstanceId());
if (jobInstance != null && !JobStatus.isDone(jobInstance.getStatus())) {
throw new TaskNotDoneException(Status.TASK_STATUS_IS_NOT_DONE.getMessage());
throw new BusException(Status.TASK_STATUS_IS_NOT_DONE.getMessage());
}
}

Expand Down Expand Up @@ -327,6 +325,9 @@ public JobResult restartTask(Integer id, String savePointPath) throws Exception

@Override
public boolean cancelTaskJob(TaskDTO task) {
if (Dialect.isCommonSql(task.getDialect())) {
return true;
}
JobInstance jobInstance = jobInstanceService.getById(task.getJobInstanceId());
Assert.notNull(jobInstance, Status.JOB_INSTANCE_NOT_EXIST.getMessage());
ClusterInstance clusterInstance = clusterInstanceService.getById(jobInstance.getClusterId());
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,15 +23,13 @@
import org.dinky.data.annotation.SupportDialect;
import org.dinky.data.dto.SqlDTO;
import org.dinky.data.dto.TaskDTO;
import org.dinky.data.result.ResultPool;
import org.dinky.data.result.SqlExplainResult;
import org.dinky.job.JobResult;
import org.dinky.service.DataBaseService;

import java.util.List;
import java.util.concurrent.TimeUnit;

import cn.hutool.cache.Cache;
import cn.hutool.cache.impl.TimedCache;
import cn.hutool.extra.spring.SpringUtil;
import lombok.extern.slf4j.Slf4j;

Expand All @@ -50,8 +48,6 @@
Dialect.PRESTO
})
public class CommonSqlTask extends BaseTask {
private static final Cache<Integer, JobResult> COMMON_SQL_SEARCH_CACHE =
new TimedCache<>(TimeUnit.MINUTES.toMillis(10));

public CommonSqlTask(TaskDTO task) {
super(task);
Expand All @@ -68,7 +64,9 @@ public JobResult execute() {
log.info("Preparing to execute common sql...");
SqlDTO sqlDTO = SqlDTO.build(task.getStatement(), task.getDatabaseId(), null);
DataBaseService dataBaseService = SpringUtil.getBean(DataBaseService.class);
return COMMON_SQL_SEARCH_CACHE.get(task.getId(), () -> dataBaseService.executeCommonSql(sqlDTO));
JobResult jobResult = dataBaseService.executeCommonSql(sqlDTO);
ResultPool.putCommonSqlCache(task.getId(), jobResult.getResult());
return jobResult;
}

@Override
Expand Down
2 changes: 1 addition & 1 deletion dinky-admin/src/main/java/org/dinky/utils/MavenUtil.java
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@
@Slf4j
public class MavenUtil {
static final String javaExecutor = FileUtil.file(
FileUtil.file(SystemUtil.getJavaRuntimeInfo().getHomeDir()).getParentFile(), "/bin/java")
FileUtil.file(SystemUtil.getJavaRuntimeInfo().getHomeDir()), "/bin/java")
.getAbsolutePath();
private static final String EXECTOR = SystemUtil.getOsInfo().isWindows() ? "mvn.cmd" : "mvn";

Expand Down
Original file line number Diff line number Diff line change
@@ -1,20 +1,4 @@
<?xml version="1.0" encoding="UTF-8"?>
<!--
~ Licensed to the Apache Software Foundation (ASF) under one or more
~ contributor license agreements. See the NOTICE file distributed with
~ this work for additional information regarding copyright ownership.
~ The ASF licenses this file to You under the Apache License, Version 2.0
~ (the "License"); you may not use this file except in compliance with
~ the License. You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing, software
~ distributed under the License is distributed on an "AS IS" BASIS,
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
~ See the License for the specific language governing permissions and
~ limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
Expand All @@ -24,11 +8,11 @@
<version>${revision}</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>dinky-app-1.13</artifactId>
<artifactId>dinky-app-1.18</artifactId>

<packaging>jar</packaging>

<name>Dinky : App 1.13</name>
<name>Dinky : App 1.18</name>

<properties>
<mainClass>org.dinky.app.MainApp</mainClass>
Expand All @@ -39,26 +23,15 @@
<groupId>org.dinky</groupId>
<artifactId>dinky-app-base</artifactId>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
</dependency>
<dependency>
<groupId>org.dinky</groupId>
<artifactId>dinky-client-1.13</artifactId>
</dependency>
<dependency>
<groupId>org.dinky</groupId>
<artifactId>dinky-flink-1.13</artifactId>
<artifactId>dinky-client-${dinky.flink.version}</artifactId>
<scope>${scope.runtime}</scope>
</dependency>
<dependency>
<groupId>org.dinky</groupId>
<artifactId>dinky-client-base</artifactId>
</dependency>
<dependency>
<groupId>org.dinky</groupId>
<artifactId>dinky-executor</artifactId>
<artifactId>dinky-flink-${dinky.flink.version}</artifactId>
<scope>${scope.runtime}</scope>
</dependency>
</dependencies>

Expand Down
Loading

0 comments on commit 1acfae8

Please sign in to comment.