Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix groups api calls #61

Draft
wants to merge 71 commits into
base: develop
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
71 commits
Select commit Hold shift + click to select a range
a516217
Added some build variations for M1 chip Mac
lukeswindale Feb 14, 2023
bffa555
Added a script which will setup and run a test of the RC
lukeswindale Mar 1, 2023
ceba4a3
Added the ability to provide some pre-requisite sql, which runs AFTER…
lukeswindale Apr 13, 2023
f63d18c
Reverting unintended commits
lukeswindale Apr 13, 2023
42d29ce
Fix groups api calls
edeati Apr 27, 2023
d50db64
Azure pipelines build file
edeati Apr 28, 2023
162b040
Set up CI with Azure Pipelines
edeati Apr 28, 2023
82c8dcf
Update azure-pipelines.yml for Azure Pipelines
edeati Apr 28, 2023
5705929
Push to acr
edeati Apr 28, 2023
bc51061
Add variable for the acr repo
edeati Apr 28, 2023
af2be2b
Revert "Reverting unintended commits"
lukeswindale Apr 28, 2023
92a61c4
Revert "Added the ability to provide some pre-requisite sql, which ru…
lukeswindale Apr 28, 2023
a9d213f
Revert "Added a script which will setup and run a test of the RC"
lukeswindale Apr 28, 2023
e8bb6de
Revert "Added some build variations for M1 chip Mac"
lukeswindale Apr 28, 2023
4e4d12d
-DalsoPush doesn't work
edeati Apr 28, 2023
ff6a174
Add docker login step
edeati Apr 28, 2023
b4d7e5b
Make a variable for the acr login too
edeati Apr 28, 2023
19aa5c5
Repository property is dockerfile.repository as per https://github.co…
edeati May 3, 2023
18279d3
docker.image.prefix is used for repository definition by the pom
edeati May 3, 2023
0517e9c
Support functions with sql statements
senjo-aehrc Jun 14, 2023
6f84a1e
Added support for pre-requisites sql
senjo-aehrc Jun 16, 2023
1079e09
Added support for pre-requisites sql
senjo-aehrc Jun 19, 2023
30cc621
Include support-stored-proc-file branch
edeati Jun 19, 2023
535545e
Include support-stored-proc-file branch
edeati Jun 19, 2023
d1c4eb0
Fix typo
senjo-aehrc Jun 19, 2023
a248e81
Fix regex compatibility
senjo-aehrc Jun 20, 2023
fc89311
Merge pull request #6 from aehrc/feature-support-functions
lukeswindale Jun 22, 2023
3f7c26f
Merge pull request #7 from aehrc/support-stored-proc-file
lukeswindale Jun 22, 2023
83a5d1c
Increasing the column size
senjo-aehrc Jun 26, 2023
01a555e
Provided filename mappings for currently unsupported files
lukeswindale Jun 27, 2023
deaa14b
Provided filename mappings for integer simple map refset
lukeswindale Jun 27, 2023
6384e06
Provided filename mappings for ccs and cci type refsets
lukeswindale Jun 27, 2023
a4842d8
Added the create table scripts for the new tables
lukeswindale Jun 27, 2023
7198ddc
Check mysql engine exists before appending
senjo-aehrc Jun 28, 2023
715fee9
Added crefset
lukeswindale Jun 28, 2023
caeb8c4
Merge branch 'support-stored-proc-file' of https://github.com/aehrc/r…
lukeswindale Jun 28, 2023
78ddbba
Ignore the the failing tests(Need to come back)
senjo-aehrc Jun 28, 2023
f155f20
Merge pull request #8 from IHTSDO/master
edeati Jul 4, 2023
c5d19b4
Fix the failing test
senjo-aehrc Jul 4, 2023
6d56fc5
Merge pull request #9 from aehrc/support-stored-proc-file
lukeswindale Jul 4, 2023
5d18e1a
Merge branch 'master-ihtsdo'
lukeswindale Jul 24, 2023
9de6b0b
Merge branch 'master' into groups-api-fix
lukeswindale Jul 24, 2023
e2ebc56
Increased the term size for descriptions
lukeswindale Jul 31, 2023
e6f3684
Increased the term size for descriptions to 2048
lukeswindale Aug 1, 2023
32e63a6
reduced the varchar max size from 2048 to 1000
lukeswindale Aug 2, 2023
64b9314
reduced the varchar max size from 1000 to 333 (1000 bytes)
lukeswindale Aug 2, 2023
0d96d7c
if term is greater than the max length (333) replace with hash
lukeswindale Aug 2, 2023
9865b98
if term is greater than the max length (333) replace with hash
lukeswindale Aug 2, 2023
2028a1f
Fixed the conditional hash function
lukeswindale Aug 2, 2023
5197d6e
reverting my truncation hanlder in the load scripts. will solve anoth…
lukeswindale Aug 3, 2023
4bb698c
Fixing the text definition column length
lukeswindale Aug 10, 2023
766c3a5
doubled the max-file-size and max-request-size
lukeswindale Aug 12, 2023
fceb2ca
Merge branch 'IHTSDO:master' into master
dionmcm Sep 13, 2023
ca9d8f2
Merging from SI into our code
lukeswindale Sep 14, 2023
f320c99
test fix
dionmcm Sep 14, 2023
6a8890a
Fixed merge error
lukeswindale Sep 14, 2023
ae79644
fix java version
dionmcm Sep 14, 2023
5dcf9a7
Merge remote-tracking branch 'origin/groups-api-fix' into groups-api-fix
dionmcm Sep 14, 2023
e6e9476
updated runtime docker image to support jdk17
dionmcm Sep 21, 2023
013f1d3
shot at fixing over-replacement of the schema/table name
dionmcm Sep 22, 2023
899c5e9
added logging for the config map
dionmcm Sep 22, 2023
4df5a52
Fixing up the config properties in the code merged from SI
lukeswindale Oct 9, 2023
f75454f
Merge branch 'master-ihtsdo'
lukeswindale Oct 10, 2023
60311ac
Merge branch 'master' into groups-api-fix
lukeswindale Oct 10, 2023
d1159f0
Correcting droolsRulesGroups property so it's consistent
lukeswindale Oct 10, 2023
e02b9cc
Revert "Correcting droolsRulesGroups property so it's consistent"
lukeswindale Oct 11, 2023
24b6658
update ALL term varchar to be 4096
MattCordell May 8, 2024
f55d094
Add OWASP plugin to pom
MattCordell May 9, 2024
9643b9a
Revert "update ALL term varchar to be 4096"
MattCordell May 23, 2024
882022c
Revert "Add OWASP plugin to pom"
MattCordell May 24, 2024
8078e43
Add owasp back (again...) Won't run without...
MattCordell May 24, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ COPY . /usr/src/app
WORKDIR /usr/src/app
RUN mvn clean install -DskipTests=true

FROM adoptopenjdk/openjdk11:alpine
FROM aehrc/jre:openjdk-17
LABEL maintainer="SNOMED International <[email protected]>"

ARG SUID=1042
Expand Down
48 changes: 48 additions & 0 deletions azure-pipelines.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
name: rvf-$(Date:yyyyMMdd)$(Rev:.r)

trigger:
branches:
include:
- 'groups-api-fix'
- 'support-stored-proc-file'
pool:
vmImage: 'ubuntu-latest'

variables:
mavenCache: $(Pipeline.Workspace)/.m2/repository
mavenOptions: '-Dmaven.repo.local=$(mavenCache)'

stages:
- stage: build
displayName: Build
jobs:
- job: build
displayName: Build
steps:
- task: Cache@2
displayName: Cache Maven local repo
inputs:
key: 'maven | "$(Agent.OS)" | **/pom.xml'
restoreKeys: |
maven | "$(Agent.OS)"
maven
path: $(mavenCache)
- task: MavenAuthenticate@0
inputs:
artifactsFeeds: 'mavenbuild'
- task: Docker@2
inputs:
containerRegistry: $(acr_registry)
command: 'login'
- task: Maven@3
displayName: Build and verify
inputs:
mavenPomFile: 'pom.xml'
options: '-Ddocker.image.prefix=$(acr_repo)'
mavenOptions: '$(mavenOptions)'
javaHomeOption: 'JDKVersion'
jdkVersionOption: '1.17'
jdkArchitectureOption: 'x64'
publishJUnitResults: true
testResultsFiles: '**/surefire-reports/TEST-*.xml'
goals: 'clean install dockerfile:build dockerfile:push'
6 changes: 3 additions & 3 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ volumes:

services:
db:
image: mysql:5.7
image: mysql:8.0.28
restart: always
environment:
- MYSQL_ROOT_PASSWORD=snomed
Expand All @@ -21,7 +21,7 @@ services:
volumes:
- mysql:/var/lib/mysql
command:
mysqld --sql_mode="NO_ENGINE_SUBSTITUTION,STRICT_TRANS_TABLES" --lower_case_table_names=1
mysqld --local-infile=ON --sql_mode="NO_ENGINE_SUBSTITUTION,STRICT_TRANS_TABLES" --lower_case_table_names=1
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good idea

rvf:
image: snomedinternational/release-validation-framework:latest
container_name: rvf
Expand All @@ -31,7 +31,7 @@ services:
ports:
- 8081:8081
environment:
- SPRING_DATASOURCE_URL=jdbc:mysql://db:3306/?useSSL=false
- SPRING_DATASOURCE_URL=jdbc:mysql://db:3306/?useSSL=false&allowLoadLocalInfile=true&allowPublicKeyRetrieval=true
- SPRING_DATASOURCE_USERNAME=root
- SPRING_DATASOURCE_PASSWORD=snomed
- rvf.assertion.resource.local.path=./snomed-release-validation-assertions/
Expand Down
12 changes: 12 additions & 0 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -351,6 +351,18 @@
</execution>
</executions>
</plugin>

<plugin>
<groupId>org.owasp</groupId>
<artifactId>dependency-check-maven</artifactId>
<version>8.3.1</version>
<executions>
<execution>
<phase>none</phase>
</execution>
</executions>
</plugin>

</plugins>
</build>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,14 +18,14 @@ public class ExecutionCommand {
@GeneratedValue
Long id;

@Column(columnDefinition = "text")
@Column(columnDefinition = "longtext")
String template = null;
@JsonBackReference
@OneToOne(fetch = FetchType.EAGER, mappedBy = "command")
Test test;
@ElementCollection(fetch = FetchType.EAGER)
@CollectionTable(name="command_statements", joinColumns=@JoinColumn(name="command_id"))
@Column(name="statement", columnDefinition = "text")
@Column(name="statement", columnDefinition = "longtext")
@OrderColumn(name="statement_index")
List<String> statements = new ArrayList<>();

Expand Down
Original file line number Diff line number Diff line change
@@ -1,8 +1,22 @@
package org.ihtsdo.rvf.core.service;

import java.sql.*;
import java.util.Map;
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
import java.util.regex.Pattern;
import java.util.stream.Collectors;

import javax.annotation.Resource;
import javax.naming.ConfigurationException;
import javax.naming.ConfigurationException;
import org.apache.commons.dbcp.BasicDataSource;
import org.ihtsdo.rvf.core.data.model.*;
import org.ihtsdo.rvf.core.service.config.MysqlExecutionConfig;
import org.ihtsdo.rvf.core.service.util.MySqlQueryTransformer;
import org.ihtsdo.rvf.importer.AssertionGroupImporter.ProductName;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
Expand Down Expand Up @@ -208,7 +222,7 @@ private void executeCommand(final Assertion assertion, final MysqlExecutionConfi
else {
if (sqlStatement.startsWith("create table")){
// only add engine if we do not create using a like statement
if (!(sqlStatement.contains("like") || sqlStatement.contains("as"))) {
if (!sqlStatement.toUpperCase().contains(" ENGINE") && !(sqlStatement.contains("like") || sqlStatement.contains("as"))) {
sqlStatement = sqlStatement + " ENGINE = MyISAM";
}
}
Expand All @@ -218,59 +232,10 @@ private void executeCommand(final Assertion assertion, final MysqlExecutionConfi
}

private List<String> transformSql(String[] parts, Assertion assertion, MysqlExecutionConfig config) throws ConfigurationException {
List<String> result = new ArrayList<>();
String defaultCatalog = dataSource.getDefaultCatalog();
String prospectiveSchema = config.getProspectiveVersion();
final String[] nameParts = config.getProspectiveVersion().split("_");
String defaultModuleId = StringUtils.hasLength(config.getDefaultModuleId()) ? config.getDefaultModuleId() : (nameParts.length >= 2 ? ProductName.toModuleId(nameParts[1]) : "NOT_SUPPLIED");
String includedModules = config.getIncludedModules().stream().collect(Collectors.joining(","));
String version = (nameParts.length >= 3 ? nameParts[2] : "NOT_SUPPLIED");

String previousReleaseSchema = config.getPreviousVersion();
String dependencyReleaseSchema = config.getExtensionDependencyVersion();

//We need both these schemas to exist
if (prospectiveSchema == null) {
throw new ConfigurationException (FAILED_TO_FIND_RVF_DB_SCHEMA + prospectiveSchema);
}

if (config.isReleaseValidation() && !config.isFirstTimeRelease() && previousReleaseSchema == null) {
throw new ConfigurationException (FAILED_TO_FIND_RVF_DB_SCHEMA + previousReleaseSchema);
}
for( String part : parts) {
if ((part.contains("<PREVIOUS>") && previousReleaseSchema == null)
|| (part.contains("<DEPENDENCY>") && dependencyReleaseSchema == null)) {
continue;
}

logger.debug("Original sql statement: {}", part);
// remove all SQL comments - //TODO might throw errors for -- style comments
final Pattern commentPattern = Pattern.compile("/\\*.*?\\*/", Pattern.DOTALL);
part = commentPattern.matcher(part).replaceAll("");
// replace all substitutions for exec
part = part.replaceAll("<RUNID>", String.valueOf(config.getExecutionId()));
part = part.replaceAll("<ASSERTIONUUID>", String.valueOf(assertion.getAssertionId()));
part = part.replaceAll("<MODULEID>", defaultModuleId);
part = part.replaceAll("<MODULEIDS>", includedModules);
part = part.replaceAll("<VERSION>", version);
// watch out for any 's that users might have introduced
part = part.replaceAll("qa_result", defaultCatalog+ "." + qaResulTableName);
part = part.replaceAll("<PROSPECTIVE>", prospectiveSchema);
part = part.replaceAll("<TEMP>", prospectiveSchema);
if (previousReleaseSchema != null) {
part = part.replaceAll("<PREVIOUS>", previousReleaseSchema);
}
if (dependencyReleaseSchema != null) {
part = part.replaceAll("<DEPENDENCY>", dependencyReleaseSchema);
}
part = part.replaceAll("<DELTA>", deltaTableSuffix);
part = part.replaceAll("<SNAPSHOT>", snapshotTableSuffix);
part = part.replaceAll("<FULL>", fullTableSuffix);
part.trim();
logger.debug("Transformed sql statement: {}", part);
result.add(part);
}
return result;
String qaResult = dataSource.getDefaultCatalog()+ "." + qaResulTableName;
MySqlQueryTransformer queryTransformer = new MySqlQueryTransformer();
Map configMap = Map.of("qa_result", qaResult, "<ASSERTIONUUID>", String.valueOf(assertion.getAssertionId()));
return queryTransformer.transformSql(parts, config, configMap);
}


Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,129 @@
package org.ihtsdo.rvf.core.service.util;

import com.facebook.presto.sql.parser.StatementSplitter;
import com.google.common.collect.ImmutableSet;
import org.apache.commons.dbcp.BasicDataSource;
import org.ihtsdo.otf.rest.exception.BusinessServiceException;
import org.ihtsdo.rvf.core.data.model.Assertion;
import org.ihtsdo.rvf.core.service.config.MysqlExecutionConfig;
import org.ihtsdo.rvf.importer.AssertionGroupImporter;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.util.StringUtils;

import javax.naming.ConfigurationException;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.regex.Pattern;
import java.util.stream.Collectors;

public class MySqlQueryTransformer {
private final Logger logger = LoggerFactory.getLogger(MySqlQueryTransformer.class);

private static final String FAILED_TO_FIND_RVF_DB_SCHEMA = "Failed to find rvf db schema for ";

private String deltaTableSuffix = "d";
private String snapshotTableSuffix = "s";
private String fullTableSuffix = "f";
private static final String DEFAULT_DELIMITER = ";";
private static final String DELIMITER_REGEX_PATTERN = "^[ ]*(delimiter|DELIMITER)";

public List<String> transformSql(String[] parts, MysqlExecutionConfig config, final Map<String, String> configMap)
throws ConfigurationException {

logger.info("Config Map contains " + configMap.entrySet().stream().map(e -> e.getKey() + " : " + e.getValue()).collect(Collectors.joining(",")));

List<String> result = new ArrayList<>();
String prospectiveSchema = config.getProspectiveVersion();
String previousReleaseSchema = config.getPreviousVersion();
String dependencyReleaseSchema = config.getExtensionDependencyVersion();

//We need both these schemas to exist
if (prospectiveSchema == null) {
throw new ConfigurationException (FAILED_TO_FIND_RVF_DB_SCHEMA + prospectiveSchema);
}

if (config.isReleaseValidation() && !config.isFirstTimeRelease() && previousReleaseSchema == null) {
throw new ConfigurationException (FAILED_TO_FIND_RVF_DB_SCHEMA + previousReleaseSchema);
}

final String[] nameParts = config.getProspectiveVersion().split("_");
String version = (nameParts.length >= 3 ? nameParts[2] : "NOT_SUPPLIED");
String includedModules = config.getIncludedModules().stream().collect(Collectors.joining(","));
String defaultModuleId = StringUtils.hasLength(config.getDefaultModuleId()) ? config.getDefaultModuleId() : (nameParts.length >= 2 ? AssertionGroupImporter.ProductName.toModuleId(nameParts[1]) : "NOT_SUPPLIED");
for( String part : parts) {
if ((part.contains("<PREVIOUS>") && previousReleaseSchema == null)
|| (part.contains("<DEPENDENCY>") && dependencyReleaseSchema == null)) {
continue;
}

logger.debug("Original sql statement: {}", part);
// remove all SQL comments - //TODO might throw errors for -- style comments
final Pattern commentPattern = Pattern.compile("/\\*.*?\\*/", Pattern.DOTALL);
part = commentPattern.matcher(part).replaceAll("");
// replace all substitutions for exec
part = part.replaceAll("<RUNID>", String.valueOf(config.getExecutionId()));
part = part.replaceAll("<ASSERTIONUUID>", configMap.get("<ASSERTIONUUID>"));
part = part.replaceAll("<MODULEID>", defaultModuleId);
part = part.replaceAll("<MODULEIDS>", includedModules);
part = part.replaceAll("<VERSION>", version);
// watch out for any 's that users might have introduced
part = part.replaceAll("qa_result", configMap.get("qa_result"));
part = part.replaceAll("<PROSPECTIVE>", prospectiveSchema);
part = part.replaceAll("<TEMP>", prospectiveSchema);
if (previousReleaseSchema != null) {
part = part.replaceAll("<PREVIOUS>", previousReleaseSchema);
}
if (dependencyReleaseSchema != null) {
part = part.replaceAll("<DEPENDENCY>", dependencyReleaseSchema);
}
part = part.replaceAll("<DELTA>", deltaTableSuffix);
part = part.replaceAll("<SNAPSHOT>", snapshotTableSuffix);
part = part.replaceAll("<FULL>", fullTableSuffix);
part = part.replaceAll(Pattern.quote("[[:<:]]"),"\\\\b" );
part = part.replaceAll(Pattern.quote("[[:>:]]"),"\\\\b" );
for(Map.Entry<String, String> configMapEntry: configMap.entrySet()){
if (configMapEntry.getKey().matches("^<[^>]+>$")) {
part = part.replaceAll(configMapEntry.getKey(), configMapEntry.getValue());
}
}
part.trim();
logger.debug("Transformed sql statement: {}", part);
result.add(part);
}
return result;
}
/**
* Convert given sql file content to multiple statements
* @param sqlFileContent
* @return
*/
public List<String> transformToStatements(String sqlFileContent) throws BusinessServiceException {
String delimiter = DEFAULT_DELIMITER;
List<String> result = new ArrayList<>();
String[] sqlChunks = sqlFileContent.trim().split(DELIMITER_REGEX_PATTERN, Pattern.MULTILINE);
for (int i = 0; i < sqlChunks.length; i++) {
String sqlChunk = sqlChunks[i].trim();
if (!sqlChunk.isEmpty()) {
if (i > 0) {
delimiter = sqlChunk.trim().replaceAll("(?s)^([^ \r\n]+).*$", "$1");
sqlChunk = sqlChunk.trim().replaceAll("(?s)^[^ \r\n]+(.*)$", "$1").trim();
}
if (!sqlChunk.isEmpty()) {
logger.debug("Executing pre-requisite SQL: " + sqlChunk);
final StatementSplitter splitter = new StatementSplitter(sqlChunk, ImmutableSet.of(delimiter));
if (splitter.getCompleteStatements() == null || splitter.getCompleteStatements().isEmpty()) {
String errorMsg = String.format("SQL statements not ending with %s %s",delimiter, sqlChunk);
logger.error( errorMsg);
throw new BusinessServiceException(errorMsg);
}
result= splitter.getCompleteStatements().stream().map(s -> s.statement()).collect(Collectors.toList());

}
}

}
return result;
}
}
Loading