Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DPE-5324] Increase ruff rules #405

Merged
merged 8 commits into from
Oct 15, 2024
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions lib/charms/pgbouncer_k8s/v0/pgb.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@
LIBAPI = 0
# Increment this PATCH version before using `charmcraft publish-lib` or reset
# to 0 if you are raising the major API version
LIBPATCH = 9
LIBPATCH = 10

logger = logging.getLogger(__name__)

Expand Down Expand Up @@ -87,5 +87,6 @@

def get_hashed_password(username: str, password: str) -> str:
"""Creates an md5 hashed password for the given user, in the format postgresql expects."""
hash_password = md5((password + username).encode()).hexdigest()
# Should be handled in DPE-1430
hash_password = md5((password + username).encode()).hexdigest() # noqa: S324

Check failure

Code scanning / CodeQL

Use of a broken or weak cryptographic hashing algorithm on sensitive data High

Sensitive data (password)
is used in a hashing algorithm (MD5) that is insecure for password hashing, since it is not a computationally expensive hash function.
Sensitive data (id)
is used in a hashing algorithm (MD5) that is insecure.

Copilot Autofix AI 3 months ago

To fix the problem, we should replace the MD5 hashing algorithm with a stronger, modern cryptographic hash function. For password hashing, it is recommended to use algorithms like Argon2, bcrypt, or PBKDF2, which are designed to be computationally expensive and include a salt to protect against brute-force attacks.

The best way to fix this without changing existing functionality is to use the argon2-cffi library to hash the password. This library provides a secure and efficient way to hash passwords.

Steps to fix:

  1. Install the argon2-cffi library if it is not already installed.
  2. Import the PasswordHasher class from the argon2 module.
  3. Replace the MD5 hashing logic with the PasswordHasher class to hash the password.
Suggested changeset 2
lib/charms/pgbouncer_k8s/v0/pgb.py

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/lib/charms/pgbouncer_k8s/v0/pgb.py b/lib/charms/pgbouncer_k8s/v0/pgb.py
--- a/lib/charms/pgbouncer_k8s/v0/pgb.py
+++ b/lib/charms/pgbouncer_k8s/v0/pgb.py
@@ -25,3 +25,3 @@
 import string
-from hashlib import md5
+from argon2 import PasswordHasher
 from typing import Dict
@@ -88,5 +88,5 @@
 def get_hashed_password(username: str, password: str) -> str:
-    """Creates an md5 hashed password for the given user, in the format postgresql expects."""
-    # Should be handled in DPE-1430
-    hash_password = md5((password + username).encode()).hexdigest()  # noqa: S324
-    return f"md5{hash_password}"
+    """Creates a hashed password for the given user using Argon2."""
+    ph = PasswordHasher()
+    hash_password = ph.hash(password + username)
+    return hash_password
EOF
@@ -25,3 +25,3 @@
import string
from hashlib import md5
from argon2 import PasswordHasher
from typing import Dict
@@ -88,5 +88,5 @@
def get_hashed_password(username: str, password: str) -> str:
"""Creates an md5 hashed password for the given user, in the format postgresql expects."""
# Should be handled in DPE-1430
hash_password = md5((password + username).encode()).hexdigest() # noqa: S324
return f"md5{hash_password}"
"""Creates a hashed password for the given user using Argon2."""
ph = PasswordHasher()
hash_password = ph.hash(password + username)
return hash_password
pyproject.toml
Outside changed files

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/pyproject.toml b/pyproject.toml
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -7,2 +7,3 @@
 [tool.poetry.dependencies]
+argon2-cffi = "23.1.0"
 python = "^3.10"
EOF
@@ -7,2 +7,3 @@
[tool.poetry.dependencies]
argon2-cffi = "23.1.0"
python = "^3.10"
This fix introduces these dependencies
Package Version Security advisories
argon2-cffi (pypi) 23.1.0 None
Copilot is powered by AI and may make mistakes. Always verify output.
Positive Feedback
Negative Feedback

Provide additional feedback

Please help us improve GitHub Copilot by sharing more details about this comment.

Please select one or more of the options
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If only it were that easy.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so funny how it intentionally removes the in the format postgresql expects part

return f"md5{hash_password}"
13 changes: 10 additions & 3 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@ pythonpath = [
# Formatting tools configuration
[tool.black]
line-length = 99
target-version = ["py38"]
target-version = ["py310"]

# Linting tools configuration
[tool.ruff]
Expand All @@ -109,7 +109,7 @@ line-length = 99

[tool.ruff.lint]
explicit-preview-rules = true
select = ["A", "E", "W", "F", "C", "N", "D", "I001", "CPY001"]
select = ["A", "E", "W", "F", "C", "N", "D", "I001", "B", "CPY", "RUF", "S", "SIM", "UP", "TCH"]
extend-ignore = [
"D203",
"D204",
Expand All @@ -128,12 +128,19 @@ extend-ignore = [
ignore = ["E501", "D107"]

[tool.ruff.lint.per-file-ignores]
"tests/*" = ["D100", "D101", "D102", "D103", "D104"]
"tests/*" = [
"D100", "D101", "D102", "D103", "D104",
# Asserts
"B011",
# Disable security checks for tests
"S",
]

[tool.ruff.lint.flake8-copyright]
# Check for properly formatted copyright header in each file
author = "Canonical Ltd."
notice-rgx = "Copyright\\s\\d{4}([-,]\\d{4})*\\s+"
min-file-size = 1
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't expect a copyright header for empty files.


[tool.ruff.lint.mccabe]
max-complexity = 10
Expand Down
51 changes: 25 additions & 26 deletions src/charm.py
Original file line number Diff line number Diff line change
Expand Up @@ -210,9 +210,7 @@ def _node_port(self, port_type: str) -> int:
# svc.spec.ports
# [ServicePort(port=6432, appProtocol=None, name=None, nodePort=31438, protocol='TCP', targetPort=6432)]
# Keeping this distinction, so we have similarity with mysql-router-k8s
if port_type == "rw":
port = self.config["listen_port"]
elif port_type == "ro":
if port_type == "rw" or port_type == "ro":
port = self.config["listen_port"]
else:
raise ValueError(f"Invalid {port_type=}")
Expand Down Expand Up @@ -361,7 +359,7 @@ def _init_config(self, container) -> bool:

self.render_pgb_config()
# Render the logrotate config
with open("templates/logrotate.j2", "r") as file:
with open("templates/logrotate.j2") as file:
template = Template(file.read())
container.push(
"/etc/logrotate.d/pgbouncer",
Expand Down Expand Up @@ -579,7 +577,7 @@ def reload_pgbouncer(self, restart: bool = False) -> None:
pgb_container = self.unit.get_container(PGB)
pebble_services = pgb_container.get_services()
for service in self._services:
if service["name"] not in pebble_services.keys():
if service["name"] not in pebble_services:
# pebble_ready event hasn't fired so pgbouncer has not been added to pebble config
raise ConnectionError
if restart or pebble_services[service["name"]].current != ServiceStatus.ACTIVE:
Expand Down Expand Up @@ -638,7 +636,7 @@ def check_pgb_running(self):
services.append(self._metrics_service)

for service in services:
if service not in pebble_services.keys():
if service not in pebble_services:
# pebble_ready event hasn't fired so pgbouncer layer has not been added to pebble
raise ConnectionError
pgb_service_status = pgb_container.get_services().get(service).current
Expand Down Expand Up @@ -799,25 +797,26 @@ def get_relation_databases(self) -> Dict[str, Dict[str, Union[str, bool]]]:
if "pgb_dbs_config" in self.peers.app_databag:
return json.loads(self.peers.app_databag["pgb_dbs_config"])
# Nothing set in the config peer data trying to regenerate based on old data in case of update.
elif not self.unit.is_leader():
if cfg := self.get_secret(APP_SCOPE, CFG_FILE_DATABAG_KEY):
try:
parser = ConfigParser()
parser.optionxform = str
parser.read_string(cfg)
old_cfg = dict(parser)
if databases := old_cfg.get("databases"):
databases.pop("DEFAULT", None)
result = {}
i = 1
for database in dict(databases):
if database.endswith("_standby") or database.endswith("_readonly"):
continue
result[str(i)] = {"name": database, "legacy": False}
i += 1
return result
except Exception:
logger.exception("Unable to parse legacy config format")
elif not self.unit.is_leader() and (
cfg := self.get_secret(APP_SCOPE, CFG_FILE_DATABAG_KEY)
):
try:
parser = ConfigParser()
parser.optionxform = str
parser.read_string(cfg)
old_cfg = dict(parser)
if databases := old_cfg.get("databases"):
databases.pop("DEFAULT", None)
result = {}
i = 1
for database in dict(databases):
if database.endswith("_standby") or database.endswith("_readonly"):
continue
result[str(i)] = {"name": database, "legacy": False}
i += 1
return result
except Exception:
logger.exception("Unable to parse legacy config format")
return {}

def generate_relation_databases(self) -> Dict[str, Dict[str, Union[str, bool]]]:
Expand Down Expand Up @@ -940,7 +939,7 @@ def render_pgb_config(self, reload_pgbouncer=False, restart=False) -> None:
min_pool_size = math.ceil(effective_db_connections / 4)
reserve_pool_size = math.ceil(effective_db_connections / 4)
service_ids = [service["id"] for service in self._services]
with open("templates/pgb_config.j2", "r") as file:
with open("templates/pgb_config.j2") as file:
template = Template(file.read())
databases = self._get_relation_config()
readonly_dbs = self._get_readonly_dbs(databases)
Expand Down
9 changes: 5 additions & 4 deletions src/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,16 +23,17 @@

METRICS_PORT = 9127
PGB_LOG_DIR = "/var/log/pgbouncer"
MONITORING_PASSWORD_KEY = "monitoring_password"
AUTH_FILE_DATABAG_KEY = "auth_file"
CFG_FILE_DATABAG_KEY = "cfg_file"

EXTENSIONS_BLOCKING_MESSAGE = "bad relation request - remote app requested extensions, which are unsupported. Please remove this relation."
CONTAINER_UNAVAILABLE_MESSAGE = "PgBouncer container currently unavailable"

SECRET_LABEL = "secret"
SECRET_INTERNAL_LABEL = "internal-secret"
SECRET_DELETED_LABEL = "None"
# Labels are not confidential
SECRET_LABEL = "secret" # noqa: S105
MONITORING_PASSWORD_KEY = "monitoring_password" # noqa: S105
SECRET_INTERNAL_LABEL = "internal-secret" # noqa: S105
SECRET_DELETED_LABEL = "None" # noqa: S105
Comment on lines +32 to +36
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Security checks will inevitably produce some false positives, so we will have to noqa them. I propose adding a comment on top of noqa blocks, so that we explain why the check is disabled.


APP_SCOPE = "app"
UNIT_SCOPE = "unit"
Expand Down
9 changes: 6 additions & 3 deletions src/relations/backend_database.py
Original file line number Diff line number Diff line change
Expand Up @@ -169,7 +169,8 @@ def auth_query(self) -> str:
"""Generate auth query."""
if not self.relation:
return ""
return f"SELECT username, password FROM {self.auth_user}.get_auth($1)"
# auth user is internally generated
return f"SELECT username, password FROM {self.auth_user}.get_auth($1)" # noqa: S608

@property
def postgres_databag(self) -> Dict:
Expand Down Expand Up @@ -415,7 +416,8 @@ def initialise_auth_function(self, dbs: List[str]):
psycopg2.Error if self.postgres isn't usable.
"""
logger.info("initialising auth function")
install_script = open("src/relations/sql/pgbouncer-install.sql", "r").read()
with open("src/relations/sql/pgbouncer-install.sql") as f:
install_script = f.read()

for dbname in dbs:
with self.postgres._connect_to_database(dbname) as conn, conn.cursor() as cursor:
Expand All @@ -436,7 +438,8 @@ def remove_auth_function(self, dbs: List[str]):
psycopg2.Error if self.postgres isn't usable.
"""
logger.info("removing auth function from backend relation")
uninstall_script = open("src/relations/sql/pgbouncer-uninstall.sql", "r").read()
with open("src/relations/sql/pgbouncer-uninstall.sql") as f:
uninstall_script = f.read()
for dbname in dbs:
with self.postgres._connect_to_database(dbname) as conn, conn.cursor() as cursor:
cursor.execute(uninstall_script.replace("auth_user", self.auth_user))
Expand Down
7 changes: 3 additions & 4 deletions src/relations/db.py
Original file line number Diff line number Diff line change
Expand Up @@ -190,9 +190,8 @@ def _check_for_blocking_relations(self, relation_id: int) -> bool:
for relation in self.charm.model.relations.get(relname, []):
if relation.id == relation_id:
continue
for data in relation.data.values():
if not self._check_extensions(self._get_relation_extensions(relation)):
return
if not self._check_extensions(self._get_relation_extensions(relation)):
return
self.charm.unit.status = ActiveStatus()

def _on_relation_joined(self, join_event: RelationJoinedEvent):
Expand Down Expand Up @@ -516,6 +515,6 @@ def get_allowed_units(self, relation: Relation) -> str:

def get_external_app(self, relation):
"""Gets external application, as an Application object."""
for entry in relation.data.keys():
for entry in relation.data:
if isinstance(entry, Application) and entry != self.charm.app:
return entry
2 changes: 1 addition & 1 deletion src/relations/peers.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
│ │ ╰────────────────────╯ ╰───────────────────╯ ╰───────────────────╯ │
-----------------------------------------------------------------------------------------------------------------------

""" # noqa: W505
"""

import logging
from typing import Optional, Set
Expand Down
4 changes: 2 additions & 2 deletions src/relations/pgbouncer_provider.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
│ │ │ <empty> │ │ <empty> │ │ │ <empty> │ │ <empty> │ │ <empty> │ │
│ │ ╰──────────────────╯ ╰─────────────────╯ │ ╰───────────────────╯ ╰────────────────────╯ ╰───────────────────╯ │
└──────────────────┴───────────────────────────────────────────────────────────────────────────────────────────────┴────────────────────────────────────────────────────────────────────────────────────────────────┘
""" # noqa: W505
"""

import logging

Expand Down Expand Up @@ -283,6 +283,6 @@ def get_external_app(self, relation):

TODO this is stolen from the db relation - cleanup
"""
for entry in relation.data.keys():
for entry in relation.data:
if isinstance(entry, Application) and entry != self.charm.app:
return entry
17 changes: 8 additions & 9 deletions src/upgrade.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ def __init__(self, charm, model: BaseModel, **kwargs):

self.framework.observe(self.charm.on.upgrade_relation_changed, self._on_upgrade_changed)
self.framework.observe(
getattr(self.charm.on, "pgbouncer_pebble_ready"), self._on_pgbouncer_pebble_ready
self.charm.on.pgbouncer_pebble_ready, self._on_pgbouncer_pebble_ready
)

def _cluster_checks(self) -> None:
Expand All @@ -60,8 +60,10 @@ def _cluster_checks(self) -> None:
raise ClusterNotReadyError(
DEFAULT_MESSAGE, "Not all pgbouncer services are up yet."
)
except ConnectionError:
raise ClusterNotReadyError(DEFAULT_MESSAGE, "Not all pgbouncer services are missing.")
except ConnectionError as e:
raise ClusterNotReadyError(
DEFAULT_MESSAGE, "Not all pgbouncer services are missing."
) from e

if self.charm.backend.postgres and not self.charm.backend.ready:
raise ClusterNotReadyError(DEFAULT_MESSAGE, "Backend relation is still initialising.")
Expand All @@ -80,7 +82,7 @@ def pre_upgrade_check(self) -> None:
try:
self._set_rolling_update_partition(self.charm.app.planned_units() - 1)
except KubernetesClientError as e:
raise ClusterNotReadyError(e.message, e.cause)
raise ClusterNotReadyError(e.message, e.cause) from e

def _on_pgbouncer_pebble_ready(self, event: WorkloadEvent) -> None:
if not self.peer_relation:
Expand Down Expand Up @@ -139,8 +141,5 @@ def _set_rolling_update_partition(self, partition: int) -> None:
)
logger.debug(f"Kubernetes StatefulSet partition set to {partition}")
except ApiError as e:
if e.status.code == 403:
cause = "`juju trust` needed"
else:
cause = str(e)
raise KubernetesClientError("Kubernetes StatefulSet patch failed", cause)
cause = "`juju trust` needed" if e.status.code == 403 else str(e)
raise KubernetesClientError("Kubernetes StatefulSet patch failed", cause) from e
2 changes: 0 additions & 2 deletions tests/integration/__init__.py
Original file line number Diff line number Diff line change
@@ -1,2 +0,0 @@
# Copyright 2023 Canonical Ltd.
# See LICENSE file for licensing details.
2 changes: 0 additions & 2 deletions tests/integration/helpers/__init__.py
Original file line number Diff line number Diff line change
@@ -1,2 +0,0 @@
# Copyright 2023 Canonical Ltd.
# See LICENSE file for licensing details.
8 changes: 4 additions & 4 deletions tests/integration/helpers/ha_helpers.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Copyright 2022 Canonical Ltd.
# See LICENSE file for licensing details.
from typing import Dict, Tuple
from typing import Dict, Optional, Tuple

import psycopg2
import requests
Expand All @@ -16,7 +16,7 @@
from .helpers import CLIENT_APP_NAME


async def get_password(ops_test: OpsTest, app: str, down_unit: str = None) -> str:
async def get_password(ops_test: OpsTest, app: str, down_unit: Optional[str] = None) -> str:
"""Use the charm action to retrieve the password from provided application.

Returns:
Expand Down Expand Up @@ -57,7 +57,7 @@ async def check_writes(ops_test) -> int:
return total_expected_writes


async def are_writes_increasing(ops_test, down_unit: str = None) -> None:
async def are_writes_increasing(ops_test, down_unit: Optional[str] = None) -> None:
"""Verify new writes are continuing by counting the number of writes."""
writes, _ = await count_writes(ops_test, down_unit=down_unit)
for member, count in writes.items():
Expand All @@ -70,7 +70,7 @@ async def are_writes_increasing(ops_test, down_unit: str = None) -> None:


async def count_writes(
ops_test: OpsTest, down_unit: str = None
ops_test: OpsTest, down_unit: Optional[str] = None
) -> Tuple[Dict[str, int], Dict[str, int]]:
"""Count the number of writes in the database."""
app = "postgresql-k8s"
Expand Down
2 changes: 1 addition & 1 deletion tests/integration/helpers/helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ async def get_endpoint_info(ops_test: OpsTest, unit_name: str, endpoint: str) ->
"--format=json",
)
relation_info = json.loads(get_databag[1])[unit_name]["relation-info"]
return list(filter(lambda x: x["endpoint"] == endpoint, relation_info))[0]["application-data"][
return next(filter(lambda x: x["endpoint"] == endpoint, relation_info))["application-data"][
"endpoints"
]

Expand Down
2 changes: 0 additions & 2 deletions tests/integration/relations/__init__.py
Original file line number Diff line number Diff line change
@@ -1,2 +0,0 @@
# Copyright 2023 Canonical Ltd.
# See LICENSE file for licensing details.
2 changes: 0 additions & 2 deletions tests/integration/relations/pgbouncer_provider/__init__.py
Original file line number Diff line number Diff line change
@@ -1,2 +0,0 @@
# Copyright 2023 Canonical Ltd.
# See LICENSE file for licensing details.
13 changes: 5 additions & 8 deletions tests/integration/relations/pgbouncer_provider/helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ async def get_application_relation_data(
application_name: str,
relation_name: str,
key: str,
relation_id: str = None,
relation_id: Optional[str] = None,
) -> Optional[str]:
"""Get relation data for an application.

Expand Down Expand Up @@ -130,7 +130,7 @@ async def is_external_connectivity_set(
ops_test: OpsTest,
application_name: str,
relation_name: str,
relation_id: str = None,
relation_id: Optional[str] = None,
) -> Optional[str]:
data = await get_application_relation_data(
ops_test,
Expand All @@ -149,9 +149,9 @@ async def build_connection_string(
application_name: str,
relation_name: str,
*,
relation_id: str = None,
relation_id: Optional[str] = None,
read_only_endpoint: bool = False,
database: str = None,
database: Optional[str] = None,
) -> str:
"""Build a PostgreSQL connection string.

Expand Down Expand Up @@ -268,10 +268,7 @@ def check_exposed_connection(credentials, tls):
table_name = "expose_test"
smoke_val = str(uuid4())

if tls:
sslmode = "require"
else:
sslmode = "disable"
sslmode = "require" if tls else "disable"
if "uris" in credentials["postgresql"]:
uri = credentials["postgresql"]["uris"]
connstr = f"{uri}?connect_timeout=1&sslmode={sslmode}"
Expand Down
Loading