Skip to content

Foreign key conflicts on merge sometimes #46

@bennuttall

Description

@bennuttall

I've experienced a problem which only happens when running code in AWS using sqlalchemy-aurora-data-api, and never when running against a real postgres served locally.

I use sqlachemy's merge to upsert a record, and I get a DatabaseError:

Traceback (most recent call last):
  File "/var/task/aurora_data_api/__init__.py", line 248, in execute
    res = self._client.execute_statement(**execute_statement_args)
  File "/var/runtime/botocore/client.py", line 530, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/var/runtime/botocore/client.py", line 960, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.errorfactory.BadRequestException: An error occurred (BadRequestException) when calling the ExecuteStatement operation: ERROR: duplicate key value violates unique constraint "episodes_pk"
  Detail: Key (episode_pid)=(m001q2mq) already exists.; SQLState: 23505

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/var/task/sqlalchemy/engine/base.py", line 1965, in _exec_single_context
    self.dialect.do_execute(
  File "/var/task/sqlalchemy/engine/default.py", line 921, in do_execute
    cursor.execute(statement, parameters)
  File "/var/task/aurora_data_api/__init__.py", line 258, in execute
    raise self._get_database_error(e) from e
aurora_data_api.exceptions.DatabaseError: An error occurred (BadRequestException) when calling the ExecuteStatement operation: ERROR: duplicate key value violates unique constraint "episodes_pk"
  Detail: Key (episode_pid)=(m001q2mq) already exists.; SQLState: 23505

This shouldn't be possible with a merge - it should simply run an UPDATE rather than an INSERT.

I've got one project that uses sqlalchemy-aurora-data-api and this problem almost never happens. Another project this happens on every single run. I even tried a get (SELECT) followed by add (INSERT) or merge (UPDATE) depending on the result of the get, but that still ended up trying and failing to insert.

For context, this is:

  • AWS lambda function (Python 3.9 runtime)
  • sqlalchemy 2.0.25
  • aurora_data_api 0.5.0
  • sqlalchemy_aurora_data_api-0.4.1.dist-info
  • aurora serverless v1 postgres engine v11.21

Is anyone else experiencing this?

Is there anything I can try, to avoid this problem?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions