-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Require all packages to solve / compile and include all valid compilers in their metadata #669
base: master
Are you sure you want to change the base?
Changes from 8 commits
0b6dd4e
e15e4a8
d8e7e41
8e069b6
5348ee2
630c0bf
77d6e68
8749bea
be93d18
5a15433
559275c
09d515a
98ef892
ae621da
5c54103
441b960
3495edb
7ceab4c
3b85cd5
3fa90b5
628fdf0
0d3cef9
4e8cb87
d7c4180
81c85a4
10bccee
26c5aa0
b11917e
3ddde82
ec388d1
5b17cb3
f924b31
3cdb9b9
d0181e5
6f9f0cd
2721c6a
f8d0f80
e2d6e87
b8a21a8
3d7ab49
6bc8d09
9acbc94
d2c3b9a
bea2013
c942722
637a757
ab184f2
9cc56e7
c05fcb9
637488d
662dd00
8156aa2
ed7913c
ec8e3ff
d7d5e49
7d74da3
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -234,11 +234,12 @@ For example: | |
|
||
All packages in the registry have an associated metadata file, which is located in the `metadata` directory of the `registry` repository under the package name. For example, the metadata for the `aff` package is located at: https://github.com/purescript/registry/blob/main/metadata/aff.json. Metadata files are the source of truth on all published and unpublished versions for a particular package for what there content is and where the package is located. Metadata files are produced by the registry, not by package authors, though they take some information from package manifests. | ||
|
||
Each published version of a package records three fields: | ||
Each published version of a package records four fields: | ||
|
||
- `hash`: a [`Sha256`](#Sha256) of the compressed archive fetched by the registry for the given version | ||
- `bytes`: the size of the tarball in bytes | ||
- `publishedTime`: the time the package was published as an `ISO8601` string | ||
- `compilers`: compiler versions this package is known to work with. This field can be in one of two states: a single version indicates that the package worked with a specific compiler on upload but has not yet been tested with all compilers, whereas a non-empty array of versions indicates the package has been tested with all compilers the registry supports. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Wouldn't it be tidier to only allow a non-empty array instead of several possible types? After all, the state with multiple compilers listed is going to be a superset of the first state. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The issue with the non-empty array is that it isn't clear whether an array of a single element represents one of:
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. When are we going to end up in a situation where we don't test the package against the whole set of compilers? My reading of the PR is that we always do? In any case, we'll always have packages that are not "tested against the full set of compilers": when a new compiler version comes out, then all packages will need a retest, and if a package doesn't have the new compiler in the array then we don't know if it's not compatible or if it hasn't been tested yet. Maybe we need another piece of state somewhere else? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Yes, as implemented here we just go ahead and test everything as soon as we've published. However, I split out the state because in our initial discussions we worried about how long it takes for the compiler builds to run (it takes publishing from N seconds to N minutes in some cases — large libraries or ones that leverage a lot of type machinery). We'd originally talked about the compiler matrix being a cron job that runs later in the day. I just made it part of the publishing pipeline directly because it was simpler to implement. If we decide that it's OK for publishing to take a long time then we can eliminate this state and just test the compilers immediately. In that case we'd just have a non-empty array. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Yea, that's a good point. You don't know if the metadata you're reading just hasn't been reached yet by an ongoing mass compiler build to check a new compiler.
Off the top of my head I don't know a good place to put some state about possible compiler support; the metadata files are not helpful if a new compiler comes out and we're redoing the build since they're only aware of the one package. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
I'm cool with this if you are.
We could either a) say that the supported list of compilers for a package can potentially be missing the current compiler if the matrix is currently running and not bother with state or b) put a JSON file or something in the metadata directory that indicates whether the compiler matrix is running. Then consumers can look at that. Personally the matrix runs infrequently enough (just new compiler releases!) that I would rather opt for (a). There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I pondered this for a few days and I think it's complicated? Since we're going towards a model where we'd only run one registry job at a time and queue the rest (to prevent concurrent pushes to the repo), I'm afraid that running the whole matrix at once would make publishing very slow.
The approach detailed above implies that we're in a world where we do (a), i.e. the list of compilers is always potentially out of date, and that's fine. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Additional note about the above: since the above would be introducing an "asynchronous matrix builder", we need to consider the dependency tree in our rebuilding: if a package A is published with compiler X, and then a package B depending on it is immediately published after it (a very common usecase since folks seem to publish their packages in batches), then we'd need to either make sure that matrix-build jobs for B are always run after matrix-build jobs for A, or retry them somehow. |
||
|
||
Each unpublished version of a package records three fields: | ||
|
||
|
Large diffs are not rendered by default.
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -168,7 +168,6 @@ handleMemoryFs env = case _ of | |
case inFs of | ||
Nothing -> pure $ reply Nothing | ||
Just entry -> do | ||
Log.debug $ "Fell back to on-disk entry for " <> memory | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. These are just so noisy. Maybe we can introduce a There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Is this log useful at all? I think it's ok to just remove it There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yea, I think they're not really useful now that we're confident the cache works correctly. I had them in there from when I first developed it and would either sometimes see things I thought should be cached not get cached, or I wanted to make sure something I removed from the cache really was. |
||
putMemoryImpl env.ref unit (Key memory (Const entry)) | ||
pure $ reply $ Just $ unCache entry | ||
Just cached -> | ||
|
@@ -226,8 +225,7 @@ getMemoryImpl ref (Key id (Reply reply)) = do | |
let (unCache :: CacheValue -> b) = unsafeCoerce | ||
cache <- Run.liftEffect $ Ref.read ref | ||
case Map.lookup id cache of | ||
Nothing -> do | ||
Log.debug $ "No cache entry found for " <> id <> " in memory." | ||
Nothing -> | ||
pure $ reply Nothing | ||
Just cached -> do | ||
pure $ reply $ Just $ unCache cached | ||
|
@@ -236,7 +234,6 @@ putMemoryImpl :: forall x r a. CacheRef -> a -> MemoryEncoding Const a x -> Run | |
putMemoryImpl ref next (Key id (Const value)) = do | ||
let (toCache :: x -> CacheValue) = unsafeCoerce | ||
Run.liftEffect $ Ref.modify_ (Map.insert id (toCache value)) ref | ||
Log.debug $ "Wrote cache entry for " <> id <> " in memory." | ||
pure next | ||
|
||
deleteMemoryImpl :: forall x r a. CacheRef -> MemoryEncoding Ignore a x -> Run (LOG + EFFECT + r) a | ||
|
@@ -275,7 +272,6 @@ getFsImpl cacheDir = case _ of | |
let path = Path.concat [ cacheDir, safePath id ] | ||
Run.liftAff (Aff.attempt (FS.Aff.readFile path)) >>= case _ of | ||
Left _ -> do | ||
Log.debug $ "No cache found for " <> id <> " at path " <> path | ||
pure $ reply Nothing | ||
Right buf -> do | ||
pure $ reply $ Just buf | ||
|
@@ -284,7 +280,6 @@ getFsImpl cacheDir = case _ of | |
let path = Path.concat [ cacheDir, safePath id ] | ||
Run.liftAff (Aff.attempt (FS.Aff.readTextFile UTF8 path)) >>= case _ of | ||
Left _ -> do | ||
Log.debug $ "No cache file found for " <> id <> " at path " <> path | ||
pure $ reply Nothing | ||
Right content -> case Argonaut.Parser.jsonParser content of | ||
Left parseError -> do | ||
|
@@ -307,7 +302,6 @@ putFsImpl cacheDir next = case _ of | |
Log.warn $ "Failed to write cache entry for " <> id <> " at path " <> path <> " as a buffer: " <> Aff.message fsError | ||
pure next | ||
Right _ -> do | ||
Log.debug $ "Wrote cache entry for " <> id <> " as a buffer at path " <> path | ||
pure next | ||
|
||
AsJson id codec (Const value) -> do | ||
|
@@ -317,7 +311,6 @@ putFsImpl cacheDir next = case _ of | |
Log.warn $ "Failed to write cache entry for " <> id <> " at path " <> path <> " as JSON: " <> Aff.message fsError | ||
pure next | ||
Right _ -> do | ||
Log.debug $ "Wrote cache entry for " <> id <> " at path " <> path <> " as JSON." | ||
pure next | ||
|
||
deleteFsImpl :: forall a b r. FilePath -> FsEncoding Ignore a b -> Run (LOG + AFF + r) a | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -6,6 +6,7 @@ import Data.Array.NonEmpty as NonEmptyArray | |
import Data.Foldable (traverse_) | ||
import Data.Map as Map | ||
import Data.Set as Set | ||
import Data.Set.NonEmpty as NonEmptySet | ||
import Data.String as String | ||
import Data.String.NonEmpty as NonEmptyString | ||
import Effect.Aff as Aff | ||
|
@@ -68,10 +69,39 @@ spec = do | |
Assert.shouldEqual version (Utils.unsafeVersion "1.0.0") | ||
FS.Extra.remove tmp | ||
|
||
Spec.describe "Finds compatible compilers from dependencies" do | ||
Spec.it "Finds intersect of single package" do | ||
Assert.Run.runBaseEffects do | ||
metadata <- Registry.readAllMetadataFromDisk $ Path.concat [ "app", "fixtures", "registry", "metadata" ] | ||
let expected = map Utils.unsafeVersion [ "0.15.10", "0.15.12" ] | ||
case API.compatibleCompilers metadata (Map.singleton (Utils.unsafePackageName "prelude") (Utils.unsafeVersion "6.0.1")) of | ||
Nothing -> Except.throw $ "Got no compatible compilers, but expected " <> Utils.unsafeStringify (map Version.print expected) | ||
Just set -> do | ||
let actual = NonEmptySet.toUnfoldable set | ||
unless (actual == expected) do | ||
Except.throw $ "Expected " <> Utils.unsafeStringify (map Version.print expected) <> " but got " <> Utils.unsafeStringify (map Version.print actual) | ||
|
||
Spec.it "Finds intersect of multiple packages" do | ||
Assert.Run.runBaseEffects do | ||
metadata <- Registry.readAllMetadataFromDisk $ Path.concat [ "app", "fixtures", "registry", "metadata" ] | ||
let | ||
expected = map Utils.unsafeVersion [ "0.15.10" ] | ||
resolutions = Map.fromFoldable $ map (bimap Utils.unsafePackageName Utils.unsafeVersion) | ||
[ Tuple "prelude" "6.0.1" | ||
, Tuple "type-equality" "4.0.1" | ||
] | ||
case API.compatibleCompilers metadata resolutions of | ||
Nothing -> Except.throw $ "Got no compatible compilers, but expected " <> Utils.unsafeStringify (map Version.print expected) | ||
Just set -> do | ||
let actual = NonEmptySet.toUnfoldable set | ||
unless (actual == expected) do | ||
Except.throw $ "Expected " <> Utils.unsafeStringify (map Version.print expected) <> " but got " <> Utils.unsafeStringify (map Version.print actual) | ||
|
||
Spec.describe "API pipelines run correctly" $ Spec.around withCleanEnv do | ||
Spec.it "Publish a legacy-converted package with unused deps" \{ workdir, index, metadata, storageDir, githubDir } -> do | ||
let testEnv = { workdir, index, metadata, username: "jon", storage: storageDir, github: githubDir } | ||
Assert.Run.runTestEffects testEnv do | ||
logs <- liftEffect (Ref.new []) | ||
let testEnv = { workdir, logs, index, metadata, username: "jon", storage: storageDir, github: githubDir } | ||
result <- Assert.Run.runTestEffects testEnv $ Except.runExcept do | ||
-- We'll publish [email protected] from the fixtures/github-packages | ||
-- directory, which has an unnecessary dependency on 'type-equality' | ||
-- inserted into it. | ||
|
@@ -80,15 +110,15 @@ spec = do | |
version = Utils.unsafeVersion "4.0.0" | ||
ref = "v4.0.0" | ||
publishArgs = | ||
{ compiler: Utils.unsafeVersion "0.15.9" | ||
{ compiler: Utils.unsafeVersion "0.15.10" | ||
, location: Just $ GitHub { owner: "purescript", repo: "purescript-effect", subdir: Nothing } | ||
, name | ||
, ref | ||
, resolutions: Nothing | ||
} | ||
|
||
-- First, we publish the package. | ||
API.publish CurrentPackage publishArgs | ||
API.publish publishArgs | ||
|
||
-- Then, we can check that it did make it to "Pursuit" as expected | ||
Pursuit.getPublishedVersions name >>= case _ of | ||
|
@@ -117,12 +147,33 @@ spec = do | |
, printJson (Internal.Codec.packageMap Range.codec) manifest.dependencies | ||
] | ||
|
||
-- We should verify the resulting metadata file is correct | ||
Metadata effectMetadata <- Registry.readMetadata name >>= case _ of | ||
Nothing -> Except.throw $ "Expected " <> PackageName.print name <> " to be in metadata." | ||
Just m -> pure m | ||
|
||
case Map.lookup version effectMetadata.published of | ||
Nothing -> Except.throw $ "Expected " <> formatPackageVersion name version <> " to be in metadata." | ||
Just published -> case published.compilers of | ||
Left one -> Except.throw $ "Expected " <> formatPackageVersion name version <> " to have a compiler matrix but unfinished single version: " <> Version.print one | ||
Right many -> do | ||
let many' = NonEmptyArray.toArray many | ||
let expected = map Utils.unsafeVersion [ "0.15.10", "0.15.12" ] | ||
unless (many' == expected) do | ||
Except.throw $ "Expected " <> formatPackageVersion name version <> " to have a compiler matrix of " <> Utils.unsafeStringify (map Version.print expected) <> " but got " <> Utils.unsafeStringify (map Version.print many') | ||
|
||
-- Finally, we can verify that publishing the package again should fail | ||
-- since it already exists. | ||
Except.runExcept (API.publish CurrentPackage publishArgs) >>= case _ of | ||
Except.runExcept (API.publish publishArgs) >>= case _ of | ||
Left _ -> pure unit | ||
Right _ -> Except.throw $ "Expected publishing " <> formatPackageVersion name version <> " twice to fail." | ||
|
||
case result of | ||
Left err -> do | ||
recorded <- liftEffect (Ref.read logs) | ||
Assert.fail $ "Expected to publish [email protected] but got error: " <> err <> "\n\nLogs:\n" <> String.joinWith "\n" (map (\(Tuple _ msg) -> msg) recorded) | ||
Right _ -> pure unit | ||
|
||
where | ||
withCleanEnv :: (PipelineEnv -> Aff Unit) -> Aff Unit | ||
withCleanEnv action = do | ||
|
@@ -184,7 +235,7 @@ checkBuildPlanToResolutions = do | |
Spec.it "buildPlanToResolutions produces expected resolutions file format" do | ||
Assert.shouldEqual generatedResolutions expectedResolutions | ||
where | ||
dependenciesDir = "testDir" | ||
installedResolutions = "testDir" | ||
|
||
resolutions = Map.fromFoldable | ||
[ Tuple (Utils.unsafePackageName "prelude") (Utils.unsafeVersion "1.0.0") | ||
|
@@ -195,14 +246,14 @@ checkBuildPlanToResolutions = do | |
generatedResolutions = | ||
API.formatPursuitResolutions | ||
{ resolutions | ||
, dependenciesDir | ||
, installedResolutions | ||
} | ||
|
||
expectedResolutions = Map.fromFoldable do | ||
packageName /\ version <- (Map.toUnfoldable resolutions :: Array _) | ||
let | ||
bowerName = RawPackageName ("purescript-" <> PackageName.print packageName) | ||
path = Path.concat [ dependenciesDir, PackageName.print packageName <> "-" <> Version.print version ] | ||
path = Path.concat [ installedResolutions, PackageName.print packageName <> "-" <> Version.print version ] | ||
pure $ Tuple bowerName { path, version } | ||
|
||
removeIgnoredTarballFiles :: Spec.Spec Unit | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Surely we can make this more future proof 😄
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea, I think you're right 😆