Skip to content

Commit

Permalink
Merge branch 'version-4' of https://github.com/pnp/pnpjs into version-4
Browse files Browse the repository at this point in the history
  • Loading branch information
patrick-rodgers committed Oct 15, 2024
2 parents 79998b2 + fca9a91 commit e2e12b0
Show file tree
Hide file tree
Showing 344 changed files with 519 additions and 470,447 deletions.
6 changes: 3 additions & 3 deletions .github/ISSUE_TEMPLATE/1-bug-report.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name: 🐞 Bug or Error Report
description: Submit a bug or error report.
labels: ["type: someting isn't working", "status: investigate"]
labels: ["type: something isn't working", "status: investigate"]

body:
- type: markdown
Expand All @@ -16,10 +16,10 @@ body:
label: Major Version
options:
- 4.x
- 3.x
- 3.x (No longer supported)
- 2.x (No longer supported)
- 1.x (No longer supported)
default: 1
default: 0
validations:
required: true
- type: input
Expand Down
4 changes: 2 additions & 2 deletions .github/ISSUE_TEMPLATE/1-question.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,10 @@ body:
label: What version of PnPjs library you are using
options:
- 4.x
- 3.x
- 3.x (No longer supported)
- 2.x (No longer supported)
- 1.x (No longer supported)
default: 1
default: 0
validations:
required: true
- type: input
Expand Down
17 changes: 17 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,23 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html).

## 4.6.0 - 2024-Oct-14

- Only documentation and package updates

## 4.5.0 - 2024-Sept-16

- Only documentation and package updates

## 4.4.0 - 2024-Aug-12

- sp
- Addresses #3091 - Update return types from Shares
- Addresses #3104 - Replaces an in-function await to just return the promise.

- graph
- Addresses #3083 - Adds the ability to pass in retrieveProperties to getAllChildrenAsTree. V2 and V3 had this functionality. Only supports Shared Custom Properties, not Local Custom Properties.

## 4.3.0 - 2024-July-15

- sp
Expand Down
2 changes: 1 addition & 1 deletion debug/spfx/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
"@pnp/logging": "^4.0.1",
"@pnp/sp": "^4.0.1",
"@pnp/sp-admin": "^4.0.1",
"tslib": "2.3.1"
"tslib": "2.7.0"
},
"devDependencies": {
"@microsoft/eslint-config-spfx": "1.18.2",
Expand Down
15 changes: 8 additions & 7 deletions docs/graph/files.md
Original file line number Diff line number Diff line change
Expand Up @@ -305,8 +305,7 @@ import * as fs from "fs";
import { graphfi } from "@pnp/graph";
import "@pnp/graph/files";
import "@pnp/graph/users";
import {IFileUploadOptions} from "@pnp/graph/files";

import { DriveItem as DriveItemType } from "@microsoft/microsoft-graph-types";
const graph = graphfi(...);

const fileBuff = fs.readFileSync("C:\\MyDocs\\TestDocument.docx");
Expand All @@ -317,7 +316,7 @@ const fileUploadOptions: IResumableUploadOptions<DriveItemUploadableProperties>
},
};

// Create the upload session
// Create the upload session, must get the drive root folder id to call createUploadSession
const uploadSession = await graph.users.getById(userId).drive.getItemById(driveRoot.id).createUploadSession(fileUploadOptions);
// Get the status of the upload session
const status = await uploadSession.resumableUpload.status();
Expand All @@ -327,12 +326,14 @@ const upload = await uploadSession.resumableUpload.upload(fileBuff.length, fileB

// Upload a chunk of the file to the upload session
// Using a fragment size that doesn't divide evenly by 320 KiB results in errors committing some files.
const chunkSize = 327680;
const chunkFactor = 1;
const chunkSize = 327680 * chunkFactor;
let startFrom = 0;
let driveItem: DriveItemType = null;
while (startFrom < fileBuff.length) {
const fileChunk = fileBuff.slice(startFrom, startFrom + chunkSize);
const contentLength = `bytes ${startFrom}-${startFrom + chunkSize}/${fileBuff.length}`
const uploadChunk = await uploadSession.resumableUpload.upload(chunkSize, fileChunk, contentLength);
const fileChunk = Uint8Array.prototype.slice.call(fileBuff, startFrom, startFrom + chunkSize);
const range = `bytes ${startFrom}-${(startFrom + fileChunk.length) - 1}/${fileBuff.length}`;
driveItem = await uploadSession.resumableUpload.upload(fileChunk.length, fileChunk, range);
startFrom += chunkSize;
}
```
Expand Down
2 changes: 2 additions & 0 deletions docs/graph/shares.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,9 @@ const shareLinkInfo = {
encodedSharingUrl: shareLink,
redeemSharingLink: false
};
// default shared drive item response (id, name)
const sharedDriveItem = await graph.shares.useSharingLink(shareLinkInfo);

```

## Create Sharing Link
Expand Down
7 changes: 6 additions & 1 deletion docs/graph/sites.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,12 @@ import "@pnp/graph/sites";
const graph = graphfi(...);
const sharepointHostName = "contoso.sharepoint.com";
const serverRelativeUrl = "/sites/teamsite1";
const siteInfo = await graph.sites.getByUrl(sharepointHostName, serverRelativeUrl)();
// Will be converted to a ISite object that can then be called, must be awaited.
const site: ISite = await graph.sites.getByUrl(sharepointHostName, serverRelativeUrl);
// Now use the ISite object to get drives
const drives = await site.drives();
// Now use the ISite object to get the site information
const siteInfo = await site();
```

### getAllSites
Expand Down
17 changes: 7 additions & 10 deletions docs/graph/taxonomy.md
Original file line number Diff line number Diff line change
Expand Up @@ -203,7 +203,6 @@ This method will get all of a set's child terms in an ordered array. It is a cos
```TypeScript
import { graphfi } from "@pnp/graph";
import "@pnp/graph/taxonomy";
import { ITermInfo } from "@pnp/graph/taxonomy";
import { dateAdd, PnPClientStorage } from "@pnp/core";

const graph = graphfi(...);
Expand Down Expand Up @@ -276,12 +275,11 @@ Access term set information
```TypeScript
import { graphfi } from "@pnp/graph";
import "@pnp/graph/taxonomy";
import { ITermInfo } from "@pnp/graph/taxonomy";

import { TermStore } from '@microsoft/microsoft-graph-types';
const graph = graphfi(...);

// list all the terms that are direct children of this set
const infos: ITermInfo[] = await graph.termStore.groups.getById("338666a8-1111-2222-3333-f72471314e72").sets.getById("338666a8-1111-2222-3333-f72471314e72").children();
const infos: TermStore.Term[] = await graph.termStore.groups.getById("338666a8-1111-2222-3333-f72471314e72").sets.getById("338666a8-1111-2222-3333-f72471314e72").children();
```

### List (terms)
Expand All @@ -291,36 +289,35 @@ You can use the terms property to get a flat list of all terms in the set. These
```TypeScript
import { graphfi } from "@pnp/graph";
import "@pnp/graph/taxonomy";
import { ITermInfo } from "@pnp/graph/taxonomy";
import { TermStore } from '@microsoft/microsoft-graph-types';

const graph = graphfi(...);

// list all the terms available in this term set by group id then by term set id
const infos: ITermInfo[] = await graph.termStore.groups.getById("338666a8-1111-2222-3333-f72471314e72").sets.getById("338666a8-1111-2222-3333-f72471314e72").terms();
const infos: TermStore.Term[] = await graph.termStore.groups.getById("338666a8-1111-2222-3333-f72471314e72").sets.getById("338666a8-1111-2222-3333-f72471314e72").terms();

// list all the terms available in this term set by term set id
const infosByTermSetId: ITermInfo[] = await graph.termStore.sets.getById("338666a8-1111-2222-3333-f72471314e72").terms();
const infosByTermSetId: TermStore.Term[] = await graph.termStore.sets.getById("338666a8-1111-2222-3333-f72471314e72").terms();
```

### Get By Id

```TypeScript
import { graphfi } from "@pnp/graph";
import "@pnp/graph/taxonomy";
import { ITermInfo } from "@pnp/graph/taxonomy";
import { TermStore } from '@microsoft/microsoft-graph-types';

const graph = graphfi(...);

// get term set data
const info: ITermInfo = await graph.termStore.groups.getById("338666a8-1111-2222-3333-f72471314e72").sets.getById("338666a8-1111-2222-3333-f72471314e72").getTermById("338666a8-1111-2222-3333-f72471314e72")();
const info: TermStore.Term = await graph.termStore.groups.getById("338666a8-1111-2222-3333-f72471314e72").sets.getById("338666a8-1111-2222-3333-f72471314e72").getTermById("338666a8-1111-2222-3333-f72471314e72")();
```

### Add

```TypeScript
import { graphfi, SPFxToken, SPFx } from "@pnp/graph";
import "@pnp/graph/taxonomy";
import { ITermInfo } from "@pnp/graph/taxonomy";

const graph = graphfi(...);

Expand Down
2 changes: 1 addition & 1 deletion docs/msaljsclient/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ import "@pnp/sp/site-users/web";
const options: MSALOptions = {
configuration: {
auth: {
authority: "https://login.microsoftonline.com/{tanent_id}/",
authority: "https://login.microsoftonline.com/{tenant_id}/",
clientId: "{client id}",
},
cache: {
Expand Down
24 changes: 11 additions & 13 deletions docs/sp/files.md
Original file line number Diff line number Diff line change
Expand Up @@ -156,9 +156,11 @@ if (file.size <= 10485760) {
result = await sp.web.getFolderByServerRelativePath("Shared Documents").files.addUsingPath(fileNamePath, file, { Overwrite: true });
} else {
// large upload
result = await sp.web.getFolderByServerRelativePath("Shared Documents").files.addChunked(fileNamePath, file, data => {
console.log(`progress`);
}, true);
result = await sp.web.getFolderByServerRelativePath("Shared Documents").files.addChunked(fileNamePath, file,
{ progress: data => { console.log(`progress`); },
Overwrite: true
}
);
}

console.log(`Result of file upload: ${JSON.stringify(result)}`);
Expand Down Expand Up @@ -186,7 +188,7 @@ const stream = createReadStream("c:/temp/file.txt");
// now add the stream as a new file
const sp = spfi(...);

const fr = await sp.web.lists.getByTitle("Documents").rootFolder.files.addChunked( "new.txt", stream, undefined, true );
const fileInfo = await sp.web.lists.getByTitle("Documents").rootFolder.files.addChunked("new.txt", stream, { progress: data => { console.log(`progress`); }, Overwrite: true });
```

### Setting Associated Item Values
Expand All @@ -200,7 +202,7 @@ import "@pnp/sp/files";
import "@pnp/sp/folders";

const sp = spfi(...);
const file = await sp.web.getFolderByServerRelativePath("/sites/dev/Shared%20Documents/test/").files.addUsingPath("file.name", "content", {Overwrite: true});
const fileInfo = await sp.web.getFolderByServerRelativePath("/sites/dev/Shared%20Documents/test/").files.addUsingPath("file.name", "content", {Overwrite: true});
const item = await file.file.getItem();
await item.update({
Title: "A Title",
Expand Down Expand Up @@ -331,18 +333,15 @@ Both the addChunked and setContentChunked methods support options beyond just su

A method that is called each time a chunk is uploaded and provides enough information to report progress or update a progress bar easily. The method has the signature:

`(data: ChunkedFileUploadProgressData) => void`
`(data: IFileUploadProgressData) => void`

The data interface is:

```typescript
export interface ChunkedFileUploadProgressData {
export interface IFileUploadProgressData {
uploadId: string;
stage: "starting" | "continue" | "finishing";
blockNumber: number;
totalBlocks: number;
chunkSize: number;
currentPointer: number;
fileSize: number;
offset: number;
}
```

Expand Down Expand Up @@ -560,4 +559,3 @@ import "@pnp/sp/files";
const sp = spfi(...);
const user = await sp.web.getFolderByServerRelativePath("{folder relative path}").files.getByUrl("name.txt").getLockedByUser();
```

5 changes: 3 additions & 2 deletions docs/sp/folders.md
Original file line number Diff line number Diff line change
Expand Up @@ -532,7 +532,7 @@ const folder: IFolder = await sp.web.rootFolder.folders.getByUrl("SiteAssets").a

### getFolderById

You can get a folder by Id from a web.
You can get a folder by UniqueId from a web.

```TypeScript
import { spfi } from "@pnp/sp";
Expand All @@ -542,7 +542,8 @@ import { IFolder } from "@pnp/sp/folders";

const sp = spfi(...);

const folder: IFolder = sp.web.getFolderById("2b281c7b-ece9-4b76-82f9-f5cf5e152ba0");
const folderItem = sp.web.lists.getByTitle("My List").items.getById(1).select("UniqueId")()
const folder: IFolder = sp.web.getFolderById(folderItem.UniqueId);
```

### getParentInfos
Expand Down
36 changes: 34 additions & 2 deletions docs/sp/items.md
Original file line number Diff line number Diff line change
Expand Up @@ -367,11 +367,13 @@ await execute();

console.log("Done");
```

### Update Taxonomy field

Note: Updating Taxonomy field for a File item should be handled differently. Instead of using update(), use validateUpdateListItem(). Please see below

List Item
#### List Item

```TypeScript
import { spfi } from "@pnp/sp";
import "@pnp/sp/webs";
Expand All @@ -385,7 +387,9 @@ await sp.web.lists.getByTitle("Demo").items.getById(1).update({
});

```
File List Item

#### File List Item

```TypeScript
import { spfi } from "@pnp/sp";
import "@pnp/sp/webs";
Expand All @@ -407,6 +411,8 @@ _Based on [this excellent article](https://www.aerieconsulting.com/blog/update-u

As he says you must update a hidden field to get this to work via REST. My meta data field accepting multiple values is called "MultiMetaData".

#### List Item

```TypeScript
import { spfi } from "@pnp/sp";
import "@pnp/sp/webs";
Expand All @@ -433,7 +439,9 @@ await sp.web.lists.getByTitle("TestList").items.getById(newItem.Id).update(updat
```

#### File List Item

To update a multi-value taxonomy field on a file item, a different serialization is needed.

```TypeScript
import { spfi } from "@pnp/sp";
import "@pnp/sp/webs";
Expand Down Expand Up @@ -482,6 +490,30 @@ const update = await sp.web.lists.getByTitle("Price").items.getById(7).select('*
]);
```

### Update Location Field

This code shows how to update a location field's coordinates.

```TypeScript
import { spfi } from "@pnp/sp";
import "@pnp/sp/webs";
import "@pnp/sp/lists";
import "@pnp/sp/items";

const sp = spfi(...);
const coordinates = {
Latitude: 47.672082,
Longitude: -122.1409983
}

const projectId = 1;
const project = sp.web.lists.getByTitle("My List").items.getById(projectId).select("Id, ProjectLocation")()
const projectLocation = JSON.parse(project.ProjectLocation);
projectLocation.Coordinates = coordinates;
const ProjectLocation = JSON.stringify(projectLocation);
const update = await sp.web.lists.getByTitle("My List").items.getById(projectId).update({ ProjectLocation });
```

## Recycle

To send an item to the recycle bin use recycle.
Expand Down
2 changes: 1 addition & 1 deletion docs/sp/lists.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ console.log(r.Title);
});
```

You can also provide other (optional) parameters like description, template and enableContentTypes. If that is not enough for you, you can use the parameter named 'additionalSettings' which is just a TypedHash, meaning you can sent whatever properties you'd like in the body (provided that the property is supported by the SharePoint API). You can find a [listing of list template codes](https://docs.microsoft.com/en-us/dotnet/api/microsoft.sharepoint.splisttemplatetype?view=sharepoint-server) in the official docs.
You can also provide other (optional) parameters like description, template and enableContentTypes. If that is not enough for you, you can use the parameter named 'additionalSettings' which is just a TypedHash, meaning you can sent whatever properties you'd like in the body (provided that the property is supported by the SharePoint API). You can find a [listing of list template codes](https://learn.microsoft.com/en-us/openspecs/sharepoint_protocols/ms-wssts/8bf797af-288c-4a1d-a14b-cf5394e636cf) in the official docs.

```TypeScript
// this will create a list with template 101 (Document library), content types enabled and show it on the quick launch (using additionalSettings)
Expand Down
Loading

0 comments on commit e2e12b0

Please sign in to comment.