Skip to content

Commit

Permalink
Replace other mentions of mrtk_dev with main
Browse files Browse the repository at this point in the history
  • Loading branch information
keveleigh committed May 6, 2021
1 parent 9888c2e commit 1eaf139
Show file tree
Hide file tree
Showing 10 changed files with 24 additions and 21 deletions.
12 changes: 6 additions & 6 deletions mixed-reality-docs/mr-dev-docs/design/spatial-sound-design.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,21 +47,21 @@ Interaction types in mixed reality include gesture, direct manipulation, and voi
### Gesture interactions

In mixed reality, users may interact with buttons by using a mouse. Button actions generally occur when the user releases rather than presses the button to give the user a chance to cancel the interaction. Use sounds to reinforce these stages. To assist users in targeting distant buttons, also consider using a pointer-hover sound.
* Button-press sounds should be a short, tactile "click."<br/>Example: [MRTK_ButtonPress.wav](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_development/Assets/MRTK/SDK/StandardAssets/Audio/MRTK_ButtonPress.wav)
* Button-"unpress" sounds should have a similar tactile feel. A higher pitch than the press sound reinforces the sense of completion.<br/>Example: [MRTK_ButtonUnpress.wav](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_development/Assets/MRTK/SDK/StandardAssets/Audio/MRTK_ButtonUnpress.wav)
* Button-press sounds should be a short, tactile "click."<br/>Example: [MRTK_ButtonPress.wav](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/main/Assets/MRTK/SDK/StandardAssets/Audio/MRTK_ButtonPress.wav)
* Button-"unpress" sounds should have a similar tactile feel. A higher pitch than the press sound reinforces the sense of completion.<br/>Example: [MRTK_ButtonUnpress.wav](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/main/Assets/MRTK/SDK/StandardAssets/Audio/MRTK_ButtonUnpress.wav)
* For hover sounds, consider using a subtle and non-threatening sound, such as a low-frequency thud or bump.

### Direct manipulation

On HoloLens 2, articulated hand tracking supports direct manipulation of user-interface elements. Sounds are important when there's no other physical feedback.

A *button press* sound is important because the user doesn't get any other indication when they reach the bottom of the key stroke. Sound indicators of key travel can be small, subtle, and occluded. As with gesture interactions, button presses should get a short, tactile sound like a click. Unpresses should have a similar click sound but with raised pitch.
* Example: [MRTK_ButtonPress.wav](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_development/Assets/MRTK/SDK/StandardAssets/Audio/MRTK_ButtonPress.wav)
* Example: [MRTK_ButtonUnpress.wav](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_development/Assets/MRTK/SDK/StandardAssets/Audio/MRTK_ButtonUnpress.wav)
* Example: [MRTK_ButtonPress.wav](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/main/Assets/MRTK/SDK/StandardAssets/Audio/MRTK_ButtonPress.wav)
* Example: [MRTK_ButtonUnpress.wav](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/main/Assets/MRTK/SDK/StandardAssets/Audio/MRTK_ButtonUnpress.wav)

It's difficult to visually confirm a grab or release action. The user's hand will often be in the way of any visual effect, and hard-bodied objects lack a real-world visual analog of "grabbing." Sounds can effectively communicate successful grab and release interactions.
* Grab actions should have a short, somewhat-muffled tactile sound that prompts the idea of fingers closing around an object. Sometimes there's also a "whoosh" sound that leads up to the grabbing sound to communicate the motion of the hand.<br/>Example: [MRTK_Move_Start.wav](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_development/Assets/MRTK/SDK/StandardAssets/Audio/MRTK_Move_Start.wav)
* Release actions should get a similarly short and tactile sound. It's usually lower pitched than the grab sound and in reverse order, with an impact and then a "whoosh" to communicate that the object is settling into place.<br/>Example: [MRTK_Move_End.wav](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_development/Assets/MRTK/SDK/StandardAssets/Audio/MRTK_Move_End.wav)
* Grab actions should have a short, somewhat-muffled tactile sound that prompts the idea of fingers closing around an object. Sometimes there's also a "whoosh" sound that leads up to the grabbing sound to communicate the motion of the hand.<br/>Example: [MRTK_Move_Start.wav](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/main/Assets/MRTK/SDK/StandardAssets/Audio/MRTK_Move_Start.wav)
* Release actions should get a similarly short and tactile sound. It's usually lower pitched than the grab sound and in reverse order, with an impact and then a "whoosh" to communicate that the object is settling into place.<br/>Example: [MRTK_Move_End.wav](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/main/Assets/MRTK/SDK/StandardAssets/Audio/MRTK_Move_End.wav)

A *drawing* interaction should get a persistent, looping sound with volume determined by the user's hand movement. It should be silent when the user's hand is still and loudest when the hand is moving quickly.

Expand Down
2 changes: 1 addition & 1 deletion mixed-reality-docs/mr-dev-docs/design/surface-magnetism.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Surface magnetism lets you place holographic objects on real-world physical surf
**[MRTK](https://github.com/Microsoft/MixedRealityToolkit-Unity)** provides scripts and example scenes for the surface magnetism technique. You can use surface magnetism with various types of inputs such as hand-ray, eye gaze, and motion controllers.

* [MRTK - Surface magnetism solver](https://docs.microsoft.com/windows/mixed-reality/mrtk-unity/features/ux-building-blocks/solvers/solver#surfacemagnetism)
* [MRTK - Spatial awareness + Surface magnetism example scenes](https://github.com/microsoft/MixedRealityToolkit-Unity/blob/mrtk_development/Assets/MRTK/Examples/Demos/Solvers/Scenes/SurfaceMagnetismSpatialAwarenessExample.unity)
* [MRTK - Spatial awareness + Surface magnetism example scenes](https://github.com/microsoft/MixedRealityToolkit-Unity/blob/main/Assets/MRTK/Examples/Demos/Solvers/Scenes/SurfaceMagnetismSpatialAwarenessExample.unity)

<br>

Expand Down
11 changes: 7 additions & 4 deletions mixed-reality-docs/mr-dev-docs/develop/unity/text-in-unity.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,7 @@ With Unity's Text Mesh Pro, you can secure the text rendering quality. It suppor
*Scaling values for the Unity 3D Text and UI*

## Recommended text size

As you can expect, type sizes that we use on a PC or a tablet device (typically between 12–32pt) look small at a distance of 2 meters. It depends on the characteristics of each font, but in general the recommended minimum viewing angle and the font height for legibility are around 0.35°-0.4°/12.21-13.97 mm based on our user research studies. It's about 35-40 pt with the scaling factor introduced above.

For the near interaction at 0.45 m (45 cm), the minimum legible font's viewing angle and the height are 0.4°-0.5° / 3.14–3.9mm. It's about 9-12 pt with the scaling factor introduced above.
Expand All @@ -63,13 +64,15 @@ For the near interaction at 0.45 m (45 cm), the minimum legible font's viewing a
*Content at near and far interaction range*

### The minimum legible font size

| Distance | Viewing angle | Text height | Font size |
|---------|---------|---------|---------|
| 45 cm (direct manipulation distance) | 0.4°-0.5° | 3.14–3.9mm | 8.9–11.13pt |
| 2 m | 0.35°-0.4° | 12.21–13.97mm | 34.63-39.58 pt |


### The comfortably legible font size

| Distance | Viewing angle | Text height | Font size |
|---------|---------|---------|---------|
| 45 cm (direct manipulation distance) | 0.65°-0.8° | 5.1-6.3 mm | 14.47-17.8 pt |
Expand All @@ -84,14 +87,14 @@ Segoe UI (the default font for Windows) works well in most cases. However, avoid

### Sharp text rendering quality with proper dimension

Based on these scaling factors, we have created [text prefabs with UI Text and 3D Text Mesh](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_development/Assets/MRTK/SDK/StandardAssets/Prefabs/Text). Developers can use these prefabs to get sharp text and consistent font size.
Based on these scaling factors, we have created [text prefabs with UI Text and 3D Text Mesh](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/main/Assets/MRTK/SDK/StandardAssets/Prefabs/Text). Developers can use these prefabs to get sharp text and consistent font size.

![Sharp text rendering quality with proper dimension](images/hug-text-06-1000px.png)<br>
*Sharp text rendering quality with proper dimension*

### Shader with occlusion support

Unity's default font material doesn't support occlusion. Because of this, you'll see the text behind the objects by default. We've included a simple [shader that supports the occlusion](https://github.com/microsoft/MixedRealityToolkit-Unity/blob/mrtk_development/Assets/MRTK/StandardAssets/Shaders/Text3DShader.shader). The image below shows the text with default font material (left) and the text with proper occlusion (right).
Unity's default font material doesn't support occlusion. Because of this, you'll see the text behind the objects by default. We've included a simple [shader that supports the occlusion](https://github.com/microsoft/MixedRealityToolkit-Unity/blob/main/Assets/MRTK/StandardAssets/Shaders/Text3DShader.shader). The image below shows the text with default font material (left) and the text with proper occlusion (right).

![Shader with occlusion support](images/hug-text-07-1000px.png)<br>
*Shader with occlusion support*
Expand All @@ -110,7 +113,7 @@ Or jump to Mixed Reality platform capabilities and APIs:
You can always go back to the [Unity development checkpoints](unity-development-overview.md#2-core-building-blocks) at any time.


## See also
* [Text Prefab in the MRTK](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_development/Assets/MRTK/SDK/StandardAssets/Prefabs/Text)

* [Text Prefab in the MRTK](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/main/Assets/MRTK/SDK/StandardAssets/Prefabs/Text)
* [Typography](../../design/typography.md)
Original file line number Diff line number Diff line change
Expand Up @@ -294,8 +294,8 @@ You could also just start a KeywordRecognizer, which will restart the PhraseReco
## Voice input in Mixed Reality Toolkit

You can find MRTK examples for voice input in the following demo scenes:
* [Dictation](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_development/Assets/MRTK/Examples/Demos/Input/Scenes/Dictation)
* [Speech](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_development/Assets/MRTK/Examples/Demos/Input/Scenes/Speech)
* [Dictation](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/main/Assets/MRTK/Examples/Demos/Input/Scenes/Dictation)
* [Speech](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/main/Assets/MRTK/Examples/Demos/Input/Scenes/Speech)

## Next Development Checkpoint

Expand Down
2 changes: 1 addition & 1 deletion mrtk-unity/contributing/pull-requests.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ A comment in the PR will let you know if you do.
## Creating a pull request

When you are ready to submit a pull request, [create a pull request](https://github.com/microsoft/MixedRealityToolkit-Unity/compare/mrtk_development...mrtk_development?expand=1) targeting the [mrtk_development](https://github.com/microsoft/mixedrealitytoolkit-unity/tree/mrtk_development) branch.
When you are ready to submit a pull request, [create a pull request](https://github.com/microsoft/MixedRealityToolkit-Unity/compare/main...main?expand=1) targeting the [main](https://github.com/microsoft/mixedrealitytoolkit-unity/tree/main) branch. For bug fixes during a release stabilization period, look for the latest `prerelease/*` branch. New features should always go into `main`.

Read the guidelines and ensure your pull request meets the guidelines.

Expand Down
2 changes: 1 addition & 1 deletion mrtk-unity/contributing/unit-tests.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ It's also possible to run the playmode tests multiple times via the `run_repeat_

MRTK's CI will build MRTK in all configurations and run all edit and play mode tests. CI can be triggered by posting a comment on the github PR `/azp run mrtk_pr` if the user has sufficient rights. CI runs can be seen in the 'checks' tab of the PR.

Only after all of the tests have passed successfully can the PR be merged into mrtk_development.
Only after all of the tests have passed successfully can the PR be merged into main.

### Stress tests / bulk tests

Expand Down
2 changes: 1 addition & 1 deletion mrtk-unity/features/ux-building-blocks/dialog.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Dialog controls are UI overlays that provide contextual app information. They of
## Example scene

You can find examples in the **DialogExample** scene under:
[MRTK/Examples/Demo/UX/Dialog](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_development/Assets/MRTK/Examples/Demos/UX/Dialog)
[MRTK/Examples/Demo/UX/Dialog](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/main/Assets/MRTK/Examples/Demos/UX/Dialog)

## How to use Dialog control

Expand Down
4 changes: 2 additions & 2 deletions mrtk-unity/features/ux-building-blocks/hand-coach.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,12 +27,12 @@ The current interaction model represents a wide variety of gesture controls such
## Example scene

You can find examples in the **HandCoachExample** scene under:
[MixedRealityToolkit.Examples/Experimental/HandCoach/Scenes](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_development/Assets/MRTK/Examples/Demos/HandCoach/Scenes)
[MixedRealityToolkit.Examples/Experimental/HandCoach/Scenes](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/main/Assets/MRTK/Examples/Demos/HandCoach/Scenes)

## Hand 3D Assets

You can find the assets under:
[MixedRealityToolkit.SDK/Experimental/HandCoach](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_development/Assets/MRTK/Examples/Demos/HandCoach)
[MixedRealityToolkit.SDK/Experimental/HandCoach](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/main/Assets/MRTK/Examples/Demos/HandCoach)

## Quality

Expand Down
4 changes: 2 additions & 2 deletions mrtk-unity/features/ux-building-blocks/text-prefab.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,12 +62,12 @@ When adding a UI or canvas based Text element to a scene, the size disparity is

![Font size with scaling factors](../images/text-prefab/TextPrefabInstructions07.png)

### [Text3DSelawik.mat](https://github.com/microsoft/MixedRealityToolkit-Unity/blob/mrtk_development/Assets/MRTK/StandardAssets/Materials/)
### [Text3DSelawik.mat](https://github.com/microsoft/MixedRealityToolkit-Unity/blob/main/Assets/MRTK/StandardAssets/Materials/)

Material for 3DTextPrefab with occlusion support. Requires 3DTextShader.shader

![Default Font material vs 3DTextSegoeUI material](../images/text-prefab/TextPrefabInstructions06.png)

### [Text3DShader.shader](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_development/Assets/MRTK/StandardAssets/Shaders)
### [Text3DShader.shader](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/main/Assets/MRTK/StandardAssets/Shaders)

Shader for 3DTextPrefab with occlusion support.
2 changes: 1 addition & 1 deletion mrtk-unity/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ Want to see what's going on under the hood?

| Branch | CI Status | Docs Status |
|---|---|---|
| `mrtk_development` |[![CI Status](https://dev.azure.com/aipmr/MixedRealityToolkit-Unity-CI/_apis/build/status/public/mrtk_CI?branchName=mrtk_development)](https://dev.azure.com/aipmr/MixedRealityToolkit-Unity-CI/_build/latest?definitionId=15)|[![Docs Status](https://dev.azure.com/aipmr/MixedRealityToolkit-Unity-CI/_apis/build/status/public/mrtk_docs?branchName=mrtk_development)](https://dev.azure.com/aipmr/MixedRealityToolkit-Unity-CI/_build/latest?definitionId=7)
| `main` |[![CI Status](https://dev.azure.com/aipmr/MixedRealityToolkit-Unity-CI/_apis/build/status/public/mrtk_CI?branchName=main)](https://dev.azure.com/aipmr/MixedRealityToolkit-Unity-CI/_build/latest?definitionId=15)|[![Docs Status](https://dev.azure.com/aipmr/MixedRealityToolkit-Unity-CI/_apis/build/status/public/mrtk_docs?branchName=main)](https://dev.azure.com/aipmr/MixedRealityToolkit-Unity-CI/_build/latest?definitionId=7)

## Feature areas

Expand Down

0 comments on commit 1eaf139

Please sign in to comment.