diff --git a/.nojekyll b/.nojekyll new file mode 100644 index 00000000..e69de29b diff --git a/2018/05/grep-like-powershell-colorful-select-string.html b/2018/05/grep-like-powershell-colorful-select-string.html new file mode 100644 index 00000000..729abffe --- /dev/null +++ b/2018/05/grep-like-powershell-colorful-select-string.html @@ -0,0 +1,1469 @@ + + + + + + +Select-ColorString : A Unix’s grep-like Powershell Cmdlet Based On Select-String With Color - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 11 minute read + + + +

+ + +
+ + +
+ + + +

Update 2019-12-28 Powershell 7 Select-String default highlighting

+ +

Update 2019-12-28: It’s very exciting to see that since Powershell 7, the Select-String has highlighting (internal name: emphasis) by default. It uses similar way (index, length) to find and highlight the matches. The emphasis uses negative colors based on your PowerShell background and text colors. To disable the emphasis, use the -NoEmphasis switch. So I highly recommend everyone to switch to Powershell 7 (RC is supported by Microsoft), it has also many other new powerful features.

+ +

BTW, in Powershell 7, Select-String -AllMatches is set as $false by default. I think it would be nice to have an inverse switch -NoAllMatches just like -NoEmphasis, and let -AllMatches to be $true by default.

+ +

Update 2019-12-31: I just found a workaround here, by specifying $PSDefaultParameterValues['Select-String:AllMatches'] = $true in the Profile.ps1. I don’t know if you have the same feeling as the mine, this feature is killing, it will help me for many other things :)

+ +

Powershell 7 Select-String default highlighting demo:

+ +

+ +

The original post before the Emphasis has been introduced in Powershell 7:

+ +
+

Select-String in Powershell is a very powerful cmdlet to search a string or a pattern in input or files. It’s very much like the famous command-line grep in Unix. But from my personal point of view, it’s a little bit pity that Select-String doesn’t highlight the matching patterns, so I will show you in this post how to make it possible (more or less) with Select-ColorString.

+
+ +

Trace-Word

+ +

First of all, I must mention another powershell cmdlet Trace-Word that I read on Prateek Singh’s blog ridicurious.com.

+ +

Let me show you a screenshot of his Trace-Word to let you have an idea about what it can do:

+ +

+ +

Indeed, I was deeply impressed when I read his post, the color in Powershell string search results had been one of my most expected Powershell functionalities. Prateek Singh made it, thanks!

+ +

When I checked the code source of Trace-Word, I found the cmdlet logic is:

+ +
    +
  1. +

    Firstly reads the input content line by line:

    + +
    +
     $content | ForEach-Object {...}
    +
    +
  2. +
  3. And then splits each line by white-space: +
    +
     `$_.split() | Where-Object {
    +     -not [string]::IsNullOrWhiteSpace($_)
    + } | ForEach-Object{...}
    +
    +
  4. +
  5. At last checks each splitted token against the searching words: +
    +
     if($Token -like "*$Word*") {
    +     $before, $after = $Token -Split "$Word";
    +     ...
    + }
    +
    +
  6. +
  7. Now we have $before, $Word, $after, so just need to Write-Host $Word with color to highlight the wanted $Word.
  8. +
+ +

That’s done, pretty cool and quite straightforward, nothing complicated, I like it so much.

+ +

I contacted Prateek to ask if I can use his idea to write something similar but with another method, he said YES and that comes my Select-ColorString, thanks Prateek again.

+ +

Select-ColorString

+ +

Although Prateek Singh’s Trace-Word is wonderful enough, I still want a bit more capabilities: the regex and the customizable color choice.

+ +

The first thing that I thought about the regex is Select-String which I’m using almost everyday with sls.

+ +
+

Sometimes I was obliged to use the DOS command-line findstr due to that Select-String catches the input too earlier before it is been displayed a pure string on console screen. But findstr just finds what you want among what is shown on the screen. Although $input | Out-String | Select-String might solve the issue sometimes but it’s not sexy to use 2 cmdlets to do one single task and sometimes this workaround even doesn’t work.

+
+ +

Powershell Select-String returns some MatchInfo objects, from its MemberType, the Matches property is what I will use to color the matching patterns. The Index key gives the index of the first char of the matching pattern in a given line string, with that I know from where I could Write-Host with color.

+ +
PS> 'a is good, b is good too' | sls good -AllMatches | gm
+
+
+   TypeName:Microsoft.PowerShell.Commands.MatchInfo
+
+Name         MemberType Definition
+----         ---------- ----------
+Equals       Method     bool Equals(System.Object obj)
+GetHashCode  Method     int GetHashCode()
+GetType      Method     type GetType()
+RelativePath Method     string RelativePath(string directory)
+ToString     Method     string ToString(), string ToString(string directory)
+Context      Property   Microsoft.PowerShell.Commands.MatchInfoContext Context {get;set;}
+Filename     Property   string Filename {get;}
+IgnoreCase   Property   bool IgnoreCase {get;set;}
+Line         Property   string Line {get;set;}
+LineNumber   Property   int LineNumber {get;set;}
+Matches      Property   System.Text.RegularExpressions.Match[] Matches {get;set;}
+Path         Property   string Path {get;set;}
+Pattern      Property   string Pattern {get;set;}
+
+
+PS> 'a is good, b is good too' | sls good -AllMatches | % matches
+
+
+Groups   : {0}
+Success  : True
+Name     : 0
+Captures : {0}
+Index    : 5
+Length   : 4
+Value    : good
+
+Groups   : {0}
+Success  : True
+Name     : 0
+Captures : {0}
+Index    : 16
+Length   : 4
+Value    : good
+
+ +

So for my Select-ColorString, its logic is:

+ +
    +
  1. +

    Split the input content in lines.

    + +
    +
     foreach ($line in $Content) {...}
    +
    +
  2. +
  3. +

    Find all the matches in a given line.

    + +
    +
     $paramSelectString = @{
    +         Pattern       = $Pattern
    +         AllMatches    = $true
    +         CaseSensitive = $CaseSensitive
    + }
    + $matchList = $line | Select-String @paramSelectString
    +
    +
  4. +
  5. +

    Write without color for the string before the match.

    + +
    +
     $index = 0
    + foreach ($myMatch in $matchList.Matches) {
    +     $length = $myMatch.Index - $index
    +     Write-Host $line.Substring($index, $length) -NoNewline
    +     ...
    + }
    +
    +
  6. +
  7. +

    Right after, write the match with color.

    + +
    +
     foreach ($myMatch in $matchList.Matches) {
    +     ...
    +     $paramWriteHost = @{
    +         Object          = $line.Substring($myMatch.Index, $myMatch.Length)
    +         NoNewline       = $true
    +         ForegroundColor = $ForegroundColor
    +         BackgroundColor = $BackgroundColor
    +     }
    +     Write-Host @paramWriteHost
    +     ...
    + }
    +
    +
  8. +
  9. +

    Recalculate the index for the next match in the same line.

    + +
    +
     $index = 0
    + foreach ($myMatch in $matchList.Matches) {
    +     ...
    +     $index = $myMatch.Index + $myMatch.Length
    + }
    +
    +
  10. +
  11. +

    When there’s no more matches in the same line, just write without color all the rest.

    + +
    +
     $index = 0
    + foreach ($myMatch in $matchList.Matches) {
    +     ...
    +     $index = $myMatch.Index + $myMatch.Length
    + }
    + Write-Host $line.Substring($index)
    +
    +
  12. +
+ +

That’s all, let’s see a demo on Select-ColorString.

+ +

Select-ColorString demo

+ +

The demo reads in real-time a test file and use Select-ColorString to highlight the keyword warn

+ +

+ +

Select-String & -Split

+ +

In fact Powershell -split operator can also take regex pattern, and is as powerful as Select-String can do in terms of searching pattern. The reason that I chose Select-String instead of -split is because :

+ +
    +
  1. +

    Select-String makes sense to ‘port’ Unix grep on Powershell, they’re both for searching patterns and display them.

    +
  2. +
  3. +

    -split just splits the line by pattern, you still need to iterate on each splitted token and perform a -like or -match operation, which might take more time to display then Select-String does, as the later stores the matches already, it just needs to move the index and display the matches in color. But to be honest, I’ve never tested the execution duration difference between -split and Select-String, maybe -split is faster.

    +
  4. +
+ +

When I have time, I will write new function based on -split with regex to test its power.

+ +

Trace-Word & Select-ColorString

+ +

Both of them are in my toolkit, and I use them in different scenarios.

+ +
    +
  • When I only need to search patterns based on words, I will use Trace-Word, as it can display different colors on different words. A typical use case is monitoring the log files which have some keywords like info, warning, error, etc. The output is much more beautiful.
  • +
  • When I need to search patterns which include white space for example, I will use Select-ColorString as it takes regex and it doesn’t split the line by white space in advance
  • +
+ +

BTW, I also set an alias on each of them:

+ +
PS> Set-Alias tw Trace-Word
+PS> Set-Alias scs Select-ColorString
+
+ +

Update 2018-11-19 on new switch -MultiColorsForSimplePattern

+ +

I added a new switch -MultiColorsForSimplePattern last week. This switch enables the Select-ColorString to display the different keywords in different colors just like Trace-Word. This is very useful at least for me to search some keywords like error, warning in the log files.

+ +

+ +

There’s a limitation on this new switch that the multicolors only works for simple pattern which contains only keywords separated by “|” as shown in above screenshot. And it cannot be used with regex, this is because by using regex, the color selection will take much more time than the simple keywords. Maybe in the future I will add a new switch -MultiColorsForRegexPatternWithFastCpu.

+ +

Select-ColorString source code

+ +

Finally, you can find the the source code of Select-ColorString on Github.

+ +
+

As I forced to use only a few of the original Select-String’s parameters, Select-ColorString cannot do everything that Select-String does, that’s why I said more or less at the beginning of this post.

+ +

Some better ways that I think to archive the goal is whether use ValueFromRemainingArguments to send all the remaing non-handled Select-ColorString parameters to Select-String, whether let Microsoft Powershell team to modify directly the Types.ps1xml

+
+ +

+
+
+
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+127
+128
+129
+130
+131
+132
+133
+134
+135
+136
+137
+138
+139
+140
+141
+142
+143
+144
+145
+146
+147
+148
+149
+150
+151
+152
+153
+154
+155
+156
+157
+158
+159
+160
+161
+162
+163
+164
+165
+166
+167
+168
+169
+170
+171
+172
+173
+174
+175
+176
+177
+178
+179
+180
+181
+182
+183
+184
+185
+186
+187
+188
+189
+190
+191
+192
+193
+194
+195
+196
+197
+198
+199
+200
+201
+202
+203
+204
+205
+206
+207
+208
+209
+210
+211
+212
+213
+214
+215
+216
+217
+218
+219
+220
+221
+222
+
function Select-ColorString {
+     <#
+    .SYNOPSIS
+
+    Find the matches in a given content by the pattern and write the matches in color like grep.
+
+    .NOTES
+
+    inspired by: https://ridicurious.com/2018/03/14/highlight-words-in-powershell-console/
+
+    .EXAMPLE
+
+    > 'aa bb cc', 'A line' | Select-ColorString a
+
+    Both line 'aa bb cc' and line 'A line' are displayed as both contain "a" case insensitive.
+
+    .EXAMPLE
+
+    > 'aa bb cc', 'A line' | Select-ColorString a -NotMatch
+
+    Nothing will be displayed as both lines have "a".
+
+    .EXAMPLE
+
+    > 'aa bb cc', 'A line' | Select-ColorString a -CaseSensitive
+
+    Only line 'aa bb cc' is displayed with color on all occurrences of "a" case sensitive.
+
+    .EXAMPLE
+
+    > 'aa bb cc', 'A line' | Select-ColorString '\(a)|\(\sb)' -CaseSensitive -BackgroundColor White
+
+    Only line 'aa bb cc' is displayed with background color White on all occurrences of regex '\(a)|\(\sb)' case sensitive.
+
+    .EXAMPLE
+
+    > 'aa bb cc', 'A line' | Select-ColorString b -KeepNotMatch
+
+    Both line 'aa bb cc' and 'A line' are displayed with color on all occurrences of "b" case insensitive,
+    and for lines without the keyword "b", they will be only displayed but without color.
+
+    .EXAMPLE
+
+    > Get-Content app.log -Wait -Tail 100 | Select-ColorString \"error|warning|critical\" -MultiColorsForSimplePattern -KeepNotMatch
+
+    Search the 3 key words "error", "warning", and "critical" in the last 100 lines of the active file app.log and display the 3 key words in 3 colors.
+    For lines without the keys words, hey will be only displayed but without color.
+
+    .EXAMPLE
+
+    > Get-Content \"C:\Windows\Logs\DISM\dism.log\" -Tail 100 -Wait | Select-ColorString win
+
+    Find and color the keyword "win" in the last ongoing 100 lines of dism.log.
+
+    .EXAMPLE
+
+    > Get-WinEvent -FilterHashtable @{logname='System'; StartTime = \(Get-Date).AddDays\(-1)} | Select-Object time\*,level\*,message | Select-ColorString win
+
+    Find and color the keyword "win" in the System event log from the last 24 hours.
+    #>
+
+    [Cmdletbinding(DefaultParametersetName = 'Match')]
+    param(
+        [Parameter(
+            Position = 0)]
+        [ValidateNotNullOrEmpty()]
+        [String]$Pattern = $(throw "$($MyInvocation.MyCommand.Name) : " `
+                + "Cannot bind null or empty value to the parameter `"Pattern`""),
+
+        [Parameter(
+            ValueFromPipeline = $true,
+            HelpMessage = "String or list of string to be checked against the pattern")]
+        [String[]]$Content,
+
+        [Parameter()]
+        [ValidateSet(
+            'Black',
+            'DarkBlue',
+            'DarkGreen',
+            'DarkCyan',
+            'DarkRed',
+            'DarkMagenta',
+            'DarkYellow',
+            'Gray',
+            'DarkGray',
+            'Blue',
+            'Green',
+            'Cyan',
+            'Red',
+            'Magenta',
+            'Yellow',
+            'White')]
+        [String]$ForegroundColor = 'Black',
+
+        [Parameter()]
+        [ValidateSet(
+            'Black',
+            'DarkBlue',
+            'DarkGreen',
+            'DarkCyan',
+            'DarkRed',
+            'DarkMagenta',
+            'DarkYellow',
+            'Gray',
+            'DarkGray',
+            'Blue',
+            'Green',
+            'Cyan',
+            'Red',
+            'Magenta',
+            'Yellow',
+            'White')]
+        [ValidateScript( {
+                if ($Host.ui.RawUI.BackgroundColor -eq $_) {
+                    throw "Current host background color is also set to `"$_`", " `
+                        + "please choose another color for a better readability"
+                }
+                else {
+                    return $true
+                }
+            })]
+        [String]$BackgroundColor = 'Yellow',
+
+        [Parameter()]
+        [Switch]$CaseSensitive,
+
+        [Parameter(
+            HelpMessage = "Available only if the pattern is simple non-regex string " `
+                + "separated by '|', use this switch with fast CPU.")]
+        [Switch]$MultiColorsForSimplePattern,
+
+        [Parameter(
+            ParameterSetName = 'NotMatch',
+            HelpMessage = "If true, write only not matching lines; " `
+                + "if false, write only matching lines")]
+        [Switch]$NotMatch,
+
+        [Parameter(
+            ParameterSetName = 'Match',
+            HelpMessage = "If true, write all the lines; " `
+                + "if false, write only matching lines")]
+        [Switch]$KeepNotMatch
+    )
+
+    begin {
+        $paramSelectString = @{
+            Pattern       = $Pattern
+            AllMatches    = $true
+            CaseSensitive = $CaseSensitive
+        }
+        $writeNotMatch = $KeepNotMatch -or $NotMatch
+
+        [System.Collections.ArrayList]$colorList =  [System.Enum]::GetValues([System.ConsoleColor])
+        $currentBackgroundColor = $Host.ui.RawUI.BackgroundColor
+        $colorList.Remove($currentBackgroundColor.ToString())
+        $colorList.Remove($ForegroundColor)
+        $colorList.Reverse()
+        $colorCount = $colorList.Count
+
+        if ($MultiColorsForSimplePattern) {
+            # Get all the console foreground and background colors mapping display effet:
+            # https://gist.github.com/timabell/cc9ca76964b59b2a54e91bda3665499e
+            $patternToColorMapping = [Ordered]@{}
+            # Available only if the pattern is a simple non-regex string separated by '|', use this with fast CPU.
+            # We dont support regex as -Pattern for this switch as it will need much more CPU.
+            # This switch is useful when you need to search some words,
+            # for example searching "error|warn|crtical" these 3 words in a log file.
+            $expectedMatches = $Pattern.split("|")
+            $expectedMatchesCount = $expectedMatches.Count
+            if ($expectedMatchesCount -ge $colorCount) {
+                Write-Host "The switch -MultiColorsForSimplePattern is True, " `
+                    + "but there're more patterns than the available colors number " `
+                    + "which is $colorCount, so rotation color list will be used." `
+                    -ForegroundColor Yellow
+            }
+            0..\(\$expectedMatchesCount -1) | % {
+                $patternToColorMapping.($expectedMatches[$_]) = $colorList[$_ % $colorCount]
+            }
+
+        }
+    }
+
+    process {
+        foreach ($line in $Content) {
+            \$matchList = \$line | Select-String @paramSelectString
+
+            if (0 -lt $matchList.Count) {
+                if (-not $NotMatch) {
+                    $index = 0
+                    foreach ($myMatch in $matchList.Matches) {
+                        $length = $myMatch.Index - $index
+                        Write-Host $line.Substring($index, $length) -NoNewline
+
+                        $expectedBackgroupColor = $BackgroundColor
+                        if ($MultiColorsForSimplePattern) {
+                            $expectedBackgroupColor = $patternToColorMapping[$myMatch.Value]
+                        }
+
+                        $paramWriteHost = @{
+                            Object          = $line.Substring($myMatch.Index, $myMatch.Length)
+                            NoNewline       = $true
+                            ForegroundColor = $ForegroundColor
+                            BackgroundColor = $expectedBackgroupColor
+                        }
+                        Write-Host @paramWriteHost
+
+                        $index = $myMatch.Index + $myMatch.Length
+                    }
+                    Write-Host $line.Substring($index)
+                }
+            }
+            else {
+                if ($writeNotMatch) {
+                    Write-Host "$line"
+                }
+            }
+        }
+    }
+
+    end {
+    }
+}
+
+ + + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/05/powershell-stop-parsing.html b/2018/05/powershell-stop-parsing.html new file mode 100644 index 00000000..5cb117f2 --- /dev/null +++ b/2018/05/powershell-stop-parsing.html @@ -0,0 +1,879 @@ + + + + + + +Powershell stop-parsing (--%) - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +
+ + +
+ + + + + +
+

A friend of mine told me about the Powershell stop-parsing (--%) last year, he said the stop-parsing tells powershell to treat the remaining characters in the line as a literal, but I’d never known where to use it. Recently working on git ssh made it happened.

+
+ +

The use case is I needed to git push by using a ssh key instead of the https wincred. So at first I needed to generate a ssh key pair. I used the ssh-keygen.exe provided by GitForWindows.

+ +

To generate a ssh key pair from Powershell :

+ +
> ssh-keygen.exe -t rsa -b 4096 -C "your_email@example.com"
+
+Eenerating public/private rsa key pair.
+Enter file in which to save the key (/c/Users/xiang/.ssh/id_rsa):
+nter passphrase (empty for no passphrase):
+Enter same passphrase again:
+Your identification has been saved in /c/Users/xiang/.ssh/id_rsa.
+Your public key has been saved in /c/Users/xiang/.ssh/id_rsa.pub.
+The key fingerprint is:
+SHA256:msbOTbVaHD2W3BNBmhxHkpJ7FWhdLhzFWj8Q0IDAiU0 xiang.zhu@outlook.com
+The key's randomart image is:
++---[RSA 4096]----+
+|      =Eo .+*BO=o|
+|     . + .o.+Xo+o|
+|           ++.=oo|
+|          .o.o.+.|
+|        S o.* o .|
+|     . o o + . . |
+|      = . +      |
+|     + o o       |
+|      o o        |
++----[SHA256]-----+
+
+ +

Press twice enter will create the key pair (id_rsa and id_rsa.public) without passphrase in the default Windows SSH keys’ location Join-Path $env:HOMEDRIVE $env:HOMEPATH | Join-Path -ChildPath .ssh which is at C:\Users\xiang\.ssh on my computer.

+ +
+

It is highly recommended to secure your SSH key by a passphrase : ssh-keygen -N 'yourPassphraseHere'.

+
+ +

Everything works well till now, and the command ssh-keygen is easy to use. But the real use case is to generate the ssh key pair on a remote Windows server. I thought it should be easy too. Just install GitForWindows on the remote Windows server, add the Git paths to the user’s env PATH (I’m not admin on the remote server), and run the same ssh-keygen command ? Imagination remains imagination, let’s see the real world :

+ +
[RemoteServer]: PS> ssh-keygen.exe -t rsa -b 4096 -C "your_email@example.com"
+Generating public/private rsa key pair.
+Enter file in which to save the key (/c/Users/Administrator/.ssh/id_rsa):
+[RemoteServer]: PS>
+[RemoteServer]: PS>Test-Path C:\Users\Administrator\.ssh
+False
+[RemoteServer]: PS>
+
+ +

Hmm… it seems that remote PsSession doesn’t support ssh-keygen’s interactive prompt dialog. It closed the prompt without giving me the chance to talk with ssh-keygen. Never mind, ssh-keygen --help shows me what is the one-line command without prompt :

+ +
# param --help doesnt exist
+> ssh-keygen --help
+ssh-keygen: unknown option -- -
+usage: ssh-keygen [-q] [-b bits] [-t dsa | ecdsa | ed25519 | rsa]
+                  [-N new_passphrase] [-C comment] [-f output_keyfile]
+       ssh-keygen -p [-P old_passphrase] [-N new_passphrase] [-f keyfile]
+       ...
+
+ +

The first one from the above help file seems good, let’s try it out :

+ +
[RemoteServer]: PS> ssh-keygen -q -t rsa -b 4096 -N '' -C 'xiang.zhu@outlook.com' -f C:\Users\xiang\.ssh\id_rsa
+
+Too many arguments.
+usage: ssh-keygen [-q] [-b bits] [-t dsa | ecdsa | ed25519 | rsa]
+                  [-N new_passphrase] [-C comment] [-f output_keyfile]
+       ...
+
+ +

Still failed, but this time it threw Too many arguments error, very strange, all the arguments are valid as per ssh-keygen’s help.

+ +

Searched on Google, finally found that someone raised already an issue on PowerShell/Win32-OpenSSH github repo. It is because Powershell thinks -f is a powershell native parameter for parsing.

+ +

For example, to parse a DateTime to a sortable string :

+ +
> '{0:s}' -f (Get-Date)
+2018-05-15T20:41:55
+
+ +

So I added the stop-parsing symbol --% just after ssh-keygen.exe, and my ssh keys are managed to be created :

+
[RemoteServer]: PS> ssh-keygen.exe --% -q -t rsa -b 4096 -N '' -C 'xiang.zhu@outlook.com' -f C:\Users\administrator\.ssh\id_rsa
+ssh-keygen.exe : Saving key "C:\\Users\\administrator\\.ssh\\id_rsa" failed: No such file or directory
+    + CategoryInfo          : NotSpecified: (Saving key "C:\...le or directory:String) [], RemoteException
+    + FullyQualifiedErrorId : NativeCommandError
+
+# I need to create the folder .ssh in advance
+[RemoteServer]: PS> md C:\Users\administrator\.ssh
+
+    Directory: C:\Users\administrator
+
+Mode                LastWriteTime         Length Name
+----                -------------         ------ ----
+d-----        5/15/2018   8:35 PM                .ssh
+
+[RemoteServer]: PS> ssh-keygen.exe --% -q -t rsa -b 4096 -N '' -C 'xiang.zhu@outlook.com' -f C:\Users\administrator\.ssh\id_rsa
+[RemoteServer]: PS> gci C:\Users\Administrator\.ssh
+
+    Directory: C:\Users\Administrator\.ssh
+
+Mode                LastWriteTime         Length Name
+----                -------------         ------ ----
+-a----        5/15/2018   8:36 PM           3243 id_rsa
+-a----        5/15/2018   8:36 PM            747 id_rsa.pub
+
+[RemoteServer]: PS>
+
+ +

Some references on stop-parsing (not many resources on Internet):

+ +
    +
  • https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_parsing?view=powershell-6
  • +
  • https://ss64.com/ps/stop-parsing.html
  • +
  • https://rkeithhill.wordpress.com/2012/01/02/powershell-v3-ctp2-provides-better-argument-passing-to-exes/
  • +
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/05/setting-up-github-pages-with-custom-domain-over-https.html b/2018/05/setting-up-github-pages-with-custom-domain-over-https.html new file mode 100644 index 00000000..17197bee --- /dev/null +++ b/2018/05/setting-up-github-pages-with-custom-domain-over-https.html @@ -0,0 +1,902 @@ + + + + + + +Setting up Github Pages With custom domain over HTTPS - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +
+ + +
+ + + +
+

With Github pages, we can create our blogs in our own domain over HTTPS completely free. Of course you should pay for your domain name at the Registrar.

+
+ +

Create Github pages on Github.com

+ +
    +
  1. On Github create a repo with name : githubUserName.github.io
  2. +
  3. Push a file index.html to branch master or gh-pages +
  4. +
  5. Now you can access your github page by going to githubUserName.github.io
  6. +
+ +

From now on, you’ve created a fully operational blog on http://githubUserName.github.io, you can also enable HTTPS on it by going to the repo’s settings menu, everything is free.

+ +

If you dont need to use a custom domain like http://yourname.com, you can stop here, but if you want it, please go ahead.

+ +

Register a custom domain

+ +

Register a custom domain on your preferred domain name registrar

+ +

Setup DNS on DNS registrar

+ +
    +
  1. +

    Add subdomain

    + +

    https://help.github.com/articles/setting-up-a-www-subdomain/

    + +
      +
    • Add a CNAME DNS record pointing www to copdips.github.io +
    • +
    • Add a CNAME DNS record pointing blog to copdips.github.io +
    • +
    +
  2. +
  3. +

    Add APEX domain

    + +

    My DNS registrar doesn’t support ALIAS nor ANAME, I should go with the A records :

    + +
      +
    • Add a A DNS record pointing @ to 185.199.108.153 +
    • +
    • Add a A DNS record pointing @ to 185.199.109.153 +
    • +
    • Add a A DNS record pointing @ to 185.199.110.153 +
    • +
    • Add a A DNS record pointing @ to 185.199.111.153 +
    • +
    +
  4. +
+ +

Enable custom domain on Github.com

+ +
    +
  1. +

    Go to github repo

    + +

    https://github.com/githubUserName/githubUserName.github.io

    +
  2. +
  3. Add your custom domain in : Settings -> Options -> GitHub Pages -> Custom domain +
      +
    • If you’ll just run a blog on your domain, I suggest to use APEX domain name here instead of subdomain, for example: yourdomain.com
    • +
    • This step creates implicitly a file named CNAME under the root of your git repo, the content of the file CNAME is just your custom domain.
    • +
    • The commit message is ‘Create CNAME’
    • +
    +
  4. +
  5. On the same page, the option Enable HTTPS serves to redirect HTTP traffic to HTTPS. The option is grayed out for the moment, because initially https://yourdomain.github.io is binded with a github’s certificate so as to https://youdomain.com. In order to secure correctly your new site https://youdomain.com, Github needs to ask LetsEncrypt to issue a new certificate where the CN is youdomain.com, than when people visit your web site, they will see a green padlock in the address bar. The generation of LetsEncryt certificate takes usually 1 or 2 days, be patient, once you see a green lock when you open https://youdomain.com, you can come back here and enable the option Enable HTTPS.
  6. +
+ +

Enable HTTPS for custom domain with Cloudflare

+ +
+

This solution is partially deprecated as Github supports natively HTTPS for custom domains now, but Github pages doesn’t provide the wildcard certificate yet. For a better compatibility, Cloudflare HTTPS solution is still one of the best choices.

+
+ +

Some tutorials : +tutorial 1 +, +tutorial 2

+ +

Simplified steps :

+ +
    +
  1. Sign up for a free Cloudflare Account
  2. +
  3. Follow the wizard, give your custom domain, Cloudflare should find all your CNAME and A records.
  4. +
  5. Cloudflare should ask you to change your custom domain’s default DNS servers given by your DNS registrar to the Cloudflare ones. +
      +
    • The change may take several hours to take effect
    • +
    • Cloudflare DNS example: vida.ns.cloudflare.com, zod.ns.cloudflare.com
    • +
    +
  6. +
  7. Go to Crypto tab, verify SSL is set to Full
  8. +
  9. Go to Page Rules tab, add a page rule : http://customdomain.com/ with Always Use HTTPS +
  10. +
+ +

If everything goes well, you can access your custom domain by HTTPS. And if you verify the HTTPS certificate, it should be signed by COMODO, the certificate’s CN is a cloudflare.com server and one of the SAN is your custom domain.

+ +

Enable HTTPS for custom domain With Github

+ +

Github announced very recently (on May 01, 2018) the support of HTTPS for custom domains, this is really a great feature. After the test, I found that the HTTPS certificate is signed by letsencrypt.org where the CN is your github.io’s CNAME, and everything is free. Thx Github and LetsEncrypt !

+ +

You can also enable the HTTP to HTTPS automatic redirection from here.

+ +

If you use subdomain (for ex: www.copdips.com), hereunder the HTTPS tests :

+ +
    +
  • typed http://copdips.com, redirected to https://www.copdips.com
  • +
  • typed http://www.copdips.com, redirected to https://www.copdips.com
  • +
  • +

    typed https://copdips.com, redirected to the same https://copdips.com with a certificate error, as LetsEncrypt only signed to www.copdips.com in the CN.

    + +
    +

    With Cloudflare’s HTTPS solution, there’s no such error, as Cloudflare signed a wildcard certificate to *.copdips.com in the SAN.

    +
    +
  • +
+ +

If you use APEX domain (for ex: copdips.com), hereunder the HTTPS tests :

+
    +
  • typed http://copdips.com, redirected to https://copdips.com
  • +
  • typed http://www.copdips.com, redirected to https://copdips.com
  • +
  • typed https://copdips.com, redirected https://copdips.com
  • +
  • +

    typed https://www.copdips.com, redirected to https://copdips.com

    + +
    +

    With APEX domain, everything is good on HTTPS with native Github solution, you dont need Cloudflare

    +
    +
  • +
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/05/setting-up-jekyll-with-minimal-mistakes-theme-on-windows.html b/2018/05/setting-up-jekyll-with-minimal-mistakes-theme-on-windows.html new file mode 100644 index 00000000..4b12d81a --- /dev/null +++ b/2018/05/setting-up-jekyll-with-minimal-mistakes-theme-on-windows.html @@ -0,0 +1,1066 @@ + + + + + + +Setting up Jekyll with Minimal Mistakes theme on Windows - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 7 minute read + + + +

+ + +
+ + +
+ + + +
+

Do you want to preview Jekyll blog locally on Windows before publishing it to Internet? Many online tutorials about setting up Jekyll on Windows are out of date, I will show you in this post the 2018 version and with the Minimal Mistakes theme.

+
+ +

Some online tutorials

+ + + +

Install Ruby and Devkit on Windows

+ +

Jekyll is writted in Ruby, to preview Jekyll blog content, we need to install Ruby and Ruby DevKit.

+ +
+

Which Development Kit?

+ +

rubyinstaller.org: Starting with Ruby 2.4.0 we use the MSYS2 toolchain as our development kit. When using the Ruby+Devkit installer version, it is a selectable component, so that no additional downloads/installs are required.

+ +

When using the Ruby without Devkit version, the MSYS2 Devkit can be installed separately by running ridk install. MSYS2 is required to build native C/C++ extensions for Ruby and is necessary for Ruby on Rails. Moreover it allows the download and usage of hundreds of Open Source libraries which Ruby gems can depend on.

+
+ +

Download and install the Ruby+DevKit from the with Devkit part of the following downloads page: +https://rubyinstaller.org/downloads/

+ +

Install Jekyll Ruby package and its dependencies

+ +

Ruby uses gem to install the Ruby packages.

+ +

Change gem source if default https://rubygems.org/ banned in China:

+ +
gem sources --add https://ruby.taobao.org/ --remove https://rubygems.org/
+
+ +

To install the basic Jekyll environment, open a Powershell console:

+ +
> gem install bundler
+> gem install jekyll
+...
+Done installing documentation for public_suffix, addressable, colorator, http_parser.rb, eventmachine, em-websocket, concurrent-ruby, i18n, rb-fsevent, ffi, rb-inotify, sass-listen, sass, jekyll-sass-converter, ruby_dep, listen, jekyll-watch, kramdown, liquid, mercenary, forwardable-extended, pathutil, rouge, safe_yaml, jekyll after 55 seconds
+25 gems installed
+
+ +

Choose a theme

+ +

Googling will give you many Jekyll theme, this blog is using the minimal-mistakes theme,

+ +

By using the procedure provided by the quick start guide of the minimal mistakes theme, we can install all the Jekyll dependencies

+ +

Customize the theme

+ +

The _config.yml file

+ +

All the global configurations are set here, this is your starting point

+ +

Add Disqus comment system

+ +
    +
  1. Create an account on https://disqus.com/ +
  2. +
  3. Create a shortname on : https://disqus.com/admin/create/ +
  4. +
  5. Edit file _config.yml +
  6. +
+ +
comments:
+  provider               : "disqus" # false (default), "disqus", "discourse", "facebook", "google-plus", "staticman", "staticman_v2" "custom"
+  disqus:
+    shortname            : "the shortname created in step 2"
+
+ +

If you want to enable comment system by default on all the blog posts, set comments in defaults part of _config.yml to true :

+
# Defaults
+defaults:
+  # _posts
+  - scope:
+      path: ""
+      type: posts
+    values:
+      layout: single
+      author_profile: true
+      read_time: true
+      comments: true
+      share: true
+      related: true
+
+ +

Default page layout

+ +

In _config.yml, I chose single as my post default layout style.

+ +

The layout can be found at : _layouts\single.html

+ +

Add update date in each post under the post title

+ +

Add last_modified_at: in the post headers.

+ +

Per page layout

+ +

On the top of the post, you can add your YAML Front Matter:

+
---
+layout: single
+title: "Setting Up Powershell Gallery And Nuget Gallery" # title shown in home page
+excerpt: "As like [pypi](https://pypi.org/) for Python, [npm](https://www.npmjs.com/) for Node.js, we also have [Powershell Gallery](https://www.powershellgallery.com/) for Powershell to add some extra Powershell modules, and [Nuget Gallery](https://www.nuget.org/) for Powershell to add some extra executables." # excerpt shown in home page under title
+permalink: # global permalink is set in_config.yml
+tags:
+  - nuget
+  - powershell
+  - powershell gallery
+  - proxy
+published: true
+comments: true
+author_profile: true
+# header:
+#   teaserlogo:
+#   teaser: ''
+#   image: ''
+#   caption:
+gallery:
+  - image_path: ''
+    url: ''
+    title: ''
+---
+
+ +

Homepage

+ +

The homepage is defined by : _layouts\home.html, and it uses _includes\archive-single.html as its default content

+ + + +

To customize the navigation bar on top of the blog: _data\navigation.yml, for example, I added the Home menu :

+ +
# main links
+main:
+  # - title: "Quick-Start Guide"
+  #   url: https://mmistakes.github.io/minimal-mistakes/docs/quick-start-guide/
+  # - title: "About"
+  #   url: https://mmistakes.github.io/minimal-mistakes/about/
+  # - title: "Sample Posts"
+  #   url: /year-archive/
+  # - title: "Sample Collections"
+  #   url: /collection-archive/
+  # - title: "Sitemap"
+  #   url: /sitemap/
+  - title: "Home"
+    url: /
+
+ +

The Search menu in the navigation bar is set by the search option in the global _config.yml file, the default value is false which disables the Search menu :

+ +
search                   : true # true, false (default)
+
+ +

Add notice (Primary, Info, Success, Warning, Danger)

+ +

Append a new line under the text bloc, and insert the notice tag there :

+ + +

Other external notice methods :

+ + +

Larger width

+ +

The $x-large size defined in the file _Variables.scss is set at 1280px, which is good as per the maintainer’s idea of in favor of the readability, but is still too narrow for me, I have large 34” screen, and I like the width https://docs.microsoft.com/, so just set $x-large: 1520px !default; to have similar size as Microsoft docs.

+ +

Write a post

+ +

All Jekyll posts should be written in markdown .md or HTML formats, and Jekyll uses Ruby’s Kramdown as its default markdown converter.

+ +
+

You can also use other formats for post files, but you should provide the corresponding convertor. If you want to host your Jekyll blog on the Github Pages, it is suggested to use Kramdown because Github Pages has its own white list of the Jekyll plugins, your convertor plugin might not be available on Github Pages, so your post won’t be displayed correctly as expected.

+
+ +

All post files should be put into the _posts folder, Jekyll requires blog post files to be named according to the following format:

+ +
YEAR-MONTH-DAY-title.MARKUP
+
+# examples:
+2011-12-31-new-years-eve-is-awesome.md
+2012-09-12-how-to-write-a-blog.md
+
+ +

You don’t need to put all the files under the root of _posts folder, you can also use year and month as the sub folder name :

+ +
> tree ./_posts /f
+
+D:\XIANG\GIT\COPDIPS.GITHUB.IO\_POSTS
+└─2018
+        2018-05-03-setting-up-github-pages-with-custom-domain-over-https.md
+        2018-05-07-setting-up-powershell-gallery-and-nuget-gallery-for-powershell.md
+        2018-05-16-powershell-stop-parsing.md
+
+ +

Write a draft

+ +

Jekyll draft files should be saved into _drafts folder. The files in this folder won’t be displayed.

+ +

Define the post url

+ +

The default post URL is https://yourdomain/post-name

+ +

If you want to custom it, edit permalink in the _config.xml file, I’m using the following format :

+ +
permalink: /:year/:month/:title.html
+
+ +

Change the post skin look

+ +

The Jekyll post is using the Minimal Mistake theme, so the post skin is defined by the minimal_mistakes_skin option in _config.yml file.

+ +

All skin look related files can be found in _sass folder, for example :

+ +
    +
  • _air.scss (This blog is using air skin)
  • +
  • _base.scss
  • +
  • _footer.scss
  • +
  • _sidebar.scss
  • +
  • etc.
  • +
+ +

Preview the blog locally on Windows

+ +

From Powershell console :

+ +
> bundle exec jekyll serve -w
+
+Configuration file: D:/xiang/git/copdips.github.io/_config.yml
+            Source: D:/xiang/git/copdips.github.io
+       Destination: D:/xiang/git/copdips.github.io/_site
+ Incremental build: disabled. Enable with --incremental
+      Generating...
+                    done in 6.534 seconds.
+  Please add the following to your Gemfile to avoid polling for changes:
+    gem 'wdm', '>= 0.1.0' if Gem.win_platform?
+ Auto-regeneration: enabled for 'D:/xiang/git/copdips.github.io'
+    Server address: http://127.0.0.1:4000
+  Server running... press ctrl-c to stop.
+
+ +

The outputs tell that you can visit your site from : http://127.0.0.1:4000

+ +

Except you modify the _config.yml file, all the other modifications can trigger automatically the regeneration of the blog pages, and just refresh your blog page from the navigator, you can read the new version right away. But any modification in _config.yml needs the relaunch of bundle exec jekyll serve -w command to see the result.

+ +

Add non-whitelisted plugins (gems)

+ +

GitHub Pages runs in safe mode and only allows a set of whitelisted plugins. To use the gem in GitHub Pages, one of the workarounds is to use CI (e.g. travis, github workflow) and deploy to your gh-pages branch like: jekyll-deploy-action, and I use this plugin: jekyll-spaceship in my github pages.

+ +

Using mermaid in github pages

+ +

Above jekyll-spaceship plugin can render the mermaid code but not very well as described here.

+ +

Currently, there’re two better solutions by using the mermaid javascript API.

+ +

The first solution is to use the mermaid API directly, it’s inspired by this post. You can refer to this commit to see how to use it. The steps are as follows:

+ +
    +
  1. create a file mermaid.html inside the folder _includes. The file content could be found on the mermaid js official website.
  2. +
  3. update the file _includes/head.html to include the new file mermaid.html with or without the condition on the var page.mermaid +
  4. +
  5. in post where we need to render the mermaid diagrams, just put the code in side a html div block by set the class to mermaid like: <div class="mermaid"></div>. If the step 2 has set a condition on the var page.mermaid, you need to aslo add a variable named mermaid and set its value to true in the post header.
  6. +
+ +

The seconde solution is to install the gem plugin jekyll-mermaid where the underlying implementation uses the mermaid API too, This is what I’m using as per this commit, it’s a little bitter easier than the first solution.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/05/setting-up-powershell-gallery-and-nuget-gallery-for-powershell.html b/2018/05/setting-up-powershell-gallery-and-nuget-gallery-for-powershell.html new file mode 100644 index 00000000..412c8b21 --- /dev/null +++ b/2018/05/setting-up-powershell-gallery-and-nuget-gallery-for-powershell.html @@ -0,0 +1,928 @@ + + + + + + +Setting Up Powershell gallery And Nuget gallery - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +
+ + +
+ + + +
+

As like pypi for Python, npm for Node.js, we also have Powershell Gallery for Powershell to add some extra Powershell modules, and Nuget Gallery for Powershell to add some extra executables.

+
+ +

Powershell version

+ +

All commands provided here are tested on Windows 10 with Windows Powershell v5.1.

+ +

Configure proxy in Powershell

+ +

Both Powershell Gallery et Nuget Gallery can be installed locally that we don’t need external Internet access to retrieve the packages from them, but setting up an internal Powershell Gallery or an internal Nuget Gallery is out of scope of this post.

+ +

To use the public Powershell Gallery or the public Nuget Gallery, you must have Internet access. If you’re at the office, your computer is probably behind a company proxy to access Internet. If your Internet Explorer’s proxy setting has already been configured, you can use the below command to tell Powershell to reuse the same proxy setting :

+ +
(New-Object -TypeName System.Net.WebClient).Proxy.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials
+
+# Or batch version by using netsh (need admin privileges) :
+netsh winhttp show proxy
+netsh winhttp import proxy source=ie
+
+ +

I suggest to add the above command in your powershell profile, otherwise you should type it each time you open a new Powershell session.

+ +

Your Windows Powershell profile can be found at four locations:

+ +
$PROFILE | gm | ? membertype -eq noteproperty
+
+ +

The output of the above command :

+ +
# For Windows Powershell :
+
+   TypeName:System.String
+
+Name                   MemberType   Definition
+----                   ----------   ----------
+AllUsersAllHosts       NoteProperty string AllUsersAllHosts=C:\Windows\System32\WindowsPowerShell\v1.0\profile.ps1
+AllUsersCurrentHost    NoteProperty string AllUsersCurrentHost=C:\Windows\System32\WindowsPowerShell\v1.0\Microsoft.PowerShell_profile.ps1
+CurrentUserAllHosts    NoteProperty string CurrentUserAllHosts=d:\xiang\Documents\WindowsPowerShell\profile.ps1
+CurrentUserCurrentHost NoteProperty string CurrentUserCurrentHost=d:\xiang\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1
+
+ +

The two CurrentUser profile locations might differ on different computers, all depends on your MyDocuments location ( [Environment]::GetFolderPath("MyDocuments") ), and if you’re using Powershell Core, all the four locations are different than the ones in Windows Powershell. +I use usually CurrentUserAllHost because the change will only affect my profile, and even if I’m not the admin of the computer, I can still get it work. The profile location can be found at :

+ +
$PROFILE | % CurrentUserAllHosts
+
+ +

Add proxy setting in the end of your CurrentUserAllHost powershell profile :

+ +
Add-Content ($PROFILE | % CurrentUserAllHost) "`n(New-Object -TypeName System.Net.WebClient).Proxy.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials`n"
+
+ +

As a best practice, it would be better to add the above line at the top of your profile.

+ +

Set up Powershell Gallery for Powershell

+ +

This is pretty easy for Powershell v5+ :

+ +
# I add the switch Trusted because I trust all the modules and scripts from Powershell Gallery
+Register-PSRepository -Default -InstallationPolicy Trusted
+
+ +

For Powershell with version less than v5:

+ +
Register-PSRepository -Name PSGallery -SourceLocation https://www.powershellgallery.com/api/v2/ -InstallationPolicy Trusted
+
+ +

Test :

+
> Get-PSRepository
+
+Name                      InstallationPolicy   SourceLocation
+----                      ------------------   --------------
+PSGallery                 Trusted              https://www.powershellgallery.com/api/v2/
+
+ +

Use Powershell Gallery

+ +
# Search a module which name is like poshrs*
+> find-module poshrs*
+
+Name                           Version          Source           Summary
+----                           -------          ------           -------
+PoshRSJob                      1.7.4.4          PSGallery        Provides an alternative to PSjobs with greater performance and less overhead to run commands in ...
+
+# Install the module without admin privileges
+> find-module poshrs* | install-module -Scope CurrentUser
+
+ +

Set up Nuget for Powershell

+ +

Nuget is well-known among the Windows developers.

+ +
# I also add the Trusted switch
+Register-PackageSource -Name Nuget -Location "http://www.nuget.org/api/v2" ProviderName Nuget -Trusted
+
+ +

My Nuget client is at v2, so I can only target at Nuget v2 API.

+ +
> Get-PackageProvider
+
+Name                     Version          DynamicOptions
+----                     -------          --------------
+msi                      3.0.0.0          AdditionalArguments
+msu                      3.0.0.0
+NuGet                    2.8.5.208        Destination, ExcludeVersion, Scope, SkipDependencies, Headers, FilterOnTag, ...
+PowerShellGet            1.0.0.1          PackageManagementProvider, Type, Scope, AllowClobber, SkipPublisherCheck, In...
+Programs                 3.0.0.0          IncludeWindowsInstaller, IncludeSystemComponent
+
+ +

Test :

+ +
> Get-PackageSource
+
+Name                             ProviderName     IsTrusted  Location
+----                             ------------     ---------  --------
+Nuget                            NuGet            True       http://www.nuget.org/api/v2
+PSGallery                        PowerShellGet    True       https://www.powershellgallery.com/api/v2/
+
+ +

Use Nuget

+ +
# install the latest version of GitForWindows without admin privileges
+find-package gitforwindows | install-package -Scope CurrentUser
+
+# install the latest version of Python without admin privileges
+find-package python | install-package -Scope CurrentUser
+
+# find the path of Python installation
+get-package python | % source
+
+# You need to add manually the package executable path to your USER PATH.
+# To get the current USER Path
+[System.Environment]::GetEnvironmentVariable('Path', 'User')
+
+# To set the current USER Path
+[System.Environment]::SetEnvironmentVariable('Path', $newPathInSingleStringSeparatedByColumn, 'User')
+
+ +
+

In fact, you can find out from the output of Get-PackageSource that Find-Package can search the packages and modules in both Nuget Gallery and Powershell Gallery.

+
+ +

Set up internal Powershell Gallery or Nuget Gallery

+ +

Some resources on setting up internal Powershell Gallery and Nuget Gallery:

+ +
    +
  1. Setting up an Internal PowerShellGet Repository
  2. +
  3. Powershell: Your first internal PSScript repository
  4. +
  5. PowerShell/PSPrivateGallery
  6. +
  7. Overview of Hosting Your Own NuGet Feeds
  8. +
  9. NuGet/NuGetGallery
  10. +
+ + +
+ + + + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/05/using-readline-in-python-repl-on-windows.html b/2018/05/using-readline-in-python-repl-on-windows.html new file mode 100644 index 00000000..d0829f76 --- /dev/null +++ b/2018/05/using-readline-in-python-repl-on-windows.html @@ -0,0 +1,816 @@ + + + + + + +Using Readline In Python REPL On Windows With PyReadline and PtPython - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +
+

As an ex-sysadmin, I’m in love with the Readline. In Powershell, we have its variation PSReadline. In Python REPL on Windows OS, I’ll show you the PyReadline and the PtPython.

+
+ +

PyReadline

+ +

When you search on Internet, you will find many tutorials telling you to install a Python module called readline, but unfortunately, it’s not compatible on Windows OS :

+ +
> pip install readline
+
+Collecting readline
+  Using cached https://files.pythonhosted.org/packages/f4/01/2cf081af8d880b44939a5f1b446551a7f8d59eae414277fd0c303757ff1b/readline-6.2.4.1.tar.gz
+    Complete output from command python setup.py egg_info:
+    error: this module is not meant to work on Windows
+
+ +

On Windows, the counterpart is PyReadline, install it by :

+
pip install pyreadline
+
+ +

Here are the features of PyReadline :

+
    +
  • keyboard text selection and copy/paste
  • +
  • Shift-arrowkeys for text selection
  • +
  • Control-c can be used for copy activate with allow_ctrl_c(True) in config file
  • +
  • Double tapping ctrl-c will raise a KeyboardInterrupt, use ctrl_c_tap_time_interval(x) where x is your preferred tap time window, default - 0.3 s.
  • +
  • paste pastes first line of content on clipboard.
  • +
  • ipython_paste, pastes tab-separated data as list of lists or numpy array if all data is numeric
  • +
  • paste_mulitline_code pastes multi line code, removing any empty lines.
  • +
+ +

PyReadline was used by IPython, but since it hasn’t been maintained since 2015, IPython removed it, and replaced it by prompt_toolkit.

+ +

As PyReadline must be used inside Python REPL, you need to type import PyReadline from the very beginning of the Python REPL. To be a lazy devops, just add the import into a file and let Python to source it before the first prompt is displayed by using $env:PYTHONSTARTUP :

+ +
# In powershell console
+Add-Content $env:USERPROFILE/.pythonrc "`nimport PyReadline"
+$env:PYTHONSTARTUP = "$env:USERPROFILE/.pythonrc"
+
+ +

PtPython

+ +

Previous chapter mentioned that PyReadline is no more maintained, so here comes the PtPython.

+ +

PyPython is an interactive Python Shell, build on top of prompt_toolkit written by the same author Jonathan Slenders.

+ +

Install PtPython by :

+
pip install ptpython
+
+ +

Start it by typing simply : ptpython from the terminal, it will start a Python REPL with prompt_toolkit integrated, nothing to set on $env:USERPROFILE

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/06/converting-python-json-list-to-csv-in-2-lines-of-code-by-pandas.html b/2018/06/converting-python-json-list-to-csv-in-2-lines-of-code-by-pandas.html new file mode 100644 index 00000000..6fcf76a3 --- /dev/null +++ b/2018/06/converting-python-json-list-to-csv-in-2-lines-of-code-by-pandas.html @@ -0,0 +1,796 @@ + + + + + + +Converting Python json dict list to csv file in 2 lines of code by pandas - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +
+

Converting a Powershell object list to a csv file is quiet easy, for example :

+
+
6.0.2> gps | select name,id,path | ConvertTo-Csv | Out-File .\gps.csv ; ii .\gps.csv
+
+

I’ll show you in this post the Python way to convert a dict list to a csv file.

+
+ +

During my work, I got a result in Python dict list type, I needed to send it to other teams who are not some Python guys. One of the most commonly used sharing file type is the csv file. When I googled how to convert json to csv in Python, I found many ways to do that, but most of them need quiet a lot of code to accomplish this common task. I was a sysadmin, I don’t like to write many lines for a single task, and I also don’t like to reinvent the wheel. Finally, I found the Python pandas module which lets me to achieve this goal in only 2 lines of code.

+ +
+

pandas is an open source, BSD-licensed library providing high-performance, easy-to-use data structures and data analysis tools for the Python programming language.

+
+ +

Here’s the code :

+ +
>>> import json
+
+# first line of code: import the pandas module
+>>> import pandas
+
+# generate a python dict list
+>>> data= [{'name':'a', 'value':1}, {'name':'b', 'value':2}]
+
+# second line of code: convert the dict list to csv and save it into the file pandas.csv
+>>> pandas.read_json(json.dumps(data)).to_csv('pandas.csv')
+
+# verify the csv file content
+>>> with open('pandas.csv') as f:
+...     print(f.read())
+,name,value
+0,a,1
+1,b,2
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/06/git-untrack-submodule-from-git-status.html b/2018/06/git-untrack-submodule-from-git-status.html new file mode 100644 index 00000000..1bf4753a --- /dev/null +++ b/2018/06/git-untrack-submodule-from-git-status.html @@ -0,0 +1,789 @@ + + + + + + +Git untrack submodule from git status - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

When we have submodules in a git repo, even if we add the submodules’ folders into the .gitignore file, these submodules folders are still tracked from the git status output.

+ +

Method 1: .gitmodules file

+ +

There’re several methods to ignore it, one of them is in .gitmodules file, add following line ignore = dirty under each submodule, example :

+
[submodule "bundle/fugitive"]
+    path = bundle/fugitive
+    url = git://github.com/tpope/vim-fugitive.git
+    ignore = dirty
+
+ +

Method 2: switch –ignore-submodules=dirty

+ +

Another method is to use the swicth --ignore-submodules=dirty of git status (available from git version 1.7.2) and create an alias to shorten the typing.

+ +
+

–ignore-submodules[=]

+ +

Ignore changes to submodules when looking for changes. can be either "none", "untracked", "dirty" or "all", which is the default. Using "none" will consider the submodule modified when it either contains untracked or modified files or its HEAD differs from the commit recorded in the superproject and can be used to override any settings of the ignore option in git-config[1] or gitmodules[5]. When "untracked" is used submodules are not considered dirty when they only contain untracked content (but they are still scanned for modified content). Using "dirty" ignores all changes to the work tree of submodules, only changes to the commits stored in the superproject are shown (this was the behavior before 1.7.0). Using "all" hides all changes to submodules (and suppresses the output of submodule summaries when the config option status.submoduleSummary is set).

+
+ +
> git status --ignore-submodules=dirty
+
+# create the alias if you like
+> git config --glo alias.gst='status --ignore-submodules=dirty'
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/06/import-python-module-with-sys-path-when-without-init-file.html b/2018/06/import-python-module-with-sys-path-when-without-init-file.html new file mode 100644 index 00000000..9fd45dae --- /dev/null +++ b/2018/06/import-python-module-with-sys-path-when-without-init-file.html @@ -0,0 +1,841 @@ + + + + + + +Import Python module with sys.path variable when without __init__ file - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +
+ + +
+ + + +
+

We’re familiar to put a python file inside a folder, and create a __init__.py file under the same folder, then we can easily import the file by import the folder, as the folder is transformed to a python module. But if we don’t have the __init__.py, how can we import it?

+
+ +

Suppose that we have a Flask project, during the development of Flask, we need to use the function flask_ctx_get_request_id() in the file request_id.py from the repo https://github.com/Workable/flask-log-request-id.

+ +

Here is the current folder tree, there’s only one file flask.py:

+ +
D:\xiang\git\test\flask_project
+│  flask.py
+
+ +

I add the repo as a submodule:

+ +
> git submodule add https://github.com/Workable/flask-log-request-id.git
+
+ +

Then my folder tree is like this:

+ +
D:\xiang\git\test\flask_project
+│  .gitmodules
+│  flask.py
+│
+└─flask-log-request-id
+    │
+    ├─flask_log_request_id
+    │  │  ctx_fetcher.py
+    │  │  filters.py
+    │  │  parser.py
+    │  │  request_id.py
+    │  │  __init__.py
+    │  │
+    │  └─extras
+    │          celery.py
+    │          __init__.py
+    │
+    └─(more files and folders and ignored ...)
+
+ +

In flask.py, I try to import the function by importing the folder flask-log-request-id:

+
# flask.py
+from flask-log-request-id.flask_log_request_id.request_id import flask_ctx_get_request_id
+
+ +

Test the import:

+
> python .\flask.py
+  File ".\flask.py", line 1
+    from flask-log-request-id.flask_log_request_id.request_id import flask_ctx_get_request_id
+              ^
+SyntaxError: invalid syntax
+
+ +

The flask-log-request-id folder is not importable because it doesn’t contain the _init.py file. I don’t want to manually create it, it has no sense here. The workaround is to use the sys.path variable.

+ +
+

6.1.2. The Module Search Path

+ +

When a module named spam is imported, the interpreter first searches for a built-in module with that name. If not found, it then searches for a file named spam.py in a list of directories given by the variable sys.path. sys.path is initialized from these locations:

+ +
    +
  • The directory containing the input script (or the current directory when no file is specified).
  • +
  • +PYTHONPATH (a list of directory names, with the same syntax as the shell variable PATH).
  • +
  • The installation-dependent default.
  • +
+ +

On file systems which support symlinks, the directory containing the input script is calculated after the symlink is followed. In other words the directory containing the symlink is not added to the module search path.

+ +

After initialization, Python programs can modify sys.path. The directory containing the script being run is placed at the beginning of the search path, ahead of the standard library path. This means that scripts in that directory will be loaded instead of modules of the same name in the library directory. This is an error unless the replacement is intended. See section Standard Modules for more information.

+
+ +

As per the official doc, I could add the path of the flask-log-request-id folder to the sys.path variable, than the module flask_log_request_id will be directly searchable by python process.

+ +
# flask.py
+import sys
+sys.path.append(r"d:/xiang/git/test/flask_project/flask-log-request-id")
+
+from flask_log_request_id.request_id import flask_ctx_get_request_id
+
+ +

The import will still fail due to the file d:/xiang/git/test/flask_project/flask-log-request-id\flask_log_request_id\__init__.py, resolving this issue is out of scope of this blog, the adding path to the sys.path to find the flask_log_request_id module works well.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/06/install-python-on-windows-with-powershell-without-administrator-privileges.html b/2018/06/install-python-on-windows-with-powershell-without-administrator-privileges.html new file mode 100644 index 00000000..e9cf1a2e --- /dev/null +++ b/2018/06/install-python-on-windows-with-powershell-without-administrator-privileges.html @@ -0,0 +1,926 @@ + + + + + + +Install Python on Windows with Powershell without administrator privileges - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +
+ + +
+ + + +
+

As a Windows DevOps, I often use Powershell and Python, Powershell is installed by Windows out of box, but this is not for Python. And for my working environment, I don’t have the administrator privileges on some servers. I will show you in this post how to rapidly deploy Python on Windows as a standard user by using Powershell with Nuget.

+
+ +

Update 2019-12-30 Installing Python by Scoop

+ +

Installing Python on Windows by Scoop is the simplest way so far if you have Internet access.

+ +

To switch between different Python versions, please check this doc.

+ +

Finding Python packages

+ +

If you cannot use Find-Package to search pacakges in Nuget repository, please check my post on Setting Up Nuget for Powershell.

+ +

We will install python with version 3.6.5 and python2 with version 2.7.15.

+ +
> Find-Package python*
+Name                           Version          Source           Summary
+----                           -------          ------           -------
+python                         3.6.5            Nuget            Installs 64-bit Python for use in build scenarios.
+python-embed                   3.6.1.1          Nuget            Installs 64-bit Python for use in build scenarios a...
+python2x86                     2.7.15           Nuget            Installs 32-bit Python 2.7 for use in build scenarios.
+python2                        2.7.15           Nuget            Installs 64-bit Python 2.7 for use in build scenarios.
+Python35                       3.5.1.1          Nuget            Python 3.5 API
+Python36                       3.6.0            Nuget            Python 3.6 API
+pythonAndroid-2.7-x86_64-22... 1.0.0.7          Nuget            Python 2.7 android api version: 22.0.0 architecture... pythonAndroid-2.7-armeabi-v... 1.0.0.7          Nuget            Python 2.7 android api version: 22.0.0 architecture... pythonAndroid-2.7-x86_64-23... 1.0.0.7          Nuget            Python 2.7 android api version: 23.0.0 architecture...
+Python27Dev                    2.7.13           Nuget            Python 2.7 unofficial dev environment package
+pythonIOS-2.7-arm64-10.3       1.0.0.7          Nuget            Python 2.7 iOS api version: 10.3 architecture: arm64
+PythonPlotter                  0.2.15           Nuget            Package to allow use of matplotlib from .NET....
+Python.Runtime                 2.7.9            Nuget            Python 2.7.9 as a single, stand-alone executable wi...
+PythonLibs4CSharp              1.0.0            Nuget            A collection of Iron Python compiled libraries with...
+pythonx86                      3.6.5            Nuget            Installs 32-bit Python for use in build scenarios.
+pythonnet_py35_dotnet          2.3.0            Nuget            Python 3.5 and .NET Framework
+pythonnet_py27_dotnet          2.3.0            Nuget            Python 2.7 and .NET Framework
+Python27                       2.7.6            Nuget            Python 2.7 API
+PythonConsoleControl           1.0.1            Nuget            PythonConsole
+Python3                        3.6.3.2          PSGallery        Python3 interpreter
+PythonSelect                   1.0.0            PSGallery        Select a Python distribution to use within a PowerS...
+PythonConverter.dll            1.0.0            Nuget            Package description
+
+ +

Installing Python

+ +
# To install Python 3
+> Install-Package python -Scope CurrentUser
+
+# To install Python 2
+> Install-Package python2 -Scope CurrentUser
+
+ +

Note 2018-08-29: +Current Find-Package python* -AllVersion gives the lastest python version is v3.7.0, but this version doesn’t work, the last worked Nuget python version is v3.6.6

+ +

Adding Python to user path

+ +

I will show you the way to add Python3 into the user PATH, it will be the same way for Python2. +I use the user PATH because I’m not admin on the Windows server, I cannot modify the system PATH.

+ +
# Get python3 package info path
+> Get-Package python | % source
+C:\Users\xiang\AppData\Local\
+
+# For Nuget packages, the executable is always under the tools folder, and the tools folder is at the same level as .nupkg file.
+> ls C:\Users\xiang\AppData\Local\PackageManagement\NuGet\Packages\python.3.6.5\tools\
+
+    Directory: C:\Users\xiang\AppData\Local\PackageManagement\NuGet\Packages\python.3.6.5\tools
+
+Mode                LastWriteTime         Length Name
+----                -------------         ------ ----
+d-----       2018-06-26     00:15                DLLs
+d-----       2018-06-26     00:15                include
+d-----       2018-06-26     00:16                Lib
+d-----       2018-06-26     00:15                libs
+d-----       2018-06-26     00:49                Scripts
+d-----       2018-06-26     00:15                Tools
+-a----       2018-03-28     17:10         100504 python.exe
+-a----       2018-03-28     17:10          58520 python3.dll
+-a----       2018-03-28     17:10        3610776 python36.dll
+-a----       2018-03-28     17:10          98968 pythonw.exe
+-a----       2018-03-28     17:10          88752 vcruntime140.dll
+
+# python needs to add 2 paths to the user PATH, one is the root folder containing python.exe, another is the Sripts folder.
+> $pythonRootFolder = Join-Path (Split-Path (Get-Package python | % source)) "tools"
+> $pythonScriptsFolder = Join-Path $pythonRootFolder "Scripts"
+> $path = [System.Environment]::GetEnvironmentVariable('path', 'user')
+> $path += ";$pythonRootFolder"
+> $path += ";$pythonScriptsFolder;"
+> [System.Environment]::SetEnvironmentVariable('path', $path, 'user')
+
+ +

Reinstalling pip

+ +

The default pip3.exe and pip2.exe have some strange behavior that just don’t work :

+ +
> pip3
+Fatal error in launcher: Unable to create process using '"'
+
+> pip2
+Fatal error in launcher: Unable to create process using '"'
+
+ +

You can bypass the issue by using python -m pip, but I like to use pip directly without python -m, the trick is just reinstalling the pip:

+ +
> python -m pip uninstall pip -y
+> python -m ensurepip
+
+

Normally python -m ensurepip will install pip v9, if you want to install pip v10, just upgrade the v9:

+
> pip3  --version
+pip 9.0.3 from c:\users\xiang\appdata\local\packagemanagement\nuget\packages\python.3.6.5\tools\lib\site-packages (python 3.6)
+
+> python -m pip install -U pip
+Collecting pip
+  Using cached https://files.pythonhosted.org/packages/0f/74/ecd13431bcc456ed390b44c8a6e917c1820365cbebcb6a8974d1cd045ab4/pip-10.0.1-py2.py3-none-any.whl
+Installing collected packages: pip
+  Found existing installation: pip 9.0.3
+    Uninstalling pip-9.0.3:
+      Successfully uninstalled pip-9.0.3
+Successfully installed pip-10.0.1
+
+> pip3 --version
+pip 10.0.1 from c:\users\xiang\appdata\local\packagemanagement\nuget\packages\python.3.6.5\tools\lib\site-packages\pip (python 3.6)
+
+ +

And we can find that when installing pip v10, the pip.exe is installed too, while in pip v9, we only have pip3.exe.

+ +
> ls C:\Users\xiang\AppData\Local\PackageManagement\NuGet\Packages\python.3.6.5\tools\Scripts\
+
+    Directory: C:\Users\xiang\AppData\Local\PackageManagement\NuGet\Packages\python.3.6.5\tools\Scripts
+
+Mode                LastWriteTime         Length Name
+----                -------------         ------ ----
+-a----       2018-03-28     17:10          98187 easy_install-3.6.exe
+-a----       2018-06-26     00:49         102812 pip.exe
+-a----       2018-06-26     00:49         102812 pip3.6.exe
+-a----       2018-06-26     00:49         102812 pip3.exe
+-a----       2018-06-26     00:29          98224 ptipython.exe
+-a----       2018-06-26     00:29          98224 ptipython3.exe
+-a----       2018-06-26     00:29          98223 ptpython.exe
+-a----       2018-06-26     00:29          98223 ptpython3.exe
+-a----       2018-06-26     00:29          98207 pygmentize.exe
+
+ +

Update on 2018-07-27: +The pip version has been jumped from v10 to v18 directly, because PyPA switches the software versioning to CalVer

+ +

Configuring pip for PyPI

+ +

If you’re in enterprise environment, you may probably dont have access to the public Python packages repository https://pypi.org/, and in this case, your enterprise should have a local Artifactory which mirrors the public https://pypi.org/. So you need to add your enterprise Artifactory PyPI URL to you Python pip conf.

+ +

You can find all the pip configuration details here.

+ +

For JFrog Artifactory: +https://www.jfrog.com/confluence/display/RTF/PyPI+Repositories

+ + +
+ + + + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/07/convert-markdown-or-rst-to-atlassian-confluance-documentation-format.html b/2018/07/convert-markdown-or-rst-to-atlassian-confluance-documentation-format.html new file mode 100644 index 00000000..b0f6771c --- /dev/null +++ b/2018/07/convert-markdown-or-rst-to-atlassian-confluance-documentation-format.html @@ -0,0 +1,786 @@ + + + + + + +Convert markdown or rst to Atlassian Confluance documentation format - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +
+ + +
+ + + +
+

A recent working experience needed me to write doc on Atlassian Confluance documentation product. I will show you how to convert your markdown doc to Confluance version.

+
+ +

Convert markdown or rst to Confluance

+ +

Confluance’s web doc editor is very powerfull, but I a markdown guy, I write everything in markdown in pure text mode and versioning it. I need sth. to convert markdown to Confluance.

+ +

Checked on the official doc, it says that Confluence supports markdown import, but after a test, not really, at least not for table.

+ +

Solution:

+ +

Convert the markdown or rst files to a HTML file.

+ +

There’re many plugins on the net, I use VSCode editor, I choose the extension Markdown All in One, it has a method called “Markdown: Print current document to HTML”.

+ +

Once I get the HTML version, than just past the HTML content into Confluence directly. Done.

+ +

Here’s the tutorial on how to insert the HTML marco.

+ +

Convert mediawiki to Confluance

+ +

Checked on the official doc, it says that Confluence supports wiki import, but I haven’t tested yet.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/07/use-python-tabulate-module-to-create-tables.html b/2018/07/use-python-tabulate-module-to-create-tables.html new file mode 100644 index 00000000..b1526ea3 --- /dev/null +++ b/2018/07/use-python-tabulate-module-to-create-tables.html @@ -0,0 +1,833 @@ + + + + + + +Use python tabulate module to create tables - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +
+

If you want to create some tables from a python list, you can use the tabulate module, it can generate the table easily in text mode and in many formats, than you can past it into markdown, wiki files or add the print version to your python CLI in order to give a beautiful output to the CLI users.

+
+ +

Install python tabulate module

+ +
> pip install tabulate
+
+ +

How to use tabulate

+ +

The official doc has already included nearly everything.

+ +

How to print in markdown, rst, wiki, html formats

+ +

For rst, wiki, html formats, the official doc has already clearly given it, but for markdown, it’s not mentioned. After the test, the "pipe" format from PHP Markdown Extra is compatible to markdown.

+ + + + + + + + + + + + + + + + + + + + + + + + + + +
filetabulate format (tablefmt)
rst“rst”
markdown“pipe”
mediawiki“mediawiki”
html“html”
+ +

Update 2019-04-23: When I tested the lastest tabulate version 0.8.3, it added support also github format.

+ +

Html code can be injected into Markdown file.

+ +

Visualize all the formats

+ +
from tabulate import _table_formats, tabulate
+
+
+format_list = list(_table_formats.keys())
+# current format list in tabulate version 0.8.3:
+# ['simple', 'plain', 'grid', 'fancy_grid', 'github', 'pipe', 'orgtbl', 'jira', 'presto', 'psql', 'rst', 'mediawiki', 'moinmoin', 'youtrack', 'html', 'latex', 'latex_raw', 'latex_booktabs', 'tsv', 'textile']
+
+
+# Each element in the table list is a row in the generated table
+table = [["spam",42], ["eggs", 451], ["bacon", 0]]
+headers = ["item", "qty"]
+
+for f in format_list:
+    print("\nformat: {}\n".format(f))
+    print(tabulate(table, headers, tablefmt=f))
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/07/use-pyvmomi-EventHistoryCollector-to-get-all-the-vcenter-events.html b/2018/07/use-pyvmomi-EventHistoryCollector-to-get-all-the-vcenter-events.html new file mode 100644 index 00000000..378dc3ec --- /dev/null +++ b/2018/07/use-pyvmomi-EventHistoryCollector-to-get-all-the-vcenter-events.html @@ -0,0 +1,930 @@ + + + + + + +Use pyVmomi EventHistoryCollector to get all the vCenter events - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +
+ + +
+ + + +
+

pyVmomi eventManager’s QueryEvents() method returns by default only the last 1000 events occurred on the vCenter. I will show you how to use another method CreateCollectorForEvents() to create an EventHistoryCollector object and then we use this object to collect all the events in a given time range by using its method ReadNextEvents().

+
+ +

An example of QueryEvents method from the eventManager object

+ +

Let’s see an example given by the pyVmomi samples community.

+ +
# ...some code ignored...
+byEntity = vim.event.EventFilterSpec.ByEntity(entity=vm, recursion="self")
+ids = ['VmRelocatedEvent', 'DrsVmMigratedEvent', 'VmMigratedEvent']
+filterSpec = vim.event.EventFilterSpec(entity=byEntity, eventTypeId=ids)
+# ...some code ignored...
+eventManager = si.content.eventManager
+events = eventManager.QueryEvent(filterSpec)
+# ...some code ignored...
+
+ +

From the above code example, we can find that the author wants to collect the vCenter events where the event types are limited to ids and the event entity is limited to byEntity. He creates a filterSpec based on these 2 limitations and creates an eventManager object, than passes the filterSpec to the method QueryEvent to collect the events.

+ +

The code works well, but you will find that in anyway, it will only returns maximum 1000 events. This is because eventManger uses the default event collector which pages all the events in a size of 1000 events (by default and also the maximum value) per page, and returns only the last page.

+ +

An example of Get-VIEvent from PowerCLI

+ +

In PowerCLI, we have the cmdlet Get-VIEvent + which can get all the events without the limitation of 1000 events.

+ +
Connect-VIServer -Server 10.23.113.41
+$events = Get-VIEvent -Start (Get-Date).AddDays(-1)
+
+ +

It works perfectly, but please take care of this site note:

+ +

“Calling Get-VIEvent without any parameters might result in significant delays depending on the total number of events on the server.”

+ +

This note tells us that the cmdlet might take a long time to finish if there’re too many events. In fact, that is also what I will show you in the below paragraph, Get-VIEvent uses the EventHistoryCollector to walk through all the events pages, and returns them all in the end.

+ +

Use EventHistoryCollector to collect all the events

+ +

Finally, here comes our protagonist, the EventHistoryCollector.

+ +

The EventHistoryCollector can be created by eventManger by using the CreateCollectorForEvents(filter) method. The EventHistoryCollector has a magic method: ReadNextEvents().

+ +
+

ReadNextEvents

+ +

Reads the ‘scrollable view’ from the current position. The scrollable position is moved to the next newer page after the read. No item is returned when the end of the collector is reached.

+
+ +

From it’s description, we can know that it reads all the events from the current page, than it jumps to the next page. EventHistoryCollector has also a ReadPreviousEvents() method that does exactly the same thing but jumps back to the previous page.

+ +

So, now we need to ensure from where (which event page) starting the EventHistoryCollector.

+ +

From the EventHistoryCollector doc, we find it inherits from HistoryCollector:

+ +
+

Managed Object - EventHistoryCollector(vim.event.EventHistoryCollector)

+ +

Returned by

+ +
+
CreateCollectorForEvents
+
+ +

Extends

+ +
+
HistoryCollector
+
+
+ +

A quick search on HistoryCollector, we find it has a method RewindCollector():

+ +
+

RewindCollector(rewind)

+ +

Moves the “scrollable view” to the oldest item. If you use ReadNextTasks or ReadNextEvents, all items are retrieved from the oldest item to the newest item. This is the default setting when the collector is created.

+
+ +

The last sentence tells us that the default starting page of the EventHistoryCollector is the oldest one, or we can call it the first page in an human readable manner, so we can use ReadNextEvents() to read all the events page by page.

+ +

If you want to set the EventHistoryCollector’s starting point to the newest page (the last page), you can use the ResetCollector(reset) method.

+ +

Finally, hereunder the sample code to collect all the vCenter events in the past hour:

+ +

+
+
+
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+
from datetime import datetime, timedelta
+
+from pyVim.connect import SmartConnectNoSSL
+from pyVmomi import vim
+
+
+time_filter = vim.event.EventFilterSpec.ByTime()
+now = datetime.now()
+time_filter.beginTime = now - timedelta(hours=1)
+time_filter.endTime = now
+event_type_list = []
+# If you want to also filter on certain events, uncomment the below event_type_list.
+# The EventFilterSpec full params details:
+# https://pubs.vmware.com/vsphere-6-5/topic/com.vmware.wssdk.smssdk.doc/vim.event.EventFilterSpec.html
+# event_type_list = ['VmRelocatedEvent', 'DrsVmMigratedEvent', 'VmMigratedEvent']
+filter_spec = vim.event.EventFilterSpec(eventTypeId=event_type_list, time=time_filter)
+
+si = SmartConnectNoSSL(host=host, user=user, pwd=password, port=port)
+eventManager = si.content.eventManager
+event_collector = eventManager.CreateCollectorForEvents(filter_spec)
+page_size = 1000 # The default and also the max event number per page till vSphere v6.5, you can change it to a smaller value by SetCollectorPageSize().
+events = []
+
+while True:
+  # If there's a huge number of events in the expected time range, this while loop will take a while.
+  events_in_page = event_collector.ReadNextEvents(page_size)
+  num_event_in_page = len(events_in_page)
+  if num_event_in_page == 0:
+    break
+  events.extend(events_in_page) # or do other things on the collected events
+# Please note that the events collected are not ordered by the event creation time, you might find the first event in the third page for example.
+
+print(
+    "Got totally {} events in the given time range from {} to {}.".format(
+        len(events), time_filter.beginTime, time_filter.endTime
+    )
+)
+
+ + + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/09/backup-and-restore-gitlab-in-docker.html b/2018/09/backup-and-restore-gitlab-in-docker.html new file mode 100644 index 00000000..601f420c --- /dev/null +++ b/2018/09/backup-and-restore-gitlab-in-docker.html @@ -0,0 +1,1144 @@ + + + + + + +Backup and restore Gitlab in docker - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 8 minute read + + + +

+ + +
+ + +
+ + + +
+

Gitlab hosts everything about the code including the docs and the pipeline data, etc. It’s crucial to back it up. You can also use restore to migrate the Gitlab to another server. This post will show you how to backup and restore the Gitlab-CE docker version.

+
+ +

Some docs on the Internet

+ +
    +
  1. Backing up and restoring Gitlab from docs.gitlab.com
  2. +
  3. Gitlab omnibus backup from docs.gitlab.com
  4. +
  5. Gitlab Backup from codereviewvideos.com
  6. +
  7. GitLab Backup Made Easy from icicletech.com
  8. +
+ +

Backup prerequisites

+ +

Tar version

+

The official doc says that the backup and restore tasks use tar with minimum version 1.3. Check the tar version by tar --version. The default tar version installed by Gitlab with docker (Gitlab-CE v10.8.3) is v1.28, after the test, the backup and restore both work well with tar in version v1.28. After the test, I find that the default tar v1.28 is also good.

+ +

VM snapshot

+ +

If your Gitlab is installed on a VM, you can create a snapshot before any action. Please note that snapshot is not a backup, you should delete it as soon as your backup or restore task is completed.

+ +

Gitlab version

+ +

Be aware that we can only restore to exactly the same version and type of Gitlab. The default backup file has the Gitlab version and type in the end of the file name which is in the format EPOCH_YYYY_MM_DD_GitLab_version.

+ +
+

https://docs.gitlab.com/ee/raketasks/backup_restore.html#backup-timestamp:

+ +

The backup archive will be saved in backup_path, which is specified in the config/gitlab.yml file. The filename will be [TIMESTAMP]_gitlab_backup.tar, where TIMESTAMP identifies the time at which each backup was created, plus the GitLab version. The timestamp is needed if you need to restore GitLab and multiple backups are available.

+ +

For example, if the backup name is 1493107454_2018_04_25_10.6.4-ce_gitlab_backup.tar, then the timestamp is 1493107454_2018_04_25_10.6.4-ce.

+
+ +

config/gitlab.yml is migrated to /etc/gitlab/gitlab.rb in newer Gitlab version

+ +

Backup Gitlab in docker

+ +

Locate backup path

+ +

gitlab_rails['backup_path'] is commented in the Gitlab configuration file gitlab.rb, its value is the default backup path which is at /var/opt/gitlab/backups.

+ +
# From Gitlab docker
+
+root@gitlab:/etc/gitlab# cat /etc/gitlab/gitlab.rb | grep backup_path
+# gitlab_rails['manage_backup_path'] = true
+# gitlab_rails['backup_path'] = "/var/opt/gitlab/backups"
+
+ +

Create the backup

+ +

You don’t need to stop anything before creating the backup.

+ +
# From Ubuntu host outside of the Gitlab docker
+
+xiang@ubuntu1804:~$ docker exec -it gitlab1083 gitlab-rake gitlab:backup:create
+Dumping database ...
+Dumping PostgreSQL database gitlabhq_production ... [DONE]
+done
+Dumping repositories ...
+ * win/flaskapi ... [DONE]
+ * win/flaskapi.wiki ...  [SKIPPED]
+ * xiang/flaskapi ... [DONE]
+ * xiang/flaskapi.wiki ...  [SKIPPED]
+done
+Dumping uploads ...
+done
+Dumping builds ...
+done
+Dumping artifacts ...
+done
+Dumping pages ...
+done
+Dumping lfs objects ...
+done
+Dumping container registry images ...
+[DISABLED]
+Creating backup archive: 1537738648_2018_09_23_10.8.3_gitlab_backup.tar ... done
+Uploading backup archive to remote storage  ... skipped
+Deleting tmp directories ... done
+done
+done
+done
+done
+done
+done
+done
+Deleting old backups ... skipping
+xiang@ubuntu1804:~$
+
+ +

The backup uses the Linux commands tar and gzip. This works fine in most cases, but can cause problems when data is rapidly changing. When data changes while tar is reading it, the error file changed as we read it may occur, and will cause the backup process to fail. In such case, you add the copy strategy to your backup command like docker exec -it gitlab1083 gitlab-rake gitlab:backup:create STRATEGY=copy.

+ +

Check the backup

+ +

In fact, I created twice the backup, so we can see two backups here with different timestamps: 1537738648_2018_09_23_10.8.3, 1537738690_2018_09_23_10.8.3.

+ +

Notice that the backup file names don’t contain the Gitlab type (ce for community edition), they only have the creation time (1537738648_2018_09_23 for the first backup file) and the Gitlab version (10.8.3).

+ +

We can also find that the backup account is git.

+ +
# From Gitlab docker
+
+root@gitlab:/etc/gitlab# ls -lart /var/opt/gitlab/backups
+total 644
+drwxr-xr-x 19 root root   4096 Sep 22 23:52 ..
+-rw-------  1 git  git  215040 Sep 23 21:37 1537738648_2018_09_23_10.8.3_gitlab_backup.tar
+-rw-------  1 git  git  215040 Sep 23 21:38 1537738690_2018_09_23_10.8.3_gitlab_backup.tar
+drwx------  2 git  root   4096 Sep 23 21:38 .
+
+ +

Backup configuration and secret files

+ +

Yes, the configuration and secret files are not backed up during the previous backup procedure. This is because the previous one encrypts the some Gitlab data by using the secret key in the configuration and secret files. If you save them to the same place, you’re just defeating the encryption.

+ +

So please also backup /etc/gitlab/gitlab.rb and /etc/gitlab/gitlab-secrets.json and save them to a secure place from other Gitlab backup data.

+ +

Upload backups to remote storage

+ +

I haven’t tested yet, here is the official doc.

+ +

Restore Gitlab

+ +

You can only restore the Gitlab backup to exactly the same Gitlab version and type. And you also need to have a working Gitlab instance.

+ +

Stop some Gitlab services

+ +
# From Gitlab docker
+
+gitlab-ctl reconfigure
+gitlab-ctl start
+gitlab-ctl stop unicorn
+gitlab-ctl stop sidekiq
+gitlab-ctl status
+ls -lart /var/opt/gitlab/backups
+
+ +

Start the restore

+ +

The backup file must can be found in the backup path, which is defined in the configuration file /etc/gitlab/gitlab.rb by the key gitlab_rails['backup_path'].

+ +
# From Ubuntu host outside of the Gitlab docker
+
+xiang@ubuntu1804:~$ docker exec -it gitlab1083 gitlab-rake gitlab:backup:restore --trace
+** Invoke gitlab:backup:restore (first_time)
+** Invoke gitlab_environment (first_time)
+** Invoke environment (first_time)
+** Execute environment
+** Execute gitlab_environment
+** Execute gitlab:backup:restore
+Unpacking backup ... done
+Before restoring the database, we will remove all existing
+tables to avoid future upgrade problems. Be aware that if you have
+custom tables in the GitLab database these tables and all data will be
+removed.
+
+Do you want to continue (yes/no)? yes
+Removing all tables. Press `Ctrl-C` within 5 seconds to abort
+(...)
+COPY 0
+ setval
+--------
+      1
+(1 row)
+
+COPY 0
+ setval
+--------
+      1
+(1 row)
+(...)
+ALTER TABLE
+ALTER TABLE
+(...)
+CREATE INDEX
+(...)
+ALTER TABLE
+ALTER TABLE
+(...)
+WARNING:  no privileges were granted for "public"
+GRANT
+[DONE]
+done
+** Invoke gitlab:backup:repo:restore (first_time)
+** Invoke gitlab_environment
+** Execute gitlab:backup:repo:restore
+Restoring repositories ...
+ * win/flaskapi ... [DONE]
+ * xiang/flaskapi ... [DONE]
+Put GitLab hooks in repositories dirs [DONE]
+done
+** Invoke gitlab:backup:uploads:restore (first_time)
+** Invoke gitlab_environment
+** Execute gitlab:backup:uploads:restore
+Restoring uploads ...
+done
+** Invoke gitlab:backup:builds:restore (first_time)
+** Invoke gitlab_environment
+** Execute gitlab:backup:builds:restore
+Restoring builds ...
+done
+** Invoke gitlab:backup:artifacts:restore (first_time)
+** Invoke gitlab_environment
+** Execute gitlab:backup:artifacts:restore
+Restoring artifacts ...
+done
+** Invoke gitlab:backup:pages:restore (first_time)
+** Invoke gitlab_environment
+** Execute gitlab:backup:pages:restore
+Restoring pages ...
+done
+** Invoke gitlab:backup:lfs:restore (first_time)
+** Invoke gitlab_environment
+** Execute gitlab:backup:lfs:restore
+Restoring lfs objects ...
+done
+** Invoke gitlab:shell:setup (first_time)
+** Invoke gitlab_environment
+** Execute gitlab:shell:setup
+This will rebuild an authorized_keys file.
+You will lose any data stored in authorized_keys file.
+Do you want to continue (yes/no)? yes
+
+** Invoke cache:clear (first_time)
+** Invoke cache:clear:redis (first_time)
+** Invoke environment
+** Execute cache:clear:redis
+** Execute cache:clear
+Deleting tmp directories ... done
+done
+done
+done
+done
+done
+done
+done
+xiang@ubuntu1804:~$
+
+ +

We can also add the param BACKUP to specify the backup file if there’re more than one backup tar file in the backup path. The value of the BACKUP is the backup file timestamp, for example : docker exec -it gitlab1083 gitlab-rake gitlab:backup:restore BACKUP=1537738690_2018_09_23_10.8.3 --trace.

+ +

Restart Gitlab with sanity check

+ +

Restart the Gitlab services by gitlab-ctl restart:

+ +
# From Gitlab docker
+
+root@gitlab:/# gitlab-ctl restart
+ok: run: alertmanager: (pid 2789) 1s
+ok: run: gitaly: (pid 2797) 0s
+ok: run: gitlab-monitor: (pid 2806) 0s
+ok: run: gitlab-workhorse: (pid 2811) 1s
+ok: run: logrotate: (pid 2827) 0s
+ok: run: nginx: (pid 2834) 1s
+ok: run: node-exporter: (pid 2839) 0s
+ok: run: postgres-exporter: (pid 2845) 1s
+ok: run: postgresql: (pid 2855) 0s
+ok: run: prometheus: (pid 2864) 0s
+ok: run: redis: (pid 2873) 1s
+ok: run: redis-exporter: (pid 2877) 0s
+ok: run: sidekiq: (pid 2957) 0s
+ok: run: sshd: (pid 2960) 0s
+ok: run: unicorn: (pid 2968) 1s
+
+ +

Launch the Gitlab sanity check by gitlab-rake gitlab:check SANITIZE=true:

+ +
root@gitlab:/# gitlab-rake gitlab:check SANITIZE=true
+Checking GitLab Shell ...
+
+GitLab Shell version >= 7.1.2 ? ... OK (7.1.2)
+Repo base directory exists?
+default... yes
+Repo storage directories are symlinks?
+default... no
+Repo paths owned by git:root, or git:git?
+default... yes
+Repo paths access is drwxrws---?
+default... yes
+hooks directories in repos are links: ...
+3/2 ... ok
+2/3 ... ok
+Running /opt/gitlab/embedded/service/gitlab-shell/bin/check
+Check GitLab API access: FAILED: Failed to connect to internal API
+gitlab-shell self-check failed
+  Try fixing it:
+  Make sure GitLab is running;
+  Check the gitlab-shell configuration file:
+  sudo -u git -H editor /opt/gitlab/embedded/service/gitlab-shell/config.yml
+  Please fix the error above and rerun the checks.
+
+Checking GitLab Shell ... Finished
+
+Checking Sidekiq ...
+
+Running? ... no
+  Try fixing it:
+  sudo -u git -H RAILS_ENV=production bin/background_jobs start
+  For more information see:
+  doc/install/installation.md in section "Install Init Script"
+  see log/sidekiq.log for possible errors
+  Please fix the error above and rerun the checks.
+
+Checking Sidekiq ... Finished
+
+Reply by email is disabled in config/gitlab.yml
+Checking LDAP ...
+
+LDAP is disabled in config/gitlab.yml
+
+Checking LDAP ... Finished
+
+Checking GitLab ...
+
+Git configured correctly? ... yes
+Database config exists? ... yes
+All migrations up? ... yesyes
+Database contains orphaned GroupMembers? ... nono
+GitLab config exists? ... yes
+GitLab config up to date? ... yes
+Log directory writable? ... yes
+Tmp directory writable? ... yes
+Uploads directory exists? ... yes
+Uploads directory has correct permissions? ... yes
+Uploads directory tmp has correct permissions? ... yes
+Init script exists? ... skipped (omnibus-gitlab has no init script)
+Init script up-to-date? ... skipped (omnibus-gitlab has no init script)
+Projects have namespace: ...
+3/2 ... yesyes
+2/3 ... yes
+Redis version >= 2.8.0? ... yes
+Ruby version >= 2.3.5 ? ... yes
+Ruby version >= 2.3.5 ? ... yes (2.3.7)
+Git version >= 2.9.5 ? ... yes (2.16.4)yes (2.3.7)
+Git version >= 2.9.5 ? ... yes (2.16.4)
+Git user has default SSH configuration? ... yes
+Active users: ... 2
+
+Checking GitLab ... Finished
+
+root@gitlab:/#
+
+ +

Verify the Gitlab container health by docker ps:

+ +
# From Ubuntu host outside of the Gitlab docker
+
+xiang@ubuntu1804:~$ docker ps
+CONTAINER ID        IMAGE                          COMMAND             CREATED             STATUS
+PORTS                                                            NAMES
+707439b39dd1        gitlab/gitlab-ce:10.8.3-ce.0   "/assets/wrapper"   2 weeks ago         Up 15 minutes (healthy)
+0.0.0.0:80->80/tcp, 0.0.0.0:443->443/tcp, 0.0.0.0:2222->22/tcp   gitlab1083
+
+ + +
+ + + + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/09/install-gitlab-ce-in-docker-on-ubuntu.html b/2018/09/install-gitlab-ce-in-docker-on-ubuntu.html new file mode 100644 index 00000000..225d2431 --- /dev/null +++ b/2018/09/install-gitlab-ce-in-docker-on-ubuntu.html @@ -0,0 +1,853 @@ + + + + + + +Install Gitlab-CE in Docker on Ubuntu - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +
+ + +
+ + + +
+

Gitlab-CE (Community Edition) is a completely free and powerful web-based Git-repository manager with wiki, issue-tracking and CI/CD pipeline features, using an open-source license, developed by GitLab Inc. There’re already many much better docs on the net, I’ve never worked with Docker and Linux before, so I wrote this post to save my way to install the Gitlab docker version on Ubuntu, the post is more or less for personal purpose.

+
+ +

Install Ubuntu server on Hyper-V

+ +
    +
  1. Enabled the free Hyper-V role on the Windows 10 PC.
  2. +
  3. Install Ubuntu server on the Hyper-V. (I used “Ubuntu 18.04.1 LTS”)
  4. +
+ +

Warning: Don’t install the snap version of Docker during the Ubuntu install, I failed to run the Docker image after. There’s an error saying that: “docker: Error response from daemon: error while creating mount source path ‘/srv/gitlab/logs’: mkdir /srv/gitlab: read-only file system.”. To remove the Docker snap: sudo snap remove docker.

+ +

Install Docker on Ubuntu

+ +

Here is the official doc for installing Docker on Ubuntu, just follow the procedure step by step.

+ +

The docker group is created but no users are added to it. You need to use sudo to run Docker commands. Continue to Linux postinstall to allow non-privileged users to run Docker commands and for other optional configuration steps.

+ +

To verify Docker is running fine, we can try to run a hello-world image :

+
xiang@ubuntu1804:~$ docker run hello-world
+Unable to find image 'hello-world:latest' locally
+latest: Pulling from library/hello-world
+9db2ca6ccae0: Pull complete
+Digest: sha256:4b8ff392a12ed9ea17784bd3c9a8b1fa3299cac44aca35a85c90c5e3c7afacdc
+Status: Downloaded newer image for hello-world:latest
+
+Hello from Docker!
+This message shows that your installation appears to be working correctly.
+
+To generate this message, Docker took the following steps:
+ 1. The Docker client contacted the Docker daemon.
+ 2. The Docker daemon pulled the "hello-world" image from the Docker Hub.
+    (amd64)
+ 3. The Docker daemon created a new container from that image which runs the
+    executable that produces the output you are currently reading.
+ 4. The Docker daemon streamed that output to the Docker client, which sent it
+    to your terminal.
+
+To try something more ambitious, you can run an Ubuntu container with:
+ $ docker run -it ubuntu bash
+
+Share images, automate workflows, and more with a free Docker ID:
+ https://hub.docker.com/
+
+For more examples and ideas, visit:
+ https://docs.docker.com/engine/userguide/
+
+ +

Install Gitlab CE in Docker

+ +

Here is the official Gitlab Docker doc, I really thank the Gitlab team, their doc system is one of the bests that I’ve ever seen. Another doc from IBM is also good. Run the following commands to install Gitlab-CE in Docker.

+ +
xiang@ubuntu1804:~$ docker run --detach \
+--hostname gitlab.copdips.local \
+--publish 443:443 --publish 80:80 --publish 2222:22 \
+--name gitlab \
+--restart always \
+--volume /srv/gitlab/config:/etc/gitlab \
+--volume /srv/gitlab/logs:/var/log/gitlab \
+--volume /srv/gitlab/data:/var/opt/gitlab \
+gitlab/gitlab-ce:latest
+
+xiang@ubuntu1804:~$ docker ps
+CONTAINER ID        IMAGE                          COMMAND             CREATED             STATUS                            PORTS                                                            NAMES
+707439b39dd1        gitlab/gitlab-ce:latest  "/assets/wrapper"   3 minutes ago       Up 3 minutes (health: starting)   0.0.0.0:80->80/tcp, 0.0.0.0:443->443/tcp, 0.0.0.0:2222->22/tcp   gitlab
+
+ +

Warning: I use --publish 2222:22 instead of --publish 22:22 which is given by the official Run the docker image doc, this is to avoid using the default SSH port (TCP 22) already binded to the Docker host, our Ubuntu server.

+ +

Warning: Do NOT use port 8080 otherwise there will be conflicts. This port is already used by Unicorn that runs internally in the container.

+ +

Note: There’s also a Docker compose way to install Gitlab-CE.

+ +

Check Gitlab

+ +

Open your browser, go to http://YourUbuntuServerIP/, you should see the Gitlab login page. On this page, you need to set the Gitlab root user initial password.

+ +

If you like to use HTTPS, you need to generate a SSL certificate and add it to Gitlab the config file.

+ +

Run Gitlab in Kubernetes

+ +

IBM has provided a doc about it.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/09/install-gitlab-runner-on-windows-by-powershell-psremoting.html b/2018/09/install-gitlab-runner-on-windows-by-powershell-psremoting.html new file mode 100644 index 00000000..3eab5f6e --- /dev/null +++ b/2018/09/install-gitlab-runner-on-windows-by-powershell-psremoting.html @@ -0,0 +1,897 @@ + + + + + + +Install Gitlab Runner on Windows by Powershell PsRemoting - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +
+ + +
+ + + +
+

Gitlab runner can be installed on Windows OS. For people like me who is more familiar with Windows, we would like to use Windows as a Gitlab runner. This post will give you a simplified procedure (winrm PsRemoting full command line) about its installation with some tips and tricks that I haven’t seen anywhere on the Internet.

+
+ +

Some docs on the Internet

+ +

The official doc is complete and clear enough.

+ +

Download Gitlab runner executable

+ +
# This command is runned from my Windows 10 desktop.
+$paramIwr = @{
+  Uri = "https://gitlab-runner-downloads.s3.amazonaws.com/latest/binaries/gitlab-runner-windows-amd64.exe";
+  OutFile = "D:\xiang\Downloads\gitlab-runner-windows-amd64.exe"
+}
+Invoke-WebRequest @paramIwr
+
+ +

Install Gitlab runner on Windows

+ +

Some official docs:

+
    +
  1. +Install gitlab runner on windows.
  2. +
  3. Gitlab-runner installation related commands
  4. +
+ +

My Gitlab runner is a fresh Windows server 2019 VM named 19S01.

+ +
# Use WinRM over HTTPS is the simplest way to connect to an out of the box workgroup Windows server in lab.
+$s19s01 = New-PSSession 19S01 -UseSSL -SessionOption (New-PSSessionOption -SkipCACheck) -Credential administrator
+
+# ntrights is in the Windows Server 2003 Resource Kit Tools
+# https://www.microsoft.com/en-us/Download/confirmation.aspx?id=17657
+Copy-Item D:\xiang\Dropbox\tools\windows\rktools\ntrights.exe c:/temp -ToSession $s19s01
+Copy-Item D:\xiang\Downloads\gitlab-runner-windows-amd64.exe c:/temp -ToSession $s19s01
+
+Enter-PSSession $s19S01
+
+# If you need to use a domain account to run the gitlab-runner server, this way is not recommended.
+# c:/temp/ntrights.exe ntrights +r SeServiceLogonRight -u Domain\DomainAccount
+
+New-Item d:/app/gitlab-runner -Type Directory -Force
+Copy-Item C:\temp\gitlab-runner-windows-amd64.exe D:\app\gitlab-runner
+Rename-Item D:\app\gitlab-runner\gitlab-runner-windows-amd64.exe gitlab-runner.exe
+
+# Install without any other params will install a windows service named gitlab-runner running under the built-in system account.
+Set-Location D:\app\gitlab-runner
+./gitlab-runner.exe install
+
+# If you need to bind a domain account to the gitlab runner service:
+# I encountered some issue when installing gitlab runner service with the full exe path : D:\app\gitlab-runner\gitlab-runner.exe install, so I firstly go to the gitlab-runner.exe folder, than run the exe directly from there.
+Set-Location D:\app\gitlab-runner
+./gitlab-runner install --user ENTER-YOUR-USERNAME --password ENTER-YOUR-PASSWORD
+
+D:\app\gitlab-runner\gitlab-runner.exe status
+
+ +

Register Gitlab runner on Windows

+ +

Some official docs:

+ +
    +
  1. Register gitlab-runner on windows
  2. +
  3. One-line registration commands
  4. +
  5. Gitlab-runner registration related commands
  6. +
+ +
Add-Content -Value "192.168.111.184`tgitlab.copdips.local" -Path C:\Windows\system32\drivers\etc\hosts
+
+# Add the gitlab self-signed certificate to runner's cert store.
+$gitlabUrl = "https://gitlab.copdips.local"
+$localCertPath = "$env:temp\$($gitlabUrl.Split('/')[2]).crt"
+$webRequest = [Net.WebRequest]::Create($gitlabUrl)
+try { $webRequest.GetResponse() } catch {} # try catch is useful if ssl cert is not valid. ServicePoint is always kept even for invalid ssl cert.
+$cert = $webRequest.ServicePoint.Certificate
+$bytes = $cert.Export("Cert")
+Set-content -value $bytes -encoding byte -path $localCertPath
+
+# https://docs.microsoft.com/en-us/windows/desktop/seccertenroll/about-certificate-directory
+Import-Certificate -FilePath $localCertPath -CertStoreLocation Cert:\LocalMachine\Root
+
+# Ensure the runner is stopped before the registration.
+D:\app\gitlab-runner\gitlab-runner.exe stop
+D:\app\gitlab-runner\gitlab-runner.exe status
+
+# Go to https://gitlab.copdips.local/win/flaskapi/settings/ci_cd and get the runner registration-token from this web site
+# Dont add quotes around the registration-token.
+# Pay attention to the usage of the stop-parsing symbol --% .
+# http://copdips.com/2018/05/powershell-stop-parsing.html
+D:\app\gitlab-runner\gitlab-runner.exe --% register -n --name 19s01 --url https://gitlab.copdips.local/ --registration-token Qdz3TyfnESrjSsmff6A9  --executor shell --shell powershell --tag-list 'windows,windows2016,flaskapi' --run-untagged true
+D:\app\gitlab-runner\gitlab-runner.exe start
+D:\app\gitlab-runner\gitlab-runner.exe status
+
+ +

Using Powershell Core pwsh.exe as a Windows Gitlab runner shell will be supported from the version 11.8

+ +

Check the Gitlab runner config from the runner server

+ +
# Dont be afraid of the error messages returned by gitlab-runner.exe list.
+# The Powershell PsRemoting session is not as powerfull as local Powershell console, and some external executables like gitlab-runner.exe or git.exe send their outputs to stderr by default.
+[19S01]: PS C:\temp> D:\app\gitlab-runner\gitlab-runner.exe list
+D:\app\gitlab-runner\gitlab-runner.exe : Listing configured runners
+ConfigFile=C:\Users\Administrator\Documents\config.toml
+    + CategoryInfo          : NotSpecified: (Listing configu...nts\config.toml:String) [], RemoteException
+    + FullyQualifiedErrorId : NativeCommandError
+
+19s01                                               Executor=shell
+Token=4a76cba042b1748e7546dad9f03458 URL=https://gitlab.copdips.local/
+
+[19S01]: PS C:\temp> Get-Content (gcim cim_service | ? name -eq gitlab-runner | % path*).split(" ")[5]
+concurrent = 1
+check_interval = 0
+
+[[runners]]
+  name = "19s01"
+  url = "https://gitlab.copdips.local/"
+  token = "4a76cba042b1748e7546dad9f03458"
+  executor = "shell"
+  shell = "powershell"
+  [runners.cache]
+
+ +

Check the Gitlab runner config from the Gitlab website

+ +

Go to the Gitlab web site hosted in my Ubuntu docker container. Then go to the repo where I got the runner registration token previously. Than go to Settings-> CI / CD Settings -> Runner Settings, check your runner setting here, especially the tag list which is not listed from the runner server local config.

+ +

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/09/setup-https-for-gitlab.html b/2018/09/setup-https-for-gitlab.html new file mode 100644 index 00000000..a9d00708 --- /dev/null +++ b/2018/09/setup-https-for-gitlab.html @@ -0,0 +1,1030 @@ + + + + + + +Setup HTTPS for Gitlab - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +
+ + +
+ + + +
+

Gitlab-CE default installation goes with HTTPS disable. We need to generate a SSL certificate, and bind it to the HTTPS of Gitlab-CE.

+
+ +

Some docs on the Internet

+ +
    +
  1. Gitlab omnibus SSL settings
  2. +
  3. Gitlab omnibus enable HTTPS
  4. +
  5. Generate a self-signed certificate with openssl
  6. +
  7. How to install and configure Gitlab on Ubuntu 16.04
  8. +
  9. [Deprecated] How To Secure GitLab with Let’s Encrypt on Ubuntu 16.04
  10. +
+ +

Generate self-signed SSL certificate without SAN

+ +

Online docs for SSL certificate without SAN

+ +
    +
  1. Creating a Self-Signed SSL Certificate
  2. +
  3. How To Run Gitlab With Self Signed Ssl Certificate
  4. +
+ +

Generate SSL certificate private key

+ +
xiang@ubuntu1804:~/ssl$ sudo openssl genrsa -out "./gitlab.copdips.local.key" 2048
+Generating RSA private key, 2048 bit long modulus
+............+++
+..+++
+e is 65537 (0x010001)
+
+ +

Generate SSL certificate request

+ +

Without the switch -config, the generation of csr request will ask you some information about company, email, and passphrasem etc. If you dont want OpenSSL to ask you that, you need to prepare a config file and specify it by -config [YourConfigPath], and config example can be found in the paragraph Prepare the OpenSSL config file.

+ +
xiang@ubuntu1804:~/ssl$ sudo openssl req -new -key "gitlab.copdips.local.key" -out "gitlab.copdips.local.csr"
+You are about to be asked to enter information that will be incorporated
+into your certificate request.
+What you are about to enter is what is called a Distinguished Name or a DN.
+There are quite a few fields but you can leave some blank
+For some fields there will be a default value,
+If you enter '.', the field will be left blank.
+-----
+Country Name (2 letter code) [AU]:
+State or Province Name (full name) [Some-State]:
+Locality Name (eg, city) []:
+Organization Name (eg, company) [Internet Widgits Pty Ltd]:copdips
+Organizational Unit Name (eg, section) []:
+Common Name (e.g. server FQDN or YOUR name) []:gitlab.copdips.local
+Email Address []:
+
+Please enter the following 'extra' attributes
+to be sent with your certificate request
+A challenge password []:
+An optional company name []:
+
+ +

Generate SSL certificate

+ +

OpenSSL has the option to generate the certificate in one line, this post splits it into 3 steps (the private key, the request file, and the certificate) in order to get a clear understanding of the certificate generation procedure.

+ +
xiang@ubuntu1804:~/ssl$ sudo openssl x509 -req -days 1000 -in gitlab.copdips.local.csr -signkey gitlab.copdips.local.key -out gitlab.copdips.local.crt -extfile gitlab.copdips.local.cnf -extension v3_req
+Signature ok
+subject=C = AU, ST = Some-State, O = copdips, CN = gitlab.copdips.local
+Getting Private key
+
+ +

Review the SSL certificate content

+ +
xiang@ubuntu1804:~/ssl$ openssl x509 -in gitlab.copdips.local.crt -text -noout
+Certificate:
+    Data:
+        Version: 1 (0x0)
+        Serial Number:
+            b4:96:ba:89:62:7b:32:83
+    Signature Algorithm: sha256WithRSAEncryption
+        Issuer: C = AU, ST = Some-State, O = copdips, CN = gitlab.copdips.local
+        Validity
+            Not Before: Sep 13 22:05:40 2018 GMT
+            Not After : Jun  9 22:05:40 2021 GMT
+        Subject: C = AU, ST = Some-State, O = copdips, CN = gitlab.copdips.local
+        Subject Public Key Info:
+            Public Key Algorithm: rsaEncryption
+                Public-Key: (2048 bit)
+
+ +

DO NOT use password protected certificate key (in case the lack of the switch -nodes for no DES), to remove the password from the key: +openssl rsa -in certificate_before.key -out certificate_after.key

+ +

Generate self-signed SAN SSL certificate

+ +

Online docs for SSL certificate with SAN

+ +

I tested many methods found on the Internet, most of them don’t work. Finally, I followed the doc maintained by Citrix. This should be a trusted one as Netscaler is a key product in Citrix, the doc is always updated with the latest version of OpenSSL. +With time going by, the procedure might change, if below procedure doesn’t work, please go to check the Citrix online doc directly.

+ +

Prepare the OpenSSL config file

+ +

Prepare an OpenSSL config file. On Ubuntu 1804, an OpenSSL config example can be found at: /usr/lib/ssl/openssl.cnf. +Or You can find the path from the command: openssl version -a | grep OPENSSLDIR. You might need to change the config according to your actual environment.

+ +
xiang@ubuntu1804:~/ssl$ cat gitlab.copdips.local.cnf
+[req]
+prompt             = no
+default_bits       = 2048
+x509_extensions    = v3_req
+distinguished_name = req_distinguished_name
+
+[req_distinguished_name]
+organizationName        = copdips
+commonName              = gitlab.copdips.local
+
+[v3_req]
+subjectAltName = @alt_names
+
+[alt_names]
+DNS.1 = *.copdips.local
+DNS.2 = ubuntu1804
+DNS.3 = ubuntu1804.copdips.local
+
+ +

Be careful with the usage of the wildcard in [alt_names], the above OpenSSL config is just an example to show what are the DNS names can be added to SAN.

+ +

Generate the SAN SSL certificate content

+ +

Pay attention to -extensions v3_req in the end of the command, it’s the extension tag name in the gitlab.copdips.local.cnf file. If you dont specify it, the output certificate won’t have the extension part, so no SAN neither.

+ +
xiang@ubuntu1804:~/ssl$ sudo openssl req -x509 -days 1000 -nodes -out gitlab.copdips.local.crt -keyout gitlab.copdips.local.key -config gitlab.copdips.local.cnf -extensions v3_req
+Generating a 2048 bit RSA private key
+...................................................+++
+...............................+++
+writing new private key to 'gitlab.copdips.local.key'
+
+ +

DO NOT use password protected certificate key (in case the lack of the switch -nodes for no DES), to remove the password from the key: +openssl rsa -in certificate_before.key -out certificate_after.key

+ +

Review the SAN SSL certificate

+ +

The default Signature Algorithm has been already SHA256. Some online docs tell to add the switch -sha256 when using openssl req, but it’s deprecated with the new version of OpenSSL. BTW, the RSA private key default bits is 2048. My OpenSSL version on Ubuntu 1804 is OpenSSL 1.1.0g 2 Nov 2017

+ +
xiang@ubuntu1804:~/ssl$ openssl x509 -in gitlab.copdips.local.crt -noout -text
+Certificate:
+    Data:
+        Version: 3 (0x2)
+        Serial Number:
+            d3:2c:bb:1d:6c:7e:7b:98
+    Signature Algorithm: sha256WithRSAEncryption
+        Issuer: O = copdips, CN = gitlab.copdips.local
+        Validity
+            Not Before: Sep 15 22:00:55 2018 GMT
+            Not After : Jun 11 22:00:55 2021 GMT
+        Subject: O = copdips, CN = gitlab.copdips.local
+        Subject Public Key Info:
+            Public Key Algorithm: rsaEncryption
+                Public-Key: (2048 bit)
+                Modulus:
+                    [...]
+                Exponent: 65537 (0x10001)
+        X509v3 extensions:
+            X509v3 Subject Alternative Name:
+                DNS:*.copdips.local, DNS:ubuntu1804, DNS:ubuntu1804.copdips.local
+    Signature Algorithm: sha256WithRSAEncryption
+         [...]
+
+ +

Save the SSL certificate

+ +

Create the folder /etc/gitlab/ssl with following two commands, and copy the SSL certificate and key here with the name of [fqdn].crt and [fqnd].key.

+ +
root@gitlab:/# mkdir -p /etc/gitlab/ssl
+root@gitlab:/# chmod 700 /etc/gitlab/ssl
+xiang@ubuntu1804:~/ssl$ sudo cp ~/ssl/gitlab.copdips.local.key ~/ssl/gitlab.copdips.local.crt /srv/gitlab1083/config/ssl/
+
+ +

/srv/gitlab1083/ssl/ is the physical gitlab location on my Ubuntu server which is pointed to /etc/gitlab/ssl its docker container.

+ +

Configure HTTPS on Gitlab

+ +

Hereunder the content of uncommented lines in the Gitlab configuration file:

+ +
root@gitlab:/# grep "^[^#;]" /etc/gitlab/gitlab.rb
+ external_url 'https://gitlab.copdips.local'
+ nginx['redirect_http_to_https'] = true
+ nginx['ssl_certificate'] = "/etc/gitlab/ssl/gitlab.copdips.local.crt"
+ nginx['ssl_certificate_key'] = "/etc/gitlab/ssl/gitlab.copdips.local.key"
+
+ +

Update Gitlab config

+ +

When you changed the configuration file, to take effect:

+ +
root@gitlab:/# gitlab-ctl reconfigure
+
+ +

Check the website SSL certificate from the command line

+ +

By openssl for both Linux and Windows

+ +

For Linux :

+
openssl s_client -connect gitlab.copdips.local:443 < /dev/null 2>/dev/null | openssl x509 -text -in /dev/stdin -noout
+
+ +

For Windows with OpenSSL installed:

+
$null | openssl s_client -connect gitlab.copdips.local:443 | openssl x509 -text -noout
+
+ +

My OpenSSL is installed with GIT on Windows. GitForWindows installs also many other powerful Linux commands (grep, ssh, tail, and also vim, etc.) ported to Windows.

+ +

By certuil for Windows only

+ +

You should explicitly download the certificate at first, and then view the content locally, so this method is not cool. +Hope Powershell team can get this done by one single cmdlet in the future Powershell releases.

+ +
$url = "https://gitlab.copdips.local"
+$localCertPath = "$env:temp\$($url.Split('/')[2]).crt"
+$webRequest = [Net.WebRequest]::Create($url)
+try { $webRequest.GetResponse() } catch {} # try catch is useful if ssl cert is not valid. ServicePoint is always kept even for invalid ssl cert.
+$cert = $webRequest.ServicePoint.Certificate
+$bytes = $cert.Export("Cert")
+Set-content -value $bytes -encoding byte -path $localCertPath
+certutil.exe -dump $localCertPath
+
+ +

Or a nice cmdlet Test-WebServerSSL written by the MVP Vadims Podāns.

+ +

Update the certificate in case of renewal

+ +

Here is the official doc.

+ +

When you changed the SSL certificate, gitlab-ctl reconfigure won’t take it into effect as there’s nothing changed in the gitlab.rb configuration file. Use following command to update the certificate:

+ +
gitlab-ctl hup nginx
+
+ + +
+ + + + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/09/terminate-powershell-script-or-session.html b/2018/09/terminate-powershell-script-or-session.html new file mode 100644 index 00000000..9a0319e6 --- /dev/null +++ b/2018/09/terminate-powershell-script-or-session.html @@ -0,0 +1,1211 @@ + + + + + + +Terminate Powershell script or session - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +
+ + +
+ + + +
+

I always asked myself how to terminate a Powershell script or session, each time I needed to do some tests by myself and also searched on Google. But I could never remember it. So I would like to take this post to note it down, the next time I need to terminate, just need to come back to here.

+
+ +

Terminate the current Powershell script

+ +

Way 1 - Exit

+ +

Exit without exit code

+ +
5.1> Get-Content .\test.ps1
+function foo {
+    Write-Output beginFunction
+    1..3 | % {
+        Write-Output $_
+        exit
+    }
+    Write-Output endFunction
+}
+
+Write-Output beginScript
+foo
+Write-Output endScript
+
+5.1> .\test.ps1
+beginScript
+beginFunction
+1
+
+5.1> Write-Host "Last execution status: $?" ; Write-Host "Last exit code: $LASTEXITCODE"
+Last execution status: True
+Last exit code: 0
+
+5.1>
+
+ +

Exit with code 0

+ +
5.1> Get-Content .\test.ps1
+function foo {
+    Write-Output beginFunction
+    1..3 | % {
+        Write-Output $_
+        exit 0
+    }
+    Write-Output endFunction
+}
+
+Write-Output beginScript
+foo
+Write-Output endScript
+
+5.1> .\test.ps1
+beginScript
+beginFunction
+1
+
+5.1> Write-Host "Last execution status: $?" ; Write-Host "Last exit code: $LASTEXITCODE"
+Last execution status: True
+Last exit code: 0
+
+5.1>
+
+ +

Exit with code 1

+ +
5.1> Get-Content .\test.ps1
+function foo {
+    Write-Output beginFunction
+    1..3 | % {
+        Write-Output $_
+        exit 1
+    }
+    Write-Output endFunction
+}
+
+Write-Output beginScript
+foo
+Write-Output endScript
+
+5.1> .\test.ps1
+beginScript
+beginFunction
+1
+
+5.1> Write-Host "Last execution status: $?" ; Write-Host "Last exit code: $LASTEXITCODE"
+Last execution status: False
+Last exit code: 0
+
+5.1>
+
+ +

Way 2 - Break

+ +

Break with an UnknownLabel terminates the script directly

+ +
5.1> Get-Content .\test.ps1
+function foo {
+    Write-Output beginFunction
+    1..3 | % {
+        Write-Output $_
+        break foobar
+    }
+    Write-Output endFunction
+}
+
+Write-Output beginScript
+foo
+Write-Output endScript
+
+5.1> .\test.ps1
+beginScript
+beginFunction
+1
+
+5.1> Write-Host "Last execution status: $?" ; Write-Host "Last exit code: $LASTEXITCODE"
+Last execution status: True
+Last exit code: 0
+
+5.1>
+
+ +

But it terminates also the caller script

+ +
5.1> Get-Content .\test.ps1
+function foo {
+    Write-Output beginFunction
+    1..3 | % {
+        Write-Output $_
+        break foobar
+    }
+    Write-Output endFunction
+}
+
+Write-Output beginScript
+foo
+Write-Output endScript
+
+5.1> Get-Content .\call-test.ps1
+Write-Output 'Call test.ps1'
+./test.ps1
+Write-Output 'End call test.ps'
+Write-Output "call-test.ps1: Last execution status: $?"
+Write-Output "call-test.ps1: Last exit code: $LASTEXITCODE"
+
+5.1> .\call-test.ps1
+Call test.ps1
+beginScript
+beginFunction
+1
+
+5.1>
+
+ +

Never use break UnknownLabel to terminate the script. Break does’t raise error, the caller script cannot catch its output.

+ +

Terminate the current Powershell session

+ +

Way 1 - System.Environment.Exit

+ +

https://docs.microsoft.com/en-us/dotnet/api/system.environment.exit

+ +

Environment.Exit with code 0 and started by powershell.exe

+ +
5.1> Get-Content .\test.ps1
+function foo {
+    Write-Output beginFunction
+    1..3 | % {
+        Write-Output $_
+        [Environment]::Exit(0)
+    }
+    Write-Output endFunction
+}
+
+Write-Output beginScript
+foo
+Write-Output endScript
+
+5.1> powershell -noprofile .\test.ps1
+beginScript
+beginFunction
+1
+
+5.1> Write-Host "Last execution status: $?" ; Write-Host "Last exit code: $LASTEXITCODE"
+Last execution status: True
+Last exit code: 0
+
+5.1>
+
+ +

Environment.Exit with code 1 and started by powershell.exe

+ +
5.1> Get-Content .\test.ps1
+function foo {
+    Write-Output beginFunction
+    1..3 | % {
+        Write-Output $_
+        [Environment]::Exit(1)
+    }
+    Write-Output endFunction
+}
+
+Write-Output beginScript
+foo
+Write-Output endScript
+
+5.1> powershell -noprofile .\test.ps1
+beginScript
+beginFunction
+1
+
+5.1> Write-Host "Last execution status: $?" ; Write-Host "Last exit code: $LASTEXITCODE"
+Last execution status: False
+Last exit code: 0
+
+5.1>
+
+ +

Environment.Exit with code 0 and started by Start-Process

+ +
5.1> Get-Content .\test.ps1
+function foo {
+    Write-Output beginFunction
+    1..3 | % {
+        Write-Output $_
+        [Environment]::Exit(0)
+    }
+    Write-Output endFunction
+}
+
+Write-Output beginScript
+foo
+Write-Output endScript
+
+5.1> Start-Process .\test.ps1
+
+
+5.1> Write-Host "Last execution status: $?" ; Write-Host "Last exit code: $LASTEXITCODE"
+Last execution status: True
+Last exit code: 0
+
+5.1>
+
+ +

Environment.Exit with code 1 and started by Start-Process

+ +
5.1> Get-Content .\test.ps1
+function foo {
+    Write-Output beginFunction
+    1..3 | % {
+        Write-Output $_
+        [Environment]::Exit(1)
+    }
+    Write-Output endFunction
+}
+
+Write-Output beginScript
+foo
+Write-Output endScript
+
+5.1> Start-Process .\test.ps1
+
+
+5.1> Write-Host "Last execution status: $?" ; Write-Host "Last exit code: $LASTEXITCODE"
+Last execution status: True
+Last exit code: 0
+
+5.1>
+
+ +

Way 2 - Stop-Process

+ +

Powershell has an automatic variable called $PID which refers to the process ID that is hosting the current PowerShell session.

+ +

Stop-Process started by powershell.exe

+ +
5.1> Get-Content .\test.ps1
+function foo {
+    Write-Output beginFunction
+    1..3 | % {
+        Write-Output $_
+        Write-Output "Kill process $PID"
+        Stop-Process $PID
+    }
+    Write-Output endFunction
+}
+
+Write-Output beginScript
+foo
+Write-Output endScript
+
+5.1> powershell -NoProfile .\test.ps1
+beginScript
+beginFunction
+1
+Kill process 12348
+
+5.1> Write-Host "Last execution status: $?" ; Write-Host "Last exit code: $LASTEXITCODE"
+Last execution status: False
+Last exit code: 0
+
+5.1>
+
+ +

Stop-Process started by Start-Process

+ +
5.1> Get-Content .\test.ps1
+function foo {
+    Write-Output beginFunction
+    1..3 | % {
+        Write-Output $_
+        Write-Output "Kill process $PID"
+        Stop-Process $PID
+    }
+    Write-Output endFunction
+}
+
+Write-Output beginScript
+foo
+Write-Output endScript
+
+5.1> Start-Process .\test.ps1
+
+
+5.1> Write-Host "Last execution status: $?" ; Write-Host "Last exit code: $LASTEXITCODE"
+Last execution status: True
+Last exit code: 0
+
+5.1>
+
+ +

Conclusion

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
GoalTerminate MethodLast execution status: $?Last exit code: $LASTEXITCODEComment
Terminate ScriptexitTrue0 
Terminate Scriptexit 0True0 
Terminate Scriptexit 1False0 
Terminate Scriptbreak UnknownLabelTrue0Never use it
Terminate ProcessTrue0 
Terminate ProcessFalse0 
Terminate ProcessTrue0 
Terminate ProcessTrue0 
Terminate ProcessStop-Process started by powershell.exeFalse0 
Terminate ProcessStop-Process started by Start-ProcessTrue0 
+ + +
+ +
+ + + + + + + +

+ Tags: + + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/09/windows-scheduled-task-by-powershell.html b/2018/09/windows-scheduled-task-by-powershell.html new file mode 100644 index 00000000..b15f9202 --- /dev/null +++ b/2018/09/windows-scheduled-task-by-powershell.html @@ -0,0 +1,987 @@ + + + + + + +Use Powershell to manage Windows Scheduled Task - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +
+ + +
+ + + +
+

A recent project made me to use the Windows scheduled task to execute periodically some python scripts. After the project, I find using Powershell to manage the Windows scheduled task is not so straightforward, that’s why I opened this post to share my experience on some common usage, and hope this can save your time if you also need to use it.

+
+ +

Scheduled task Powershell cmdlets

+ +

From the official Windows scheduled task powershell doc, we can find the ScheduledTasks module provides many cmdlets:

+ +
    +
  • Disable-ScheduledTask
  • +
  • Enable-ScheduledTask
  • +
  • Export-ScheduledTask
  • +
  • Get-ScheduledTask
  • +
  • Get-ScheduledTaskInfo
  • +
  • New-ScheduledTask
  • +
  • New-ScheduledTaskAction
  • +
  • New-ScheduledTaskPrincipal
  • +
  • New-ScheduledTaskSettingsSet
  • +
  • New-ScheduledTaskTrigger
  • +
  • Register-ScheduledTask
  • +
  • Set-ScheduledTask
  • +
  • Start-ScheduledTask
  • +
  • Stop-ScheduledTask
  • +
  • Unregister-ScheduledTask
  • +
+ +

Guess what is the cmdlet to create the task? New-ScheduledTask? Wrong, it’s Register-ScheduledTask.

+ +

Create scheduled task folder

+ +

By default, all the scheduled tasks are created under the root “\” folder, if you have many tasks here, from the taskschd.msc GUI, it might take time to display all of them. So I suggest to create your tasks to some custom task folders. And withthat, you can easily filter on (Get-ScheduledTask -TaskPath) only your interested tasks especially if some other tasks have the similar name as the yours.

+ +
# Create a task folder named 'project1' under the root path \
+$taskFolderName = 'project1'
+$taskPath = "\$taskFolderName"
+$scheduleObject = New-Object -ComObject schedule.service
+$scheduleObject.connect()
+$taskRootFolder = $scheduleObject.GetFolder("\")
+$taskRootFolder.CreateFolder($taskPath)
+
+ +

Disable disabledomaincreds

+ +

In some corporate networks, the Windows or security admins might enable the security policy : Network access: Do not allow storage of passwords and credentials for network authentication. If this policy is enabled, we will not be able to use Register-ScheduledTask with the -User param. Its registry setting can be found from the Microsoft official excel file for Group Policy Settings Reference for Windows and Windows Server, There’s also a online variant here: (http://gpsearch.azurewebsites.net/).

+ +
# Set the key 'disabledomaincreds' to value 0 to disable it.
+$regPath = 'HKLM:\SYSTEM\CurrentControlSet\Control\Lsa'
+$regName = 'disabledomaincreds'
+$regValue = 0
+Set-ItemProperty -Path $regPath -Name $regName -Value $regValue
+
+ +

Create scheduled task

+ +

Suppose we need to :

+
    +
  • Run the script ‘d:/scripts/job1.ps1 arg1’ every 30 minutes from 2018-09-05T18:00:00.
  • +
  • The script should be stopped if it runs more then 15 minutes.
  • +
  • The script should be executed under the account ‘user1’ with the password ‘password1’.
  • +
  • The task should be in the ‘project1’ task folder.
  • +
  • The task name is ‘task1’.
  • +
+ +
$taskName = 'task1'
+$taskFolderName = 'project1'
+$taskPath = "\$taskFolderName"
+$taskUser = 'user1'
+$taskPassword = 'password1' # $taskPassword is given by un-secure clear string, it's only for demo. In addition, if you use clear string, please careful with the single-quoted because some passwords might contain the char $ which can be evaluated if you use the double-quoted string. https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_quoting_rules?view=powershell-6
+$taskAction = New-ScheduledTaskAction -Execute powershell.exe -Argument 'd:/scripts/job1.ps1 arg1'
+
+# some online docs give also the param -RepetitionDuration ([TimeSpan]::MaxValue),
+# but this doesn't work for windows server 2016, it's a bug, this is also why I wrote this post.
+# On windows server 2016, there's another bug is that we must use -Once -RepetitionInterval to run a repeated action,
+# not -daily with 'Repeat task every 30 minutes' option in the advanced settings which is an option available only from taskschd.msc GUI,
+# otherwise, the scheduled task will also be automatically triggered by the system reboot.
+$taskTrigger = New-ScheduledTaskTrigger -Once -RepetitionInterval (New-TimeSpan -Minutes 30) -At (get-date '2018-09-05T18:00:00')
+
+# the task is scheduled for every 30 minutes,
+# but if it is running more than 15 minutes,
+# I consider the task is hanging or failed, and I want to stop it.
+$taskSetting= New-ScheduledTaskSettingsSet -ExecutionTimeLimit (New-TimeSpan -Minutes 15)
+
+
+# finally create the task
+$registerScheduledTaskParam = @{
+  TaskName = $taskName
+  TaskPath = $taskPath
+  User = $taskUser
+  Password = $taskPassword
+  Action = $taskAction
+  Trigger = $taskTrigger
+  Settings = $taskSetting
+}
+Register-ScheduledTask @registerScheduledTaskParam
+
+TaskPath                                       TaskName                          State
+--------                                       --------                          -----
+\project1\                                     task1                             Ready
+
+ +

Get scheduled task info

+ +
$taskFolderName = 'project1'
+# when using Register-SchduledTask, one of its params $taskPath = "\$taskFolderName"
+# doesn't contain the the ending "\", but for Get-ScheduledTask,
+# we must add it. This is a bug on Windows Server 2016 at least.
+$taskPath = "\$taskFolderName\"
+Get-ScheduledTask $taskName | fl *
+
+State                 : Ready
+Actions               : {MSFT_TaskExecAction}
+Author                :
+Date                  :
+Description           :
+Documentation         :
+Principal             : MSFT_TaskPrincipal2
+SecurityDescriptor    :
+Settings              : MSFT_TaskSettings3
+Source                :
+TaskName              : task1
+TaskPath              : \project1\
+Triggers              : {MSFT_TaskTimeTrigger}
+URI                   : \project1\task1
+Version               :
+PSComputerName        :
+CimClass              : Root/Microsoft/Windows/TaskScheduler:MSFT_ScheduledTask
+CimInstanceProperties : {Actions, Author, Date, Description...}
+CimSystemProperties   : Microsoft.Management.Infrastructure.CimSystemProperties
+
+# The result shown above is not straightforward for admin,
+# we need to deep into many sub properties to get some scheduled task basic information.
+# Hereunder some ways to archive that.
+
+# Way 1: oneliner by using the custom properties
+$taskPath = "\project1\" ; Get-ScheduledTask -TaskPath $taskPath | Select-Object `
+TaskName, State, `
+@{n='TaskEnabled'; e={$_.Settings.Enabled}}, `
+@{n='TriggerEnabled'; e={$_.Triggers.Enabled}}, `
+@{n='User'; e={$_.Principal.UserID}}, `
+@{n='TriggerStartBoundary'; e={$_.Triggers.StartBoundary}}, `
+@{n='TriggerInterval'; e={$_.Triggers.Repetition.Interval}}, `
+@{n='ExecutionTimeLimit'; e={$_.Settings.ExecutionTimeLimit}},`
+@{n='LastRunTime'; e={$_ |  Get-ScheduledTaskInfo | % LastRunTime}}, `
+@{n='LastTaskResult'; e={$_ |  Get-ScheduledTaskInfo | % LastTaskResult}}, `
+@{n='NextRunTime'; e={$_ |  Get-ScheduledTaskInfo | % NextRunTime}}, `
+@{n='Action'; e={$_.Actions.Execute + ' ' + $_.Actions.Arguments}}
+
+# Way 2: Export the task config to XML and view the XML content directly
+$taskPath = "\project1\" ; Get-ScheduledTask -TaskPath $taskPath | % {Write-Host "`nTask: $($_.TaskName)" -BackgroundColor Red ;  Export-ScheduledTask $_ ; $_ | Get-ScheduledTaskInfo}
+
+ +

Get scheduled task log

+ +

It seems that there’s no cmdlet to get the task log from the *-ScheduledTask cmdlets list. Yes, you’re right, the task log is saved directly to the standard windows event log. You can use Get-WinEvent (Get-EventLog is an old way) to get it.

+ +
# if you're not admin on the server,
+# you might get some error when running below Get-WinEvent command,
+# you can set $ErrorActionPreference = "SilentlyContinue" to hide it.
+> Get-WinEvent -ListLog * | ? logname -match task
+
+LogMode   MaximumSizeInBytes RecordCount LogName
+-------   ------------------ ----------- -------
+Circular             1052672          32 Microsoft-Windows-BackgroundTaskInfrastructure/Operational
+Circular             1052672           0 Microsoft-Windows-Mobile-Broadband-Experience-Parser-Task/Operational
+Circular             1052672           0 Microsoft-Windows-Shell-Core/LogonTasksChannel
+Circular             1052672         636 Microsoft-Windows-TaskScheduler/Maintenance
+Circular            10485760             Microsoft-Windows-TaskScheduler/Operational
+
+# No RecordCount for Microsoft-Windows-TaskScheduler/Operational
+# because the log history is disabled by default, enable it by wevtutil.
+> wevtutil set-log Microsoft-Windows-TaskScheduler/Operational /enabled:true
+> wevtutil get-log Microsoft-Windows-TaskScheduler/Operational
+> Get-WinEvent -ListLog * | ? logname -match task
+
+LogMode   MaximumSizeInBytes RecordCount LogName
+-------   ------------------ ----------- -------
+Circular             1052672          32 Microsoft-Windows-BackgroundTaskInfrastructure/Operational
+Circular             1052672get-           0 Microsoft-Windows-Mobile-Broadband-Experience-Parser-Task/Operational
+Circular             1052672           0 Microsoft-Windows-Shell-Core/LogonTasksChannel
+Circular             1052672         636 Microsoft-Windows-TaskScheduler/Maintenance
+Circular            10485760          12 Microsoft-Windows-TaskScheduler/Operational
+
+# All the logs of all the tasks are saved in the same place,
+# and the event object doesn't have a task name property,
+# this is why when we view the task history from the taskschd.msc GUI,
+# it's too slow to display, not cool /_\.
+# So if we want to see the logs of a single task, there's still something to do.
+> Get-WinEvent -FilterHashtable @{logname="Microsoft-Windows-TaskScheduler/Operational"; StartTime=$(get-date).AddDays(-2)} | so -fir 1 | fl *
+
+
+Message              : Task Scheduler launched "{F14F3BF1-DAA7-4286-93BF-1BB1EE3B2C0C}" instance of task "\project1\task1"  for user "user1" .
+Id                   : 110
+Version              : 0
+Qualifiers           :
+Level                : 4
+Task                 : 110
+Opcode               : 0
+Keywords             : -9223372036854775808
+RecordId             : 6
+ProviderName         : Microsoft-Windows-TaskScheduler
+ProviderId           : de7b24ea-73c8-4a09-985d-5bdadcfa9017
+LogName              : microsoft-windows-taskscheduler/operational
+ProcessId            : 1612
+ThreadId             : 10152
+MachineName          : DELL-ZX
+UserId               : S-1-5-18
+TimeCreated          : 2018-09-05 00:48:37
+ActivityId           : f14f3bf1-daa7-4286-93bf-1bb1ee3b2c0c
+RelatedActivityId    :
+ContainerLog         : microsoft-windows-taskscheduler/operational
+MatchedQueryIds      : {}
+Bookmark             : System.Diagnostics.Eventing.Reader.EventBookmark
+LevelDisplayName     : Information
+OpcodeDisplayName    : Info
+TaskDisplayName      : Task triggered by user
+KeywordsDisplayNames : {}
+Properties           : {System.Diagnostics.Eventing.Reader.EventProperty, System.Diagnostics.Eventing.Reader.EventProperty, System.Diagnostics.Eventing.Reader.EventProperty}
+
+> Get-WinEvent -FilterHashtable @{logname="Microsoft-Windows-TaskScheduler/Operational"; StartTime=$(get-date).AddDays(-2)} | ? message -match "\\project1\\task1"
+
+ + +
+ + + + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/10/migrate-gitlab-in-docker.html b/2018/10/migrate-gitlab-in-docker.html new file mode 100644 index 00000000..8e3800be --- /dev/null +++ b/2018/10/migrate-gitlab-in-docker.html @@ -0,0 +1,940 @@ + + + + + + +Migrate Gitlab in docker - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +
+ + +
+ + + +
+

This post will walk you through the steps to migrate Gitlab from one docker container to another. The steps need you to know how to install a new Gitlab container and how to backup and restore Gitlab container, because the migration is just a restoration of a backup to another container.

+
+ +

Some docs on the Internet

+ +
    +
  1. Migrate GitLab Instance to new Host
  2. +
+ +

Backup before anything

+ +

We must backup the Gitlab before everything. I’ve already written a post on how to backup up Gitlab docker version. For a double insurance, you can also at first create a VM snapshot/checkpoint, but don’t forget to delete it as soon as the migration is successfully finished.

+ +

In this post, we’ll also use this backup to migrate the date to the new Gitlab container. The backup file name is 1538845523_2018_10_06_11.3.1_gitlab_backup.tar.

+ +

Backup host key (optional)

+ +

If rebuilding the machine and keeping the same IP, to avoid having to delete the host key entry in the ~/.ssh/know_hosts file, run the following to backup the SSH host keys.

+ +
# From gitlab docker container
+
+root@gitlab:/# tar -cvf $(date "+hostkeys-%s.tar") $(grep HostKey /etc/ssh/sshd_config | grep -v ^'#' | awk '{print $2}')
+
+ +

Install a new Gitlab in docker with the same version

+ +

I’ve already written a post on how to install Gitlab-CE in docker. Be aware that for this container installation inside the same Ubuntu VM, we should map some new volumes and provide a new container name. If you install Gitlab container in another VM, of course you can reuse the same volume name and container name.

+ +

Verify the new Gitlab SSL certificate before the migration

+ +

Depends on your client OS (Linux or Windows), you can use the commands here to verify the SSL certificate. The mine is a Windows 10. Note that the certificate’s serial number is 8c:87:45b9:04:b0:ae.

+ +
6.1.0> $null | openssl s_client -connect gitlab.copdips.local:443 | openssl x509 -text -noout
+depth=0 O = copdips, CN = gitlab.copdips.local
+verify error:num=18:self signed certificate
+verify return:1
+depth=0 O = copdips, CN = gitlab.copdips.local
+verify return:1
+DONE
+Certificate:
+    Data:
+        Version: 3 (0x2)
+        Serial Number:
+            8c:87:45:ab:b9:04:b0:ae
+    Signature Algorithm: sha256WithRSAEncryption
+        Issuer: O=copdips, CN=gitlab.copdips.local
+        Validity
+            Not Before: Oct  2 21:00:13 2018 GMT
+            Not After : Jun 28 21:00:13 2021 GMT
+        Subject: O=copdips, CN=gitlab.copdips.local
+(...)
+
+ +

Transfer the backup

+ +

Copy the backup file 1538845523_2018_10_06_11.3.1_gitlab_backup.tar to the new Gitlab. From the backup name, we know the old gitlab version is at v11.3.1, this version must be exaclty the same as the new Gitlab.

+ +

To verify current Gitlab docker version:

+ +
# From gitlab docker container
+
+root@gitlab:/# gitlab-rake gitlab:env:info | grep "GitLab information" -A2
+GitLab information
+Version:        11.3.1
+Revision:       32cb452
+
+ +

Transfer the backup file:

+ +
# From Ubuntu host outside of the Gitlab docker container
+
+xiang@ubuntu1804:~$ sudo cp \
+ /srv/gitlab1083/data/backups/1538845523_2018_10_06_11.3.1_gitlab_backup.tar \
+ /srv/gitlab-new/data/backups/
+
+ +

Check the backup permission

+ +

The backup file must be owned by git account. The previous copy make the file’s owner as root:root, so we need to change it.

+ +
# From gitlab docker container
+
+root@gitlab:/# ls -lart /var/opt/gitlab/backups
+total 344
+-rw-------  1 git  git   81920 Oct  2 21:33 1538516038_2018_10_02_10.8.3_gitlab_backup.tar
+drwx------  8 git  git    4096 Oct  2 21:40 tmp
+-rw-------  1 root root 256000 Oct  8 21:00 1538845523_2018_10_06_11.3.1_gitlab_backup.tar
+drwx------  3 git  root   4096 Oct  8 21:00 .
+drwxr-xr-x 20 root root   4096 Oct  8 21:06 ..
+root@gitlab:/# chown -v git:git /var/opt/gitlab/backups/1538845523_2018_10_06_11.3.1_gitlab_backup.tar
+changed ownership of '/var/opt/gitlab/backups/1538845523_2018_10_06_11.3.1_gitlab_backup.tar' from root:root to git:git
+
+ +

Migrate by restoring from the backup

+ +

For docker version of Gitlab, the migration is just a standard restoration procedure.

+ +

Stop unicorn and sidekiq

+
# From gitlab docker container
+
+root@gitlab:/# gitlab-ctl reconfigure
+gitlab-ctl start
+gitlab-ctl stop unicorn
+gitlab-ctl stop sidekiq
+gitlab-ctl status
+ls -lart /var/opt/gitlab/backups
+
+ +

Start restore

+ +
# From Ubuntu host outside of the Gitlab docker container
+
+xiang@ubuntu1804:~$ docker exec -it gitlab gitlab-rake gitlab:backup:restore BACKUP=1538845523_2018_10_06_11.3.1 --trace
+
+ +

Start Gitlab

+ +
# From Gitlab docker container
+
+root@gitlab:/# gitlab-ctl restart
+root@gitlab:/# gitlab-rake gitlab:check SANITIZE=true
+
+ +

Verify

+ +

Verify the config file gitlab.rb

+ +

The config file is not replaced by the backup. If you want to use the config from the old container, just copy the file, and restart Gitlab by gitlab-ctl reconfigure from the docker container or docker restart [container name] from the docker host. To locate the config file, you can refer to this post.

+ +

Verify SSL certificate

+ +

By rechecking the SSL certificate, the SSL certificate is not replaced. If you want to keep the old certificate especially if your certificate is self-signed, you need to copy it from the old container’s volume. You can check this post to locate the SSL certificate.

+ +

Verify local user accounts

+ +

The local user accounts are replaced by the backup. Good.

+ +

Verify repositories

+ +

The repositories are replaced by the backup. Good.

+ +

Verify Gitlab runner

+ +

The Gitlab runner are replaced by the backup. Good.

+ +

But if the Gitlab SSL certificate is self-signed, and you dont want to restore the old one from the old container, you need to import the new self-signed SSL certificate to all Gitlab runners’s cert store, at least for Windows runners, Linux runners are not tested because I’m still a novice on Linux.

+ +

Please take a look at the line starting by Import-Certificate from this post to learn how to import the certificate to the Trusted Root Certification Authorities logical store in the Windows certificate store.

+ + +
+ + + + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/10/update-gitlab-in-docker.html b/2018/10/update-gitlab-in-docker.html new file mode 100644 index 00000000..bfe490f5 --- /dev/null +++ b/2018/10/update-gitlab-in-docker.html @@ -0,0 +1,857 @@ + + + + + + +Update Gitlab in docker - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +
+ + +
+ + + +
+

Gitlab has several methods to update to newer version depending on the type of the original installation and the Gitlab version. This post will show you the way for docker version of Gitlab, which is the simplest among others.

+
+ +

Some docs on the Internet

+ +

This post will follow the official doc for updating docker version of Gitlab.

+ +

If you installed the Gitlab with docker compose, please follow this official procedure.

+ +

And hereunder some docs for the non docker version update if you are interested:

+
    +
  1. Official global Gitlab update doc
  2. +
  3. Official doc for upgrading without downtime
  4. +
  5. Official doc for updating Gitlab installed from source
  6. +
  7. Official doc for patching between minor feature versions
  8. +
  9. Official doc for restoring from backup after a failed upgrade
  10. +
+ +

Backup before anything

+ +

We must backup the Gitlab before everything. I’ve already written a post on how to backup up Gitlab docker version.

+ +

Verify the docker container volumes

+ +

The update procedure will remove the current Gitlab container, so the data must be kept somewhere to be reused by the update. As I wrote in a previous post on how to install Gitlab in docker, we used the docker run --volume to mount the docker host volumes to Gitlab container. So even the Gitlab container is removed, the data are still kept in the docker host.

+ +

To verify the mounted volumes:

+ +
xiang@ubuntu1804:~$ docker ps
+CONTAINER ID        IMAGE                          COMMAND             CREATED             STATUS                 PORTS                                                            NAMES
+707439b39dd1        gitlab/gitlab-ce:10.8.3-ce.0   "/assets/wrapper"   3 weeks ago         Up 2 hours (healthy)   0.0.0.0:80->80/tcp, 0.0.0.0:443->443/tcp, 0.0.0.0:2222->22/tcp   gitlab
+xiang@ubuntu1804:~$
+xiang@ubuntu1804:~$ docker container inspect -f "{{ json .HostConfig.Binds }}" gitlab | python3 -m json.tool
+[
+    "/srv/gitlab/config:/etc/gitlab",
+    "/srv/gitlab/logs:/var/log/gitlab",
+    "/srv/gitlab/data:/var/opt/gitlab"
+]
+
+ +

Ok, I see there’re three volumes mounted in the Gitlab container, it’s good.

+ +

Update docker version of Gitlab

+ +

Exactly the same procedure as the official one. I will update the current gitlab-ce:10.8.3-ce.0 to gitlab-ce:latest

+ +
    +
  1. +

    Pull the new image:

    + +

    To pull other version, change the lastest by the tag name which can be found from the docker hub.

    + +
    +
     docker pull gitlab/gitlab-ce:latest
    +
    +
    +
  2. +
  3. Stop the running container called gitlab: +
    +
     docker stop gitlab
    +
    +
  4. +
  5. Remove existing container: +
    +
     docker rm gitlab
    +
    +
  6. +
  7. Create the container once again with previously specified options: +
    +
     docker run --detach \
    + --hostname gitlab.copdips.local \
    + --publish 443:443 --publish 80:80 --publish 2222:22 \
    + --name gitlab \
    + --restart always \
    + --volume /srv/gitlab/config:/etc/gitlab \
    + --volume /srv/gitlab/logs:/var/log/gitlab \
    + --volume /srv/gitlab/data:/var/opt/gitlab \
    + gitlab/gitlab-ce:latest
    +
    +
  8. +
+ +

That’s all, go to take a coffee, GitLab will reconfigure and update itself, the procedure is pretty simple.

+ +

If you take a look at the procedure for Gitlab installed from the source, you will thank yourself for choosing to install Gitlab in docker, because you chose the zen.

+ + +
+ + + + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/10/using-gitlab-integrated-cicd-for-python-project-on-windows.html b/2018/10/using-gitlab-integrated-cicd-for-python-project-on-windows.html new file mode 100644 index 00000000..d6001863 --- /dev/null +++ b/2018/10/using-gitlab-integrated-cicd-for-python-project-on-windows.html @@ -0,0 +1,1010 @@ + + + + + + +Using Gitlab integrated CICD for Python project on Windows - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +
+ + +
+ + + +
+

Gitlab ships with its own free CICD which works pretty well. This post will give you an example of the CICD file .gitlab-ci.yml for a Python project running on Gitlab Windows runner.

+
+ +

Some docs on the Internet

+ +
    +
  1. Official GitLab Continuous Integration (GitLab CI/CD)
  2. +
  3. Official Configuration of your jobs with .gitlab-ci.yml
  4. +
  5. Official Gitlab Pipelines settings
  6. +
  7. Official Publish code coverage report with GitLab Pages
  8. +
  9. introduction-gitlab-ci
  10. +
  11. Rubular: a Ruby regular expression editor and tester
  12. +
+ +

Code Coverage

+ +

The official doc on how to use coverage is not very clear.

+ +

My coverage tool’s output (from pytest --cov=) is something like :

+ +
----------- coverage: platform win32, python 3.7.0-final-0 -----------
+Name                                      Stmts   Miss  Cover
+-------------------------------------------------------------
+python_project\__init__.py                    6      0   100%
+python_project\ctx_fetcher.py                15      0   100%
+python_project\extras\__init__.py             0      0   100%
+python_project\extras\celery.py              18     18     0%
+python_project\filters.py                     6      2    67%
+python_project\parser.py                     26      0   100%
+python_project\request_id.py                 42      1    98%
+-------------------------------------------------------------
+TOTAL                                       113     21    81%
+
+ +

In my example .gitlab-ci.yml, the coverage is configured as:

+ +
coverage: '/^TOTAL.*\s+(\d+\%)$/'
+
+ +

This regex will find the coverage which is at 81%.

+ +

Be aware that:

+ +
    +
  1. The coverage only use regular expression to find the coverage percentage from coverage tool’s output.
  2. +
  3. The regular expression must be surrounded by single quote ', double quote is not allowed.
  4. +
  5. Inside the single quotes, must be surrounded by /.
  6. +
  7. You can use http://rubular.com to test your regex.
  8. +
  9. The overage regex returns the last catch group value from the output. Even if it is not in the last line, or if the regex catches more than one values among all the lines.
  10. +
+ +

.gitlab-ci.yml example for Python project on a Windows runner

+ +

.gitlab-ci.yml file content

+ +

I cloned the project flask_log_request_id and try to run CICD over it.

+ +

I’m still working on this CICD .gitlab-ci.yml file, the example given here will be updated as long as I add new things inside.

+ +
stages:
+    - venv
+    - test
+    - build
+    - deploy
+
+before_script:
+  - $gitApiUrl = 'https://gitlab.copdips.local/api/v4'
+  # will save git api token more securely later.
+  - $gitApiToken = $env:GitApiToken
+  - $gitApiHeader = @{"PRIVATE-TOKEN" = $gitApiToken}
+  - $cicdReportsFolderPath = Join-Path (Get-Location) "cicd_reports"
+  - $venvPath = "$env:temp/venv/$($env:CI_PROJECT_NAME)"
+  - >
+    function Set-SecurityProtocolType {
+        # [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
+        # $AllProtocols = [System.Net.SecurityProtocolType]'Ssl3,Tls,Tls11,Tls12'
+        $AllProtocols = [System.Net.SecurityProtocolType]'Tls12'
+        [System.Net.ServicePointManager]::SecurityProtocol = $AllProtocols
+    }
+  - >
+    function Write-PythonPath {
+        $pythonPath = $(Get-Command python | % source)
+        Write-Output "The python path is at: '$pythonPath'"
+    }
+  - >
+    function Get-UpstreamProject {
+        $apiParam = @{
+          Headers = $gitApiHeader
+          Uri = "$gitApiUrl/projects?search=$($env:CI_PROJECT_NAME)"
+        }
+        if ($PSVersionTable.PSVersion.Major -gt 5) {
+          $apiParam.SkipCertificateCheck = $true
+        }
+        $projectList = Invoke-RestMethod @apiParam
+        $upstreamProject = $projectList | ? forked_from_project -eq $null
+        return $upstreamProject
+    }
+  - >
+    function Get-UpstreamProjectId {
+        $upstreamProject = Get-UpstreamProject
+        return $upstreamProject.id
+    }
+
+  - >
+    function Test-CreateVenv {
+        param($VenvPath, $GitCommitSHA)
+        $gitShowCommand = "git show $GitCommitSHA --name-only"
+        $gitShowResult = Invoke-Expression $gitShowCommand
+        Write-Host "$gitShowCommand`n"
+        $gitShowResult | ForEach-Object {Write-Host $_}
+        $changedFiles = Invoke-Expression "git diff-tree --no-commit-id --name-only -r $GitCommitSHA"
+        $requirementsFiles = @()
+        $requirementsFiles += "requirements.txt"
+        foreach ($requirements in $requirementsFiles) {
+            if ($requirements -in $changedFiles) {
+                Write-Host "`nFound $requirements in the changed files, need to create venv."
+                return $True
+            }
+        }
+        if (-not (Test-Path $VenvPath)) {
+            Write-Host "`nCannot found venv at $VenvPath, need to create venv."
+            return $True
+        }
+
+        Write-Host "`nNo need to create venv."
+        return $False
+    }
+  - >
+    function Enable-Venv {
+        param($VenvPath)
+
+        Invoke-Expression (Join-Path $VenvPath "Scripts/activate.ps1")
+        Write-Host "venv enabled at: $VenvPath"
+        Write-PythonPath
+    }
+  - >
+    function Create-Venv {
+        param($VenvPath)
+
+        Write-Output "Creating venv at $venvPath ."
+        python -m venv $VenvPath
+        Write-Output "venv created at $venvPath ."
+    }
+  - >
+    function Install-PythonRequirements {
+        param($VenvPath)
+
+        Enable-Venv $VenvPath
+        python -m pip install -U pip setuptools wheel
+        pip install -r requirements.txt
+    }
+  - >
+    function Remove-Venv {
+        param($VenvPath)
+
+        if (Test-Path $VenvPath) {
+            Remove-Item $VenvPath -Recurse -Force
+            Write-Host "venv removed from: $VenvPath"
+        } else {
+            Write-Host "venv not found at: $VenvPath"
+        }
+    }
+  - Get-Location
+  - git --version
+  - python --version
+  - Write-PythonPath
+  - $PSVersionTable | ft -a
+  - Get-ChildItem env:\ | Select-Object Name, Value | ft -a
+
+venv:
+  stage: venv
+  script:
+    - >
+      if (Test-CreateVenv $venvPath $env:CI_COMMIT_SHA) {
+          Remove-Venv $venvPath
+          Create-Venv $venvPath
+      }
+      Install-PythonRequirements $venvPath
+
+pytest:
+  stage: test
+  script:
+    - $reportFolder = Join-Path $cicdReportsFolderPath "pytest"
+    - New-Item -Path $reportFolder -Type Directory -Force
+    - $upstreamProjectId = Get-UpstreamProjectId
+    - Write-Output "upstreamProjectId = $upstreamProjectId"
+    # TODO: add check master last commit coverage
+    - Enable-Venv $venvPath
+    - pytest --cov=flask_log_request_id --cov-report=html:$reportFolder
+    - $coverageLine = (Get-Content (Join-Path $reportFolder index.html) | Select-String "pc_cov").line
+    - $coverageString = ($coverageLine -replace "<[^>]*>", "").trim()
+    - Write-Output "Total Coverage = $coverageString"
+  coverage: '/^(?i)(TOTAL).*\s+(\d+\%)$/'
+
+
+nosetests:
+  stage: test
+  script:
+    - Enable-Venv $venvPath
+    - nosetests.exe
+  coverage: '/^TOTAL.*\s+(\d+\%)$/'
+
+flake8:
+  stage: test
+  script:
+    - Enable-Venv $venvPath
+    - flake8.exe .\flask_log_request_id
+
+mypy:
+  stage: test
+  script:
+    - Enable-Venv $venvPath
+    - $reportFolder = Join-Path $cicdReportsFolderPath "mypy"
+    - New-Item -Path $reportFolder -Type Directory -Force
+    - $mypyResult = mypy ./flask_log_request_id --ignore-missing-imports --html-report $reportFolder --xml-report $reportFolder
+    - Write-Output "MyPy result = `""
+    - $mypyResult | % { Write-Output $_}
+    - Write-Output "`"`nEnd of MyPy result."
+    - if ($mypyResult.count -gt 2) {
+          return $False
+      }
+
+ +

.gitlab-ci.yml results from pipeline view

+ +

.gitlab-ci.yml results from pipeline view

+ +

.gitlab-ci.yml results from job view

+ +

.gitlab-ci.yml results from job view

+ +

.gitlab-ci.yml results from merge_request view

+ +

.gitlab-ci.yml results from merge_request view

+ + +
+ + + + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/11/creating-multiple-redis-instance-services-on-windows.html b/2018/11/creating-multiple-redis-instance-services-on-windows.html new file mode 100644 index 00000000..e3f06dbf --- /dev/null +++ b/2018/11/creating-multiple-redis-instance-services-on-windows.html @@ -0,0 +1,831 @@ + + + + + + +Creating Multiple Redis Instance Services On Windows - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +
+ + +
+ + + +
+

Even Salvatore Sanfilippo (creator of Redis) thinks it’s a bad idea to use multiple DBs in Redis. So we can install as many Redis instances as the number of DBs we need. This post will show you how to create multiple Redis instance as Windows service on the same Windows server.

+
+ +

Choose Redis Windows port version

+ +

As mentioned by the official doc, due to the lack of fork on Windows system, Redis is not officially supported on Windows. For Windows port version of Redis, we can use the one from : https://github.com/MicrosoftArchive/redis , currently the latest version is v3.2.100 which was released on Jul 1, 2016.

+ +

Create single Redis service on Windows

+ +

The official doc is good enough to get the job done. You can create the service by a simple command:

+ +
> redis-server --service-install
+
+ +

Or if you want to use a customized configuration:

+ +
> redis-server --service-install redis.windows.conf --loglevel verbose
+
+ +

BTW, if you want to use Redis in the Windows Subsystem for Linux (WSL) on Windows 10 or on Windows Server 2019, you can refer to this official doc.

+ +

Create multiple Redis services on Windows

+ +

There’s no many docs on the Internet telling you how to achieve that, in fact the doc from the Github gives the answer. We should use the magic --service-name.

+ +
# Create redis service which name is redis_6381 and listens to the port tcp 6381
+> redis-server --service-install --service-name redis_6381 --port 6381
+
+# Create redis service which name is redis_6382 and listens to the port tcp 6382
+> redis-server --service-install --service-name redis_6382 --port 6382
+
+ +

We just created 2 Redis server services on Windows, the only difference between them is the ports they listen to. All the other configurations are the default ones. This provokes a problem. That is the rdb dump file. The default configure set the rdb file name to dump.rdb, so both the redis services are using the same dump.rdb file which creates the file conflict in case of SAVE command or BGSAVE command.

+ +

Due to above problem, we need to set each redis service uses its own rdb file. +In redis config, there’re two configurations to control the rdb file.

+ +
    +
  1. +

    rbd file folder

    + +
    +
     # from redis-cli
    + config get dir
    + config set dir [new dir path]
    +
    +
  2. +
  3. +

    rdb file name

    + +
    +
     # from redis-cli
    + config get dbfilename
    + config set dbfilename [new db file name]
    +
    +
  4. +
+ +

Don’t forget to set also the maxmemory and maxmemory-policy in order to avoid the out of memory issue. Redis’ default maxmemory is set to 0 which means no limitation on used memory, and the default maxmemory-policy is set to noeviction, which means the Redis server returns errors when the memory limit was reached and the client is trying to execute commands that could result in more memory to be used.

+ +

To get Redis memory usage, use :

+
# from redis-cli
+info memory
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2018/11/setting-pwsh-invoke-webrequest-proxy.html b/2018/11/setting-pwsh-invoke-webrequest-proxy.html new file mode 100644 index 00000000..10fdb866 --- /dev/null +++ b/2018/11/setting-pwsh-invoke-webrequest-proxy.html @@ -0,0 +1,778 @@ + + + + + + +Setting Pwsh Invoke-WebRequest Proxy - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +
+

Different than Windows Powershell, Powershell Core doesn’t use the system proxy setting on Windows. This post will show you an one-line command to set Powershell Core web cmdlets proxy.

+
+ +

My office working environment is behind an Internet proxy, and I use Scoop to install many dev tools on my Windows desktop.

+ +

Scoop is a Chocolatey-like Windows package management tool but its package sources are all on the Internet, there’s no possibility to mirror the packages to a local repository. So I need to use the company Internet proxy to use the Scoop.

+ +

In fact, there’s one possibility to install packages by using the local source control repo, I’ve never tested, it should be technically worked, and seems not very difficult to set up, but it needs to be maintained.

+ +

Scoop uses mainly the Invoke-WebRequest cmdlet to download the package sources from the Internet, and it has already generously given a wiki on how to configure proxy, but I’ve switched to Powershell Core (pwsh.exe) since a while, and none of the methods given by the wiki works.

+ +

After some googling, I finally find the issue 3122 from the official Powershell Github repository, the collaborator @markekraus gave a solution:

+ +
$PSDefaultParameterValues["invoke-webrequest:proxy"] = 'http://username:password@proxyserver:port'
+
+ +

When giving the password as a plain text in a string, always use the single quotes to create the string, as some special characters ($, `, etc.) in the password might be evaluated by the string created by the double quotes. Otherwise pass the password as a variable into a double quoted string to convert it to a plain text. On Linux bash, we can see the same thing.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2019/04/creating-custom-python-request-auth-class.html b/2019/04/creating-custom-python-request-auth-class.html new file mode 100644 index 00000000..a4e190ff --- /dev/null +++ b/2019/04/creating-custom-python-request-auth-class.html @@ -0,0 +1,886 @@ + + + + + + +Creating Custom Python Request Auth Class - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +
+

When you need to use a complicated, or a non-standard API authentication method, +or your dev and prd environments don’t use the same API authentication method, +it might be better to create a Python requests auth method to reduce your work.

+
+ +

Create the class MyAuth

+ +

Suppose you have an API url at: https://httpbin.org/, +its authentication method is by request’s headers where headers["Authorization"] = username_password.

+ +

So the class MyAuth could be as following:

+ +

The most important is the __call__() method.

+ +
import requests
+
+
+def auth_header_value(username, password):
+    return "{}_{}".format(username, password)
+
+
+class MyAuth(requests.auth.AuthBase):
+    # http://docs.python-requests.org/en/master/user/authentication/#new-forms-of-authentication
+    def __init__(self, username, password):
+        self.username = username
+        self.password = password
+
+    def __call__(self, r: requests.Request):
+        # Implement my authentication
+        # http://docs.python-requests.org/en/master/_modules/requests/auth/
+        r.headers["Authorization"] = auth_header_value(self.username, self.password)
+        return r
+
+# unittest
+def test_myauth():
+    username = "u1"
+    password = "p1"
+    auth = MyAuth(username, password)
+    url = "https://httpbin.org/"
+    # http://docs.python-requests.org/en/master/user/advanced/
+    prepared_request = requests.Request("GET", url)
+    prepared_request_with_auth = auth.__call__(prepared_request)
+    assert prepared_request_with_auth.headers["Authorization"] == "{}_{}".format(
+        username, password
+    )
+
+ +

Use the class MyAuth

+ +

Without the class MyAuth

+ +

We should directly provide the headers object in the request:

+ +
import requests
+
+username = "u1"
+password = "p1"
+url = "https://httpbin.org/"
+headers = {"Authorization": "{}_{}".format(username, password)}
+requests.get(url, headers=headers)
+
+ +

With the class MyAuth

+ +

We just need pass it to the param auth:

+ +
import requests
+import MyAuth
+
+username = "u1"
+password = "p1"
+url = "https://httpbin.org/"
+auth = MyAuth(username, password)
+requests.get(url, auth=auth)
+
+ +

Conditional MyAuth

+ +

You may not find the power of the MyAuth from te above examples. True. +But suppose if your dev API uses HTTPBasicAuth, +and your prd API uses a special key (“token”) in the request’s headers. +And suppose you have many APIs to target in this manner. +What would you do without the class MyAuth ? Adding if..else.. block everywhere ?

+ +

With the class MyAuth, we just need to add only once if..else.. block in the __call__() method.

+ +

For example:

+ +
import requests
+
+
+def auth_header_value(username, password):
+    return "{}_{}".format(username, password)
+
+
+class MyAuth(requests.auth.AuthBase):
+    # http://docs.python-requests.org/en/master/user/authentication/#new-forms-of-authentication
+    def __init__(self, username, password, token, env):
+        # we must specify all the possible auth credentials here,
+        # and the variables (env) which allows to select the credential to use.
+        self.username = username
+        self.password = password
+        self.token = token
+        self.env = env
+
+    def __call__(self, r: requests.Request):
+        # Implement my authentication
+        if env == "dev":
+            # http://docs.python-requests.org/en/master/_modules/requests/auth/
+            r.headers['Authorization'] = requests.auth._basic_auth_str(
+                self.username, self.password
+            )
+        elif env == "prd":
+            r.headers["token"] = self.token
+        return r
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2019/05/using-python-sqlalchemy-session-in-multithreading.html b/2019/05/using-python-sqlalchemy-session-in-multithreading.html new file mode 100644 index 00000000..6a60d0df --- /dev/null +++ b/2019/05/using-python-sqlalchemy-session-in-multithreading.html @@ -0,0 +1,904 @@ + + + + + + +Using Python SQLAlchemy session in multithreading - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +
+ + +
+ + + +
+

SQLAlchemy DB session is not thread safe. In this post, I will show you 2 ways to use it in a multithreading context.

+
+ +

Way 1 - Using contextmanager to create a session per thread

+ +

Below is an example given by the official doc to show how to use the contextmanager to construct, commit and close a SQLAlchemy session.

+ +
### another way (but again *not the only way*) to do it ###
+
+from contextlib import contextmanager
+
+
+@contextmanager
+def session_scope():
+    """Provide a transactional scope around a series of operations."""
+    session = Session()
+    try:
+        yield session
+        session.commit()
+    except:
+        session.rollback()
+        raise
+    finally:
+        session.close()
+
+
+def run_my_program():
+    with session_scope() as session:
+        ThingOne().go(session)
+        ThingTwo().go(session)
+
+ +

Suppose we have a function called f1 which does something with the session. And we need to call f1 in a multithreading context. +All we need to do is to add the session_scope() around the f1:

+ +
from contextlib import contextmanager
+from multiprocessing.dummy import Pool as ThreadPool
+
+# db_utils is a python file that creats the Session by using the factory sessionmaker(),
+# not shown here.
+from db_utils import Session
+
+
+@contextmanager
+def session_scope():
+    """Provide a transactional scope around a series of operations."""
+    session = Session()
+    try:
+        yield session
+        session.commit()
+    except:
+        session.rollback()
+        raise
+    finally:
+        session.close()
+
+
+def f1(session, number):
+    # do something around the session and the number...
+
+
+def thread_worker(number):
+    # We're using the session context here.
+    with session_scope() as session:
+        f1(session, number)
+
+
+def work_parallel(numbers, thread_number=4):
+    pool = ThreadPool(thread_number)
+    results = pool.map(thread_worker, numbers)
+    # If you don't care about the results, just comment the following 3 lines.
+    # pool.close()
+    # pool.join()
+    # return results
+
+
+if __name__ == "__main__":
+    numbers = [1, 2, 3]
+    work_parallel(numbers, 8)
+
+ +

Way 2 - Using scoped_session to create a thread-local variable

+ +

https://docs.sqlalchemy.org/en/13/orm/contextual.html#contextual-thread-local-sessions

+ +
+

The scoped_session object is a very popular and useful object used by many SQLAlchemy applications. However, it is important to note that it presents only one approach to the issue of Session management. If you’re new to SQLAlchemy, and especially if the term “thread-local variable” seems strange to you, we recommend that if possible you familiarize first with an off-the-shelf integration system such as Flask-SQLAlchemy or zope.sqlalchemy.

+
+ +
from multiprocessing.dummy import Pool as ThreadPool
+
+from sqlalchemy.orm import scoped_session
+from sqlalchemy.orm import sessionmaker
+
+
+def f1(number):
+    # now all calls to Session() will create a thread-local session.
+    # If we call upon the Session registry a second time, we get back the same Session.
+    session = Session()
+    # do something around the session and the number...
+
+    # You can even directly use Session to perform DB actions.
+    # See: https://docs.sqlalchemy.org/en/13/orm/contextual.html#implicit-method-access
+    # when methods are called on the Session object, they are proxied to the underlying Session being maintained by the registry.
+
+
+def thread_worker(number):
+    f1(number)
+
+
+def work_parallel(numbers, thread_number=4):
+    pool = ThreadPool(thread_number)
+    results = pool.map(thread_worker, numbers)
+    # If you don't care about the results, just comment the following 3 lines.
+    # pool.close()
+    # pool.join()
+    # return results
+
+
+if __name__ == "__main__":
+    engine = create_engine("postgresql://scott:tiger@localhost/mydatabase")
+    session_factory = sessionmaker(bind=engine)
+
+    # The Session object created here will be used by the function f1 directly.
+    Session = scoped_session(session_factory)
+
+    numbers = [1, 2, 3]
+    work_parallel(numbers, 8)
+
+    Session.remove()
+
+ +

Bonus - How the Python web frameworks work with SQLAlchemy thread local scope

+ +

https://docs.sqlalchemy.org/en/13/orm/contextual.html#using-thread-local-scope-with-web-applications

+ + +
+ + + + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2019/06/git-cheat-sheet.html b/2019/06/git-cheat-sheet.html new file mode 100644 index 00000000..bc0a57b6 --- /dev/null +++ b/2019/06/git-cheat-sheet.html @@ -0,0 +1,1024 @@ + + + + + + +Git Cheat Sheet - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +
+ + +
+ + + +
+

This is not a complete Git cheat sheet for everyone, this is just a personal cheat sheet for some often forgotten git commands.

+ +
+ +

Alias

+ +

User level alias

+ +

Edit ~/.gitconfig

+ +
git config --global alias.st status
+git config --global alias.lga log --graph --decorate --oneline --all
+git config --global alias.co checkout
+git config --global alias.last log -1 HEAD
+git config --global alias.ci commit
+git config --global alias.unstage reset HEAD
+git config --global alias.ll "log --graph --all --pretty=format:'%C(auto)%h%Creset %an: git config --global %s - %Creset %C(auto)%d%Creset %C(bold black)(%cr)%Creset %C(bold git config --global black)(%ci)%Creset' --no-abbrev-commit"
+git config --global alias.sh show
+git config --global alias.df diff
+git config --global alias.br branch
+git config --global alias.cm checkout main
+git config --global alias.cd checkout dev
+git config --global alias.rum pull --rebase upstream main
+git config --global alias.rud pull --rebase upstream dev
+git config --global alias.rom pull --rebase origin main
+git config --global alias.rod pull --rebase origin dev
+
+ +

~/.bashrc

+ +
alias gitpush='git ci -am $GIT_BRANCH ; git push origin $GIT_BRANCH'
+alias gitamendpush='git add . ; git amend ; git push origin $GIT_BRANCH -f'
+alias gitrebasemain='git cm ; git rom ; git fetch origin --prune ; git br -d $GIT_BRANCH'
+alias gitrebasedev='git cd ; git rod ; git fetch origin --prune ; git br -d $GIT_BRANCH'
+
+ +

Restore

+ +

Restore a file to an old version

+ +
git restore --source [old_commit_hash] [file_name]
+
+ +

Restore a deleted branch

+ +
git reflog
+git branch [branch_name] [commit_hash_that_preceded_the_delete_commit]
+
+ +

Undo

+ +

+ +

Discard changes in working directory

+ +
# discard changes to a file in working directory
+git checkout <filename or wildcard>
+
+# discard changes to all files in working directory
+git checkout .
+# or
+git checkout *
+
+ +

Untracked files cannot be discarded by checkout.

+ +

Discard last commit (completely remove)

+ +
# better to show git log history before using --hard for rollback purpose.
+git reset --hard HEAD~
+
+ +

We can recover the commit discarded by --hard with the git cherry-pick [commit number] if we displayed or saved it before. Whatever you can also use git reflog to get the commit number too.

+ +

Unstage from staging area

+ +

StackOverflow: How do I undo git add before commit?

+ +
# unstage a file from staging area
+git reset <filename or wildcard>
+
+# unstage all files from staging area
+git reset
+
+ +

No more need to add HEAD like git reset HEAD <file> and git reset HEAD since git v1.8.2.

+ +

Do not use git rm --cached <filename> to unstage, it works only for newly created file to remove them from the staging area. But if you specify a existing file, it will delete it from cache, even if it is not staged.

+ +

Undo commit to working directory

+ +

StackOverflow: How do I undo the most recent local commits in Git?

+ +
+

You should readd the files if you want to commit them, as they’re in the working directory now, they’re unstaged too.

+
+ +
# Undo last commit to working directory
+git reset HEAD~
+# same as to
+git reset HEAD~1
+
+# Undo last 2 commits to working directory
+git reset HEAD~2
+
+# Undo till a special commit to working directory,
+# the special commit and every commits before are still committed.
+git reset <commit number>
+
+ +

git reset HEAD will do nothing, as the HEAD is already at the last commit.

+ +

git reset HEAD~1 <file> will create a delete file index in staging area. Normally we don’t need this command.

+ +

Undo commit to staging area

+ +

StackOverflow: How do I undo the most recent local commits in Git?

+ +

Add --soft to git reset to undo commit to staging area.

+ +

Undo staging to working directory

+ +
# used after a git add
+git restore --staged <file>
+git reset
+
+ +

Authentication

+ +

With bearer token

+ +
# https://learn.microsoft.com/en-us/azure/devops/integrate/get-started/authentication/service-principal-managed-identity?view=azure-devops#q-can-i-use-a-service-principal-to-do-git-operations-like-clone-a-repo
+git -c http.extraheader="AUTHORIZATION: bearer $ServicePrincipalAadAccessToken" clone https://dev.azure.com/{yourOrgName}/{yourProjectName}/_git/{yourRepoName}
+
+ +

Branch

+ +

Force local branch to the same with remote branch

+ +
git reset --hard upstream/master
+or
+git checkout -B master origin/master # sometimes this one might not work
+
+ +

get last commit of another local branch

+ +
git cherry-pick another_local_branch
+
+ +

get all commits of another local other_branch

+ +
get rebase another_local_branch
+
+ +

Show diff

+ +

show content in staging area

+ +
git diff --cached
+
+ +

show content in the last commit local repository

+ +
git show
+git show HEAD
+
+ +

show content in the second last commit in local repository

+ +
git show HEAD~
+git show HEAD~1
+
+ +

Disable host key checking

+ +

Sometimes during CICD, we need to use git to do something, if the remote repository is accessed by SSH, the first time when you use git (git clone for example), you need to accept the remote host key. This might be a problem for CICD as it cannot type Y for you as you do in an interactive session. To let git to disable the host key checking or precisely accept automatically the remote host key, you need to add the following line in git config:

+ +
> git config --global core.sshcommand 'ssh -i [YouPrivateKeyPath] -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -F /dev/null'
+
+ +

You may need to use git config --system to set the config at system level.

+ +

Proxy

+ +

Usually, in an enterprise environment, we need to use a proxy to connect to the Internet resources. +And from Powershell, we can ask Powershell to inherit the IE proxy settings.

+ +

With this proxy setting in Powershell, we should be able to use git clone to connect to the external www.github.com or to the internally hosted for example github.your_enterprise_local_domain.

+ +

But trust me, some enterprises’ proxy settings (often for those who use a .pac file) are so complicated that Powershell cannot use the proxy the same way as IE.

+ +

In such case, fortunately, git has its own proxy setting. I think the official doc doesn’t explain very well how to set the proxy. But this gist gives some good examples.

+ +

So, normally, you just need to set this config to ask git to use the $internet_proxy only for the url github.com, and all the other urls, git won’t use the proxy.

+ +
git config --global http.https://github.com.proxy $internet_proxy
+
+ +

GUI

+ +

GitForWindows ships with a GUI tool, very cool.

+ +
# start git gui tool
+git gui
+
+ +

git-gui

+ +

Pull Requests with Rebase

+ +

Azure devops doc: https://devblogs.microsoft.com/devops/pull-requests-with-rebase/

+ +

Moving Git repository content to another repository preserving history

+ +
# https://stackoverflow.com/a/55907198/5095636
+# this keeps all commits history and git tags
+$ git clone --bare https://github.com/exampleuser/old-repository.git
+$ cd old-repository.git
+$ git push --mirror https://github.com/exampleuser/new-repository.git
+$ cd -
+$ rm -rf old-repository.git
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2019/07/filtering-pandas-dataframe.html b/2019/07/filtering-pandas-dataframe.html new file mode 100644 index 00000000..3e6f96a2 --- /dev/null +++ b/2019/07/filtering-pandas-dataframe.html @@ -0,0 +1,1030 @@ + + + + + + +Filtering In Pandas Dataframe - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +
+ + +
+ + + +
+

Pandas dataframe is like a small database, +we can use it to inject some data and do some in-memory filtering without any external SQL. +This post is much like a summary of this StackOverflow thread.

+
+ +

Building dataframe

+ +
In [1]: import pandas as pd
+   ...: import numpy as np
+   ...: df = pd.DataFrame({'A': 'foo bar foo bar foo bar foo foo'.split(),
+   ...:                    'B': 'one one two three two two one three'.split(),
+   ...:                    'C': np.arange(8), 'D': np.arange(8) * 2})
+
+In [2]: df
+Out[2]:
+     A      B  C   D
+0  foo    one  0   0
+1  bar    one  1   2
+2  foo    two  2   4
+3  bar  three  3   6
+4  foo    two  4   8
+5  bar    two  5  10
+6  foo    one  6  12
+7  foo  three  7  14
+
+ +

Some basic filtering conditions

+ +

Filtering by A = ‘foo’

+ +
In [3]: df[df.A == 'foo']
+Out[3]:
+     A      B  C   D
+0  foo    one  0   0
+2  foo    two  2   4
+4  foo    two  4   8
+6  foo    one  6  12
+7  foo  three  7  14
+
+ +

Filtering by A = ‘foo’ and B = ‘one’

+ +
In [4]: df[(df.A == 'foo') & (df.B == 'one')]
+Out[4]:
+     A    B  C   D
+0  foo  one  0   0
+6  foo  one  6  12
+
+ +

Filtering by A = ‘foo’ or B = ‘one’

+ +
In [5]: df[(df.A == 'foo') | (df.B == 'one')]
+Out[5]:
+     A      B  C   D
+0  foo    one  0   0
+1  bar    one  1   2
+2  foo    two  2   4
+4  foo    two  4   8
+6  foo    one  6  12
+7  foo  three  7  14
+
+ +

Different ways to achieve the same filtering

+ +
+

Let’s take the example for filtering by A = 'foo' and B = 'one'

+
+ +

Column as dataframe property

+ +
In [4]: df[(df.A == 'foo') & (df.B == 'one')]
+Out[4]:
+     A    B  C   D
+0  foo  one  0   0
+6  foo  one  6  12
+
+ +

Column as dataframe dict key

+ +
In [7]: df[(df['A'] == 'foo') & (df['B'] == 'one')]
+Out[7]:
+     A    B  C   D
+0  foo  one  0   0
+6  foo  one  6  12
+
+ +

Using multiple single filters

+ +
In [16]: df[df.A == 'foo'][df.B == 'one']
+C:\Users\xiang\AppData\Local\PackageManagement\NuGet\Packages\python.3.7.0\tools\Scripts\ipython:1: UserWarning: Boolean Series key will be reindexed to match DataFrame index.
+Out[16]:
+     A    B  C   D
+0  foo  one  0   0
+6  foo  one  6  12
+
+ +

Using numpy array

+ +
In [24]: df[(df.A.values == 'foo') & (df.B.values == 'one')]
+Out[24]:
+     A    B  C   D
+0  foo  one  0   0
+6  foo  one  6  12
+
+ +

Using isin function

+ +
In [9]: df[( df['A'].isin(['foo']) ) & ( df['B'].isin(['one']) )]
+Out[9]:
+     A    B  C   D
+0  foo  one  0   0
+6  foo  one  6  12
+
+ +

Using underlying numpy in1d function

+ +
In [25]: df[(np.in1d(df['A'].values, ['foo'])) & (np.in1d(df['B'].values, ['one']))]
+Out[25]:
+     A    B  C   D
+0  foo  one  0   0
+6  foo  one  6  12
+
+ +

Using query API (developer friendly)

+ +
In [10]: df.query("(A == 'foo') & (B == 'one')")
+Out[10]:
+     A    B  C   D
+0  foo  one  0   0
+6  foo  one  6  12
+
+ +

Using numpy where function and dataframe iloc positional indexing

+ +
In [20]: df.iloc[np.where( (df.A.values=='foo') & (df.B.values=='one') )]
+Out[20]:
+     A    B  C   D
+0  foo  one  0   0
+6  foo  one  6  12
+
+ +

Using xs label indexing

+ +

The Syntax is too complicated.

+ +

Developer friendly filtering

+ +

As mentioned previously, the query API method is a developer friendly filtering method.

+ +

Why? All the other methods must include the original df object in the filter. If we have a dynamic filter conditions, it will be difficult to generate the filters (pandas Series) with the df object. I haven’t found the solution to build this kind of filter by looping over a Python dict.

+ +

For example:

+ +

The filter is based on a Python dict, the key of the dict corresponds to the dataframe column, and the value of the dict corresponds to the value to dataframe column value to filter. One more context, if the value is None, don’t filter on the corresponding key (column).

+ +

Suppose the filter dict is like this one:

+ +
filter_dict = {'A': 'foo', 'B': 'one', 'C': None, 'D': None}
+
+ +

By using df object in the filter, we should see something like this:

+ +
df[(df['A'] == 'foo') & (df['B'] == 'one')]
+
+ +

It’s easy to type manually the filter directly from a shell (ipython or jupyter as you like), but how you build the same filter from a Python script ? Not simple.

+ +
+

Please let me know if you have any suggestions :)

+
+ +

But with the query API, we just need to convert the filter_dict to a string like: "(A == 'foo') & (B == 'one')". This is pretty easy in pure Python:

+ +
In [32]: filter_dict = {'A': 'foo', 'B': 'one', 'C': None, 'D': None}
+
+In [33]: filter_string = " & ".join(["{} == '{}'".format(k,v) for k,v in filter_dict.items() if v is not None])
+
+In [34]: filter_string
+Out[34]: "A == 'foo' & B == 'one'"}
+
+ +

Benchmark

+ +

You can get the benchmark from this StackOverflow thread.

+ +

Generally speaking, except for the query API and the xs label indexing methods, all the others are fast.

+ +

But for a large quantity of data, the query API becomes pretty fast.

+ +

Some benchmarks I tested from my laptop:

+ +

For 8 lines of data

+ +
In [35]: import pandas as pd
+   ...: import numpy as np
+   ...: df = pd.DataFrame({'A': 'foo bar foo bar foo bar foo foo'.split(),
+   ...:                    'B': 'one one two three two two one three'.split(),
+   ...:                    'C': np.arange(8), 'D': np.arange(8) * 2})
+
+In [36]: %timeit df.query("(A == 'foo') & (B == 'one')")
+1.48 ms ± 35.1 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
+
+In [37]: %timeit df[df.A == 'foo'][df.B == 'one']
+1.01 ms ± 33.7 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
+
+In [38]: %timeit df[(df.A == 'foo') & (df.B == 'one')]
+688 µs ± 48.3 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
+
+In [39]: %timeit df[(df.A.values == 'foo') & (df.B.values == 'one')]
+248 µs ± 15 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
+
+In [40]: %timeit df.iloc[np.where( (df.A.values=='foo') & (df.B.values=='one') )]
+287 µs ± 20.8 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
+
+ +

For 30k lines of data

+ +
In [51]: import pandas as pd
+    ...: import numpy as np
+    ...: df = pd.DataFrame({'A': ('foo bar ' * 15000).split(),
+    ...:                    'B': ('one one two two three three ' * 5000).split(),
+    ...:                    'C': np.arange(30000), 'D': np.arange(30000) * 2})
+
+In [52]: %timeit df.query("(A == 'foo') & (B == 'one')")
+2.83 ms ± 373 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)
+
+In [53]: %timeit df[df.A == 'foo'][df.B == 'one']
+6.51 ms ± 230 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)
+
+In [54]: %timeit df[(df.A == 'foo') & (df.B == 'one')]
+5.58 ms ± 480 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)
+
+In [55]: %timeit df[(df.A.values == 'foo') & (df.B.values == 'one')]
+1.47 ms ± 58 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
+
+In [56]: %timeit df.iloc[np.where( (df.A.values=='foo') & (df.B.values=='one') )]
+1.5 ms ± 38.5 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2019/07/troubleshooting-python-twine-cannot-upload-package-on-windows.html b/2019/07/troubleshooting-python-twine-cannot-upload-package-on-windows.html new file mode 100644 index 00000000..dd9fada6 --- /dev/null +++ b/2019/07/troubleshooting-python-twine-cannot-upload-package-on-windows.html @@ -0,0 +1,929 @@ + + + + + + +Troubleshooting Python Twine Cannot Upload Package On Windows - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +
+ + +
+ + + +

Python has several tools to upload packages to PyPi or some private Artifactory locations. The mostly used one should be twine. Although twine is not a Python originate tool, but it’s officially recommended by Python.org.

+ +

Building the package

+ +

Just a quick callback on how to build the pacakge. We need to create a file named setup.py at the root of the app. Use another file named MANIFEST.IN to include the non-code files to the package. Don’t forget to set include_package_data=True in setup.py

+ +
+

Wheel

+ +

A Built Distribution format introduced by PEP 427, which is intended to replace the Egg format. Wheel is currently supported by pip.

+
+ +

Before the build, ensure that version key in setup.py is well defined.

+ +
# to build a python wheel package
+# sdist will generate a .tar.gz file in dist/
+# bdist_wheel will generate a .whl file in dist/
+python setup.py sdist bdist_wheel
+
+ +

Upload built package to PyPi or private Artifactory.

+ +

We use twine to upload the Python packages. Before using it, we need to create a file name .pypirc in ~/.

+ +

There’s an example from jfrog for .pypirc.

+ +

Then, we can upload the package by:

+ +
# -r dev, dev is a repo defined in the ~/.pypirc file.
+6.2.0> twine upload dist/* -r dev --cert [path_of_artifactory_site_cert_bundle_full_chain_in_pem_format_it_seems_that_no_param_to_ignore_ssl_error_with_twine]
+
+ +

.pypirc path error

+ +

Unfortunately, on Windows OS, you might get following error message:

+ +
6.2.0> twine upload dist/* --cert [artifactory_site_cert_full_chain_in_pem_format] -r dev
+
+InvalidConfiguration: Missing 'dev' section from the configuration file or not a complete URL in --repository-url.
+Maybe you have a out-dated '~/.pypirc' format?
+more info: https://docs.python.org/distutils/packageindex.html#pypirc
+
+ +

This error is too generic, one of the reasons is because twine cannot find the file ~/.pypirc, but if you check by get-content ~/.pypirc, it exits.

+ +

The reason for this error is that if you’re on Windows, and $env:HOME exists and doesn’t point to the same location as $env:USERPROFILE.

+ +

twine uses $env:HOME as ~/ as per os.path.expanduser(), but Windows powershell uses $env:USERPROFILE as ~/. $env:HOME is not set by Windows by default. And Windows administrators often use $env:HOME to redirect the user roaming profile.

+ +

.pypirc path error reason

+ +
    +
  1. +

    Firstly, I set $env:HOME to a temp file, so it is differnet than $env:USERPROFILE

    + +
    +
     # Initially $env:HOME doesn't exist
    + 6.2.0> Get-ChildItem env: | Out-String -st | Select-String 'userpro|home'
    +
    + ANDROID_SDK_HOME               C:\Android
    + HOMEDRIVE                      C:
    + HOMEPATH                       \Users\xiang
    + USERPROFILE                    C:\Users\xiang
    +
    + 6.2.0> $env:HOME = 'c:/temp'
    +
    + # now, we have $env:HOME which is different than $env:USERPROFILE
    + 6.2.0> Get-ChildItem env: | Out-String -st | Select-String 'userpro|home'
    +
    + ANDROID_SDK_HOME               C:\Android
    + HOME                           c:/temp
    + HOMEDRIVE                      C:
    + HOMEPATH                       \Users\xiang
    + USERPROFILE                    C:\Users\xiang
    +
    +
  2. +
  3. Check ~/ in Python +
    +
     In [1]: import os
    +
    + In [2]: os.path.expanduser('~/')
    + Out[2]: 'c:/temp/'
    +
    +
  4. +
  5. Check ~/ in Powershell +
    +
     6.2.0> Resolve-Path ~/
    +
    + Path
    + ----
    + C:\Users\xiang
    +
    +
  6. +
+ +

So if we created the .pypirc file in ~/ in Powershell, twine won’t find it.

+ +

Why os.path.expanduser() doesn’t resolve the same ~/ as Powershell

+ +

As shown previsouly, Windows Powershell resolves ~/ as $env:USERPROFILE. How about os.path.expanduser()? Let’s check its source code by the inspect module.

+ +
In [1]: import os ; print(inspect.getsource(os.path.expanduser))
+def expanduser(path):
+    """Expand ~ and ~user constructs.
+
+    If user or $HOME is unknown, do nothing."""
+    path = os.fspath(path)
+    if isinstance(path, bytes):
+        tilde = b'~'
+    else:
+        tilde = '~'
+    if not path.startswith(tilde):
+        return path
+    i, n = 1, len(path)
+    while i < n and path[i] not in _get_bothseps(path):
+        i += 1
+
+    if 'HOME' in os.environ:
+        userhome = os.environ['HOME']
+    elif 'USERPROFILE' in os.environ:
+        userhome = os.environ['USERPROFILE']
+    elif not 'HOMEPATH' in os.environ:
+        return path
+    else:
+        try:
+            drive = os.environ['HOMEDRIVE']
+        except KeyError:
+            drive = ''
+        userhome = join(drive, os.environ['HOMEPATH'])
+
+    if isinstance(path, bytes):
+        userhome = os.fsencode(userhome)
+
+    if i != 1: #~user
+        userhome = join(dirname(userhome), path[1:i])
+
+    return userhome + path[i:]
+
+In [2]:
+
+ +

From the source code, obviously, if $env:HOME exists, expanduser() will return its value. If $env:HOME doesn’t exists, it falls back to $env:USERPROFILE, if not again, it falls back to $env:HOMEDRIVE/$env:HOMEPATH.

+ +

Solutions

+ +

We have 3 solutions.

+ +
    +
  1. +

    use twine --config-file to manually specify the .pypirc config file.

    +
  2. +
  3. +

    if $env:HOME exists, copy the .pypirc file to $env:HOME, otherwise to $env:USERPROFILE.

    +
  4. +
  5. +

    declare all the upload params as environment variables.

    +
  6. +
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2019/09/fast-tcp-port-check-in-powershell.html b/2019/09/fast-tcp-port-check-in-powershell.html new file mode 100644 index 00000000..4f1b1e19 --- /dev/null +++ b/2019/09/fast-tcp-port-check-in-powershell.html @@ -0,0 +1,891 @@ + + + + + + +A fast way to check TCP port in Powershell - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +
+ + +
+ + + +

The Test-NetConnection cmdlet is great and verbose but too slow if the remote port to check is not opened. This is due to its timeout setting and cannot be modified. In this port, I will show you a custom function that leverages the power of System.Net.Sockets.TcpClient to accelerate the port test.

+ +

Update 2019-12-31: I didn’t mention Test-Connection previously because although it has the parameter -TimeoutSeconds, its output only has True or False. What a pity. But things are going to be changed, as per this github issue, @jackdcasey is preparing a pull request to make Test-Connection’s output verbose enough.

+ +

Test-NetConnection is slow if the port is not opened

+ +

If the port is opened, it’s OK.

+ +
# if the port is opened
+6.2.2> Measure-Command {Test-NetConnection www.google.fr -Port 80} | % TotalSeconds
+0,2015152
+
+ +

But if the port is not opened, it would be better to take a coffee to wait for the result.

+ +
# if the port is not opened
+6.2.2> Measure-Command {Test-NetConnection www.google.fr -Port 123} | % TotalSeconds
+WARNING: TCP connect to (2a00:1450:4007:805::2003 : 123) failed
+WARNING: TCP connect to (172.217.18.195 : 123) failed
+42,5026257
+
+ +

For most of the cases, we only need to test a TCP port in a fast network (often LAN), waiting for 42 seconds is ridiculous, but unfortunately, Test-NetConnection doesn’t provide a parameter to decrease the timeout.

+ +

System.Net.Sockets.TcpClient is fast

+ +
+

“Talk is cheap. Show me the code.”

+
+ +

Test-Port demos

+ +
# if the port is opened
+6.2.2> Measure-Command {Test-Port www.google.fr 80} | % TotalSeconds
+0,0648323
+
+# if the port is not opened
+6.2.2> Measure-Command {Test-Port www.google.fr 123} | % TotalSeconds
+1,0072371
+
+# it works with pipeline too
+6.2.2> Measure-Command {"www.google.fr:80", "www.orange.fr:123", "www.free.fr" | Test-Port} | % TotalSeconds
+2,0201628
+
+# the output of the Test-Port, the default port to check is TCP 5985
+6.2.2> "www.google.fr:80", "www.orange.fr:123", "www.free.fr" | Test-Port | ft -a
+
+RemoteHostname RemotePort PortOpened TimeoutInMillisecond SourceHostname OriginalComputerName
+-------------- ---------- ---------- -------------------- -------------- --------------------
+www.google.fr  80               True                 1000 DELL-ZX        www.google.fr:80
+www.orange.fr  123             False                 1000 DELL-ZX        www.orange.fr:123
+www.free.fr    5985            False                 1000 DELL-ZX        www.free.fr
+
+ +

Test-Port source code

+ +

The code is still in POC, there’re still many parts to improve. For example, validating the given $ComputerName by resolving its IP, and error handling, etc.

+ +
function Test-Port {
+    [CmdletBinding()]
+    param (
+        [Parameter(ValueFromPipeline = $true, HelpMessage = 'Could be suffixed by :Port')]
+        [String[]]$ComputerName,
+
+        [Parameter(HelpMessage = 'Will be ignored if the port is given in the param ComputerName')]
+        [Int]$Port = 5985,
+
+        [Parameter(HelpMessage = 'Timeout in millisecond. Increase the value if you want to test Internet resources.')]
+        [Int]$Timeout = 1000
+    )
+
+    begin {
+        $result = [System.Collections.ArrayList]::new()
+    }
+
+    process {
+        foreach ($originalComputerName in $ComputerName) {
+            $remoteInfo = $originalComputerName.Split(":")
+            if ($remoteInfo.count -eq 1) {
+                # In case $ComputerName in the form of 'host'
+                $remoteHostname = $originalComputerName
+                $remotePort = $Port
+            } elseif ($remoteInfo.count -eq 2) {
+                # In case $ComputerName in the form of 'host:port',
+                # we often get host and port to check in this form.
+                $remoteHostname = $remoteInfo[0]
+                $remotePort = $remoteInfo[1]
+            } else {
+                $msg = "Got unknown format for the parameter ComputerName: " `
+                    + "[$originalComputerName]. " `
+                    + "The allowed formats is [hostname] or [hostname:port]."
+                Write-Error $msg
+                return
+            }
+
+            $tcpClient = New-Object System.Net.Sockets.TcpClient
+            $portOpened = $tcpClient.ConnectAsync($remoteHostname, $remotePort).Wait($Timeout)
+
+            $null = $result.Add([PSCustomObject]@{
+                RemoteHostname       = $remoteHostname
+                RemotePort           = $remotePort
+                PortOpened           = $portOpened
+                TimeoutInMillisecond = $Timeout
+                SourceHostname       = $env:COMPUTERNAME
+                OriginalComputerName = $originalComputerName
+                })
+        }
+    }
+
+    end {
+        return $result
+    }
+}
+
+ +

Test-Port in parallel

+ +

Although the timeout in Test-Port is 1000 milliseconds, if we have 100 hosts to check and if all the ports are not opened, Test-Port will be slow too, because it runs the check in serial.

+ +

I don’t prefer to implement the parallel inside Test-Port, as we have already some pure powershell parallel solutions by using the RunspacePool (PoshRSJob, Invoke-Parallel, etc.). And Microsoft is releasing its home-born parallel mechanism ForEach-Object -Parallel for Powershell 7.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2019/09/sqlalchemy-mixin-in-method.html b/2019/09/sqlalchemy-mixin-in-method.html new file mode 100644 index 00000000..11c1c06e --- /dev/null +++ b/2019/09/sqlalchemy-mixin-in-method.html @@ -0,0 +1,804 @@ + + + + + + +SQLAlchemy mixin in method - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +
+ + +
+ + + +

If I’m not wrong, the SQLAlchemy official doc provides some examples to explain how to share a set of common columns, some common table options, or other mapped properties, across many classes. But I cannot find how to share common methods (e.g. your customized to_dict() method). This post will just show you a POC to achieve this goal by using Python Mixin.

+ +

Share the common method to_dict() across two SQLAlchemy models

+ +
from sqlalchemy import Column, Integer, String
+from sqlalchemy.ext.declarative import declarative_base
+
+Base = declarative_base()
+
+
+class ModelMixin(object):
+
+    def to_dict(self):
+        return {c.name: getattr(self, c.name) for c in self.__table__.columns}
+
+
+class ModelA(Base, ModelMixin):
+    __tablename__ = "model_a"
+
+    model_id = Column(Integer, primary_key=True)
+    name = Column(String)
+
+
+class ModelB(Base, ModelMixin):
+    __tablename__ = "model_b"
+
+    model_id = Column(Integer, primary_key=True)
+    name = Column(String)
+
+ +

Test:

+ +
# to_dict() method from ModelMixin is shared between ModelA and ModelB
+
+>>> a = ModelA(model_id=11, name='a1')
+>>> a.to_dict()
+{'model_id': 11, 'name': 'a1'}
+
+>>> b = ModelB(model_id=22, name='b1')
+>>> b.to_dict()
+{'model_id': 22, 'name': 'b1'}
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2019/10/installing-python3-on-ubuntu.html b/2019/10/installing-python3-on-ubuntu.html new file mode 100644 index 00000000..4a89bc28 --- /dev/null +++ b/2019/10/installing-python3-on-ubuntu.html @@ -0,0 +1,870 @@ + + + + + + +Install Python3 on Ubuntu - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +
+ + +
+ + + +

Most of tutorials on the Internet about installing Python3.6 on Ubuntu are by using 3rd party PPA repositories. If for any reason, you cannot use them, hereunder a quick tutorial for installing it from the Python official source, you should in advance download the source to the Ubuntu.

+ +

Installing Python3.6 on Ubuntu 16.04

+ +

Disabling IPv6

+ +

IPv6 is enabled by default on Ubuntu 16.04, in some cases, your Ubuntu network connection might be very low due to IPv6. Use ip a | grep inet6 to check if IPv6 is enabled.

+ +

Ref: How to disable ipv6 address on ubuntu 18 04 bionic beaver linux

+ +

To disable IPv6 in a persist way, add following 2 lines in the file /etc/sysctl.conf and reload the sysctl by sudo sysctl --system or reboot the server:

+ +
net.ipv6.conf.all.disable_ipv6=1
+net.ipv6.conf.default.disable_ipv6=1
+
+ +

Installing build packages

+ +
sudo apt install -y build-essential zlib1g-dev libssl-dev
+
+ +

without libssl-dev package, pip install will throw TLS/SSL error.

+ +

From this point of view, installing Python on Windows by Scoop is much more pleasant :)

+ +

Installing Python3.6 from official source

+ +

The latest Python3.6 version at the time of this writing is 3.6.9.

+ +
# You may download the Python source to a local shared location (S3 or Artifactory, etc.) if you need to deploy Python to many servers.
+wget https://www.python.org/ftp/python/3.6.9/Python-3.6.9.tgz
+tar xzvf Python-3.6.9.tgz
+cd Python-3.6.9
+sudo ./configure --prefix=/opt/python3.6
+make -j $(nproc)
+sudo make install
+sudo ln -s /opt/python3.6/bin/python3.6 /usr/bin/python3.6
+
+ +

Python3.5 is preinstalled by default on Ubuntu 16.04, python3 -V gives Python 3.5.2, many system tools rely on it, please DO NOT bind python3 to any versions other than Python3.5, otherwise your system might have unexpected problems.

+ +

For a general Python installation not only for this Python3.6, if you have gcc v8+, you can add the flag --enable-optimizations to ./configure to gain an extra runtime speed, otherwise you might encounter Could not import runpy module error

+ +

Using Python3.6 pip

+ +
python3.6 -m pip install [a python module]
+
+ +

Prevent pip install without an active venv

+ +
echo 'export PIP_REQUIRE_VIRTUALENV=true' >> ~/.bashrc
+
+ +

Installing Python3.7 on Ubuntu 16.04

+ +

Just tested installing Python3.7.5 with the same procedure, all works.

+ +

Installing Python3.10.10 with sqlite3 on Ubuntu 20.04 in WSL

+ +
# install build packages
+sudo apt update
+sudo apt install -y build-essential zlib1g-dev libssl-dev libffi-dev
+
+# install sqlite3 from source, if you need a specific sqlite3 version in Python, you must install it before compiling Python, because the compilation needs the lib libsqlite3.so
+mkdir ~/src
+cd ~/src/
+wget https://www.sqlite.org/2021/sqlite-autoconf-3400100.tar.gz
+tar xvf sqlite-autoconf-3400100.tar.gz
+cd sqlite-autoconf-3400100/
+./configure --prefix=/usr/local
+make -j $(nproc)
+sudo make install
+make clean
+ll /usr/local/bin/sqlite*
+ll /usr/local/lib/*sqlite*
+
+# let below Python compilation to use the newly installed sqlite3 lib
+export LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATH
+
+# install python3.10.10 from source
+cd ~/src/
+wget https://www.python.org/ftp/python/3.10.10/Python-3.10.10.tgz
+tar xvf Python-3.10.10.tgz
+cd Python-3.10.10/
+
+# ubuntu 20.04 has gcc v9, so you can add the flag --enable-optimizations to ./configure
+# --with-bz2 is for pandas, otherwise modulenotfounderror: no module named '_bz2' pandas
+./configure --prefix=$HOME/opt/python3.10 --with-bz2
+make -j $(nproc)
+sudo make install
+make clean
+sudo ln -s ~/opt/python3.10/bin/python3.10 /usr/bin/python3.10
+ll $(which python3.10)
+echo -e '\nexport PIP_REQUIRE_VIRTUALENV=true' >> ~/.bashrc
+python3.10 -c 'import sqlite3 ; print(sqlite3.sqlite_version)'
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2019/12/Using-Powershell-to-retrieve-latest-package-url-from-github-releases.html b/2019/12/Using-Powershell-to-retrieve-latest-package-url-from-github-releases.html new file mode 100644 index 00000000..428f869e --- /dev/null +++ b/2019/12/Using-Powershell-to-retrieve-latest-package-url-from-github-releases.html @@ -0,0 +1,793 @@ + + + + + + +Using Powershell To Retrieve Latest Package Url From Github Releases - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +
+ + +
+ + + +
+

Github can host package releases, I will show you how to use Powershell to retrieve the latest release download url.

+
+ +

Download latest Powershell release for Windows x64 zip version

+ +

The goal of this demo is to convert the static url:

+ + + +

to the real download url (latest version on 2019/12/29):

+ + + +
> $url = 'https://github.com/PowerShell/PowerShell/releases/latest'
+> $request = [System.Net.WebRequest]::Create($url)
+> $response = $request.GetResponse()
+> $realTagUrl = $response.ResponseUri.OriginalString
+> $version = $realTagUrl.split('/')[-1].Trim('v')
+> $version
+6.2.3
+> $fileName = "PowerShell-$version-win-x64.zip"
+> $realDownloadUrl = $realTagUrl.Replace('tag', 'download') + '/' + $fileName
+> $realDownloadUrl
+https://github.com/PowerShell/PowerShell/releases/download/v6.2.3/PowerShell-6.2.3-win-x64.zip
+> Invoke-WebRequest -Uri $realDownloadUrl -OutFile $env:TEMP/$fileName
+
+ +

The same method can be applied to retrieve other urls on other sites.

+ +

The powershell pre-release doesn’t have a static url, so I cannot retrieve the latest v7.0.0-rc.1 download url.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2019/12/Using-Scoop-On-Windows.html b/2019/12/Using-Scoop-On-Windows.html new file mode 100644 index 00000000..7cda8c1b --- /dev/null +++ b/2019/12/Using-Scoop-On-Windows.html @@ -0,0 +1,867 @@ + + + + + + +Using Scoop On Windows - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +
+ + +
+ + + +
+

I’ve been using Scoop for setting up my personal and professional Windows development desktops since nearly 2 years. +For me, it’s much more useful than another famous Windows package management tool Chocolatey, because with Scoop, everything is run & installed without any administrator privileges. +This is very important in an enterprise environment, that all the enterprise Windows administrators are trying their best to prevent you from installing anything on Windows. This post will share my ways to use it especially in such an enterprise environment. BTW, Scoop is completely a Powershell open source project and free for use.

+
+ +

Using external 7Zip

+ +

7Zip is a prerequisite for Scoop which is used for decompress many tools (git, conemu, etc.). +By default, Scoop will download 7Zip from its official website https://7-zip.org/a/7z1900-x64.msi. +Unfortunately, this website is probably excluded by some enterprises’ security gateway/tool.

+ +

But, fortunately, 7Zip is often already installed by enterprises’ deployment tool by default.

+ +

So, in order to let Scoop to use this external 7Zip pre-installed by enterprise admin rather than $env:SCOOP\apps\7zip, we need to set following config:

+ +
scoop config '7ZIPEXTRACT_USE_EXTERNAL' $true
+
+ +

This tips is not documented yet in the Scoop Wiki.

+ +

BTW: Maybe coping manually the 7Zip files to $env:SCOOP\apps\7zip will work too, but I haven’t tested yet.

+ +

Scoop TLS/SSL support

+ +

Scoop uses following methods to support different TLS/SSL versions:

+ +

Previously:

+ +
# https://github.com/lukesampson/scoop/issues/2040#issuecomment-368298352
+
+function set_https_protocols($protocols) {
+    try {
+        [System.Net.ServicePointManager]::SecurityProtocol = [System.Net.SecurityProtocolType] $protocols
+    } catch {
+        [System.Net.ServicePointManager]::SecurityProtocol = "Tls,Tls11,Tls12"
+    }
+}
+
+function use_any_https_protocol() {
+    $original = "$([System.Net.ServicePointManager]::SecurityProtocol)"
+    $available = [string]::join(', ', [Enum]::GetNames([System.Net.SecurityProtocolType]))
+
+    # use whatever protocols are available that the server supports
+    set_https_protocols $available
+
+    return $original
+}
+
+function do_dl($url, $to, $cookies) {
+    $original_protocols = use_any_https_protocol
+    $progress = [console]::isoutputredirected -eq $false
+
+    try {
+        $url = handle_special_urls $url
+        dl $url $to $cookies $progress
+    } catch {
+        $e = $_.exception
+        if($e.innerexception) { $e = $e.innerexception }
+        throw $e
+    } finally {
+        set_https_protocols $original_protocols
+    }
+}
+
+ +

Now:

+ +
# https://github.com/lukesampson/scoop/blob/48bb96a3d80ed722317a88afbae126c40ee205e8/lib/core.ps1#L1
+
+function Optimize-SecurityProtocol {
+    # .NET Framework 4.7+ has a default security protocol called 'SystemDefault',
+    # which allows the operating system to choose the best protocol to use.
+    # If SecurityProtocolType contains 'SystemDefault' (means .NET4.7+ detected)
+    # and the value of SecurityProtocol is 'SystemDefault', just do nothing on SecurityProtocol,
+    # 'SystemDefault' will use TLS 1.2 if the webrequest requires.
+    $isNewerNetFramework = ([System.Enum]::GetNames([System.Net.SecurityProtocolType]) -contains 'SystemDefault')
+    $isSystemDefault = ([System.Net.ServicePointManager]::SecurityProtocol.Equals([System.Net.SecurityProtocolType]::SystemDefault))
+
+    # If not, change it to support TLS 1.2
+    if (!($isNewerNetFramework -and $isSystemDefault)) {
+        # Set to TLS 1.2 (3072), then TLS 1.1 (768), and TLS 1.0 (192). Ssl3 has been superseded,
+        # https://docs.microsoft.com/en-us/dotnet/api/system.net.securityprotocoltype?view=netframework-4.5
+        [System.Net.ServicePointManager]::SecurityProtocol = 3072 -bor 768 -bor 192
+    }
+}
+
+ +

We can reuse it elsewhere.

+ +

Scoop aria2 skip certificate check

+ +

To use aria2 within Scoop to download packages in multithreading:

+ +
scoop config aria2-enabled true
+
+ +

But aria2 by default checks the certificate, to skip the check, use aria2-options:

+ +
scoop config aria2-options @('--check-certificate=false')
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2019/12/elastic-painless-scripted-field-on-null-or-mssing-value.html b/2019/12/elastic-painless-scripted-field-on-null-or-mssing-value.html new file mode 100644 index 00000000..fd38c737 --- /dev/null +++ b/2019/12/elastic-painless-scripted-field-on-null-or-mssing-value.html @@ -0,0 +1,786 @@ + + + + + + +Elastic Painless Scripted Field On Null/Missing Value - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +
+ + +
+ + + +

This post shows how to use elastic painless language in scripted field to work on documents’ keys which might not exist in some documents.

+ +

Parsing analyzed field in Painless

+ +

Suppose we have following 2 documents in elastic:

+ +
[{
+    "kye1": "value1",
+    "key2": {
+        "key22": "value22"
+    }
+}, {
+    "key1": "valuex"
+}]
+
+ +

The key key22 in the first document can be accessed by doc['key2.key22'].value. If we use this script in the scripted field, we will see a null value for all the documents. This is because the second document doesn’t have the key key22, painless language will throw an error. This github issue is discussing how to return a default value if it is missing.

+ +

To workaround this, I found a solution from this github issue. We should check the null value each time.

+ +

The script should be:

+ +
(params._source.key2 == null) ? '' : ( (params._source.key2.key22 == null) ? '' : (params._source.key2.key22) ))
+
+ +

Parsing documents by params._source is very slow. It’s not cached, and is calculated in real-time each time.

+ +

The fields calculated by the scripted field is not searchable.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2020/02/setting-up-wsl.html b/2020/02/setting-up-wsl.html new file mode 100644 index 00000000..37ce1b83 --- /dev/null +++ b/2020/02/setting-up-wsl.html @@ -0,0 +1,792 @@ + + + + + + +Setting up WSL - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +
+ + +
+ + + +

Cleaning up manually the WSL instance

+ +

For any reason you failed to install WSL from Microsoft store, you might need to clean up manually the downloaded WSL instance, the default location is at: $env:LOCALAPPDATA\Packages

+ +

For example, Ubuntu v1804 is at: C:\Users\xiang\AppData\Local\Packages\CanonicalGroupLimited.UbuntuonWindows_79rhkp1fndgsc\

+ +

Just delete the folder then reinstall it from Microsoft store.

+ +

Changing the default ls output directory color

+ +

https://github.com/microsoft/vscode/issues/7556

+ +

https://askubuntu.com/a/466203

+ +
# add to ~/.bashrc
+export LS_COLORS="ow=0;36;40"
+
+
+

ow = (OTHER_WRITABLE) Directory that is other-writable (o+w) and not sticky

+
+ +

Installing Python3.7 on Ubuntu 1804

+ +

Installing Python3.7 will automatically prompt you to update libssl.

+ +
sudo apt update
+sudo apt install python3.7 python3.7-venv python3-venv
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2020/03/flattening-nested-dict-in-python.html b/2020/03/flattening-nested-dict-in-python.html new file mode 100644 index 00000000..30117d8b --- /dev/null +++ b/2020/03/flattening-nested-dict-in-python.html @@ -0,0 +1,839 @@ + + + + + + +Flattening nested dict in Python - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

Problem

+ +

Given a nested dict with list as some keys’ value, we want to flatten the dict to a list.

+ +

For example, given a dict as like:

+ +
nested_data = {
+  "env": ["prd", "dev"],
+  "os": ["win", "unx"],
+  "msg": "ok"
+}
+
+ +

we want to convert it to a list as like:

+ +
{'msg': 'ok', 'env': 'prd', 'os': 'win'}
+{'msg': 'ok', 'env': 'prd', 'os': 'unx'}
+{'msg': 'ok', 'env': 'dev', 'os': 'win'}
+{'msg': 'ok', 'env': 'dev', 'os': 'unx'}
+
+ +

Solution

+ +
from copy import deepcopy
+import itertools
+
+nested_data = {
+  "env": ["prd", "dev"],
+  "os": ["win", "unx"],
+  "msg": "ok"
+}
+
+base_data = {}
+non_base_data = []
+
+for k, v in nested_data.items():
+    if isinstance(v, list):
+        non_base_data.append([{k: single_v} for single_v in v])
+    else:
+        base_data.update({k: v})
+
+print("base_data:", base_data)
+print("non_base_data:", non_base_data)
+
+flatted_list = list(itertools.product(*tuple(non_base_data)))
+
+for l in flatted_list:
+    print(l)
+print(len(flatted_list))
+
+
+flatted_data = []
+for one_combination in flatted_list:
+    line = deepcopy(base_data)
+    for column in one_combination:
+        line.update(column)
+    flatted_data.append(line)
+
+for l in flatted_data:
+    print(l)
+print(len(flatted_data))
+
+
+# base_data: {'msg': 'ok'}
+# non_base_data: [[{'env': 'prd'}, {'env': 'dev'}], [{'os': 'win'}, {'os': 'unx'}]]
+# ({'env': 'prd'}, {'os': 'win'})
+# ({'env': 'prd'}, {'os': 'unx'})
+# ({'env': 'dev'}, {'os': 'win'})
+# ({'env': 'dev'}, {'os': 'unx'})
+# 4
+# {'msg': 'ok', 'env': 'prd', 'os': 'win'}
+# {'msg': 'ok', 'env': 'prd', 'os': 'unx'}
+# {'msg': 'ok', 'env': 'dev', 'os': 'win'}
+# {'msg': 'ok', 'env': 'dev', 'os': 'unx'}
+# 4
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2020/04/fixing-ipython-on-Windows10-ConEmu-mouse-event-bug.html b/2020/04/fixing-ipython-on-Windows10-ConEmu-mouse-event-bug.html new file mode 100644 index 00000000..54136d5d --- /dev/null +++ b/2020/04/fixing-ipython-on-Windows10-ConEmu-mouse-event-bug.html @@ -0,0 +1,886 @@ + + + + + + +Fixing an ipython Windows ConEmu only bug on ‘MouseEventType.MOUSE_DOWN’ - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +
+ + +
+ + + +

Problem

+ +

Previously I updated the python version, the ipython version and maybe ConEmu on my Windows 10 (I don’t remember which one exactly), I got an error when I wanted to copy some text from ipython repl in ConEmu console by the right mouse click:

+ +
ps.7.0.0 | py.3.8.2 ipython
+Python 3.8.2 (tags/v3.8.2:7b3ab59, Feb 25 2020, 23:03:10) [MSC v.1916 64 bit (AMD64)]
+Type 'copyright', 'credits' or 'license' for more information
+IPython 7.13.0 -- An enhanced Interactive Python. Type '?' for help.
+
+
+Unhandled exception in event loop:
+  File "d:\xiang\tools\scoop\apps\python\3.8.2\lib\asyncio\events.py", line 81, in _run
+    self._context.run(self._callback, *self._args)
+  File "d:\xiang\tools\scoop\apps\python\3.8.2\lib\site-packages\prompt_toolkit\input\win32.py", line 512, in ready
+    callback()
+  File "d:\xiang\tools\scoop\apps\python\3.8.2\lib\site-packages\prompt_toolkit\application\application.py", line 653, in read_from_input
+    self.key_processor.process_keys()
+  File "d:\xiang\tools\scoop\apps\python\3.8.2\lib\site-packages\prompt_toolkit\key_binding\key_processor.py", line 274, in process_keys
+    self._process_coroutine.send(key_press)
+  File "d:\xiang\tools\scoop\apps\python\3.8.2\lib\site-packages\prompt_toolkit\key_binding\key_processor.py", line 186, in _process
+    self._call_handler(matches[-1], key_sequence=buffer[:])
+  File "d:\xiang\tools\scoop\apps\python\3.8.2\lib\site-packages\prompt_toolkit\key_binding\key_processor.py", line 329, in _call_handler
+    handler.call(event)
+  File "d:\xiang\tools\scoop\apps\python\3.8.2\lib\site-packages\prompt_toolkit\key_binding\key_bindings.py", line 101, in call
+    self.handler(event)
+  File "d:\xiang\tools\scoop\apps\python\3.8.2\lib\site-packages\prompt_toolkit\key_binding\bindings\mouse.py", line 128, in _mouse
+    event_type = MouseEventType(pieces[0])
+  File "d:\xiang\tools\scoop\apps\python\3.8.2\lib\enum.py", line 304, in __call__
+    return cls.__new__(cls, value)
+  File "d:\xiang\tools\scoop\apps\python\3.8.2\lib\enum.py", line 595, in __new__
+    raise exc
+  File "d:\xiang\tools\scoop\apps\python\3.8.2\lib\enum.py", line 579, in __new__
+    result = cls._missing_(value)
+  File "d:\xiang\tools\scoop\apps\python\3.8.2\lib\enum.py", line 608, in _missing_
+    raise ValueError("%r is not a valid %s" % (value, cls.__name__))
+
+Exception 'MouseEventType.MOUSE_DOWN' is not a valid MouseEventType
+Press ENTER to continue...
+
+ +

Root cause

+ +

From the error stack, we can identify that it should be this line from the prompt_toolkit which throws the error:

+ +
  File "d:\xiang\tools\scoop\apps\python\3.8.2\lib\site-packages\prompt_toolkit\key_binding\bindings\mouse.py", line 128, in _mouse
+    event_type = MouseEventType(pieces[0])
+
+ +

And hereunder is the ipython and prompt_toolkit version installed on my Windows 10.

+ +
ps.7.0.0 | py.3.8.2 pip list | sls ipython, prompt
+
+ipython          7.13.0
+ipython-genutils 0.2.0
+prompt-toolkit   3.0.4
+
+ +

Let’s check the source code of the prompt_toolkit:

+ +
@key_bindings.add(Keys.WindowsMouseEvent)
+def _mouse(event: E) -> None:
+    """
+    Handling of mouse events for Windows.
+    """
+    assert is_windows()  # This key binding should only exist for Windows.
+
+    # Parse data.
+    pieces = event.data.split(";")
+
+    event_type = MouseEventType(pieces[0])
+
+ +

And let’s add some simple debug code by using print():

+ +
@key_bindings.add(Keys.WindowsMouseEvent)
+def _mouse(event: E) -> None:
+    """
+    Handling of mouse events for Windows.
+    """
+    assert is_windows()  # This key binding should only exist for Windows.
+
+    # Parse data.
+    pieces = event.data.split(";")
+
+    # start debug
+    for met in MouseEventType:
+          print("met:", met)
+    print("pieces[0]:", pieces[0])
+    # end debug
+
+    event_type = MouseEventType(pieces[0])
+
+ +

Reproduce the error in ipython, I got the print info:

+ +
met: MouseEventType.MOUSE_UP
+met: MouseEventType.MOUSE_DOWN
+met: MouseEventType.SCROLL_UP
+met: MouseEventType.SCROLL_DOWN
+pieces[0]: MouseEventType.MOUSE_DOWN
+
+ +

Visually it seems that pieces[0] is in the MouseEventType, but as MouseEventType is an Enum type, the correct syntax is that pieces[0] should not be prefixed by the enum class name MouseEventType, instead we can use the string format of the type, so called programmatic access: MouseEventType["MOUSE_DOWN"]

+ +

Solution

+ +

Adding a split on pieces[0] object can workaround the issue, but to fix it definitively, in fact, the author already fixed it a couple of weeks ago:

+ +

https://github.com/prompt-toolkit/python-prompt-toolkit/issues/1099

+ +

https:/ +/pull/1105/commits/d2e7da3be5e46a5c8b432f67f78b662541b957de

+ +
# prompt_toolkit/input/win32.py
+# On a key press, generate both the mouse down and up event.
+for event_type in [MouseEventType.MOUSE_DOWN, MouseEventType.MOUSE_UP]:
+    data = ";".join(
+-       [str(event_type), str(ev.MousePosition.X), str(ev.MousePosition.Y)]
++       [event_type.value, str(ev.MousePosition.X), str(ev.MousePosition.Y)]
+    )
+    result.append(KeyPress(Keys.WindowsMouseEvent, data))
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2020/04/making-isort-compatible-with-black.html b/2020/04/making-isort-compatible-with-black.html new file mode 100644 index 00000000..48ed5e8c --- /dev/null +++ b/2020/04/making-isort-compatible-with-black.html @@ -0,0 +1,867 @@ + + + + + + +Making isort compatible with black - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

Update 2020-12-06, thanks to Christian Jauvin’s comment, since isort v5, it has introduced --profile=black option, so the life is much easier now:)

+ +
+

Both isort and black are a must have in my python life, but with their default settings, I will get different imports formats.

+
+ +

multi_line_output, include_trailing_comma and line_length

+ +

The main difference between isort and black are on there points:

+ +
    +
  1. the multi line mode
  2. +
  3. the trailing comma of the last import
  4. +
  5. the max line length
  6. +
+ +

Personally, I prefer making isort compatible with black, so the settings to be used with isort is: isort -m 3 -tc

+ +

As per isort settings wiki:

+ +
    +
  • +

    -m 3 standards for multi line mode 3, which is Vertical Hanging Indent

    + +
    +
    from third_party import (
    +    lib1,
    +    lib2,
    +    lib3,
    +    lib4,
    +)
    +
    +
  • +
  • +

    -tc standards for adding trailing comma for each import including the last one

    +
  • +
+ +

There’s also a param -w 88 to set the max line length to 88, but with multi line mode 3, we rarely need it.

+ +

There’s also a param -rc to recursively sort on all files in the project.

+ +

We can also use isort custom profile to overwrite the default settings as shown here. And to use the custom profile in VSCode:

+
# https://github.com/microsoft/vscode/issues/83586#issuecomment-557334564
+"python.sortImports.args": [
+    "--settings-path=${workspaceFolder}/setup.cfg"
+]
+
+ +

isort with VSCode

+ +

isort v5-:

+ +

https://pycqa.github.io/isort/docs/configuration/profiles/

+ +
{
+  "editor.formatOnSave":true,
+  "python.sortImports.path": "isort",
+  "python.sortImports.args":[
+    "-m 3",
+    "-tc",
+  ],
+  "[python]":{
+    "editor.codeActionsOnSave":{
+         # it was `"source.organizeImports": true` in my first version of this post,
+         # see below comment for explanation.
+        "source.organizeImports.python": true
+    }
+  }
+}
+
+ +

isort v5+:

+ +
{
+  "editor.formatOnSave":true,
+  "python.sortImports.path": "isort",
+  "python.sortImports.args":[
+    "--profile=black",
+  ],
+  "[python]":{
+    "editor.codeActionsOnSave":{
+         # it was `"source.organizeImports": true` in my first version of this post,
+         # see below comment for explanation.
+        "source.organizeImports.python": true
+    }
+  }
+}
+
+ +

After some days of using above settings, I found a very frustrating behavior that when I pressed Ctrl+S multiple times to save manually a python file, the imports part changed upon each save, and sometimes it even deleted some imports… +Digged in github, people have already reported the issue. See issues/83586, and issues/9889 +The solution (workaround) is here. Replace "source.organizeImports":true by source.organizeImports.python to allow codeActionsOnSave to specify which extension to use for a given on save action, the way editor.defaultFormatter or python.formatting.provider work.

+ +

isort with git hook

+ +

Just in case you’re interested in git hook, the settings is here.

+ +

Update 2021-03-28: using git pre-commit.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2020/05/using-python-contextmanager-to-create-a-timer-decorator.html b/2020/05/using-python-contextmanager-to-create-a-timer-decorator.html new file mode 100644 index 00000000..2c5a9ab4 --- /dev/null +++ b/2020/05/using-python-contextmanager-to-create-a-timer-decorator.html @@ -0,0 +1,819 @@ + + + + + + +Using Python Contextmanager To Create A Timer Decorator - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +
+ + +
+ + + +

This stackoverflow post has already given an example on how to use contextmanager to create a timer decorator:

+ +
from contextlib import contextmanager
+from timeit import default_timer
+
+@contextmanager
+def elapsed_timer():
+    start = default_timer()
+    elapser = lambda: default_timer() - start
+    yield lambda: elapser()
+    end = default_timer()
+    elapser = lambda: end-start
+
+ +

It works well, but flake8 linter warns me that: [E731]: do not assign a lambda expression, use a def.

+ +

So hereunder the lambda free version:

+ +
from contextlib import contextmanager
+from timeit import default_timer
+
+@contextmanager
+def elapsed_timer():
+    start_time = default_timer()
+
+    class _Timer():
+      start = start_time
+      end = default_timer()
+      duration = end - start
+
+    yield _Timer
+
+    end_time = default_timer()
+    _Timer.end = end_time
+    _Timer.duration = end_time - start_time
+
+ +

Test:

+ +
In [67]: from time import sleep
+    ...:
+    ...: def sleep_1s():
+    ...:     sleep(1)
+    ...:
+    ...: with elapsed_timer() as t:
+    ...:     sleep_1s()
+    ...:
+
+In [68]: t.start
+Out[68]: 4583.4985535
+
+In [69]: t.end
+Out[69]: 4584.4983676
+
+In [70]: t.duration
+Out[70]: 0.9998141000005489
+
+# the duration is less than 1s, it's default_timer of timeit.
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2020/06/compiling-sqlalchemy-query-to-nearly-real-raw-sql-query.html b/2020/06/compiling-sqlalchemy-query-to-nearly-real-raw-sql-query.html new file mode 100644 index 00000000..58c0c40f --- /dev/null +++ b/2020/06/compiling-sqlalchemy-query-to-nearly-real-raw-sql-query.html @@ -0,0 +1,938 @@ + + + + + + +Compiling SQLAlchemy query to nearly real raw sql query - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +
+ + +
+ + + + + +
    +
  1. https://stackoverflow.com/questions/5631078/sqlalchemy-print-the-actual-query
  2. +
  3. https://docs.sqlalchemy.org/en/13/faq/sqlexpressions.html?highlight=literal_bind#rendering-bound-parameters-inline
  4. +
  5. https://docs.sqlalchemy.org/en/13/core/engines.html#configuring-logging
  6. +
+ +

Query to compile

+ +

Suppose we have a table called Movie, and a column release_date in the table Movie.

+ +
> from datetime import date
+
+> from sqlalchemy import create_engine, sessionmaker
+
+> engine = create_engine('sqlite:///moive_example.db')
+> Session = sessionmaker(bind=engine)
+> session = Session()
+
+> filter1 = Movie.release_date > date(2015, 1, 1)
+
+> filter1
+<sqlalchemy.sql.elements.BinaryExpression object at 0x000001FFA56E6BE0>
+
+> str(filter1)
+'movies.release_date > :release_date_1'
+
+> query1 = session.query(Movie).filter(Movie.release_date > date(2015, 1, 1)).limit(2)
+
+> query1
+<sqlalchemy.orm.query.Query object at 0x0000015A4700A2E0>>
+
+> str(query1)
+'SELECT movies.id AS movies_id, movies.title AS movies_title, movies.release_date AS movies_release_date \nFROM movies \nWHERE movies.release_date > ?\n LIMIT ? OFFSET ?'
+
+ +

Compiling to ORM sql query

+ +

As per the method given by Rendering Bound Parameters Inline:

+ +

Compiling filter1 to ORM sql query

+ +
> filter1.compile()
+<sqlalchemy.sql.compiler.StrSQLCompiler object at 0x000001FFA5706AC0>
+
+> str(filter1.compile())
+'movies.release_date > :release_date_1'
+
+> str(filter1.compile().params)
+"{'release_date_1': datetime.date(2015, 1, 1)}"
+
+> filter1.compile(compile_kwargs={"literal_binds": True})
+<sqlalchemy.sql.compiler.StrSQLCompiler object at 0x000001FFA572EEE0>
+
+> str(filter1.compile(compile_kwargs={"literal_binds": True}))
+"movies.release_date > '2015-01-01'"
+
+ +

Compiling query1 to ORM sql query

+ +
> str(query1.statement.compile())
+'SELECT movies.id, movies.title, movies.release_date \nFROM movies \nWHERE movies.release_date > :release_date_1\n LIMIT :param_1'
+
+> str(query1.statement.compile().params)
+"{'release_date_1': datetime.date(2015, 1, 1), 'param_1': 2}"
+
+> str(query1.statement.compile(compile_kwargs={"literal_binds": True}))
+"SELECT movies.id, movies.title, movies.release_date \nFROM movies \nWHERE movies.release_date > '2015-01-01'\n LIMIT 2"
+
+ +

As given by the paragraph name, the above compiled query is not the real raw sql query sent to the database, it’s an ORM one. But it’s more or less enough for debugging or logging purpose. See below paragraph to get how to compile to real raw sql query.

+ +

Compiling to nearly real raw sql query

+ +

SQLAlchemy doesn’t provide an out of the box function to compile a statement to the real raw sql query, and as per some issues’ comments, it seems that the authors wouldn’t like to implement it. There’s no official way, this part is based on some solutions provided by the community.

+ +

If you want to compile to real raw sql query, we should add the corresponding dialect, but be aware that it compiles only some simple types like Integer, String, etc. For complex types like Date, we need to use TypeDecorator to tell SQLAlchemy how to literal render these complex types. Using TypeDecorator means to modify your DB models, which is sometimes not a comfortable way.

+ +

Below 2 examples (by using engine or using dialect show the error message on Date type:

+ +
# using engine
+> str(filter1.compile(
+    engine,
+    compile_kwargs={"literal_binds": True},
+  ))
+NotImplementedError: Don't know how to literal-quote value datetime.date(2015, 1, 1)
+
+ +
# using dialect
+> from sqlalchemy.dialects import postgresql
+> str(query1.statement.compile(
+    compile_kwargs={"literal_binds": True},
+    dialect=postgresql.dialect(),
+  ))
+NotImplementedError: Don't know how to literal-quote value datetime.date(2015, 1, 1)
+
+ +

render_query()

+ +

Base on this stackoverflow example, I changed the param dialect to session, and removed the python2 part, hereunder the modified one:

+ + +
from datetime import date, datetime, timedelta
+from sqlalchemy.orm import Query
+
+def render_query(statement, db_session):
+    """
+    Generate an SQL expression string with bound parameters rendered inline
+    for the given SQLAlchemy statement.
+    WARNING: This method of escaping is insecure, incomplete, and for debugging
+    purposes only. Executing SQL statements with inline-rendered user values is
+    extremely insecure.
+    Based on http://stackoverflow.com/questions/5631078/sqlalchemy-print-the-actual-query
+    """
+    if isinstance(statement, Query):
+        statement = statement.statement
+    dialect = db_session.bind.dialect
+
+    class LiteralCompiler(dialect.statement_compiler):
+        def visit_bindparam(
+            self, bindparam, within_columns_clause=False, literal_binds=False, **kwargs
+        ):
+            return self.render_literal_value(bindparam.value, bindparam.type)
+
+        def render_array_value(self, val, item_type):
+            if isinstance(val, list):
+                return "{}".format(
+                    ",".join([self.render_array_value(x, item_type) for x in val])
+                )
+            return self.render_literal_value(val, item_type)
+
+        def render_literal_value(self, value, type_):
+            if isinstance(value, int):
+                return str(value)
+            elif isinstance(value, (str, date, datetime, timedelta)):
+                return "'{}'".format(str(value).replace("'", "''"))
+            elif isinstance(value, list):
+                return "'{{{}}}'".format(
+                    ",".join(
+                        [self.render_array_value(x, type_.item_type) for x in value]
+                    )
+                )
+            return super(LiteralCompiler, self).render_literal_value(value, type_)
+
+    return LiteralCompiler(dialect, statement).process(statement)
+
+ + +

Using the render_query()

+ +

The results in sqlite dialect:

+ +
> render_query(filter1, session)
+"movies.release_date > '2015-01-01'"
+
+> render_query(query1, session)
+"SELECT movies.id, movies.title, movies.release_date \nFROM movies \nWHERE movies.release_date > '2015-01-01'\n LIMIT 2 OFFSET 0"
+
+ +

With render_query(), it renders the query with dialect syntax, but please be aware that the values rendered are the ones translated by render_literal_value(), which might not be the ones really passed to SQL database. That’s also why I named this post as nearly real raw sql query.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2020/07/rolling-back-from-flask-restplus-reqparse-to-native-flask-request-to-parse-inputs.html b/2020/07/rolling-back-from-flask-restplus-reqparse-to-native-flask-request-to-parse-inputs.html new file mode 100644 index 00000000..3b28f70a --- /dev/null +++ b/2020/07/rolling-back-from-flask-restplus-reqparse-to-native-flask-request-to-parse-inputs.html @@ -0,0 +1,815 @@ + + + + + + +Rolling back from flask-restplus reqparse to native flask request to parse inputs - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

flask-restplus’ (or flask-restx) reqparse module is deprecated, so I decided to use the native flask request object to parse the incoming inputs.

+ +

After the try, I noticed some points to take care of. Before listing these points, I will show you how to use native flask request to parse the inputs.

+ +

The flask-restplus official doc suggests to use marshmallow to replace reqparse.

+ +

Parsing inputs with the native flask request

+ +

The native Flask Request object has many attributes. To parse the incoming inputs, we can mainly use:

+ +
from flask import request
+request.args
+request.json
+request.data
+request.form
+request.headers
+request.authorization
+
+ +

request is a global object always available in any active request contexts.

+ +

Point 1. Smart boolean type

+ +

flask-restplus’s boolean type is actually a smart boolean type, which can convert bool True, or string “True”, “tRue”, “1” etc., or int 1 to True, so as to False. This is very smart.

+ +
parser.add_argument('flag', type=inputs.boolean)
+
+ +

When I rolled back to using the flask.request, there’s no such smartness, so be careful how the API parsed the inputs with flask-restplus previously. If it accepted for example the string ‘false’ as smart boolean, which will be converted to boolean False with flask-restplus, once migrated to the native flask.request.json, the string ‘false’ is considered as a boolean True.

+ +
>>> bool("false")
+True
+
+ +

So maybe as a quick backward compatible workaround, we can reuse the smart boolean source code.

+ +

Point 2. Optional inputs

+ +

flask-restplus can define an optional input like this:

+ +
parser.add_argument('name', required=False, help="Name cannot blank!")
+
+ +

If user doesn’t provide name in the inputs, the reqparse will render it as {"name": None}, which means the optional input has None as its default value.

+ +

But in the native flask.request.json, we won’t see this input at all if it was not provided. So if the API backend must need the input name, we must add some protection.

+ +

Tests

+ +

In the end, I would just like to suggest everyone to write as many tests as we can to cover all the use cases.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2020/11/my-powerline.html b/2020/11/my-powerline.html new file mode 100644 index 00000000..dea612a2 --- /dev/null +++ b/2020/11/my-powerline.html @@ -0,0 +1,908 @@ + + + + + + +My Powerline setup and configuration - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +
+ + +
+ + + +

If you’re working in an enterprise environment, and you don’t have the admin rights on your Windows desktop to install additional fonts, or your enterprise admin cannot do that, then I suggest you to ignore this post, powerline will be installed, but very ugly. If you have a Linux desktop, all will be OK, installing fonts doesn’t need to be root.

+ + + +

Installing powerline-status from pip

+ +
pip3 install powerline-status --user
+pip3 show powerline-status
+user_python_site_packages=$(pip3 show powerline-status | grep Location: | awk '{print $2}')
+powerline_global_config_files_path="$user_python_site_packages/powerline/config_files"
+mkdir -p ~/.config/powerline
+cp -r $powerline_global_config_files_path/. ~/.config/powerline
+
+ +

Installing fonts

+ +

https://powerline.readthedocs.io/en/latest/installation/linux.html#fonts-installation

+ +
apt install fontconfig
+mkdir -p ~/.local/share/fonts/
+mkdir -p ~/.config/fontconfig/conf.d/
+wget https://github.com/powerline/powerline/raw/develop/font/PowerlineSymbols.otf
+wget https://github.com/powerline/powerline/raw/develop/font/10-powerline-symbols.conf
+mv PowerlineSymbols.otf ~/.local/share/fonts/
+fc-cache -vf ~/.local/share/fonts/
+mv 10-powerline-symbols.conf ~/.config/fontconfig/conf.d/
+
+ +

Installing additional fonts

+ +

https://github.com/powerline/fonts#quick-installation

+ +

Adding VIM support

+ + + +

If Python support is absent then Vim needs to be compiled with it. To do this use --enable-pythoninterp ./configure flag (Python 3 uses --enable-python3interp flag instead). Note that this also requires the related Python headers to be installed. Please consult distribution’s documentation for details on how to compile and install packages.

+ +

Check VIM with python support:

+ +
vim --version | grep +python
+
+ +

if you don’t have +python or +python3, you can install VIM from source by enable python support: https://github.com/ycm-core/YouCompleteMe/wiki/Building-Vim-from-source

+ +

Add following lines to $HOME/.vimrc:

+ +
python3 from powerline.vim import setup as powerline_setup
+python3 powerline_setup()
+python3 del powerline_setup
+
+ +

Adding Ipython support

+ +

https://powerline.readthedocs.io/en/latest/usage/other.html#ipython-prompt

+ +

Doesn’t work for ipython v7+: https://github.com/powerline/powerline/issues/1953

+ +

Adding PBD support

+ +

https://powerline.readthedocs.io/en/latest/usage/other.html#pdb-prompt

+ +

Adding Bash support

+ +

https://powerline.readthedocs.io/en/latest/usage/shell-prompts.html#bash-prompt

+ +

Add following lines to ~/.bashrc:

+ +
+

python path must be available before powerline-daemon -q

+
+ +
powerline-daemon -q
+POWERLINE_BASH_CONTINUATION=1
+POWERLINE_BASH_SELECT=1
+. {repository_root}/powerline/bindings/bash/powerline.sh
+
+ +

Adding Git support

+ +

https://github.com/jaspernbrouwer/powerline-gitstatus

+ +
pip3 install powerline-gitstatus
+
+ +

Add to ~/.config/powerline/colorschemes/default.json:

+ +
{
+  "groups": {
+    "gitstatus":                 { "fg": "gray8",           "bg": "gray2", "attrs": [] },
+    "gitstatus_branch":          { "fg": "gray8",           "bg": "gray2", "attrs": [] },
+    "gitstatus_branch_clean":    { "fg": "green",           "bg": "gray2", "attrs": [] },
+    "gitstatus_branch_dirty":    { "fg": "gray8",           "bg": "gray2", "attrs": [] },
+    "gitstatus_branch_detached": { "fg": "mediumpurple",    "bg": "gray2", "attrs": [] },
+    "gitstatus_tag":             { "fg": "darkcyan",        "bg": "gray2", "attrs": [] },
+    "gitstatus_behind":          { "fg": "gray10",          "bg": "gray2", "attrs": [] },
+    "gitstatus_ahead":           { "fg": "gray10",          "bg": "gray2", "attrs": [] },
+    "gitstatus_staged":          { "fg": "green",           "bg": "gray2", "attrs": [] },
+    "gitstatus_unmerged":        { "fg": "brightred",       "bg": "gray2", "attrs": [] },
+    "gitstatus_changed":         { "fg": "mediumorange",    "bg": "gray2", "attrs": [] },
+    "gitstatus_untracked":       { "fg": "brightestorange", "bg": "gray2", "attrs": [] },
+    "gitstatus_stashed":         { "fg": "darkblue",        "bg": "gray2", "attrs": [] },
+    "gitstatus:divider":         { "fg": "gray8",           "bg": "gray2", "attrs": [] }
+  }
+}
+
+ +

Add to ~/.config/powerline/themes/shell/default.json:

+ +
{
+    "function": "powerline_gitstatus.gitstatus",
+    "priority": 40
+}
+
+ +

Add to ~/.config/powerline/themes/shell/__main__.json:

+ +
"gitstatus": {
+    "args": {
+        "show_tag": "exact"
+    }
+}
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2021/01/python-lint-and-format.html b/2021/01/python-lint-and-format.html new file mode 100644 index 00000000..02ff0ad9 --- /dev/null +++ b/2021/01/python-lint-and-format.html @@ -0,0 +1,1478 @@ + + + + + + +Python Lint And Format - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 11 minute read + + + +

+ + +
+ + +
+ + + +

Azure SDK Python Guidelines

+ +

https://azure.github.io/azure-sdk/python_implementation.html

+ +

Lint

+ +

Update 2023-05-21: Replaced flake8, pylint, and isort by ruff. When replacing pylint, should add check by mypy.

+ +

ruff

+ +
ruff .
+ruff check .  # check is the default command so can be ignore
+
+# show ignored ruff alerts
+ruff . --ignore-noqa --exit-zero
+
+ +

pylint

+ +

Could be replaced by ruff.

+ +

As pylint has too many options, it’s recommended to use the pylint config file:

+ +
# file ~/.pylintrc, can be generated by pylint --generate-rcfile
+
+[MASTER]
+
+[MESSAGES CONTROL]
+disable=
+    C0116, # Missing function or method docstring (missing-function-docstring)
+    W1203, # Use lazy % formatting in logging functions (logging-fstring-interpolation)
+
+[format]
+max-line-length = 88
+
+[MISCELLANEOUS]
+# List of note tags to take in consideration, separated by a comma.
+notes=FIXME
+
+[VARIABLES]
+
+# List of additional names supposed to be defined in builtins. Remember that
+# you should avoid defining new builtins when possible.
+additional-builtins=
+    spark
+
+ +

But we can also ignore some warnings directly in the pylint command:

+ +
pylint . -j 0 --disable=C0116,W1203
+
+ +

To show all the inline ignored pylint alerts: pylint --enable=suppressed-message

+ +

Ignore Unused Argument given a Function Name Expression

+ +

Use dummy variable to ignore the Pylint warning on unused-argument.

+ +

flake8

+ +

Could be replaced by ruff.

+ +
# ignore W503 because of black format. BTW, flake8 also has W504 which is in contrary to W503.
+# ignore E501, line too long because we have the same check at Pylint side already.
+flake8 . \
+  --exclude=venv \
+  --extend-ignore=E203,E501,W503, \
+  --max-complexity=7 \
+  --show-source \
+  --statistics \
+  --count \
+  --jobs=auto
+
+flake8 [a_file_path]
+
+ +

To show all the inline ignored flake8 alerts: flake8 --disable-noqa || true

+ +

There’s a very nice flake8 plugin called flake8-cognitive-complexity which checks the Cognitive Complexity in addition to the Cyclomatic Complexity provided by flake8 out of the box. We dont need to add extra parameter to use the Cognitive Complexity in flake8, it’s set to --max-cognitive-complexity=7 by default once the plugin is installed. By the way, Sonar sets the Cognitive Complexity threshold to 15 by default.

+ +

To fix imported but not used error in __init__.py file, could by all attribute (the most elegant) or by –per-file-ignores.

+ +

bandit

+ +

The bandit config file format is not well documented, I passed a lot of time to test the config.

+ +
$ cat .bandit
+# https://github.com/PyCQA/bandit/issues/400
+exclude_dirs:
+  - "./venv/*"
+
+# https://github.com/PyCQA/bandit/pull/633
+assert_used:
+  skips:
+    - "*/*_test.py"
+    - "*/test_*.py"
+
+ +
# without specifying -c ./bandit, it doesn't work
+$ bandit . -r -c ./.bandit
+
+ +

ossaudit

+ +

ossaudit uses Sonatype OSS Index to audit Python packages for known vulnerabilities.

+ +

It can check installed packages and/or packages specified in dependency files. The following formats are supported with dparse:

+ +
    +
  • PIP requirement files
  • +
  • Pipfile
  • +
  • Pipfile.lock
  • +
  • tox.ini
  • +
  • conda.yml
  • +
+ +
# check installed packages and packages listed in two requirements files
+$ ossaudit --installed --file requirements.txt --file requirements-dev.txt
+Found 0 vulnerabilities in 214 packages
+
+ +

Github has already provided, free of charge, the vulnerable dependencies alert.

+ +

pyright

+ +

faster than mypy.

+ +
pyproject.toml:
+
+```toml
+[tool.pyright]
+reportUnnecessaryTypeIgnoreComment = true
+include = []
+exclude = []
+
+ +

running pyright:

+ +
# scan pathes specified in pyproject.toml include, exclude
+pyright
+
+# scan current folder and subfolders in spite of pyproject.toml include, exclude
+pyright .
+
+ +

mypy

+ +

For projects having sqlalchemy, we often install the sqlalchemy-stubs plugin as sqlalchemy uses some dynamic classes.

+ +

And also django-stubs, pandas-stubs, types-setuptools, types-requests etc.

+ +

mypy config file:

+ +
[mypy]
+ignore_missing_imports = True # We recommend using this approach only as a last resort: it's equivalent to adding a # type: ignore to all unresolved imports in your codebase.
+plugins = sqlmypy # sqlalchemy-stubs
+exclude = (?x)(
+    ^venv
+    | ^build
+  )
+
+ +

running mypy:

+ +
mypy .
+mypy . --exclude [a regular expression that matches file path]
+mypy . --exclude venv[//] # exclude venv folder under the root
+
+ +

When using mypy, it would be better to use mypy against to all files in the project, but ont some of them,

+ +

ignore lint error in one line

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
linterignore in one line
ruff(2 spaces)# noqa: {errorIdentifier}
pylint(2 spaces)# pylint: disable={errorIdentifier}
flake8(2 spaces)# noqa: {errorIdentifier}
bandit(2 spaces)# nosec
pyright(2 spaces)# pyright: ignore [reportOptionalMemberAccess, reportGeneralTypeIssues]
mypy(2 spaces)# type: ignore
multiple linters(2 spaces)# type: ignore # noqa: {errorIdentifier} # pylint: disable={errorIdentifier}
+ +

To ignore Pylint within a code block

+ +
# https://stackoverflow.com/a/48836605/5095636
+import sys
+sys.path.append("xx/xx")
+
+# pylint: disable=wrong-import-position
+from x import (  # noqa: E402
+    a,
+    b,
+)
+from y import c  # noqa: E402
+
+# pylint: enable=wrong-import-position
+
+ +

Format

+ +

isort

+ +

Could be replaced by ruff.

+ +
isort . --profile=black --virtual-env=venv --recursive --check-only
+isort . --profile=black --virtual-env=venv --recursive
+isort [a_file_path]
+
+ +

Be very careful with isort, it’s not uncompromising, especially for some codes that dynamically import some modules inside a function instead of from the beginning of a file. People use often this to avoid circular import problem. Always run the tests after the isort.

+ +

black

+ +
black . --check
+black .
+black [a_file_path]
+
+ +

Using black with other tools: https://black.readthedocs.io/en/stable/guides/using_black_with_other_tools.html

+ +

VSCode

+ +

Just my 2 cents, try the errorlens extension in VSCode, it will lint all the warnings/errors on live when coding, it’s really cool.

+ +

And don’t forget to install the official SonarLint extension, it will give you extra lint. It eats a lot of memory with its java processes nevertheless.

+ +
  "python.formatting.provider": "none",
+  "[python]": {
+    "editor.defaultFormatter": "ms-python.black-formatter",
+    "editor.formatOnSave": true,
+    "editor.codeActionsOnSave": {
+      // "source.organizeImports": true
+    },
+  },
+  "python.linting.banditEnabled": true,
+  "python.linting.banditArgs": [
+    "-r",
+    "-c",
+    "~/pyproject.toml"
+  ],
+  "python.linting.ignorePatterns": [
+    ".vscode/*.py",
+    "**/site-packages/**/*.py",
+    "venv/"
+  ],
+  "python.linting.mypyEnabled": true,
+  "python.linting.mypyArgs": [
+    "--follow-imports=silent",
+    "--ignore-missing-imports",
+    "--show-column-numbers",
+    "--no-pretty",
+    "--warn-return-any",
+    "--warn-unused-configs",
+    "--show-error-codes"
+  ],
+  "sonarlint.connectedMode.connections.sonarqube": [
+    {
+      "serverUrl": "https://sonar.xxx",
+      "connectionId": "sonar.xxx"
+    }
+  ],
+  "[json]": {
+    "editor.defaultFormatter": "esbenp.prettier-vscode",
+    // "editor.defaultFormatter": "esbenp.prettier-vscode",
+    "editor.formatOnSave": true
+  },
+  "[jsonc]": {
+    "editor.defaultFormatter": "vscode.json-language-features"
+  },
+
+ +

pyproject.toml

+ +

pyproject.toml is the new standard in Python introduced by PEP 518 (2016) for build system requirements, PEP 621 (2020) for project metadata, and PEP 660 (2021) for wheel based editable installs.

+ +

It’s fun to know why Python authority chose this name, and very interesting to understand their POV of different file formats :smile:.

+ +

All the the major tools (setuptools, pip-tools, poetry) support this new standard, and the repo awesome-pyproject maintains a list of Python tools which are compatible to pyproject.toml.

+ +

We cannot officially declare flake8 config in pyproject.toml.

+ +

Hereunder an example of its content for the lint part.

+ +
[tool.ruff]
+fix = true
+show-fixes = true
+select = [
+    "ALL",
+    # "E",  # pycodestyle errors
+    # "W",  # pycodestyle warnings
+    # "F",  # pyflakes
+    # "I",  # isort
+    # "C",  # flake8-comprehensions
+    # "B",  # flake8-bugbear
+]
+ignore = [
+    # https://beta.ruff.rs/docs/rules/
+    "D",  # pydocstyle
+    "E501",  # line too long, handled by black
+    "B008",  # do not perform function calls in argument defaults
+    "ANN",  # flake8-annotations
+    # "C901",  # too complex
+    # "PTH123", # pathlib-open - this would force pathlib usage anytime open or with open was used.
+]
+
+[tool.ruff.isort]
+# Combine multiple `from foo import bar as baz` statements with the same source
+# (`foo`) into a single statement.
+combine-as-imports = true
+
+# Imports of the form `from foo import bar as baz` show one `import bar as baz`
+# per line. Useful for __init__.py files that just re-export symbols.
+force-wrap-aliases = true
+
+[tool.ruff.per-file-ignores]
+# Don't format docstrings in alembic migrations.
+"**/alembic/versions/*.py" = ["D"]
+"tests/**/*.py" = [
+    "S101", # asserts allowed in tests...
+    "ARG", # Unused function args -> fixtures nevertheless are functionally relevant...
+    "FBT", # Don't care about booleans as positional arguments in tests, e.g. via @pytest.mark.parametrize()
+]
+[tool.ruff.pep8-naming]
+classmethod-decorators = ["pydantic.validator"]
+
+[tool.pyright]
+reportUnnecessaryTypeIgnoreComment = true
+
+# mypy not used in favor of pyright
+# [tool.mypy]
+# incremental = true
+# ignore_missing_imports = true
+# warn_return_any = true
+# warn_unused_configs = true
+# # disallow_untyped_defs = true
+# exclude = [
+#     "^.venv/",
+#     "^build/",
+#     "^_local_test/",
+# ]
+
+[tool.bandit]
+exclude_dirs = [".venv", "_local_test"]
+skips = ["B101"]
+# tests = ["B201", "B301"]
+
+# replaced by ruff with mypy
+# [tool.pylint.main]
+# # ! type to use pyspark-stubs
+# # extension-pkg-allow-list = ["pyspark"]
+# # ignored-modules = ["pyspark"]
+# jobs = 0
+# # [tool.pylint.typecheck]
+# # # ! type to use pyspark-stubs
+# # generated-members = ["pyspark.sql.functions"]
+# [tool.pylint.variables]
+# # List of additional names supposed to be defined in builtins. Remember that
+# # you should avoid defining new builtins when possible.
+# # additional-builtins = ["spark"]
+# [tool.pylint."messages control"]
+# disable = [
+#     "missing-class-docstring",
+#     "missing-module-docstring",
+#     "missing-function-docstring",
+#     "logging-fstring-interpolation",
+# ]
+# [tool.pylint.miscellaneous]
+# notes = ["FIXME"]
+# [tool.pylint.format]
+# max-line-length = 88
+# expected-line-ending-format = "LF"
+# ignore-long-lines = "^\\s*(# )?<?https?://\\S+>?$"
+
+[tool.pytest.ini_options]
+testpaths=["tests/unit"]
+addopts="""
+    -v -s
+    --junitxml=junit/test-results.xml
+    --cov app
+    --cov-report=html
+    --cov-report=xml
+    --cov-report=term-missing:skip-covered
+    --cov-fail-under=95
+    """
+
+ +

Git pre-commit

+ +

https://pre-commit.com/

+ +
+

“Git hook scripts are useful for identifying simple issues before submission to code review. We run our hooks on every commit to automatically point out issues in code such as missing semicolons, trailing whitespace, and debug statements. By pointing these issues out before code review, this allows a code reviewer to focus on the architecture of a change while not wasting time with trivial style nitpicks.”

+
+ +
pip install pre-commit
+pre-commit install
+
+# install the script along with the hook environments in one command
+# https://pre-commit.com/index.html#pre-commit-install-hooks
+pre-commit install --install-hooks
+
+# Auto-update pre-commit config to the latest repos' versions.
+pre-commit autoupdate
+
+# Clean out cached pre-commit files.
+pre-commit clean
+
+# Clean unused cached repos.
+pre-commit gc
+
+# Run single check
+pre-commit run black
+
+# continuous integration
+# https://pre-commit.com/index.html#usage-in-continuous-integration
+pre-commit run --all-files
+# check only files which have changed
+pre-commit run --from-ref origin/HEAD --to-ref HEAD
+
+# Azure pipeline example with cache
+https://pre-commit.com/index.html#azure-pipelines-example
+
+# automatically enabling pre-commit on repositories
+# https://pre-commit.com/index.html#automatically-enabling-pre-commit-on-repositories
+git config --global init.templateDir ~/.git-template
+pre-commit init-templatedir ~/.git-template
+
+ +

Online examples

+ +

pylint github pre-commit-config.yaml

+ +

Create a file named .pre-commit-config.yaml to the root of your project

+ +

Although each lint has its own config to exclude some files from checking, pre-commit also has the key exclude with list value or regex to exclude file from sending to linter.

+ +

language: system means using the executables from the same environment of current Python interpreter.

+ +

When using mypy in pre-commit, it would be better run pre-commit run --all-files, mypy doesn’t work well with only diff files sent by pre-commit run --from-ref origin/${pullrequest_target_branch_name} --to-ref HEAD.

+ +
# Installation:
+# pip install pre-commit
+# pre-commit install
+repos:
+  - repo: https://github.com/pre-commit/pre-commit-hooks
+    rev: v4.3.0
+    hooks:
+      - id: check-json
+        exclude: devcontainer.json
+      - id: check-yaml
+      - id: check-toml
+      - id: end-of-file-fixer
+      - id: trailing-whitespace
+      - id: debug-statements
+      - id: requirements-txt-fixer
+      - id: detect-private-key
+      - id: mixed-line-ending
+        args: ["--fix=lf"]
+      - id: check-added-large-files
+      - id: no-commit-to-branch
+  - repo: https://github.com/Lucas-C/pre-commit-hooks
+    rev: v1.3.1
+    hooks:
+      - id: forbid-crlf
+      - id: remove-crlf
+      - id: forbid-tabs
+      - id: remove-tabs
+  - repo: https://github.com/pre-commit/mirrors-prettier
+    rev: v3.0.0-alpha.1
+    hooks:
+      - id: prettier
+  - repo: https://github.com/pre-commit/pygrep-hooks
+    rev: v1.9.0
+    hooks:
+      - id: python-check-blanket-type-ignore
+      - id: python-check-mock-methods
+      - id: python-no-log-warn
+      - id: python-use-type-annotations
+  - repo: https://github.com/asottile/pyupgrade
+    rev: v3.1.0
+    hooks:
+      - id: pyupgrade
+  - repo: local
+    hooks:
+      - id: bandit
+        name: bandit
+        entry: bandit
+        language: system
+        types: [python]
+        args:
+          - -c
+          - pyproject.toml
+      - id: ruff
+        name: ruff
+        entry: ruff
+        language: system
+        types: [python]
+        args:
+          - "."
+      - id: black
+        name: black
+        entry: black
+        language: system
+        types: [python]
+      - id: pyright
+        name: pyright
+        language: system
+        entry: pyright
+        types: [python]
+      - id: pytest
+        name: pytest
+        types: [python]
+        entry: pytest
+        language: system
+        pass_filenames: false
+        always_run: true
+
+
+
+ +

Be aware that especially in a local environment, we often use venv, in such case, it would be better to use above system level lint executables instead of below public ones, the checks will be more accurate.

+ +
# example of using online linters
+# Installation:
+# pip install pre-commit
+# pre-commit install
+repos:
+  - repo: https://github.com/pre-commit/pre-commit-hooks
+    rev: v4.4.0
+    hooks:
+      - id: check-json
+        exclude: devcontainer.json
+      - id: check-yaml
+      - id: check-toml
+      - id: end-of-file-fixer
+      - id: trailing-whitespace
+      - id: debug-statements
+      - id: requirements-txt-fixer
+      - id: detect-private-key
+      - id: mixed-line-ending
+        args: ["--fix=lf"]
+      - id: check-added-large-files
+      - id: no-commit-to-branch
+  - repo: https://github.com/Lucas-C/pre-commit-hooks
+    rev: v1.5.1
+    hooks:
+      - id: forbid-crlf
+      - id: remove-crlf
+      - id: forbid-tabs
+      - id: remove-tabs
+  - repo: https://github.com/pre-commit/mirrors-prettier
+    rev: v3.0.0-alpha.9-for-vscode
+    hooks:
+      - id: prettier
+        exclude: ".md"
+  - repo: https://github.com/pre-commit/pygrep-hooks
+    rev: v1.10.0
+    hooks:
+      - id: python-check-blanket-type-ignore
+      - id: python-check-mock-methods
+      - id: python-no-log-warn
+      - id: python-use-type-annotations
+  - repo: local
+    hooks:
+      - id: bandit
+        name: bandit
+        entry: bandit
+        language: system
+        types: [python]
+        args:
+          - -c
+          - pyproject.toml
+      - id: ruff
+        name: ruff
+        entry: ruff
+        language: system
+        types: [python]
+        args:
+          - --fix
+      - id: black
+        name: black
+        entry: black
+        language: system
+        types: [python]
+      - id: mypy
+        name: mypy
+        language: system
+        entry: mypy
+        types: [python]
+        args:
+          # - --strict
+          - --show-error-codes
+      - id: pytest
+        name: pytest
+        types: [python]
+        entry: pytest
+        language: system
+        pass_filenames: false
+        always_run: true
+
+ +

Install the git hook scripts

+ +
$ pre-commit install
+pre-commit installed at .git/hooks/pre-commit
+
+$ pre-commit install --hook-type post-merge
+pre-commit installed at .git/hooks/post-merge
+
+$ pre-commit install --hook-type pre-merge-commit
+pre-commit installed at .git/hooks/pre-merge-commit
+
+ +

You could also run pre-commit install --hook-type pre-push to register pre-push hooks.

+ +

Run against all the files

+ +
+

“it’s usually a good idea to run the hooks against all of the files when adding new hooks (usually pre-commit will only run on the changed files during git hooks)”

+
+ +
pre-commit run --all-files
+
+ +

Run for changed files only in CI

+ +

Please check also this official doc.

+ +
git fetch origin
+pre-commit run --from-ref origin/${pullrequest_target_branch_name} --to-ref HEAD
+
+ +

When using mypy, it would be better to use mypy against to all files in the project, but not the changed one only.

+ +

Git commit

+ +

Each time we use git commit to stage some files, these files will be sent to pre-commit to be checked against to the hooks defined in .pre-commit-config.yaml.

+ +

Temporarily disabling hooks

+ +

The official doc gives the example how to disable explicitly hooks by hooks’ ids: SKIP=flake8 git commit -m "foo", but if you want to disable completely all the hooks, an easy way might be found here by using git commit --no-verify or its shortcut git commit -n. If you use pre-commit during push, you can disable pre-commit during push by git push --no-verify or git push -n.

+ +

Automatically enabling pre-commit on repositories

+ +

https://pre-commit.com/#automatically-enabling-pre-commit-on-repositories

+ +

Usage in continuous integration

+ +

https://pre-commit.com/#usage-in-continuous-integration

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2021/01/python-requests-with-retry.html b/2021/01/python-requests-with-retry.html new file mode 100644 index 00000000..99ed42d1 --- /dev/null +++ b/2021/01/python-requests-with-retry.html @@ -0,0 +1,835 @@ + + + + + + +Python Requests With Retry - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

There’re several solutions to retry a HTTP request with Requests module, some of them are:

+ +
    +
  1. Native Requests’ retry based on urllib3’s HTTPAdapter.
  2. +
  3. Third party module: backoff.
  4. +
  5. Third party module: tenacity.
  6. +
+ +

The native HTTPAdapter is not easy to use. The tenacity module is very powerful, but is also more or less overkill because it’s a general Python retry utility, and doesn’t throw the same exception requests.exceptions.HTTPError raised by raise_for_status() of Requests. Using tenacity to an ongoing project might involve some code refactoring. So this post will just show some snippets to make retry with the backoff module.

+ +

Usually, we should only retry on idempotent verbs, we can get the same thing twice but we don’t want to create the same thing twice. On the other hand, sometimes the specific environment that we’re working on might have a POST as idempotent too, so make sure of that before using the retry.

+ +

Using backoff to retry

+ +
import logging
+from logging import Logger
+
+import backoff
+import requests
+from requests.exceptions import HTTPError
+import urllib3
+
+urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
+
+
+# in an internal enterprise environment, we often need to disable the proxy and ignore the ssl check. Of course, if you don't trust the target, then verify the ssl.
+NO_PROXY = {"http": None, "https": None}
+COMMON_REQUESTS_PARAMS = {"verify": False, "proxies": NO_PROXY}
+
+
+# This snippet only retries on the response return code >= 500
+def fatal_code(e):
+    return 400 <= e.response.status_code < 500
+
+
+BACKOFF_RETRY_ON_EXCEPTION_PARAMS = {
+    # expo: [1, 2, 4, 8, etc.] https://github.com/litl/backoff/blob/master/backoff/_wait_gen.py#L6
+    "wait_gen": backoff.expo,
+    # HTTPError raised by raise_for_status()
+    # HTTPError code list: https://github.com/psf/requests/blob/master/requests/models.py#L943
+    "exception": (HTTPError,),
+    "max_tries": 4,
+    "max_time": 50,  # nginx closes a session at 60' second by default
+    "giveup": fatal_code,
+}
+
+
+@backoff.on_exception(**BACKOFF_RETRY_ON_EXCEPTION_PARAMS)
+def request_with_retry(
+    should_log: bool = False,
+    logger: Logger = logging.getLogger(),
+    logger_level: str = "info",
+    **request_params
+):
+    full_params = COMMON_REQUESTS_PARAMS | request_params
+    requests_params_keys_to_log = ["data", "json", "params"]
+    if should_log:
+        params_message = ""
+        for key in requests_params_keys_to_log:
+            if key in request_params:
+                params_message += " with {} {}".format(key, request_params[key])
+        log_message = "[{}] {} with params{}.".format(
+            full_params["method"], full_params["url"], params_message
+        )
+        getattr(logger, logger_level.lower())(log_message)
+    response = requests.request(**full_params)
+    response.raise_for_status()
+    return response
+
+# how to use:
+request_params = {"method": "get", "url": "http://localhost"}
+response = request_with_retry(**request_params)
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2021/03/trying-python-pipreqs-and-pip-tools.html b/2021/03/trying-python-pipreqs-and-pip-tools.html new file mode 100644 index 00000000..13deafba --- /dev/null +++ b/2021/03/trying-python-pipreqs-and-pip-tools.html @@ -0,0 +1,902 @@ + + + + + + +Trying Python pipreqs and pip-tools - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +
+ + +
+ + + +
+

Relative to pipenv, and poetry, if you’re searching for some lightweight python package managers for a small project, I will introduce 2 handy tools for you: pipreqs and pip-tools.

+
+ +

pipreqs

+ +

pipreqs github

+ +

Suppose you are onboarded to an existing project where only pip is used. The requirements.txt file is generated by pip freeze. and it contains more than 30 lines of requirements, in addition the team cannot remember the original basic requirements anymore. One of the simple ways to rebuild the basic requirements (the project dependency but not all the underlying dependencies) is to use pipreqs.

+ +

How to use:

+ +
# First of all, just backup your current requirements.txt
+$ mv requirements.txt{,.bck}
+
+# at this moment, there's no more requirements.txt, then run pipreqs
+$ pipreqs /home/project/location
+Successfully saved requirements file in /home/project/location/requirements.txt
+
+ +

Let’s use the --debug option to see what it does in background

+ +
# I'm running pipreqs from the root path of a Flask project
+$ pipreqs . --debug
+DEBUG: Found packages: {'json', 'flask_webtest', 'time', 'requests', 'webtest', 'sys', 'flask', 'os', 'pathlib', 'setuptools', 'unittest', 'werkzeug'}
+DEBUG: Found imports: Flask, flask_webtest, Requests, WebTest, Werkzeug
+DEBUG: Getting packages information from Local/PyPI
+DEBUG: Starting new HTTPS connection (1): pypi.python.org:443
+DEBUG: https://pypi.python.org:443 "GET /pypi/flask_webtest/json HTTP/1.1" 301 122
+DEBUG: Starting new HTTPS connection (1): pypi.org:443
+DEBUG: https://pypi.org:443 "GET /pypi/flask_webtest/json HTTP/1.1" 301 221
+DEBUG: https://pypi.org:443 "GET /pypi/Flask-WebTest/json HTTP/1.1" 200 2155DEBUG: Starting new HTTPS connection (1): pypi.python.org:443
+DEBUG: https://pypi.python.org:443 "GET /pypi/WebTest/json HTTP/1.1" 301 122DEBUG: Starting new HTTPS connection (1): pypi.org:443
+DEBUG: https://pypi.org:443 "GET /pypi/WebTest/json HTTP/1.1" 200 12870
+
+$ cat ./requirements.txt
+Werkzeug==1.0.1
+Flask==1.1.2
+requests==2.25.1
+flask_webtest==0.0.9
+WebTest==2.0.35
+
+ +

pipreqs has also some other useful options(–no-pin, –force).

+ +

pip-tools

+ +

pip-tools github

+ +

Another missing feature of the native pip is that pip freeze doesn’t provide the packages dependencies. All the packages installed in the venv are listed in a single requirements.txt file, in the same top level with only the version info. Pipenv and Poetry resolve this issue, and introduce some lock system. But they’re not the native requirements.txt way. By using pip-tools, we can resolve this issue too and at the same time keep using requirements.txt. I found this tool occasionally by checking Flask project requirements.

+ +

The idea of pip-tools is to maintain a project basic dependency in a file called requirements.in, then use pip-tools to generate the requirements.txt file with all the dependencies including the underlying dependencies info inside but in the comments part.

+ +

Please be aware that: pip-tools = pip-compile + pip-sync

+ +

pip-compile

+ +

I’m running a small Flask project, the only package I need is just a single Flask. Let’s see an example of pip-compile by using requirements.in file without setup.py,

+ +
(venv)$ cat requirements.in
+Flask
+
+(venv)$ pip-compile requirements.in
+
+(venv)$ cat requirements.txt
+#
+# This file is autogenerated by pip-compile
+# To update, run:
+#
+#    pip-compile requirements.in
+#
+click==7.1.2
+    # via flask
+flask==1.1.2
+    # via -r r.in
+itsdangerous==1.1.0
+    # via flask
+jinja2==2.11.3
+    # via flask
+markupsafe==1.1.1
+    # via jinja2
+werkzeug==1.0.1
+    # via flask
+
+ +

The Solution for with setup.py is here.

+ +

pip-sync

+ +

Let’s see an example of pip-sync:

+ +
# Upgrade Werkzeug to a rc version that is newer that the once listed by requirements.txt
+(venv)$ pip install Werkzeug==2.0.0rc2
+Collecting Werkzeug==2.0.0rc2
+  Downloading Werkzeug-2.0.0rc2-py3-none-any.whl (284 kB)
+     |████████████████████████████████| 284 kB 3.3 MB/s
+Installing collected packages: werkzeug
+  Attempting uninstall: werkzeug
+    Found existing installation: Werkzeug 1.0.1
+    Uninstalling Werkzeug-1.0.1:
+      Successfully uninstalled Werkzeug-1.0.1
+Successfully installed werkzeug-2.0.0rc2
+
+# Use pip-sync to downgrade Werkzeug in order to make the venv to have exactly the same version of dependencies listed in requirements.txt
+(venv)$ pip-sync
+Collecting Werkzeug==1.0.1
+  Using cached Werkzeug-1.0.1-py2.py3-none-any.whl (298 kB)
+Installing collected packages: Werkzeug
+  Attempting uninstall: Werkzeug
+    Found existing installation: Werkzeug 2.0.0rc2
+    Uninstalling Werkzeug-2.0.0rc2:
+      Successfully uninstalled Werkzeug-2.0.0rc2
+Successfully installed Werkzeug-1.0.1
+
+ +

pipdeptree for dependency tree

+ +

The pip-tools’ github page introduces also a dependency tree generation tool: pipdeptree which is also very nice:

+ +
(venv)$ pipdeptree
+Flask==1.0
+  - click [required: >=5.1, installed: 7.1.2]
+  - itsdangerous [required: >=0.24, installed: 1.1.0]
+  - Jinja2 [required: >=2.10, installed: 2.11.3]
+    - MarkupSafe [required: >=0.23, installed: 1.1.1]
+  - Werkzeug [required: >=0.14, installed: 1.0.1]
+pip-tools==5.5.0
+  - click [required: >=7, installed: 7.1.2]
+  - pip [required: >=20.1, installed: 20.2.3]
+pipdeptree==2.0.0
+  - pip [required: >=6.0.0, installed: 20.2.3]
+setuptools==49.2.1
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2021/06/python-datetime-utc-now.html b/2021/06/python-datetime-utc-now.html new file mode 100644 index 00000000..935f92e2 --- /dev/null +++ b/2021/06/python-datetime-utc-now.html @@ -0,0 +1,782 @@ + + + + + + +Python datetime utcnow - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +
+ + +
+ + + +

Previously, when I needed a real UTC now with ISO 8601 format, I used to use the strftime function or the pytz module. But recently I just found that Python at least since v3.5 has already provide it with built-in module: datetime.now(timezone.utc), and this is also the preferred method over datatime.utcnow()

+ +

PS: datetime.fromisoformat() is release with python v3.7

+ +
>>> from datetime import datetime, timezone
+
+>>> datetime.utcnow()
+datetime.datetime(2021, 6, 27, 17, 31, 14, 410011)
+>>> datetime.utcnow().isoformat()
+'2021-06-27T17:31:14.410200'
+>>> datetime.fromisoformat(datetime.utcnow().isoformat())
+datetime.datetime(2021, 6, 27, 17, 31, 14, 415153)
+
+>>> datetime.now(timezone.utc)
+datetime.datetime(2021, 6, 27, 17, 31, 14, 419667, tzinfo=datetime.timezone.utc)
+>>> datetime.now(timezone.utc).isoformat()
+'2021-06-27T17:31:14.425507+00:00'
+>>> datetime.fromisoformat(datetime.now(timezone.utc).isoformat())
+datetime.datetime(2021, 6, 27, 17, 31, 14, 431368, tzinfo=datetime.timezone.utc)
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2021/06/python-unittest-cheet-sheet.html b/2021/06/python-unittest-cheet-sheet.html new file mode 100644 index 00000000..f96e3d27 --- /dev/null +++ b/2021/06/python-unittest-cheet-sheet.html @@ -0,0 +1,1297 @@ + + + + + + +Python Unittest Cheet Sheet - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 10 minute read + + + +

+ + +
+ + +
+ + + +
+

Python unittest and Pytest is a big deal, this post just gives some small & quick examples on how to use Python unittest framwork, especially with Pytest framework. This post is not finished yet.

+
+ +

pytest in Makefile

+ +
# Makefile
+# https://github.com/databrickslabs/dbx/blob/main/Makefile
+
+SHELL=/bin/bash
+VENV_NAME := $(shell [ -d venv ] && echo venv || echo .venv)
+PYTHON=${VENV_NAME}/bin/python
+FOLDER_FOR_COV=module1_folder module2_folder
+COVERAGE_THRESHOLD=80
+
+test:
+	$(PYTHON) -m pytest tests/unit/ -v -s -n auto --cov ${FOLDER_FOR_COV} \
+		--cov-report=html \
+		--cov-report=term-missing:skip-covered \
+        --cov-fail-under=$(COVERAGE_THERSHOLD)
+
+ +

pytest --pdb

+ +

https://docs.pytest.org/en/stable/usage.html#dropping-to-pdb-python-debugger-on-failures

+ +

This will invoke the Python debugger on every failure (or KeyboardInterrupt).

+ +

pytest --pdb --pdbcls=IPython.terminal.debugger:TerminalPdb

+ +

https://docs.pytest.org/en/stable/usage.html#using-the-builtin-breakpoint-function

+ +
$ pytest --help | grep -i ipython
+                        --pdbcls=IPython.terminal.debugger:TerminalPdb
+
+ +

--pdbcls=IPython.terminal.debugger:Pdb also opens a ipython session, but without tab completion (readline).

+ +

This will use ipdb instead of pdb. Can also be set by default in pytest.ini:

+ +
[pytest]
+addopts = --pdbcls=IPython.terminal.debugger:Pdb
+
+ +

PS: an alternatif: pdbpp (successor of pytest-ipdb) at: https://github.com/pdbpp/pdbpp

+ +

export PYTHONBREAKPOINT=ipdb.set_trace

+ +

Another way to using ipdb in debugger is to set export PYTHONBREAKPOINT=ipdb.set_trace, and set a break point with breakpoint() (introduce in Python 3.7), then run test with pytest -s.

+ +

import pdb; pdb.set_trace() won’t drop in to ipdb session with this way.

+ +

jupyter notebook #%% Debug Cell (VSCode only)

+ +

Aadd the #%% marker on a line, you will see a Debug Cell code lens. Should install the module jupyter at first.

+ +

Although we get the Debug Cell, it seems that it doesn’t work in test, should do more research later.

+ +

sys.last_value, sys.last_type and sys.last_traceback

+ +

https://docs.pytest.org/en/stable/usage.html#dropping-to-pdb-python-debugger-on-failures

+ +
+

Note that on any failure the exception information is stored on sys.last_value, sys.last_type and sys.last_traceback. In interactive use, this allows one to drop into postmortem debugging with any debug tool. One can also manually access the exception information, for example:

+
+ +
# when pytest --pdb is stopping at a failure
+>>> import sys
+
+>>> sys.last_traceback.tb_lineno
+1641
+
+>>> sys.last_traceback.tb_frame
+<frame at 0x7fcca5f89a00, file '/home/xiang/git/myPython/venv/lib/python3.8/site-packages/_pytest/python.py', line 1641, code runtest>
+
+>>> sys.last_value
+AssertionError('assert result == "ok"',)
+
+>>> sys.last_type
+<class 'AssertionError'>
+
+ +

pytest --trace

+ +

https://docs.pytest.org/en/stable/usage.html#dropping-to-pdb-python-debugger-at-the-start-of-a-test

+ +
+

allows one to drop into the PDB prompt immediately at the start of each test via a command line option.

+
+ +

pytest --disable-socket

+ +

This is using a third party plugin pytest-socket to disable all network calls flowing through Python’s socket interface. Unit test should not have any network calls, even any local file operations.

+ +

To work with async: pytest --disable-socket --allow-unix-socket

+ +

To allow specific hosts: pytest --disable-socket --allow-hosts=127.0.0.1,8.8.8.8 +Not easy with IPs other than 127.0.0.1, as you might need to open sockets to more IPs for intermediate connections. So normally just –allow-hosts=127.0.0.1 if you have a local service (database for e.g.) for the unit tests.

+ +

Pay extra attention to this cavet. If you create another fixture that creates a socket usage that has a “higher” instantiation order, such as at the module/class/session, then the higher order fixture will be resolved first, and won’t be disabled during the tests.

+ +

@pytest.mark

+ +

https://docs.pytest.org/en/stable/example/markers.html

+ +

We can use @pytest.mark.foo decorator to add a marker (label) on any test, and use pytest -m foo to run the tests only with mark name is foo. +This method is often used by the pytest extensions to for example enable or disable the extension on some specific tests. Like @pytest.mark.enable_socket for the pytest-socket extension

+ +

Some people also use markers to categorize the tests, like @pytest.mark.unit for unit tests, and @pytest.mark.integration for integration tests, etc. +Personally, I dont like this because it forces to add the markers on every tests, it will be a very heavy work, and once you forget to add the markers, your tests wont be runned, and you will never discover it. The common usage (maybe I’m wrong) that I saw on github is just to put different categories’ tests in different folders.

+ +

We can also marking the whole classs or modules.

+ +

pytest -k expr

+ +

https://docs.pytest.org/en/stable/example/markers.html#using-k-expr-to-select-tests-based-on-their-name

+ +
+

You can use the -k command line option to specify an expression which implements a substring match on the test names or class names or file names instead of the exact match on markers that -m provides. This makes it easy to select tests based on their names.

+
+ +

You can use and, or, and not.

+ +
$ pytest -k "send_http" -v
+$ pytest -k "not send_http" -v
+$ pytest -k "send_http or quick" -v
+
+ +

@pytest.mark.xfail(strict=True, reason=””)

+ +

https://docs.pytest.org/en/reorganize-docs/new-docs/user/xfail.html#strict-parameter

+ +
+

Having the xfail marker will still run the test but won’t report a traceback once it fails. Instead terminal reporting will list it in the “expected to fail” (XFAIL) section. If the test doesn’t fail it will be reported as “unexpectedly passing” (XPASS). set strict=True to ensure XPASS (unexpectedly passing) causes the tests to be recorded as a failure.

+
+ +

+@pytest.mark.xfail(strict=True, reason="")
+def test_function():
+    ...
+
+ +

@pytest.mark.parametrize

+ +

https://docs.pytest.org/en/stable/example/parametrize.html

+ +

I put @pytest.mark.parametrize out of @pytest.mark because they’re really different. In fact, I discovered pytest from this functionnality.

+ +
@pytest.mark.parametrize(
+    "a, b, expected",
+    [
+        (1, 2, 3),
+        (3, 3, 6),
+    ],
+)
+def test_sum(a, b, expected):
+    total = a + b
+    assert total == expected
+
+ +

Apply indirect on particular arguments

+ +

https://docs.pytest.org/en/stable/example/parametrize.html#apply-indirect-on-particular-arguments

+ +
+

Very often parametrization uses more than one argument name. There is opportunity to apply indirect parameter on particular arguments. It can be done by passing list or tuple of arguments’ names to indirect. In the example below there is a function test_indirect which uses two fixtures: x and y. Here we give to indirect the list, which contains the name of the fixture x. The indirect parameter will be applied to this argument only, and the value a will be passed to respective fixture function.

+
+ +

if indirect=True, both x and y fixtures will be used, if only indirect=["x"], then only the fixture x will be used, and y will be considered as a standard var name.

+ +
# content of test_indirect_list.py
+
+import pytest
+
+
+@pytest.fixture(scope="function")
+def x(request):
+    return request.param * 3
+
+
+@pytest.fixture(scope="function")
+def y(request):
+    return request.param * 2
+
+
+@pytest.mark.parametrize("x, y", [("a", "b")], indirect=["x"])
+def test_indirect(x, y):
+    assert x == "aaa"
+    assert y == "b"
+
+ +

side_effect functions and iterables

+ +

https://docs.python.org/3/library/unittest.mock-examples.html#side-effect-functions-and-iterables

+ +

We used to use side_effect to force a mock object to raise an exception. But we can also use side_effect to define different return values. This is useful when we have a same mock function used multiple times in a testing function, and this mock function should return different values.

+ +

functions:

+ +
>>> vals = {(1, 2): 1, (2, 3): 2}
+>>> def side_effect(*args):
+...    return vals[args]
+...
+>>> mock = MagicMock(side_effect=side_effect)
+>>> mock(1, 2)
+1
+>>> mock(2, 3)
+2
+
+ +

iterables:

+ +
>>> mock = MagicMock(side_effect=[4, 5, 6])
+>>> mock()
+4
+>>> mock()
+5
+>>> mock()
+6
+
+

mock any class with Mock

+ +
from dataclasses import dataclass
+from unittest.mock import Mock
+
+
+@dataclass
+class A:
+    name: str
+
+
+@dataclass
+class B:
+    name: str
+
+
+@dataclass
+class InventoryItem:
+    a: A
+    b: B
+
+
+def test_class_inventory_item():
+    mock_inventory_item = InventoryItem(*[Mock() for _ in range(2)])
+
+    # or using inspect to get dynamically the class parameters count
+    from inspect import signature
+    mock_inventory_item = InventoryItem(*[Mock() for _ in range(len(signature(InventoryItem).parameters))])
+
+ +

monkeypatch

+ +

monkeypatch is a pytest native fixture, all modifications will be undone after the requesting test function or fixture has finished.

+ +

Monkeypatching functions or the property of a class

+ +

https://docs.pytest.org/en/stable/monkeypatch.html#simple-example-monkeypatching-functions

+ +

Very similar to Python standard lib unittest.mock.patch decorator since Python 3, but monkeypatch is a fixture. Some people find monkeypatch is less effort to write than unittest.mock.patch. Ref. https://github.com/pytest-dev/pytest/issues/4576

+ +

To use the native unittest.mock.patch, use the wraps parameter:

+ +
# replace function bar of module x by another function fake_bar with unittest.mock.patch
+# we can assert the mocked function with mock_bar
+from unittest.mock import patch
+
+def foo(arg1, arg2):
+    r = bar(arg1)
+
+def test_foo():
+   with patch("x.bar", wraps=fake_bar) as mock_bar:
+      actual = foo(arg1, arg2)
+      assert actual == expected
+      mock_bar.assert_called_once_with(arg1)
+
+ +
# replace function bar of module x by another function fake_bar with monkeypatch
+# we cannot assert the mocked function, but we dont need to give the x module in full string format.
+
+def foo(arg1, arg2):
+    r = bar(arg1)
+
+def test_foo(monkeypatch):
+    monkeypatch.setattr(x, "bar", fake_bar)
+
+ +
# replace function bar of module x by another function fake_bar with pytest-mock
+# we assert the mocked function
+
+def foo(arg1, arg2):
+    r = bar(arg1)
+
+def test_foo(monkeypatch):
+    mock_bar = mocker.patch("x.bar", wraps=fake_bar)
+
+ +

There’s also a plugin pytest-mock, which provides spy and stub utilities.

+ +

The wraps parameter in the native unittest.mock.patch can also be used to spy function, if you don’t want to use pytest-mock.spy.

+ +
monkeypatch.setattr(obj, name, value, raising=True)
+monkeypatch.delattr(obj, name, raising=True)
+
+ +

Monkeypatching environment variables

+ +

https://docs.pytest.org/en/stable/monkeypatch.html#monkeypatching-environment-variables

+ +

Can be replaced by python native unittest.mock @patch.dict(‘os.environ’, {‘newkey’: ‘newvalue’})

+ +
# contents of our test file e.g. test_code.py
+import pytest
+
+
+@pytest.fixture
+def mock_env_user(monkeypatch):
+    monkeypatch.setenv("USER", "TestingUser")
+
+
+@pytest.fixture
+def mock_env_missing(monkeypatch):
+    monkeypatch.delenv("USER", raising=False)
+
+
+# notice the tests reference the fixtures for mocks
+def test_upper_to_lower(mock_env_user):
+    assert get_os_user_lower() == "testinguser"
+
+
+def test_raise_exception(mock_env_missing):
+    with pytest.raises(OSError):
+        _ = get_os_user_lower()
+
+

monkeypatch with parametrize

+ +

As said above monkeypatch is a fixture, so we can use pytest-lazy-fixture to parametrize the fixtures. I cannot remember where is the link, in fact on one page from pytest official doc, it says that pytest cannot do it for the moment, that’s why pytest-lazy-fixture is introduced here. +###

+ +

It is worth saying that following monkeypatch on env won’t work:

+ +
# file a.py
+TEST_USER = os.getenv("TEST_USER")
+
+def get_test_user():
+    return(TEST_USER)
+
+
+# file test_a.py
+import pytest
+
+from a import get_test_user
+
+@pytest.fixture
+def mock_env_user(monkeypatch):
+    monkeypatch.setenv("TEST_USER", "TestingUser")
+
+def test_get_test_user(mock_env_user):
+    assert get_test_user() == "testinguser"
+
+ +

The test will fail, because the line TEST_USER = os.getenv("TEST_USER") in the file a.py is always imported before mock_env_user by test_a.py, from a import get_test_user is at the beginning of the test file. During the import, at this moment, the env var TEST_USER doesn’t exist yet in os, it will always have the value None. To fix this problem, we need to put the os.getenv into get_test_user like:

+ +
# file a.py
+
+def get_test_user():
+    TEST_USER = os.getenv("TEST_USER")
+    return(TEST_USER)
+
+ +

Monkeypatching dictionaries

+ +

Can be replaced by python native unittest.mock @patch.dict()

+ +
# patch one key at each patch
+monkeypatch.setitem(app.DEFAULT_CONFIG, "user", "test_user")
+monkeypatch.setitem(app.DEFAULT_CONFIG, "database", "test_db")
+monkeypatch.delitem(app.DEFAULT_CONFIG, "name", raising=False)
+
+

Modifying sys.path

+ +
monkeypatch.syspath_prepend(path)
+
+ +

Changing the context of the current working directory during a test

+ +
monkeypatch.chdir(path)
+
+ +

pytest-xdist to run tests in parallel

+ +

https://github.com/pytest-dev/pytest-xdist

+ +

Especially useful when your tests are unit tests for exmaple, which dont have dependencies from one with each other, and don’t share any changing data, which means your tests should be stateless.

+ +
# run on 4 CPUs
+pytest -n 4
+
+# run on a number of CPUs calculated automatically by the python built-in multiprocessing module
+pytest -n auto
+
+# run on a number of CPUs calculated automatically by the module psutil, you need such module if you have logical cpus as well as certain imposed limitations (like container runtimes with cpu limits)
+# ref. https://stackoverflow.com/a/14840102/5095636
+# ref. https://docs.python.org/3/library/multiprocessing.html#multiprocessing.cpu_count
+pip install pytest-xdist[psutil]
+pytest -n auto
+
+ +

There’s another module pytest-parallel, the author says his module can run the tests in concurrency, and very efficient in integration tests, which tests might be stateful or sequential. I haven’t tested yet, so cannot say anything here.

+ +

speccing

+ +

https://docs.python.org/3/library/unittest.mock.html#autospeccing

+ +

mock.patch returns a mock object, a mock object can have whatever atrributes and methods.

+ +

mock.asssert_called_once_with(4, 5, 6) doesn’t fail as shown as follows:

+ +
>>> mock = Mock(name='Thing', return_value=None)
+>>> mock(1, 2, 3)
+>>> mock.asssert_called_once_with(4, 5, 6)
+<Mock name='Thing.asssert_called_once_with()' id='140160334650144'>
+
+ +

simple speccing

+ +
>>> from urllib import request
+>>> mock = Mock()
+>>> mock.asssert_called_with()
+<Mock name='mock.asssert_called_with()' id='140160336271776'>
+
+# using simple speccing, mock.asssert_called_with() is detected as an error
+>>> mock = Mock(spec=request.Request)
+>>> mock.asssert_called_with()
+---------------------------------------------------------------------------
+AttributeError                            Traceback (most recent call last)
+...
+AttributeError: Mock object has no attribute 'asssert_called_with'
+
+# still using simple speccing, mock.data.asssert_called_with() is detected as an mocked method, no errors.
+# so simple speccing doesnt' work for nested objects
+>>> mock.data.asssert_called_with()
+<Mock name='mock.data.asssert_called_with()' id='140160336027120'>
+
+ +

auto-speccing

+ +

Using patch(autospec=True)

+ +
>>> from urllib import request
+>>> patcher = patch('__main__.request', autospec=True)
+>>> mock_request = patcher.start()
+
+>>> request is mock_request
+True
+
+# mock_request.Request has the spec='Request' now
+>>> mock_request.Request
+<MagicMock name='request.Request' spec='Request' id='...'>
+
+# the real request object doesn't have the static data attribute, so autospecced object doesn't have it neither.
+>>> mock_request.data
+Traceback (most recent call last):
+  File "<stdin>", line 1, in <module>
+  File "/usr/lib/python3.8/unittest/mock.py", line 637, in __getattr__
+    raise AttributeError("Mock object has no attribute %r" % name)
+AttributeError: Mock object has no attribute 'data'
+
+ +

Using create_autospec()

+ +
>>> from urllib import request
+>>> mock_request = create_autospec(request)
+>>> mock_request.Request('foo', 'bar')
+<NonCallableMagicMock name='mock.Request()' spec='Request' id='...'>
+
+ +

autospec works well on methods and static attributes, but a serious problem is that it is common for instance attributes to be created in the init() method and not to exist on the class at all. autospec can’t know about any dynamically created attributes and restricts the api to visible attributes. This is why autospeccing is not the patch default behaviour. Search the above phrase in the python official doc to get more details and solutions.

+ +

unittest.mock.ANY

+ +

https://docs.python.org/3/library/unittest.mock.html#any

+ +
>>> from unittest.mock import Mock
+>>> mock = Mock(return_value=None)
+>>> mock('foo', bar=object())
+>>> mock.assert_called_once_with('foo', bar=ANY)
+
+ +
>>> from unittest.mock import Mock, call
+>>> m = Mock(return_value=None)
+>>> m(1)
+>>> m(1, 2)
+>>> m(object())
+>>> m.mock_calls == [call(1), call(1, 2), ANY]
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2021/09/python-asyncio.html b/2021/09/python-asyncio.html new file mode 100644 index 00000000..5675e765 --- /dev/null +++ b/2021/09/python-asyncio.html @@ -0,0 +1,765 @@ + + + + + + +Python Asyncio Study notes - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +
+ + +
+ + + +

concurrent.futures

+ +

The concurrent.futures is a high-level abstraction for the threading and multiprocessing modules.

+ +

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/01/azure-pipeline-predefined-variables.html b/2022/01/azure-pipeline-predefined-variables.html new file mode 100644 index 00000000..4bd54f00 --- /dev/null +++ b/2022/01/azure-pipeline-predefined-variables.html @@ -0,0 +1,893 @@ + + + + + + +Azure pipeline predefined variables - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +
+ + +
+ + + +

The official doc gives an explanation of all the predefined variables, but it lacks of some concret examples. Hereunder some examples for my preferred variables.

+ +

Access the predefined variables

+ +

To access the variables value in YAML pipeline, we can use 2 methods:

+ +
    +
  1. +$(System.PullRequest.SourceBranch) : the standard way to access pipeline variables.
  2. +
  3. +$SYSTEM_PULLREQUEST_SOURCEBRANCH : most of the pipeline variables are mapped to the pipeline machine environment variables in upper snake case.
  4. +
+ +

Variables upon Git events

+ +

Suppose we create a new branch named new_branch, and create a pull request (with id 123) from the new branch new_branch to the main branch. +During the pipeline, we can see following predefined variables in different GIT events.

+ +

Check here for variables upon git events in Github Actions.

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
variable name \ git actionon pushon pull requeston mergeon manual trigger
Build.SourceBranchrefs/heads/new_branchrefs/pull/123/mergerefs/heads/mainrefs/heads/new_branch
Build.SourceBranchNamenew_branchmergemainnew_branch
Build.SourceVersionMessage{the last commit message}Merge pull request 123 from new_branch into mainMerged PR 123: {pull request title}
- It’s a way to determin this merge is from which PR
- We can also change the default message when merging the PR +
{the last commit message}
Build.ReasonIndividualCIPullRequestIndividualCIManual
System.Pullrequest.SourceBranchVAR_NOT_EXISTSrefs/heads/new_branchVAR_NOT_EXISTSVAR_NOT_EXISTS
System.Pullrequest.TargetBranchVAR_NOT_EXISTSrefs/heads/mainVAR_NOT_EXISTSVAR_NOT_EXISTS
System.Pullrequest.PullRequestIdVAR_NOT_EXISTS123VAR_NOT_EXISTSVAR_NOT_EXISTS
System.PullRequest.SourceCommitIdVAR_NOT_EXISTSthe last commit number in pull requestVAR_NOT_EXISTSVAR_NOT_EXISTS
+ +

Variables not varying upon triggering Git action

+ +

System.AccessToken

+ +

System.AccessToken is a SecretVariable, which is in fact a PAT token with limited 1 hour of lifetime by default, and is about to be expired 5 minutes before the end of the lifetime.

+ +
    +
  • +User name +The access token is bound to a build service account, which name should be in this format: {projectName} Build Service ({organizationName}). So it’s required to set necessary permissions on this account. For example, to be able to publish a Python wheel package to Azure Artifacts, it needs the AddPackage permission, we can set the build service account as a contributor to the corresponding Artifacts feed’s permission tab to get this permission.
  • +
  • +Basic auth +If we need to use this PAT to create the base64 string, the user name for this PAT should be kept empty, which is in the format of :$(System.AccessToken), to convert it to base64, use: printf "%s"":$(System.AccessToken)" | base64, or echo -n ":$(System.AccessToken)" | base64 with -n. When using with curl, it should be something like curl -u :$(System.AccessToken), the user name part is empty. or user an basic auth header like {"Authorization": "Basic {:$(System.AccessToken) in base64 format}"}.
  • +
  • +OAtuh +Besides the above basic auth (it’s secure as the password is a PAT with limited lifetime, not a real clear password), we can also use OAuth, with header {"Authorization": "Bearer $(System.AccessToken)"}, it’s not enabled by defaut, we should enable the OAuth by checking the box Allow scripts to access OAuth token from Realeses / Tasks / Agent job (Run on Agent) or from Pipelines / Tasks / Agent job (Run on Agent). And we need to create a task in advance in order to see the Tasks menu. If we don’t enbale the option, and use Bearer header directly, we will get an API resposne code 203, with the reason Non-Authoritative Information.
  • +
  • See also job access token.
  • +
+ +

Agent.OS

+ +

Agent.OS: Just to check which OS running the pipeline.

+ +

Variables to be set by user

+ +

System.Debug

+ +

Add a new variable with the name System.Debug and value true for debugging.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/02/azure-pipeline-checkout-repository-from-another-project.html b/2022/02/azure-pipeline-checkout-repository-from-another-project.html new file mode 100644 index 00000000..e0b51936 --- /dev/null +++ b/2022/02/azure-pipeline-checkout-repository-from-another-project.html @@ -0,0 +1,816 @@ + + + + + + +Azure pipeline checkout repository from another project - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +
+ + +
+ + + +

Context

+ +

This post can be an extend to my previous post on variables and templates reuse

+ +

In fact, in addition to the variables and templates, I also need to reuse some non native Azure pipeline yaml files, for example some Python scripts defined in the shared template. If we use the same technic shown by the previous blog, the pipeline will throw error saying that it cannot find the Python script. This is because we need to checkout the remote repository at first before using the native pipeline yaml files.

+ +

Checkout repository from another project

+ +

By default, each pipeline run has a temproray token of the project build service account, this account name should be in the format of: [Project name] Build Service ([Organization name]), we want to use this token to checkout the remote repository.

+ +

We can also use a third account PAT to perform the checkout, but I won’t explain here because we need to save the PAT somewhere which is less convenient than the default build service account. We should use the build service account as much as possible.

+ +

If we do nothing, just add the checkout step in the pipeline as shown here, the pipeline run will throw the error like below:

+ +
+

“remote: TF401019: The Git repository with name or identifier {remote repository name} does not exist or you do not have permissions for the operation you are attempting.”

+
+ +

There might be many reasons that can trigger this error, but for this case, this is because since May 2020, all the new projects have the option Limit job authorization scope to current project for non-release pipelines enabled. Which means by default the built-in build service account of a project A cannot access anything inside the project B. The cross-project access is denied by default.

+ +

Disable this option makes the checkout of remote repository worked, but it opens also a very big security issue. So we should NEVER disable it.

+ +

Readers group to the whole target project

+ +

My first try was going the security tab of the remote repository (projectB’s Project Settings -> Repos -> Repositories -> sharedRepo), and grant the source project build service account the read permission on it, I got the same error. Then, I granted the same permission at all repositories level, same error. Finally, I added the source project build service account into the Readers group of the shared project (Project Settings -> Permissions -> Groups -> Readers), and this time, it worked.

+ +

So the whole blog can be summarized to the above phrase by using the Readers group. But please be aware that, as it’s read-only access to the whole target project, which means the source project build service account has the read access to all the repositories inside the target project. If you want to grant read access only to a single repository, you need to add the source project build service account to all the other repositories security tabs and set the Read permission to Deny. As said in above first try, the inverse way doesn’t work as the time of writing this blog.

+ +

Create read access to the target repository

+ +

This method is shown here in the last use case, which is:

+
    +
  • If the scope is project? => Yes
  • +
  • Is the repo in the same project as the pipeline? => No
  • +
  • Is your pipeline in a public project? => No
  • +
+ +

You need to take additional steps to grant access. Let us say that your pipeline exists in project A and that your repository exists in project B.

+
    +
  1. Go to the project settings of the project in which the repository exists (B). Select Repos -> Repositories -> specific repository.
  2. +
  3. Add your-project-name Build Service (your-collection-name) to the list of users, where your-project-name is the name of the project in which your pipeline exists (A).
  4. +
  5. Give Read access to the account.
  6. +
  7. (Update 2022-09-16)(refer to this doc) Create a new group in the target project permissions tab. Add the source project build service account into this group. Grant this new group with the View project-level information permission, or you can also grant this permission only to the added source project build service account.
  8. +
+ +

In fact, the default Readers group has also this permission, but it’s not straightforward to figure out that we must need this permission in addition to the Read permission at the repository level.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/02/azure-pipeline-reuse-variables-in-template-from-another-repository.html b/2022/02/azure-pipeline-reuse-variables-in-template-from-another-repository.html new file mode 100644 index 00000000..25016936 --- /dev/null +++ b/2022/02/azure-pipeline-reuse-variables-in-template-from-another-repository.html @@ -0,0 +1,854 @@ + + + + + + +Azure pipeline reuse variables in template from another repository - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

Context

+ +

In my project, I have several Azure pipelines that share some same variables, instead of declaring them in each pipeline, I would like to refactor it by using some central places to store the shared variables.

+ +

I can split the variables into 3 groups:

+ +
    +
  1. organization level variables: +organization name, tenant id, etc.
  2. +
  3. project level variables: +project name, resouces group name, keyvault name, project support email, etc.
  4. +
  5. repository level variables: +module name, repository support email, etc.
  6. +
+ +

Suppose I’m writing an Azure pipeline called cicd.yml for the repositoryA located at: organizationA/projectA/repositoryA, I will save the above 3 groups of variables to 3 places:

+ +
    +
  1. organization level variables -> to a new repository outside of the project, for e.g. organizationA/sharedProject/sharedRepository +
  2. +
  3. project level variables -> to a new repository inside the same project, for e.g. organizationA/projectA/sharedRepository +
  4. +
  5. repository level variables -> to the same repository: organizationA/projectA/repositoryA +
  6. +
+ +

By checking following two official docs (in fact in the same doc :-)) : Variable reuse, Use other repositories, the file content of each variable group could be:

+ +

organization level variables

+ +
# file: organizationA/sharedProject/sharedRepository/.azure-pipelines/variables/organization_variables.yml
+
+variables:
+  organizationName: xxx
+
+ +

project level variables

+ +
# file: organizationA/projectA/sharedRepository/.azure-pipelines/variables/project_variables.yml
+
+variables:
+  - template: .azure-pipelines/variables/organization_variables.yml@sharedProject_sharedRepository
+  - name: myProjectVar
+    value: $(organizationName)_abc
+
+ +

repository level variables

+ +
# file: organizationA/projectA/repositoryA/.azure-pipelines/variables/repository_variables.yml
+
+variables:
+  - template: .azure-pipelines/variables/project_variables.yml@projectA_sharedRepository
+  - name: myRepositoryVar
+    value: $(myProjectVar)_abc
+
+ +

root cicd file

+ +
# file: organizationA/projectA/repositoryA/.azure-pipelines/cicd.yml
+
+# repository type = git means Azure DevOps repository as per https://docs.microsoft.com/en-us/azure/devops/pipelines/repos/multi-repo-checkout?view=azure-devops#specify-multiple-repositories
+
+resources:
+  repositories:
+    - repository: sharedProject_sharedRepository
+      type: git
+      name: sharedProject/sharedRepository
+    - repository: projectA_sharedRepository
+      type: git
+      name: projectA/sharedRepository
+
+trigger: xxx
+
+variables:
+  - template: variables/repository_variables.yml
+  - name: myRepositoryVar
+    value: xxx
+
+pool:
+  vmImage: xxx
+
+steps:
+  - script: |
+      echo $(myRepositoryVar)
+    displayName: test repositry level variables
+
+ +

Note: We cannot put the resources part elsewhere, it must be declared in the root pipeline file. Otherwise, the pipeline might throw the Unexpected value 'resources' error. There’s some black magic that the variables templates defined in other repositories (for e.g. project_variables.yml) recognize well the sharedProject_sharedRepository repository resource defined in the repository hosting the cicd.yml file.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/03/azure-pipeline-variables-and-parameters.html b/2022/03/azure-pipeline-variables-and-parameters.html new file mode 100644 index 00000000..d5936e50 --- /dev/null +++ b/2022/03/azure-pipeline-variables-and-parameters.html @@ -0,0 +1,890 @@ + + + + + + +Azure pipeline variables and parameters - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +
+ + +
+ + + +

Variable

+ +

Variable scope

+ +

When we set variables from a script, the new variable is only available from the next step, not the step where the variable is defined.

+ +
variables:
+  sauce: orange
+steps:
+# Create a variable
+- bash: |
+    echo "##vso[task.setvariable variable=sauce]crushed tomatoes" # remember to use double quotes
+    echo inside the same step, sauce: $(sauce)
+
+# Use the variable
+# "$(sauce)" is replaced by the contents of the `sauce` variable by Azure Pipelines
+# before handing the body of the script to the shell.
+- bash: |
+    echo from the next step, sauce: $(sauce)
+
+ +

The result will be:

+ +
inside the same step, sauce: orange
+from the next step, sauce: crushed tomatoes
+
+ +

Json Variable

+ +

Parameter can have object type like dict in Python, but not the case for variable. The workaround is to assign a raw json string to variable, and using tools like jq to handle it during runtime. The json string variable must follow some special format, the double quotes must be escaped, and the whole string must be enclosed by the single quote.

+ +
aJsonVar: '{ \"dev\": \"foo\", \"prd\": \"bar\" }'
+
+ +

Parameter

+ +

String parameter

+ +

For string parameter with an empty string "" as default value, in bash script task, we can use if [[ -n $VAR_NAME ]]; then to handle it.

+ +

-n in Linux returns true (0) if exists, and not empty.

+ + +
parameters:
+  - name: paramName
+    type: string
+    default: ""
+
+steps:
+  - scripts: |
+      if [[ -n $PARAM_NAME ]]; then
+        echo PARAM_NAME is set with a value: $PARAM_NAME
+      fi
+    displayName: check paramName
+    failOnStderr: true
+    env:
+      PARAM_NAME: ${{ parameters.paramName }}
+
+ + +

Boolean parameter

+ +
parameters:
+- name: myBoolean
+  type: boolean
+  default: true
+
+ +
    +
  • In pipeline YAML syntax, we compare the value by YAML’s Boolean type true or false +
  • +
  • In bash script, we should compare it with string format of True or False +
  • +
+ +

Object parameter

+ +

Parameter has a type of object which can take any YAML structure. If it’s related to a array/list type, we can use ${{ each element in paramters.elements}} to loop through it, but if it’s related to a mapping/dict type, it will not be easy as Microsoft hasn’t provided any official docs (and this one) on how to use complex parameter with the pipeline native syntax, and my tests with different approaches failed too. Hopefully, for mapping/dict object type of parameter, we can workaround it by doing some transformation in a script task with convertToJson like: echo '${{ convertToJson(parameters.elements) }}'

+ +

Must use single quotes around the convetToJson expression. If we use double quotes, the output will remove the double quotes from the json data.

+ +

Loop through parameters

+ +

We can loop through parameters with:

+ + +
steps:
+- ${{ each parameter in parameters }}:
+  - script: echo ${{ parameter.Key }}
+  - script: echo ${{ parameter.Value }}
+
+ + +

The above example provided by the official doc loops through the parameters script by script. +In the pipeline, we will see as many tasks as the number of parameters which looks a bit heavy, hereunder how to iterate all the parameters in a single script.

+ + +
# suppose the blow pipeline is defined in a template which takes the parameter with name `parameters`, so we can reuse it in any other pipelines.
+parameters:
+  - name: parameters
+    displayName: parameters
+    type: object
+
+steps:
+  - script: |
+      parameters_in_json=$(echo '${{ convertToJson(parameters.parameters) }}' | jq -c)
+      echo "##vso[task.logissue type=warning]parameters: $parameters_in_json"
+    displayName: echo parameters
+
+ + +

The above example uses only one script to iterate all the parameters and pipe it to jq, as long as jq can handle the parameters, we can handle everything. +Here, we use jq -c to convert all the parameters into a single line json, which will be better displayed by ##vso[task.logissue type=warning], as it takes only one line.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/03/manage-azure-databricks-service-principal.html b/2022/03/manage-azure-databricks-service-principal.html new file mode 100644 index 00000000..c3a5a4b2 --- /dev/null +++ b/2022/03/manage-azure-databricks-service-principal.html @@ -0,0 +1,838 @@ + + + + + + +Manage Azure Databricks Service Principal - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

Most of Databricks management can be done from the GUI or CLI, but for Azure Service Principal, we can only manage it by the SCIM API. There’s an open PR for adding support of SCIM API in Databricks CLI, but the lastest update is back to the beginning of 2021.

+ +

This post is to add some tips that not covered by the official API docs.

+ +

Patch Service Principal

+ +

The official docs gives op add, remove, in fact, if you want to for example, update the displayName field of a SP, the op should be add:

+ +
{
+    "schemas": [
+        "urn:ietf:params:scim:api:messages:2.0:PatchOp"
+    ],
+    "Operations": [
+        {
+            "op": "add",
+            "path": "displayName",
+            "value": "{newServicePrincipalName}"
+        }
+    ]
+}
+
+ +

Consistant fields across workspaces

+ +

We could link multiple Databricks workspaces together. Below screenshot is an example of 3 linked workspaces.

+ +

azure-databricks-multiple-workspaces

+ +

Please be aware that each workspace has its own API url.

+ +

Let’s see the example of the output of the GET Service Principal endpoint, where the applicationId is 11111111-0000-0000-0000-111111111111:

+ +
{
+  "displayName": "foo",
+  "externalId": "22222222-0000-0000-0000-222222222222",
+  "groups": [
+    {
+      "display": "group1",
+      "type": "direct",
+      "value": "111",
+      "$ref": "Groups/111"
+    },
+    {
+      "display": "group2",
+      "type": "indirect",
+      "value": "222",
+      "$ref": "Groups/222"
+    }
+  ],
+  "id": "123456789",
+  "entitlements": [
+    {
+      "value": "allow-cluster-create"
+    },
+    {
+      "value": "allow-instance-pool-create"
+    },
+    {
+      "value": "workspace-access"
+    }
+  ],
+  "applicationId": "11111111-0000-0000-0000-111111111111",
+  "active": true
+}
+
+ +

Although we have 3 different workspaces, the same Service Principal (applicationId) defined in these workspace shares some fields:

+ +
    +
  • displayName
  • +
  • id
  • +
  • applicationId
  • +
+ +

And among these 3 fields, you can only update the displayName field, the id and applicationId fileds are immutable. Which means if we change the displayName in one of the workspaces by using the PATCH SCIM API, we will get the the updated displayName in other workspaces by using the GET SCIM API. We can not change id and applicationId fields, and both of them are the same across workspaces.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/04/azure-pipeline-checkout-multiple-repositories.html b/2022/04/azure-pipeline-checkout-multiple-repositories.html new file mode 100644 index 00000000..f16177ae --- /dev/null +++ b/2022/04/azure-pipeline-checkout-multiple-repositories.html @@ -0,0 +1,1740 @@ + + + + + + +Azure Pipeline Checkout Multiple Repositories - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 13 minute read + + + +

+ + +
+ + +
+ + + +

This post will talk about some Azure pipeline predefined variables’ values in a multiple repositories checkout situation. The official doc is here.

+ +

The examples given in this post is using Azure DevOps repositories and Azure pipeline Ubuntu agent.

+ +

Default Pipeline workspace structure

+ +

When a pipeline starts, something is created inside the folder defined in the predefined variable $(Pipeline.Workspace), this variable has the same value as $(Agent.BuildDirectory), For example, when using the default Azure pipeline Ubuntu agent, the value is /home/vsts/work/1.

+ +

At the very beginning of a pipeline run, you should the folder constructed like below:

+ +
pwd
+/home/vsts/work/1/s
+
+ls -lart /home/vsts/work/1
+total 24
+drwxr-xr-x 2 vsts docker 4096 Apr  3 12:52 b
+drwxr-xr-x 2 vsts docker 4096 Apr  3 12:52 a
+drwxr-xr-x 2 vsts docker 4096 Apr  3 12:52 TestResults
+drwxr-xr-x 6 vsts docker 4096 Apr  3 12:52 .
+drwxr-xr-x 4 vsts docker 4096 Apr  3 12:52 s
+drwxr-xr-x 7 vsts root   4096 Apr  3 12:52 ..
+
+ +
    +
  • Folder /home/vsts/work/1 for Pipeline.Workspace, Agent.BuildDirectory.
  • +
  • Folder /home/vsts/work/1/a for Build.ArtifactStagingDirectory, Build.StagingDirectory.
  • +
  • Folder /home/vsts/work/1/b for Build.BinariesDirectory.
  • +
  • Folder //home/vsts/work/1s for System.DefaultWorkingDirectory or sometimes for Build.SourcesDirectory, Build.Repository.LocalPath.
  • +
  • Folder /home/vsts/work/1/TestResults for Common.TestResultsDirectory +
  • +
+ +

The value of Build.SourcesDirectory, Build.Repository.LocalPath could change upon checkout policies, so pay attention when using these two variables.

+ +

System.DefaultWorkingDirectory is very important too because its value will never change in whatever situation, and this is the default working directory when running the script task, we can confirm it by checking the result of the above pwd command.

+ +

I will show these variables’ value within different steps of 5 different pipelines:

+ +
    +
  1. With self checkout and external repository checkout (most common)
  2. +
  3. Single self checkout with default path
  4. +
  5. Single self checkout with custom path
  6. +
  7. No self checkout but single external checkout with default path
  8. +
  9. No self checkout but single external checkout with custom path
  10. +
  11. No self checkout but multiple external checkout
  12. +
+ +

With self checkout and external repository checkout

+ +
resources:
+  repositories:
+    - repository: another_repo
+      type: git
+      name: AzureDevOpsProjectName/another_repo
+
+steps:
+  - checkout: self
+    persistCredentials: true  # persists cred to perform some git remote commands like git push --tags
+    path: $(Build.Repository.Name)
+
+  - checkout: another_repo
+    path: another_repo
+
+  - script: |
+      cp "$BUILD_REPOSITORY_LOCALPATH/." "$SYSTEM_DEFAULTWORKINGDIRECTORY" -r
+    displayName: Copy $(Build.Repository.Name) content to default workding directroy
+
+ +

Declare repository resources

+ +

Suppose the self (primary) repository name is cicd, and in the pipeline file, we declare a repository resource to the repository found at AzureDevOpsProjectName/another_repo.

+ +
resources:
+  repositories:
+    - repository: another_repo
+      type: git
+      name: AzureDevOpsProjectName/another_repo
+
+ +

From the very beginning of the pipeline line, the another_repo repository and the self repository will be automatically checked out at /home/vsts/work/1/s

+ +
ls -lart /home/vsts/work/1
+total 24
+drwxr-xr-x 2 vsts docker 4096 Apr  3 12:52 b
+drwxr-xr-x 2 vsts docker 4096 Apr  3 12:52 a
+drwxr-xr-x 2 vsts docker 4096 Apr  3 12:52 TestResults
+drwxr-xr-x 6 vsts docker 4096 Apr  3 12:52 .
+drwxr-xr-x 4 vsts docker 4096 Apr  3 12:52 s
+drwxr-xr-x 7 vsts root   4096 Apr  3 12:52 ..
+
+ls -lart /home/vsts/work/1/s
+total 16
+drwxr-xr-x 6 vsts docker 4096 Apr  3 12:52 ..
+drwxr-xr-x 2 vsts docker 4096 Apr  3 12:52 cicd
+drwxr-xr-x 2 vsts docker 4096 Apr  3 12:52 another_repo
+drwxr-xr-x 4 vsts docker 4096 Apr  3 12:52 .
+
+
+ +

At this point, the following variables having following values:

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Predefined variable nameValueWhen
Pipeline.Workspace/home/vsts/work/1Beginning of the pipeline
Agent.BuildDirectory/home/vsts/work/1Beginning of the pipeline
Build.ArtifactStagingDirectory/home/vsts/work/1/aBeginning of the pipeline
Build.StagingDirectory/home/vsts/work/1/aBeginning of the pipeline
Build.BinariesDirectory/home/vsts/work/1/bBeginning of the pipeline
System.DefaultWorkingDirectory/home/vsts/work/1/sBeginning of the pipeline
Build.SourcesDirectory/home/vsts/work/1/sBeginning of the pipeline
Build.Repository.LocalPath/home/vsts/work/1/s/cicdBeginning of the pipeline
Common.TestResultsDirectory/home/vsts/work/1/TestResultsBeginning of the pipeline
PWD/home/vsts/work/1/sBeginning of the pipeline
+ +

We see both the self repository (cicd) and the external repository (another_repo) is saved to /home/vsts/work/1/s, this is because during the compiling time, the pipeline found that we will checkout both the repositories, but if there wouldn’t have been the checkout out of the external repository, the /home/vsts/work/1/s directory will be empty at this step.

+ +

Checkout self to its repository name

+ +
- checkout: self
+  persistCredentials: true
+  path: $(Build.Repository.Name)
+
+ +
pwd
+/home/vsts/work/1/s
+
+ls -lart /home/vsts/work/1
+total 28
+drwxr-xr-x 2 vsts docker 4096 Apr  1 08:51 b
+drwxr-xr-x 2 vsts docker 4096 Apr  1 08:51 a
+drwxr-xr-x 2 vsts docker 4096 Apr  1 08:51 TestResults
+drwxr-xr-x 7 vsts root   4096 Apr  1 08:51 ..
+drwxr-xr-x 3 vsts docker 4096 Apr  1 08:51 s
+drwxr-xr-x 7 vsts docker 4096 Apr  1 08:51 .
+drwxr-xr-x 4 vsts docker 4096 Apr  1 08:51 cicd
+
+ls -lart /home/vsts/work/1/s
+total 12
+drwxr-xr-x 2 vsts docker 4096 Apr  1 08:51 another_repo
+drwxr-xr-x 3 vsts docker 4096 Apr  1 08:51 .
+drwxr-xr-x 7 vsts docker 4096 Apr  1 08:51 ..
+
+ +

At this point, the following variables having following values:

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Predefined variable nameValueWhen
Pipeline.Workspace/home/vsts/work/1After checking out self to its repo name
Agent.BuildDirectory/home/vsts/work/1After checking out self to its repo name
Build.ArtifactStagingDirectory/home/vsts/work/1/aAfter checking out self to its repo name
Build.StagingDirectory/home/vsts/work/1/aAfter checking out self to its repo name
Build.BinariesDirectory/home/vsts/work/1/bAfter checking out self to its repo name
System.DefaultWorkingDirectory/home/vsts/work/1/sAfter checking out self to its repo name
Build.SourcesDirectory/home/vsts/work/1/sAfter checking out self to its repo name
Build.Repository.LocalPath/home/vsts/work/1/cicdAfter checking out self to its repo name
Common.TestResultsDirectory/home/vsts/work/1/TestResultsAfter checking out self to its repo name
PWD/home/vsts/work/1/sAfter checking out self to its repo name
+ +

Checkout another repository to its repository name

+ +
- checkout: another_repo
+  path: another_repo
+
+ +
pwd
+/home/vsts/work/1/s
+
+ls -lart /home/vsts/work/1
+total 32
+drwxr-xr-x 2 vsts docker 4096 Apr  3 12:52 b
+drwxr-xr-x 2 vsts docker 4096 Apr  3 12:52 a
+drwxr-xr-x 2 vsts docker 4096 Apr  3 12:52 TestResults
+drwxr-xr-x 7 vsts root   4096 Apr  3 12:52 ..
+drwxr-xr-x 4 vsts docker 4096 Apr  3 12:52 cicd
+drwxr-xr-x 2 vsts docker 4096 Apr  3 12:52 s
+drwxr-xr-x 8 vsts docker 4096 Apr  3 12:52 .
+drwxr-xr-x 4 vsts docker 4096 Apr  3 12:52 another_repo
+
+ls -lart /home/vsts/work/1/s
+total 8
+drwxr-xr-x 2 vsts docker 4096 Apr  3 12:52 .
+drwxr-xr-x 8 vsts docker 4096 Apr  3 12:52 ..
+
+ +

At this point, nothing exists anymore in the /home/vsts/work/1/s folder, remember there was the folder another_repo in the previous step. The checkout step moved /home/vsts/work/1/s/another_repo to /home/vsts/work/1/another_repo.

+ +

At this point, the following variables having following values:

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Predefined variable nameValueWhen
Pipeline.Workspace/home/vsts/work/1After checking out another_repo to its repo name
Agent.BuildDirectory/home/vsts/work/1After checking out another_repo to its repo name
Build.ArtifactStagingDirectory/home/vsts/work/1/aAfter checking out another_repo to its repo name
Build.StagingDirectory/home/vsts/work/1/aAfter checking out another_repo to its repo name
Build.BinariesDirectory/home/vsts/work/1/bAfter checking out another_repo to its repo name
System.DefaultWorkingDirectory/home/vsts/work/1/sAfter checking out another_repo to its repo name
Build.SourcesDirectory/home/vsts/work/1/sAfter checking out another_repo to its repo name
Build.Repository.LocalPath/home/vsts/work/1/cicdAfter checking out another_repo to its repo name
Common.TestResultsDirectory/home/vsts/work/1/TestResultsAfter checking out another_repo to its repo name
PWD/home/vsts/work/1/sAfter checking out another_repo to its repo name
+ +

Move self to System.DefaultWorkingDirectory

+ +

Once we have multi-checkout repositories in a pipeline, the source code of the self (primary) repository won’t be saved in /home/vsts/work/1/s, where is pointed by the System.DefaultWorkingDirectory variable, but System.DefaultWorkingDirectory is the default working directory of the script task, we can add workingDirectory: parameter to the script task to change the path, but if we have many script tasks, and even they’re declared in some shared templates, it would be difficult to change it. So we need to manually move the source repository content back to /home/vsts/work/1/s:

+ +
- script: |
+    cp "$BUILD_REPOSITORY_LOCALPATH/." "$SYSTEM_DEFAULTWORKINGDIRECTORY" -r
+  displayName: Copy $(Build.Repository.Name) content to default workding directroy
+
+ +
pwd
+/home/vsts/work/1/s
+
+ls -lart /home/vsts/work/1
+total 32
+drwxr-xr-x 2 vsts docker 4096 Apr  1 08:51 b
+drwxr-xr-x 2 vsts docker 4096 Apr  1 08:51 a
+drwxr-xr-x 2 vsts docker 4096 Apr  1 08:51 TestResults
+drwxr-xr-x 7 vsts root   4096 Apr  1 08:51 ..
+drwxr-xr-x 4 vsts docker 4096 Apr  1 08:51 cicd
+drwxr-xr-x 4 vsts docker 4096 Apr  1 08:51 s
+drwxr-xr-x 8 vsts docker 4096 Apr  1 08:51 .
+drwxr-xr-x 4 vsts docker 4096 Apr  1 08:51 another_repo
+
+ls -lart /home/vsts/work/1/s
+total 20
+-rw-r--r-- 1 vsts docker    0 Apr  1 08:51 repo_cicd.md
+-rw-r--r-- 1 vsts docker  985 Apr  1 08:51 README.md
+drwxr-xr-x 8 vsts docker 4096 Apr  1 08:51 .git
+drwxr-xr-x 3 vsts docker 4096 Apr  1 08:51 .azure-pipelines
+drwxr-xr-x 4 vsts docker 4096 Apr  1 08:51 .
+drwxr-xr-x 8 vsts docker 4096 Apr  1 08:51 ..
+
+ls -lart
+total 20
+-rw-r--r-- 1 vsts docker    0 Apr  1 08:51 repo_cicd.md
+-rw-r--r-- 1 vsts docker  985 Apr  1 08:51 README.md
+drwxr-xr-x 8 vsts docker 4096 Apr  1 08:51 .git
+drwxr-xr-x 3 vsts docker 4096 Apr  1 08:51 .azure-pipelines
+drwxr-xr-x 4 vsts docker 4096 Apr  1 08:51 .
+drwxr-xr-x 8 vsts docker 4096 Apr  1 08:51 ..
+
+ +

Single self checkout with default path

+ +
resources:
+  repositories:
+    - repository: another_repo
+      type: git
+      name: AzureDevOpsProjectName/another_repo
+
+steps:
+  - checkout: self
+    persistCredentials: true
+
+ +

Before checkout

+ +
pwd
+/home/vsts/work/1/s
+
+ls -lart /home/vsts/work/1
+total 24
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:14 s
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:14 b
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:14 a
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:14 TestResults
+drwxr-xr-x 6 vsts docker 4096 Apr  3 21:14 .
+drwxr-xr-x 7 vsts root   4096 Apr  3 21:14 ..
+
+ls -lart /home/vsts/work/1/s
+total 8
+drwxr-xr-x 6 vsts docker 4096 Apr  3 21:14 ..
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:14 .
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + +
Predefined variable nameValueWhen
System.DefaultWorkingDirectory/home/vsts/work/1/sbefore checkout
Build.SourcesDirectory/home/vsts/work/1/sbefore checkout
Build.Repository.LocalPath/home/vsts/work/1/sbefore checkout
+ +

After checkout

+ +
pwd
+/home/vsts/work/1/s
+
+ls -lart /home/vsts/work/1
+total 24
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:14 b
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:14 a
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:14 TestResults
+drwxr-xr-x 7 vsts root   4096 Apr  3 21:14 ..
+drwxr-xr-x 6 vsts docker 4096 Apr  3 21:14 .
+drwxr-xr-x 4 vsts docker 4096 Apr  3 21:14 s
+
+ls -lart /home/vsts/work/1/s
+total 20
+drwxr-xr-x 6 vsts docker 4096 Apr  3 21:14 ..
+-rw-r--r-- 1 vsts docker    0 Apr  3 21:14 repo_cicd.md
+-rw-r--r-- 1 vsts docker  985 Apr  3 21:14 README.md
+drwxr-xr-x 3 vsts docker 4096 Apr  3 21:14 .azure-pipelines
+drwxr-xr-x 4 vsts docker 4096 Apr  3 21:14 .
+drwxr-xr-x 8 vsts docker 4096 Apr  3 21:14 .git
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + +
Predefined variable nameValueWhen
System.DefaultWorkingDirectory/home/vsts/work/1/safter checkout
Build.SourcesDirectory/home/vsts/work/1/safter checkout
Build.Repository.LocalPath/home/vsts/work/1/safter checkout
+ +

Single self checkout with custom path

+ +
resources:
+  repositories:
+    - repository: another_repo
+      type: git
+      name: AzureDevOpsProjectName/another_repo
+
+steps:
+  - checkout: self
+    persistCredentials: true
+    path: $(Build.Repository.Name)
+
+ +

Before checkout

+ +
pwd
+/home/vsts/work/1/s
+
+ls -lart /home/vsts/work/1
+total 24
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:10 s
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:10 b
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:10 a
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:10 TestResults
+drwxr-xr-x 6 vsts docker 4096 Apr  3 21:10 .
+drwxr-xr-x 7 vsts root   4096 Apr  3 21:10 ..
+
+ls -lart /home/vsts/work/1/s
+total 8
+drwxr-xr-x 6 vsts docker 4096 Apr  3 21:10 ..
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:10 .
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + +
Predefined variable nameValueWhen
System.DefaultWorkingDirectory/home/vsts/work/1/sbefore checkout
Build.SourcesDirectory/home/vsts/work/1/sbefore checkout
Build.Repository.LocalPath/home/vsts/work/1/sbefore checkout
+ +

After checkout

+ +
pwd
+/home/vsts/work/1/cicd
+
+ls -lart /home/vsts/work/1
+total 24
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:10 b
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:10 a
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:10 TestResults
+drwxr-xr-x 7 vsts root   4096 Apr  3 21:10 ..
+drwxr-xr-x 6 vsts docker 4096 Apr  3 21:10 .
+drwxr-xr-x 4 vsts docker 4096 Apr  3 21:10 cicd
+
+ls -lart /home/vsts/work/1/s
+ls: cannot access '/home/vsts/work/1/s': No such file or directory
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + +
Predefined variable nameValueWhen
System.DefaultWorkingDirectory/home/vsts/work/1/cicdafter checkout
Build.SourcesDirectory/home/vsts/work/1/cicdafter checkout
Build.Repository.LocalPath/home/vsts/work/1/cicdafter checkout
+ +

No self checkout but single external checkout with default path

+ +
resources:
+  repositories:
+    - repository: another_repo
+      type: git
+      name: AzureDevOpsProjectName/another_repo
+
+steps:
+  - checkout: another_repo
+
+ +

Before checkout

+ +
pwd
+/home/vsts/work/1/s
+
+ls -lart /home/vsts/work/1
+total 24
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:25 s
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:25 b
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:25 a
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:25 TestResults
+drwxr-xr-x 6 vsts docker 4096 Apr  3 21:25 .
+drwxr-xr-x 7 vsts root   4096 Apr  3 21:25 ..
+
+ls -lart /home/vsts/work/1/s
+total 8
+drwxr-xr-x 6 vsts docker 4096 Apr  3 21:25 ..
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:25 .
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + +
Predefined variable nameValueWhen
System.DefaultWorkingDirectory/home/vsts/work/1/sbefore checkout
Build.SourcesDirectory/home/vsts/work/1/sbefore checkout
Build.Repository.LocalPath/home/vsts/work/1/sbefore checkout
+ +

After checkout

+ +
pwd
+/home/vsts/work/1/s
+
+ls -lart /home/vsts/work/1
+total 24
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:25 b
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:25 a
+drwxr-xr-x 2 vsts docker 4096 Apr  3 21:25 TestResults
+drwxr-xr-x 7 vsts root   4096 Apr  3 21:25 ..
+drwxr-xr-x 6 vsts docker 4096 Apr  3 21:25 .
+drwxr-xr-x 4 vsts docker 4096 Apr  3 21:25 s
+
+ls -lart /home/vsts/work/1/s
+total 40
+drwxr-xr-x 6 vsts docker 4096 Apr  3 21:25 ..
+-rw-r--r-- 1 vsts docker  947 Apr  3 21:25 README.md
+drwxr-xr-x 8 vsts docker 4096 Apr  3 21:25 .git
+drwxr-xr-x 5 vsts docker 4096 Apr  3 21:25 repo_another_repo
+drwxr-xr-x 4 vsts docker 4096 Apr  3 21:25 .
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + +
Predefined variable nameValueWhen
System.DefaultWorkingDirectory/home/vsts/work/1/safter checkout
Build.SourcesDirectory/home/vsts/work/1/safter checkout
Build.Repository.LocalPath/home/vsts/work/1/safter checkout
+ +

No self checkout but single external checkout with custom path

+ +

Please see following pipeline example, we define an external repository called another_repo, but we don’t checkout the self repository, and we only checkout this external repository.

+ +
resources:
+  repositories:
+    - repository: another_repo
+      type: git
+      name: AzureDevOpsProjectName/another_repo
+
+steps:
+  - checkout: another_repo
+    path: another_repo
+
+ +

Before checkout

+ +
pwd
+/home/vsts/work/1/s
+
+ls -lart /home/vsts/work/1
+total 24
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:52 s
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:52 b
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:52 a
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:52 TestResults
+drwxr-xr-x 6 vsts docker 4096 Apr  3 20:52 .
+drwxr-xr-x 7 vsts root   4096 Apr  3 20:52 ..
+
+ls -lart /home/vsts/work/1/s
+total 8
+drwxr-xr-x 6 vsts docker 4096 Apr  3 20:52 ..
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:52 .
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + +
Predefined variable nameValueWhen
System.DefaultWorkingDirectory/home/vsts/work/1/sbefore checkout
Build.SourcesDirectory/home/vsts/work/1/sbefore checkout
Build.Repository.LocalPath/home/vsts/work/1/sbefore checkout
+ +

After checkout

+ +
pwd
+/home/vsts/work/1/another_repo
+
+ls -lart /home/vsts/work/1
+total 24
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:52 b
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:52 a
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:52 TestResults
+drwxr-xr-x 7 vsts root   4096 Apr  3 20:52 ..
+drwxr-xr-x 6 vsts docker 4096 Apr  3 20:53 .
+drwxr-xr-x 4 vsts docker 4096 Apr  3 20:53 another_repo
+
+ls -lart /home/vsts/work/1/s
+ls: cannot access '/home/vsts/work/1/s': No such file or directory
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + +
Predefined variable nameValueWhen
System.DefaultWorkingDirectory/home/vsts/work/1/another_repoafter checkout
Build.SourcesDirectory/home/vsts/work/1/another_repoafter checkout
Build.Repository.LocalPath/home/vsts/work/1/another_repoafter checkout
+ +

No self checkout but multiple external checkout

+ +
resources:
+  repositories:
+    - repository: another_repo1
+      type: git
+      name: AzureDevOpsProjectName/another_repo1
+    - repository: another_repo2
+      type: git
+      name: AzureDevOpsProjectName/another_repo2
+
+steps:
+  - checkout: another_repo1
+    path: another_repo1
+  - checkout: another_repo2
+    path: another_repo2
+
+ +

Before checkout

+ +
pwd
+/home/vsts/work/1/s
+
+ls -lart /home/vsts/work/1
+total 24
+drwxr-xr-x 5 vsts docker 4096 Apr  3 20:59 s
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:59 b
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:59 a
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:59 TestResults
+drwxr-xr-x 6 vsts docker 4096 Apr  3 20:59 .
+drwxr-xr-x 7 vsts root   4096 Apr  3 20:59 ..
+
+ls -lart /home/vsts/work/1/s
+total 20
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:59 cicd
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:59 another_repo1
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:59 another_repo2
+drwxr-xr-x 6 vsts docker 4096 Apr  3 20:59 ..
+drwxr-xr-x 5 vsts docker 4096 Apr  3 20:59 .
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + +
Predefined variable nameValueWhen
System.DefaultWorkingDirectory/home/vsts/work/1/sbefore checkout
Build.SourcesDirectory/home/vsts/work/1/sbefore checkout
Build.Repository.LocalPath/home/vsts/work/1/sbefore checkout
+ +

After checkout

+ +
pwd
+/home/vsts/work/1/s
+
+ls -lart /home/vsts/work/1
+total 32
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:59 b
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:59 a
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:59 TestResults
+drwxr-xr-x 7 vsts root   4096 Apr  3 20:59 ..
+drwxr-xr-x 4 vsts docker 4096 Apr  3 20:59 another_repo1
+drwxr-xr-x 3 vsts docker 4096 Apr  3 20:59 s
+drwxr-xr-x 8 vsts docker 4096 Apr  3 20:59 .
+drwxr-xr-x 4 vsts docker 4096 Apr  3 20:59 another_repo2
+
+ls -lart /home/vsts/work/1/s
+total 12
+drwxr-xr-x 2 vsts docker 4096 Apr  3 20:59 cicd
+drwxr-xr-x 3 vsts docker 4096 Apr  3 20:59 .
+drwxr-xr-x 8 vsts docker 4096 Apr  3 20:59 ..
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + +
Predefined variable nameValueWhen
System.DefaultWorkingDirectory/home/vsts/work/1/sbefore checkout
Build.SourcesDirectory/home/vsts/work/1/sbefore checkout
Build.Repository.LocalPath/home/vsts/work/1/sbefore checkout
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/06/using-databricks-connect-inside-a-container.html b/2022/06/using-databricks-connect-inside-a-container.html new file mode 100644 index 00000000..9da6c30d --- /dev/null +++ b/2022/06/using-databricks-connect-inside-a-container.html @@ -0,0 +1,963 @@ + + + + + + +Using Databricks Connect inside a container - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +
+ + +
+ + + +

Why use Databricks Connect

+ +

From the very beginning of the Databricks Connect official doc, it says already that Databricks Connect has some limitations and is more or less deprecated in favor of dbx. But for some usages like local IDE live debug, Databricks Connect is still a very good tool where as dbx cannot do it at all. At the time of writing, dbx is mainly a Databricks jobs’ API wrapper to deploy and run Databricks jobs.

+ +

Note 2022-10-15, recently a Databricks blog introduced the project Spark Connect which aims to do very similar thing as Databricks Connect. This project is still in development, and is a part of Apache Spark, not Databricks specific.

+ +

A very important point to be taken into account is that if we plan to deploy production ready Databricks workflows, it’s recommended to use dbx. currently it’s not official supported by Databricks (version number starts with 0), but it’s good enough to use, I’ve already used it since several months. And as it’s a deployment tool, even if it bugs, it will be much less dangerous for production.

+ +

Just a quick helper information of dbx:

+ +
$ dbx --version
+[dbx][2022-10-15 22:43:24.265] 🧱Databricks eXtensions aka dbx, version ~> 0.7.6
+
+$ dbx --help
+
+ Usage: dbx [OPTIONS] COMMAND [ARGS]...
+
+ 🧱Databricks eXtensions aka dbx. Please find the main docs page here.
+
+╭─ Options ──────────────────────────────────────────────────────────────────────────╮
+│ --version                                                                          │
+│ --install-completion          Install completion for the current shell.            │
+│ --show-completion             Show completion for the current shell, to copy it or │
+│                               customize the installation.                          │
+│ --help                        Show this message and exit.                          │
+╰────────────────────────────────────────────────────────────────────────────────────╯
+╭─ Commands ─────────────────────────────────────────────────────────────────────────╮
+│ configure  🔧 Configures project environment in the current folder.                │
+│ deploy     📦 Deploy project to artifact storage.                                  │
+│ destroy    🚮 Delete defined workflows and relevant assets.                        │
+│ execute    🔥 Executes chosen workload on the interactive cluster.                 │
+│ init       💎 Generates new project from the template.                             │
+│ launch     🚀 Launch the workflow on a job cluster.                                │
+│ sync       🔄 Sync local files to Databricks and watch for changes.                │
+╰────────────────────────────────────────────────────────────────────────────────────╯
+
+ +

Using Databricks Connect outside a container

+ +

Just follow the official guide.

+ +

Using Databricks Connect inside a container

+ +

VSCode has a very nice feature that enable us to develop inside a container. As Databricks Connect needs some setup, we can leverage this feature to prepare a container that having everything pre-configured. When we need to do a live debug, just connect VSCode to the container, then set some breakpoints and start the debug.

+ +

We need following folder and files to use VSCode remote container:

+ +
.databricks-connect.template
+.devcontainer/
+├── Dockerfile
+└── devcontainer.json
+databricks_demo_job.py
+
+ +

And the content of each files:

+ +

.databricks-connect.template

+ +
// to find the config values: https://docs.databricks.com/dev-tools/databricks-connect.html#step-2-configure-connection-properties
+
+// inside the container, we can use `databricks-connect configure` to create this file, but it takes time, that's why we pre-created this file before container build.
+
+{
+  "host": "https://aaa.azuredatabricks.net/",
+  "token": "replacetoken",
+  "cluster_id": "abc",
+  "org_id": "111111",
+  "port": "15001"
+}
+
+ +

Dockerfile

+ +

My test is run in a Databricks cluster with the runtime 10.4, which is bound to Python 3.8. At the time of writing, Databricks only releases a beta version for the runtime 10.4: databricks-connect==10.4.0b0. In the future, as per the official doc, it would be better to use the convention databricks-connect==10.4.*.

+ +

The official doc says also that only OpenJDK 8 JRE is supported by the Databricks Connect client. But default-jre installed in the Dockerfile is for python:3.8, which is bound to 3.8-bullseye, which means the JRE version is v11. If we encounter some bugs when using Databricks Connect, we might need to install OpenJDK 8 JRE.

+ +

ENV SPARK_HOME is tested from my Python:3.8 image, once in the container, run the command databricks-connect get-spark-home to check if it’s the same. If not, update the Dockerfile.

+ +
# https://github.com/microsoft/vscode-dev-containers/blob/main/containers/python-3/.devcontainer/Dockerfile
+
+# [Choice] Python version (use -bullseye variants on local arm64/Apple Silicon): 3, 3.10, 3.9, 3.8, 3.7, 3.6, 3-bullseye, 3.10-bullseye, 3.9-bullseye, 3.8-bullseye, 3.7-bullseye, 3.6-bullseye, 3-buster, 3.10-buster, 3.9-buster, 3.8-buster, 3.7-buster, 3.6-buster
+
+ARG VARIANT="3.8"
+FROM mcr.microsoft.com/vscode/devcontainers/python:${VARIANT}
+
+ARG DEV_DATABRICKS_TOKEN
+
+COPY .databricks-connect.template /home/vscode/.databricks-connect
+
+RUN && sudo apt update \
+    && sudo apt-get install -y default-jre \
+    && pip install databricks-connect==10.4.0b0 \
+    && pip install -U pip \
+    && sed -i "s/replacetoken/${DEV_DATABRICKS_TOKEN}/g" /home/vscode/.databricks-connect
+
+ENV SPARK_HOME /usr/local/lib/python3.8/site-packages/pyspark
+
+ +

devcontainer.json

+ +
// Config options: https://aka.ms/devcontainer.json
+
+// File example: https://github.com/microsoft/vscode-dev-containers/blob/main/containers/python-3/.devcontainer/devcontainer.json
+{
+
+  "name": "Python 3",
+  "build": {
+    "dockerfile": "Dockerfile",
+    "context": "..",
+    "args": {
+      // Update 'VARIANT' to pick a Python version: 3, 3.10, 3.9, 3.8, 3.7, 3.6
+      // Append -bullseye or -buster to pin to an OS version.
+      // Use -bullseye variants on local on arm64/Apple Silicon.
+      "VARIANT": "3.10-bullseye",
+      "DEV_DATABRICKS_TOKEN": "${localEnv:DEV_DATABRICKS_TOKEN}"
+    }
+  },
+
+  // Configure tool-specific properties.
+  "customizations": {
+    // Configure properties specific to VS Code.
+    "vscode": {
+      // Set *default* container specific settings.json values on container create.
+      "settings": {
+        "python.defaultInterpreterPath": "/usr/local/bin/python",
+        "python.linting.enabled": true,
+        "python.linting.pylintEnabled": true,
+        "python.formatting.autopep8Path": "/usr/local/py-utils/bin/autopep8",
+        "python.formatting.blackPath": "/usr/local/py-utils/bin/black",
+        "python.formatting.yapfPath": "/usr/local/py-utils/bin/yapf",
+        "python.linting.banditPath": "/usr/local/py-utils/bin/bandit",
+        "python.linting.flake8Path": "/usr/local/py-utils/bin/flake8",
+        "python.linting.mypyPath": "/usr/local/py-utils/bin/mypy",
+        "python.linting.pycodestylePath": "/usr/local/py-utils/bin/pycodestyle",
+        "python.linting.pydocstylePath": "/usr/local/py-utils/bin/pydocstyle",
+        "python.linting.pylintPath": "/usr/local/py-utils/bin/pylint"
+      },
+
+      // Add the IDs of extensions you want installed when the container is created.
+      "extensions": ["ms-python.python", "ms-python.vscode-pylance"]
+    }
+  },
+
+  // Use 'forwardPorts' to make a list of ports inside the container available locally.
+  // "forwardPorts": [],
+
+  // Use 'postCreateCommand' to run commands after the container is created.
+  // "postCreateCommand": "pip3 install --user -r requirements.txt",
+
+  // Comment out to connect as root instead. More info: https://aka.ms/vscode-remote/containers/non-root.
+  "remoteUser": "vscode"
+}
+
+ +

databricks_demo_job.py

+ +
# example taken from: https://docs.databricks.com/dev-tools/databricks-connect.html#access-dbutils
+
+from pyspark.sql import SparkSession
+from pyspark.dbutils import DBUtils
+
+spark = SparkSession.builder.getOrCreate()
+
+dbutils = DBUtils(spark)
+print(dbutils.fs.ls("dbfs:/"))
+print(dbutils.secrets.listScopes())
+
+ +

env var DEV_DATABRICKS_TOKEN

+ +

As you can see, in the file .databricks-connect.template, there’s a line "token": "replacetoken",. +In fact, during the build of the Dockerfile, it will replace the string replacetoken by the value of the env var DEV_DATABRICKS_TOKEN. So we need to create this env var in advance.

+ +

Test

+ +
    +
  1. From VSCode, type F1, choose Remote-Containers: Reopen in Container, VSCode will open a new instance. If you check the lower left corner of VSCode, you’ll see Dev Container: Python 3.
  2. +
  3. Run cat ~/.databricks-connect, you should see the correct config.
  4. +
  5. Run databricks-connect test, it should not raise any error, and might have the phrase * All tests passed. in the end. If the cluster is not started yet, it could take some time during this step.
  6. +
  7. Set a breakpoint in the file databricks_demo_job.py, type F5, have fun.
  8. +
+ + +
+ + + + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/07/azure-pipeline-conditions.html b/2022/07/azure-pipeline-conditions.html new file mode 100644 index 00000000..5358e375 --- /dev/null +++ b/2022/07/azure-pipeline-conditions.html @@ -0,0 +1,825 @@ + + + + + + +Azure pipeline conditions - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +
+ + +
+ + + +

Azure pipeline has two kinds of conditions:

+ +
    +
  1. With keyword condition +
  2. +
  3. With jinja like format ${{if elseif else}} +
  4. +
+ +

In both syntax, we have use parameters and variables, but there’s a big difference between them which makes DevOps frustrated.

+ +

Conditions with keyword $

+ +

With ${{if elseif else}} condition, the using parameters and variables’ values are calculated during the compilation/parsing/loading time, which means:

+ +
    +
  • Even if you define a variable before the ${{if elseif else}} block, but the condition is always evaluated to false if you use this variable in the condition, as it considers the value doesn’t exist yet during the compilation, so if you have a - ${{ else }} block, it will always be executed.
  • +
  • In a template, unless the parameters’ values can be calculated from the loading time, otherwise they’re always evaluated to its default value, if the default value is not defined, Azure pipeline will not raise any error, the condition check just returns always false, so the pipeline will never run into it except for the - ${{ else }} block.
  • +
  • But in a root pipeline out of template, it’s the real parameter value being evaluated in the ${{if elseif else}} block.
  • +
  • Some predefined variables cannot be used in ${{if elseif else}} neither, check the column Available in templates? in the Use predefined variables doc, which means these values are always evaluated to null.
  • +
  • When evaluated to false, the tasks, scripts, etc. wont even be shown as skipped in the Azure pipelines UI, they’re just not shown.
  • +
  • The official doc calls the parameters as runtime parameters, but in fact they’re runtime only when they’re not in a template.
  • +
+ +

Conditions with keyword condition

+ +

The official doc puts the condition keyword format in the Jobs and stages level, but in fact, we can also use it in tasks or scripts level.

+ +
    +
  • Same as to ${{if elseif else}} condition, if you use parameters in condition keyword conditions, it’s value is calculated in the compilation time, so be careful with their usages.
  • +
  • +Variables in the conditions are evaluated in real time, this is the only point that make DevOps happy.
  • +
  • If you really want to evaluate in real time the parameters, the workaround is to add a script task in advance that define some variables taking the values of parameters, and then use these variables in the conditions with condition keyword.
  • +
  • When evaluated to false, the tasks, scripts, etc. bound by the conditions will be shown as skipped in the Azure pipelines UI.
  • +
  • As condition keyword is bound to a single task, script, jobs, stages, etc., if you want to for example run 3 tasks under the same condition, you need to add the same condition to the 3 tasks respectively, whereas with ${{if elseif else}}, we can group the 3 tasks under the same condition, but as explained above, the values of compared parameters or variables referenced in the ${{if elseif else}} format conditions are evaluated during the compilation/loading time, so ${{if elseif else}} will not work for all the use cases, this is the biggest pity of Azure Pipeline from my point of view.
  • +
  • We can add condition to jobs, and inside the jobs, we can have multiple tasks, this could be a workaround of above pity if we do not want add condition to each task with the same condition.
  • +
+ +

A table to sum up

+ + + + + + + + + + + + + + + + + + + + + +
Inputs \ Conditions$ keywordcondition keyword
parametercompilation/parsing/loading timecompilation/parsing/loading time
variablecompilation/parsing/loading timereal time (except for some predefined variables)
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/07/databricks-job-context.html b/2022/07/databricks-job-context.html new file mode 100644 index 00000000..1848e529 --- /dev/null +++ b/2022/07/databricks-job-context.html @@ -0,0 +1,843 @@ + + + + + + +Databricks job/task context - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

Suppose we’re running following job/task in a Azure Databricks workspace:

+ +
jobId: "1111"
+jobRunId: "2222"
+taskRunId: "3333"
+jobName: "ths job name"
+taskName: "first-task"
+databricksWorkspaceUrl: https://adb-4444444444.123.azuredatabricks.net/
+
+ +

Run below command in a Databricks job (task precisely):

+ +
dbutils.notebook.entry_point.getDbutils().notebook().getContext().toJson()
+
+ +

We will get following json:

+ +
{
+    "rootRunId": null,
+    "currentRunId": null,
+    "jobGroup": "7777777777777777777_8888888888888888888_job-1111-run-3333-action-9999999999999999",
+    "tags": {
+        "jobId": "1111", # job id
+        "jobName": "ths job name",
+        "jobClusterKey": "ths job name",
+        "multitaskParentRunId": "2222", # this is the job run id
+        "taskKey": "first-task", # task name
+        "jobRunOriginalAttempt": "3333", # first task run id
+        "jobRunAttempt": "3333",
+        "idInJob": "3333",
+        "runId": "3333", # current task run id, could be different to `jobRunOriginalAttempt` if retry on failure
+        "jobOwnerId": "01010101010101",
+        "opId": "ServerBackend-5fe4478cdfb206ba",
+        "jobFallbackOndemand": "true",
+        "opTarget": "com.databricks.backend.common.rpc.InternalDriverBackendMessages$StartRepl",
+        "taskDependencies": "[]",
+        "eventName": "runExecution",
+        "serverBackendName": "com.databricks.backend.daemon.driver.DriverCorral",
+        "projectName": "driver",
+        "jobClusterNumContainers": "1",
+        "jobMiscMessage": "In run",
+        "jobTriggerTime": "1659015591689",
+        "buildHash": "a2e5769182f120d638a865bc99430452da7670de",
+        "effectiveSparkVersion": "",
+        "sparkVersion": "",
+        "userProvidedSparkVersion": "10.4.x-cpu-ml-scala2.12",
+        "jobTriggerSource": "DbScheduler",
+        "host": "1.2.3.4",
+        "clusterId": "0728-133953-i3676wgl",
+        "hostName": "0728-133953-i3676wgl-1-2-3-4",
+        "jettyRpcJettyVersion": "9",
+        "orgId": "4444444444", # the id in the Databricks workspace url https://adb-{orgId}.{randomNumber}.azuredatabricks.net/
+        "jobType": "NORMAL",
+        "jobTimeoutSec": "0",
+        "maxConcurrentRuns": "10",
+        "rootOpId": "ServiceMain-1ffca09fcc660002",
+        "jobClusterType": "job_cluster",
+        "executorName": "ActiveRunMonitor-job-run-pool",
+        "jobUseSpot": "true",
+        "jobTerminalState": "Running",
+        "userId": "01010101010101", # user id in Databricks, same as jobOwnerId in this example as the job is running by the job owner
+        "jobTriggerId": "0",
+        "opType": "ServerBackend",
+        "jobTriggerType": "manual",
+        "jobTaskType": "python",
+        "isGitRun": "false",
+        "user": "00000000-0000-0000-0000-000000000000", # user name or sp id, or etc.
+        "parentOpId": "RPCClient-1ffca09fcc6602f4",
+        "jettyRpcType": "InternalDriverBackendMessages$DriverBackendRequest"
+    },
+    "extraContext": {
+        "notebook_path": "dbfs:/dbx/my_repo_unique_name/f80372effd494fd79d3831d69fb5d3cd/artifacts/repo_name/tasks/first/entrypoint.py",
+        "api_url": "https://westeurope.azuredatabricks.net", # ! This is not the Databricks workspace URL where the job is running, I find nowhere having the full Databricks workspace URL, `orgId` is not enough, as there's a random number right after it in the URL.
+        "api_token": "[REDACTED]",
+        "non_uc_api_token": ""
+    },
+    "credentialKeys": [
+        "adls_aad_token",
+        "adls_gen2_aad_token",
+        "synapse_aad_token"
+    ]
+}
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/08/azure-pipeline-jobs.html b/2022/08/azure-pipeline-jobs.html new file mode 100644 index 00000000..4f8e1036 --- /dev/null +++ b/2022/08/azure-pipeline-jobs.html @@ -0,0 +1,849 @@ + + + + + + +Azure pipeline jobs - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

Traditional jobs vs deployment jobs

+ +
    +
  • +traditional jobs run in parallel,
  • +
  • +deployment jobs run in sequence, save the deployment history to a environment and a resource, and can also be applied with deployment strategy (runOnce, rolling, and the canary)
  • +
+ +

Deployment jobs

+ +

Tracking deployment history

+ +

As per example given here: we can use RunOnce deployment strategy to create some environments with empty resources and use that as an abstract shell to record deployment history, as the deployment history is across pipelines, down to a specific resource and status of the deployments for auditing.

+ +

Sharing output variables

+ +

The syntax is here.

+ +

Be careful that we must provide the <lifecycle-hookname> in the outputs part. In the below example we can see that the deployement A is specified twice: $[ dependencies.A.outputs['A.setvarStep.myOutputVar'] ]

+ +
# Set an output variable in a lifecycle hook of a deployment job executing runOnce strategy.
+- deployment: A
+  pool:
+    vmImage: 'ubuntu-latest'
+  environment: staging
+  strategy:
+    runOnce:
+      deploy:
+        steps:
+        - bash: echo "##vso[task.setvariable variable=myOutputVar;isOutput=true]this is the deployment variable value"
+          name: setvarStep
+        - bash: echo $(setvarStep.myOutputVar)
+          name: echovar
+
+# Map the variable from the job.
+- job: B
+  dependsOn: A
+  pool:
+    vmImage: 'ubuntu-latest'
+  variables:
+    myVarFromDeploymentJob: $[ dependencies.A.outputs['A.setvarStep.myOutputVar'] ]
+  steps:
+  - script: "echo $(myVarFromDeploymentJob)"
+    name: echovar
+
+ +

When you output a variable from a deployment job, referencing it from the next job uses different syntax depending on if you want to set a variable or use it as a condition for the stage.

+ +
stages:
+- stage: StageA
+  jobs:
+  - job: A1
+    steps:
+      - pwsh: echo "##vso[task.setvariable variable=RunStageB;isOutput=true]true"
+        name: setvarStep
+      - bash: echo $(System.JobName)
+
+- stage: StageB
+  dependsOn:
+    - StageA
+
+  # when used in a condition, job name `A1` is included in variable path.
+  condition: eq(dependencies.StageA.outputs['A1.setvarStep.RunStageB'], 'true')
+
+  # when use to set a variable, jon name `A1` is not included in the variable path.
+  variables:
+    myOutputVar: $[stageDependencies.StageA.A1.outputs['setvarStep.RunStageB']]
+  jobs:
+  - deployment: B1
+    pool:
+      vmImage: 'ubuntu-latest'
+    environment: envB
+    strategy:
+      runOnce:
+        deploy:
+          steps:
+          - bash: echo $(myOutputVar)
+
+ +

Here is the doc for defining variables.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/09/adding-data-files-to-python-package-with-setup-py.html b/2022/09/adding-data-files-to-python-package-with-setup-py.html new file mode 100644 index 00000000..55478f86 --- /dev/null +++ b/2022/09/adding-data-files-to-python-package-with-setup-py.html @@ -0,0 +1,925 @@ + + + + + + +Adding data files to Python package with setup.py - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +
+ + +
+ + + +

setup.py vs pyproject.toml

+ +

pyproject.toml is the new Python project metadata specification standard since PEP 621. As per PEP 517, and as per one of the comments of this StackOverflow thread, in some rare cases, we might have a chicken and egg problem when using setup.py if it needs to import something from the package it’s building. The only thing that pyproject.toml cannot achieve for the moment is the installation in editable mode, where we must use setup.py. Another advantage of setup.py is that we can compute some variables dynamically during the build time as it’s a Python file.

+ +

Nevertheless, setup.py is still a widely used solid tool to build Python package. This post will discuss how to add data files (non Python files) to a Python wheel package built by setup.py, the source distribution files (sdist .tar.gz files, .zip for Windows) are not covered by this post.

+ +

Adding data files

+ +

With parameter package_data for files inside a package

+ +

Official doc: https://docs.python.org/3/distutils/setupscript.html#installing-package-data

+ +

package_data accepts wildcard, but from the given example, the data files must exist inside a Python module folder (coexist with file __init__.py), you cannot use package_data to include files from non module folders, for e.g. the folder conf where there’s no __init__.py file inside.

+ +
setup.py
+conf/
+    conf.json
+src/
+    mypkg/
+        __init__.py
+        module.py
+        data/
+            tables.dat
+            spoons.dat
+            forks.dat
+
+ +
setup(...,
+      packages=['mypkg'],
+      package_dir={'mypkg': 'src/mypkg'},
+      package_data={'mypkg': ['data/*.dat']},
+      )
+
+ +

With parameter data_files for any files

+ +

official doc: https://docs.python.org/3/distutils/setupscript.html#installing-additional-files

+ +

distutils is deprecated, and will be remove in Python 3.12 as per PEP 632, the migration path is to simply use setuptools.

+ +
setup(...,
+    data_files=[
+        ('bitmaps', ['bm/b1.gif', 'bm/b2.gif']),
+        ('config', ['cfg/data.cfg']),
+        ({dest_folder_path_in_wheel}, [{source_file_path_relative_to_setup.py_script}]),
+    ],
+)
+
+ +

From the above example, we can see that:

+ +
    +
  1. +data_files accepts any files from any folder, in contrast to package_data which accepts files inside a package folder.
  2. +
  3. +data_files takes files one by one, we can not use the wildcard like * to specify a set of source files.
  4. +
  5. after build, there’s a .whl wheel file generated, the source_file_path_relative_to_setup will be added to the path {package_name}-{package_version}.data/data/{dest_folder_path_in_wheel}/{source_file_name}, and the Python files are added to {module_name}/{python_package_original_path}. If you want to put the data files at the original path, you need to replace {dest_folder_path_in_wheel} with ../../{data_files_original_path}, the first two .. is just to escape two folder levels from {package_name}-{package_version}.data/data/.
  6. +
+ +

With file MANIFEST.in

+ +

From my understanding and tests, MANIFEST.in file is only for sdist, so out of the scope of this post which talks about bdist wheel package only.

+ +

Parameter zip_safe

+ +

If you’re using old-fashion egg file, to reference data files inside package, should put zie_safe=False during built. Otherwise, for modern Python packaging, this parameter is obsolete.

+ +

Loading data files

+ +

A very good sum-up can be found in this StackOverflow thread.

+ +

Loading data files packaged by package_data

+ +
    +
  • +

    With importlib.resources, importlib.metadata or their backports importlib_resources importlib_metadata.

    + +
    +
    # to read file from module_a/folder_b/file.json
    +import importlib.resources
    +import json
    +
    +# open_text is deprecated in Python3.11 as only support files in Python modules
    +# see below example how to use `importlib.resources.files`
    +json.load(importlib.resources.open_text("module_a.folder_b", "file.json"))
    +
    + +

    Check this doc for migration from pkg_resources.

    +
  • +
  • +

    With deprecated pkg_resources from setuptools of pypa.io, and some examples from here or here.

    + +

    pkg_resources is deprecated due to some performance issue, and also need to install third-party setuptools for the run which should only be used during the build.

    + +
    +
    # to read file from module_a/folder_b/file.json
    +import json
    +import pkg_resources
    +
    +json.load(pkg_resources.resource_stream("module_a", "folder_b/file.json"))
    +
    +
  • +
+ +

Loading data files packaged by data_files

+ +

As data files packaged by data_files parameter could be in any folder, not necessarily inside a Python module with __init__ file, in such case the new importlib.resources.open_textcan not be used anymore, and indeed marked as deprecated in Python 3.11.

+ +
    +
  • +

    Use stdlib importlib.resources.files to read file from module_a/folder_b/file.json

    + +

    This method can also be used to load data files packaged by package_data

    + +
    +
    try:
    +    # new stdlib in Python3.9
    +    from importlib.resources import files
    +except ImportError:
    +    # third-party package, backport for Python3.9-,
    +    # need to add importlib_resources to requirements
    +    from importlib_resources import files
    +import json
    +
    +# with `data_files` in `setup.py`,
    +# we can specify where to put the files in the wheel package,
    +# so inside the module_a for example
    +with open(files(module_a).joinpath("folder_b/file.json")) as f:
    +    print(json.load(f))
    +
    +
  • +
  • +

    Use deprecated third-party pkg_resources to read file from module_a/folder_b/file.json

    + +
    +
    import json
    +import pkg_resources
    +
    +# use `data_files` in `setup.py`, we can specify where to put the files,
    +# so inside the module_a for example
    +json.load(pkg_resources.resource_stream("module_a", "folder_b/file.json"))
    +
    +
  • +
  • +

    Use stdlib pkgtuil.get_data

    + +

    You can find an example in this StackOverflow thread. All the answers and the comments are worth reading. Be aware that pkgutil.get_date() could be deprecated too one day.

    +
  • +
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/09/azure-pipeline-system-access-token-in-shared-pipeline.html b/2022/09/azure-pipeline-system-access-token-in-shared-pipeline.html new file mode 100644 index 00000000..e4cf5e30 --- /dev/null +++ b/2022/09/azure-pipeline-system-access-token-in-shared-pipeline.html @@ -0,0 +1,799 @@ + + + + + + +Azure pipeline System.AccessToken in shared pipeline - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

Var $(System.AccessToken)

+ +

System.AccessToken is a special variable that carries the security token used by the running build. If you check the doc of job authorization scope, you might think the var $(System.AccessToken) has by default the access to all the repositories in the same project where hosts the calling Azure pipeline. But unfortunately, it’s only partially right.

+ +

Problem

+ +

Suppose following situation:

+ +

ProjectA.RepoOne.PipelineOne: The Azure DevOps repository RepoOne in the Azure DevOps project ProjectA has a pipeline PipelineOne, the PipelineOne just gets the repositoryId by repositoryName, behind the scenes, it calls the Azure DevOps API. We need to provide an access token to call this API, in our case, we use the built-in $(System.AccessToken).

+ +

After the test, if we give RepoOne as the repository name, the pipeline works well, and returns the repositoryId of the repository RepoOne. But if we give another repository name (for e.g. RepoTwo), which is in the same project ProjectA, you will get an error something like:

+ +
401 Client Error: Unauthorized for url: https://almsearch.dev.azure.com/...
+
+ +

Root cause

+ +

This is because although the $(System.AccessToken) is designed to have access to all the repositories in the same project, there’s still another level of security control which blocks the API call, which is the pipeline level permission.

+ +

Solution

+ +

To fix this, one of the solutions is to add the target repository as repositories resource in the PipelineOne yaml file:

+ +
resources:
+  repositories:
+    - repository: RepoTwo
+      type: git
+      name: ProjectA/RepoTwo
+
+ +

When we re-run PipelineOne, this time the pipeline will be in pending for asking for the permission to access to the RepoTwo repository, we need to manually click on the permit button to grant this access, and then the pipeline will succeed as expected.

+ +

The repositories resource does not accept variables in the repository and name values which makes the pipeline authoring a little bit sticky. We must write letter by letter the project name and repository name in string, so we need to declare as many repositories resources as the repositories in the same project on which we want to apply the PipelineOne.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/09/databricks-cluster-access-mode.html b/2022/09/databricks-cluster-access-mode.html new file mode 100644 index 00000000..6b55a398 --- /dev/null +++ b/2022/09/databricks-cluster-access-mode.html @@ -0,0 +1,1020 @@ + + + + + + +Databricks cluster access mode - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +
+ + +
+ + + +

What is cluster access mode

+ +

Just a copy from Azure Databricks official doc:

+ +

Amazon Databricks official doc has less info on access mode.

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Access ModeVisible to userUC SupportSupported LanguagesNotes
Single UserAlwaysYesPython, SQL, Scala, RCan be assigned to and used by a single user only. Dynamic views are not supported. Credential passthrough is not supported.
SharedAlways (Premium plan required)YesPython, SQLInit scripts, third-party libraries, and JARS are not supported. Credential passthrough is not supported.
No Isolation SharedHidden (Enforce User Isolation required)NoPython, SQL, Scala, RAdmin console configuration required to enforce user isolation
CustomHidden (For all new clusters)NoPython, SQL, Scala, RThis option is shown only if you have existing clusters without a specified access mode.
+ +

Single User mode is easy to understand, the cluster is reserved to a single user, other user cannot use it.

+ +

Custom mode is often seen in job cluster, which means cluster created by a job running in a cluster pool for example, because when creating a cluster pool, there’s no option for access mode.

+ +

This post will talk about Shared and No Isolation Shared access modes.

+ +

All the below examples were tested on a cluster with Databricks runtime v10.4 LTS (Scala 2.12 Spark 3.2.1).

+ +

+Shared access mode

+ +

From two different users, running the same command python -m site, I got two different results.

+ +
    +
  • in a notebook from user1, the mapped user is spark-6166cfd7-9154-4017-b0ff-89:
  • +
+ +
%%sh
+whoami
+echo ======
+which python
+echo ======
+python -m site
+
+# outputs:
+spark-6166cfd7-9154-4017-b0ff-89
+======
+/databricks/python3/bin/python
+======
+sys.path = [
+    '/home/spark-6166cfd7-9154-4017-b0ff-89',
+    '/databricks/spark/python',
+    '/databricks/spark/python/lib/py4j-0.10.9.1-src.zip',
+    '/databricks/jars/spark--driver--driver-spark_3.2_2.12_deploy.jar',
+    '/WSFS_NOTEBOOK_DIR',
+    '/databricks/python_shell',
+    '/usr/lib/python38.zip',
+    '/usr/lib/python3.8',
+    '/usr/lib/python3.8/lib-dynload',
+    '/databricks/python3/lib/python3.8/site-packages',
+    '/usr/local/lib/python3.8/dist-packages',
+    '/usr/lib/python3/dist-packages',
+]
+USER_BASE: '/home/spark-6166cfd7-9154-4017-b0ff-89/.local' (exists)
+USER_SITE: '/home/spark-6166cfd7-9154-4017-b0ff-89/.local/lib/python3.8/site-packages' (doesn't exist)
+ENABLE_USER_SITE: True
+
+ +
    +
  • in a notebook from user2, the mapped user is spark-5a9eefa7-49d3-4176-9805-1e:
  • +
+ +
%%sh
+whoami
+echo ======
+which python
+echo ======
+python -m site
+
+# outputs:
+spark-5a9eefa7-49d3-4176-9805-1e
+======
+/databricks/python3/bin/python
+======
+sys.path = [
+    '/home/spark-5a9eefa7-49d3-4176-9805-1e',
+    '/databricks/spark/python',
+    '/databricks/spark/python/lib/py4j-0.10.9.1-src.zip',
+    '/databricks/jars/spark--driver--driver-spark_3.2_2.12_deploy.jar',
+    '/WSFS_NOTEBOOK_DIR',
+    '/databricks/python_shell',
+    '/usr/lib/python38.zip',
+    '/usr/lib/python3.8',
+    '/usr/lib/python3.8/lib-dynload',
+    '/databricks/python3/lib/python3.8/site-packages',
+    '/usr/local/lib/python3.8/dist-packages',
+    '/usr/lib/python3/dist-packages',
+]
+USER_BASE: '/home/spark-5a9eefa7-49d3-4176-9805-1e/.local' (exists)
+USER_SITE: '/home/spark-5a9eefa7-49d3-4176-9805-1e/.local/lib/python3.8/site-packages' (doesn't exist)
+ENABLE_USER_SITE: True
+
+ +
    +
  • Pip install a third party Python module will fail
  • +
+ +

Below example demonstrates the phrase “Init scripts, third-party libraries, and JARS are not supported” in the above table.

+ +
%%sh
+pip install requests==2.26.0
+# same error message for: `python -m pip install requests==2.26.0 --user`,
+# except for there's no the first phrase: "Defaulting to user installation because normal site-packages is not writeable"
+
+Defaulting to user installation because normal site-packages is not writeable
+Looking in indexes: https://[REDACTED]:****@[REDACTED]/_packaging/[REDACTED]/pypi/simple/
+Collecting requests==2.26.0
+  Downloading https://[REDACTED]/_packaging/daa86ee5-06b8-417b-bc88-e64e3e2eef29/pypi/download/requests/2.26/requests-2.26.0-py2.py3-none-any.whl (62 kB)
+Requirement already satisfied: certifi>=2017.4.17 in /databricks/python3/lib/python3.8/site-packages (from requests==2.26.0) (2020.12.5)
+Requirement already satisfied: urllib3<1.27,>=1.21.1 in /databricks/python3/lib/python3.8/site-packages (from requests==2.26.0) (1.25.11)
+Requirement already satisfied: idna<4,>=2.5 in /databricks/python3/lib/python3.8/site-packages (from requests==2.26.0) (2.10)
+Collecting charset-normalizer~=2.0.0
+  Downloading https://[REDACTED]/_packaging/daa86ee5-06b8-417b-bc88-e64e3e2eef29/pypi/download/charset-normalizer/2.0.12/charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
+ERROR: Will not install to the user site because it will lack sys.path precedence to requests in /databricks/python3/lib/python3.8/site-packages
+WARNING: You are using pip version 21.0.1; however, version 22.2.2 is available.
+You should consider upgrading via the '/databricks/python3/bin/python -m pip install --upgrade pip' command.
+CalledProcessError: Command 'b'pip install requests==2.26.0\n'' returned non-zero exit status 1.
+
+ +

+No Isolation Shared access mode

+ +

Update 2023-02-01, I retested the No Isolation Shared access mode today, it seems that something has been changed at Databricks level.

+ +

Hereunder the new behavior:

+ +
    +
  1. The user is still root, but the Python binary is not a system one, instead an isolated venv is used, and pip install occurs in the venv too.
  2. +
  3. For the same user, each time we re-attach to the cluster, the venv path is changed. And therefore, previous pip install is discarded.
  4. +
+ +
%%sh
+whoami
+echo ======
+which python
+echo ======
+python -m site
+
+# outputs:
+root
+======
+/local_disk0/.ephemeral_nfs/envs/pythonEnv-76eac499-b8f2-451c-ac6a-88f9a68fcae7/bin/python
+======
+sys.path = [
+    '/databricks/driver',
+    '/databricks/spark/python',
+    '/databricks/spark/python/lib/py4j-0.10.9.5-src.zip',
+    '/databricks/jars/spark--driver--driver-spark_3.3_2.12_deploy.jar',
+    '/WSFS_NOTEBOOK_DIR',
+    '/databricks/jars/spark--maven-trees--ml--11.x--graphframes--org.graphframes--graphframes_2.12--org.graphframes__graphframes_2.12__0.8.2-db1-spark3.2.jar',
+    '/databricks/python_shell',
+    '/usr/lib/python39.zip',
+    '/usr/lib/python3.9',
+    '/usr/lib/python3.9/lib-dynload',
+    '/local_disk0/.ephemeral_nfs/envs/pythonEnv-76eac499-b8f2-451c-ac6a-88f9a68fcae7/lib/python3.9/site-packages',
+    '/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.9/site-packages',
+    '/databricks/python/lib/python3.9/site-packages',
+    '/usr/local/lib/python3.9/dist-packages',
+    '/usr/lib/python3/dist-packages',
+    '/databricks/.python_edge_libs',
+]
+USER_BASE: '/root/.local' (exists)
+USER_SITE: '/root/.local/lib/python3.9/site-packages' (doesn't exist)
+ENABLE_USER_SITE: False
+
+ +

Below is the test result on 2022-09-20:

+ +

In contrast to Shared mode, within the No Isolation Shared mode, running the same commands, I got the same results from two different users. +We can find that all the users are logged as root account.

+ +
%%sh
+whoami
+echo ======
+which python
+echo ======
+python -m site
+
+# outputs:
+root
+======
+/databricks/python3/bin/python
+======
+sys.path = [
+    '/databricks/driver',
+    '/databricks/spark/python',
+    '/databricks/spark/python/lib/py4j-0.10.9-src.zip',
+    '/databricks/jars/spark--driver--driver-spark_3.1_2.12_deploy.jar',
+    '/WSFS_NOTEBOOK_DIR',
+    '/databricks/python_shell',
+    '/usr/lib/python38.zip',
+    '/usr/lib/python3.8',
+    '/usr/lib/python3.8/lib-dynload',
+    '/databricks/python3/lib/python3.8/site-packages',
+    '/usr/local/lib/python3.8/dist-packages',
+    '/usr/lib/python3/dist-packages',
+]
+USER_BASE: '/root/.local' (exists)
+USER_SITE: '/root/.local/lib/python3.8/site-packages' (doesn't exist)
+ENABLE_USER_SITE: True
+
+ +
    +
  • Pip install a third party Python module will succeed
  • +
+ +

Conclusion

+ +
    +
  • +Shared access mode maps different users to different user space, their environments are isolated, but they cannot install any additional packages or modules.
  • +
  • +No Isolation Shared access mode maps all the users to the root account, everything is shared, they can install anything, but the changes imply to all users. After cluster restart, all the additional installations are purged. So maybe one project per cluster is a choice.
  • +
  • Another good choice is to use the non-interactive job cluster with a cluster pool, where the cluster pool is shared, but any user can install anything (can be limited by cluster policy), and the installation is isolated at job level. Which means even two jobs are created by the same user, the two jobs will use different environments (VMs with Databricks runtime container re-deployed in the cluster pool after each job run).
  • +
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/11/azure-pipeline-delete-blobs-from-blob-storage.html b/2022/11/azure-pipeline-delete-blobs-from-blob-storage.html new file mode 100644 index 00000000..ead3f3d0 --- /dev/null +++ b/2022/11/azure-pipeline-delete-blobs-from-blob-storage.html @@ -0,0 +1,871 @@ + + + + + + +Azure pipeline delete blobs from blob storage - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +
+ + +
+ + + +

The example given by this post is for Azure Pipeline with the latest Ubuntu agent, for AzCli from local machine, removing the --auth-mode login part should work.

+ +

As it’s a Linux pipeline agent, the pipeline task AzureFileCopy can not be used, it’s written in Powershell, we should use the AzureCLI task instead.

+ +

Working example

+ +

Suppose we have following use case:

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
typevalue
storage account namesto
container namecon
blob 1 path in blob storagefolder/sub_folder/blob1
blob 2 path in blob storagefolder/sub_folder/blob2
blob 1 path in local machinelocal_folder/local_sub_folder/blob1
blob 2 path in local machinelocal_folder/local_sub_folder/blob2
+ +

The virtual folder folder/sub_folder/ has only 2 blobs as shown in the above table.

+ +

Hereunder the Azure Pipeline code to delete existing files from folder/sub_folder/ in the Azure blob storage and than upload all the local files from local_folder/local_sub_folder/ to folder/sub_folder/:

+ +
- task: AzureCLI@2
+  displayName: Az File Copy to Storage
+  inputs:
+    azureSubscription: $(serviceConnection)
+    scriptType: bash
+    scriptLocation: inlineScript
+    inlineScript: |
+      az config set extension.use_dynamic_install=yes_without_prompt
+      folder_path="folder/sub_folder"
+
+      echo "##[command]Getting existing_files"
+      existing_files=$(az storage fs file list \
+        --auth-mode login \
+        -f con \
+        --account-name sto \
+        --path $folder_path | jq)
+      echo -e "existing_files:\n$existing_files"
+
+      echo "##[command]Stating delete"
+      echo $existing_files | jq .[].name -r | while read file ; do \
+        az storage blob delete \
+        --auth-mode login \
+        -c con \
+        --account-name sto \
+        -n "$file" ; \
+        done
+
+      echo "##[command]Starting update-batch"
+      az storage blob upload-batch \
+        --auth-mode login \
+        --destination con \
+        --account-name sto \
+        --destination-path $folder_path \
+        --source "local_folder/local_sub_folder"
+
+      echo "##[command]Listing files after upload"
+      az storage fs file list \
+        --auth-mode login \
+        -f con \
+        --account-name sto \
+        --path $folder_path
+
+ +

Should not use failOnStandardError: true with AzureCLI as the commands az config set and az storage blob upload-batch send both messages to stderr.

+ +

Failed with az storage azcopy blob delete +

+ +

The best way to delete bunch of blobs is az storage azcopy blob delete -c con --account-name sto -t folder/subfolder --recursive. But if you use --account-key for auth, it’s currently not available as az storage account keys list --account-name sto with current version (v2.41.0) of azure-cli delivered by Azure Pipeline agent has a bug like this: AttributeError: module ‘azure.mgmt.storage.v2022_05_01.models’ has no attribute ‘ActiveDirectoryPropertiesAccountType’ or this: AttributeError: module ‘azure.mgmt.storage.v2022_05_01.models’ has no attribute ‘ListKeyExpand’. So we should use other auth methods like SAS token or connection string pre-populated in KeyVault.

+ +

Downgrading the azure-cli version inside AzureCLI during Azure pipeline might work, but not tested.

+ +

az storage azcopy blob delete --account-key works from local machine if it’s not the buggy version installed.

+ +

Failed with az storage blob delete-batch +

+ +

az storage blob delete-batch -s con --account-name sto --delete-snapshots include --dryrun --pattern "folder/subfolder/* could work only in case there’re not many blobs inside the container con, otherwise this command using --pattern (with Python fnmatch behind the scenes) will pending for a long time.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/11/azure-pipeline-windows-agent-UnicodeEncodeError.html b/2022/11/azure-pipeline-windows-agent-UnicodeEncodeError.html new file mode 100644 index 00000000..a57c6839 --- /dev/null +++ b/2022/11/azure-pipeline-windows-agent-UnicodeEncodeError.html @@ -0,0 +1,772 @@ + + + + + + +Azure pipeline Windows agent UnicodeEncodeError - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +
+ + +
+ + + +

For people who encounter UnicodeEncodeError when using Windows Azure Pipeline agent, the issue might be here.

+ +

As per above link, or this email, the solutions could be:

+ +
    +
  • You can override just sys.std* to UTF-8 by setting +the environment variable PYTHONIOENCODING=UTF-8.
  • +
  • You can override all I/O to use UTF-8 by setting PYTHONUTF8=1, or by passing the +command-line option -X utf8.
  • +
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/11/using-ast-and-cst-to-change-python-code.html b/2022/11/using-ast-and-cst-to-change-python-code.html new file mode 100644 index 00000000..5e7e2901 --- /dev/null +++ b/2022/11/using-ast-and-cst-to-change-python-code.html @@ -0,0 +1,847 @@ + + + + + + +Using ast and cst to change Python code - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

Difference between AST and CST

+ +

A brief comparison could be found in the libcst doc. Generally speaking, CST could keep the original source code format including the comments.

+ +

Using AST to change Python code

+ +

Since Python 3.9, the helper ast.unparse has been introduced, so we have both ast.parse and ast.unparse in our hands, everything is ready, finally we have an official way to change Python code.

+ +

For example, I have a the file setup.py as belows:

+ +
"""setup.py file
+"""
+from pkg_resources import parse_requirements
+from setuptools import setup
+
+with open("requirements.txt", encoding="utf-8") as f:
+    install_requires = [str(req) for req in parse_requirements(f)]
+
+setup(
+    name="foo",
+    install_requires=install_requires,
+)
+
+ +

I want to change the line install_requires=install_requires, by install_requires=["a==1", "b==2"],.

+ +

Since Python3.9, I can achieve it like this:

+ +
import ast
+import json
+
+new_install_requires = ["a==1", "b==2"]
+
+setup_file = open("setup.py").read()
+setup = ast.parse(setup_file)
+
+print("\n***Before change\n")
+print(ast.unparse(setup))
+
+for body in setup.body:
+    try:
+        if hasattr(body, "value") and hasattr(body.value, "keywords"):
+            for kw in body.value.keywords:
+                if kw.arg == "install_requires":
+                    kw.value = ast.parse(json.dumps(new_install_requires)).body[0]
+    except Exception as err:
+        print(err)
+
+print("\n***After change\n")
+print(ast.unparse(setup))
+
+ +

Result from the console:

+ +
$ python3.9 change_setup.py
+
+***Before change
+
+"""setup.py file
+"""
+from pkg_resources import parse_requirements
+from setuptools import setup
+with open('requirements.txt', encoding='utf-8') as f:
+    install_requires = [str(req) for req in parse_requirements(f)]
+setup(name='foo', install_requires=install_requires)
+
+***After change
+
+"""setup.py file
+"""
+from pkg_resources import parse_requirements
+from setuptools import setup
+with open('requirements.txt', encoding='utf-8') as f:
+    install_requires = [str(req) for req in parse_requirements(f)]
+setup(name='foo', install_requires=
+['a==1', 'b==2'])
+
+ +

You will notice that, the ast.parse discards all the comments. And if need to format the code, black could be a good choice.

+ +

Using CST to change Python code

+ +

An example can be found the repo hauntsaninja/no_implicit_optional that uses the libcst from Instagram

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/12/python-difference-on-subprocess-run-call-check-call-check-output.html b/2022/12/python-difference-on-subprocess-run-call-check-call-check-output.html new file mode 100644 index 00000000..eb554003 --- /dev/null +++ b/2022/12/python-difference-on-subprocess-run-call-check-call-check-output.html @@ -0,0 +1,861 @@ + + + + + + +Python difference on subprocess run(), call(), check_call(), check_output() - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

Difference on subprocess run(), call(), check_call(), check_output()

+ +

Since Python 3.5, the official doc explains that:

+ +
+

Prior to Python 3.5, these three functions (subprocess.call(), subprocess.check_call(), subprocess.check_output()) comprised the high level API to subprocess. You can now use subprocess.run() in many cases, but lots of existing code calls these functions.

+
+ +

subprocess.run common parameters

+ +
    +
  • +

    subprocess.run default behavior accepts arguments in list

    + +
    +
    subprocess.run(["ls", "-l"])
    +
    +
  • +
  • +

    shell=True (default False) to send arguments in string

    + +
    +
    subprocess.run("ls -l", shell=True)
    +
    +
  • +
  • +

    capture_output=True (default False) to save output in a var

    + +
    +
    res = subprocess.run("ls -l", shell=True, capture_output=True)
    +res.stdout
    +
    +
  • +
  • +

    encoding="utf-8" (default None) to save var in string instead of bytes.

    +
  • +
  • +

    check=True (default False) to raise subprocess.CalledProcessError: if command returned non-zero exit code. But if the command executable doesn’t exist for exampel missspellm you will get the error FileNotFoundError

    +
  • +
  • +

    Popen() is for advanced usage. For example, replacing the shell pipeline.

    + +

    shell command:

    + +
    +
    output=$(dmesg | grep hda)
    +
    + +

    with Popen, becomes:

    + +
    +
    p1 = Popen(["dmesg"], stdout=PIPE)
    +p2 = Popen(["grep", "hda"], stdin=p1.stdout, stdout=PIPE)
    +p1.stdout.close()  # Allow p1 to receive a SIGPIPE if p2 exits.
    +output = p2.communicate()[0]
    +
    +
  • +
  • +

    default params

    + +
    +
    import subprocess
    +
    +default_run_params = dict(
    +    capture_output=True,
    +    encoding="utf-8",
    +    check=True
    +)
    +# command = ["unknown_command", "-l"]
    +# command = ["python", "-askjd"]
    +command = ["ls", "-l"]
    +
    +try:
    +    # output type is subprocess.CompletedProcess
    +    output = subprocess.run(command, **default_run_params)
    +
    +    # print in pure string in one line
    +    print(output)
    +
    +    # print with new line just as launching from shell
    +    print(output.stdout)
    +
    +    # as we catch error with `check=True`,
    +    # output.stderr is always an empty string.
    +    # and output.returncode is always 0 in this case.
    +except FileNotFoundError as exc:
    +    print(f"{type(exc).__name__}: {exc}")
    +    raise
    +except subprocess.CalledProcessError as exc:
    +    print(exc)  # no error details will given by print(exc)
    +    print(exc.__dict__)  # print all
    +    print(exc.returncode)
    +    print(exc.stderr)  # print error message only
    +    # exc.stdout should be empty
    +    raise
    +
    +
  • +
+ + +
+ +
+ + + + + + + +

+ Tags: + + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2022/12/syncing-repository-from-github-to-gitee.html b/2022/12/syncing-repository-from-github-to-gitee.html new file mode 100644 index 00000000..98a1b160 --- /dev/null +++ b/2022/12/syncing-repository-from-github-to-gitee.html @@ -0,0 +1,777 @@ + + + + + + +Syncing repository from github to gitee - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +
+ + +
+ + + +

I need to sync github repository (files and commits only) https://github.com/copdips/copdips.github.io to gitee repository https://gitee.com/copdips/copdips.github.io.

+ +
    +
  1. In gitee: create an empty repository, normal the same name as the one you want to sync from github. For example for this blog repository: https://gitee.com/copdips/copdips.github.io
  2. +
  3. In gitee: create a PAT in gitee with necessary permissions (all or projects). +The sync needs to run two commands against to gitee: +
      +
    • git push --all --force gitee
    • +
    • git push --tags --force gitee
    • +
    +
  4. +
  5. In github repository: create 2 secrets: GITEE_USERNAME=copdips, and GITEE_PAT={PAT_created_in_the_previous_step} +
  6. +
  7. In github repository: create a github workflow, such as: .github/workflows/sync-to-gitee.yml +
  8. +
  9. In github repository: push the above github workflow file to github, it will automatically trigger the first sync. And from now on, all the pushes to the main branch will trigger a such sync too. main is my default branch to trigger the sync, could be changed in the workflow file.
  10. +
+ +

Github action within github free personal plan has a time limit at 2000 minutes per month, which should be enough if you don’t have many repositories and many pushes.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2023/01/calling-azure-rest-api.html b/2023/01/calling-azure-rest-api.html new file mode 100644 index 00000000..18906160 --- /dev/null +++ b/2023/01/calling-azure-rest-api.html @@ -0,0 +1,852 @@ + + + + + + +Calling Azure REST API - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +
+ + +
+ + + +

This blog Calling Azure REST API via curl is pretty good. Just two more things.

+ +

Auth token in curl

+ +

We can use curl -X GET -u :$token instead of curl -X GET -H "Authorization: Bearer $token"

+ +

Azure DevOps API resource id for OAuth

+ +

when using az rest to call Azure DevOps API, you will get a similar error as follows:

+ +
+

Can’t derive appropriate Azure AD resource from –url to acquire an access token. If access token is required, use –resource to specify the resource.

+
+ +

This is because Azure DevOps API base url: https://dev.azure.com/ or https://vssps.dev.azure.com/, etc. are not an Azure cloud endpoint.

+ +
$ az rest --help
+Command
+    az rest : Invoke a custom request.
+        This command automatically authenticates using the logged-in credential: If Authorization
+        header is not set, it attaches header `Authorization: Bearer <token>`, where `<token>` is
+        retrieved from AAD. The target resource of the token is derived from --url if --url starts
+        with an endpoint from `az cloud show --query endpoints`. You may also use --resource for a
+        custom resource.
+        If Content-Type header is not set and --body is a valid JSON string, Content-Type header
+        will default to application/json.
+    Arguments
+        [...redacted]
+        --resource : Resource url for which CLI should acquire a token from AAD
+                     in order to access the service. The token will be placed in
+                     the Authorization header. By default, CLI can figure this
+                     out based on --url argument, unless you use ones not in the
+                     list of "az cloud show --query endpoints".
+        [...redacted]
+
+ +
$ az cloud show --query endpoints
+{
+  "activeDirectory": "https://login.microsoftonline.com",
+  "activeDirectoryDataLakeResourceId": "https://datalake.azure.net/",
+  "activeDirectoryGraphResourceId": "https://graph.windows.net/",
+  "activeDirectoryResourceId": "https://management.core.windows.net/",
+  "appInsightsResourceId": "https://api.applicationinsights.io",
+  "appInsightsTelemetryChannelResourceId": "https://dc.applicationinsights.azure.com/v2/track",
+  "attestationResourceId": "https://attest.azure.net",
+  "azmirrorStorageAccountResourceId": null,
+  "batchResourceId": "https://batch.core.windows.net/",
+  "gallery": "https://gallery.azure.com/",
+  "logAnalyticsResourceId": "https://api.loganalytics.io",
+  "management": "https://management.core.windows.net/",
+  "mediaResourceId": "https://rest.media.azure.net",
+  "microsoftGraphResourceId": "https://graph.microsoft.com/",
+  "ossrdbmsResourceId": "https://ossrdbms-aad.database.windows.net",
+  "portal": "https://portal.azure.com",
+  "resourceManager": "https://management.azure.com/",
+  "sqlManagement": "https://management.core.windows.net:8443/",
+  "synapseAnalyticsResourceId": "https://dev.azuresynapse.net",
+  "vmImageAliasDoc": "https://raw.githubusercontent.com/Azure/azure-rest-api-specs/master/arm-compute/quickstart-templates/aliases.json"
+}
+
+ +

So we need to find the resource url for Azure DevOps API. Hopefully, we can find it from this github issue, or from the official Azure DevOps doc, we can use 499b84ac-1321-427f-aa17-267ca6975798 as the value of --resource to call az rest:

+ +
az rest \
+    --resource 499b84ac-1321-427f-aa17-267ca6975798 \
+    --url <url>
+
+ +

When running az rest within Azure pipeline, we also need to add the authorization, as the SPN injected by azureSubscription cannot be recognized by Azure DevOps API, it’s not a user account. The SPN support is in Azure DevOps road map, and planned to be released in 2023 Q1. I’ll update this post once I’ve tested it.

+ +
- task: AzureCLI@2
+  displayName: Az rest
+  inputs:
+    azureSubscription: $(azureResourceServiceConnection)
+    scriptType: bash
+    scriptLocation: inlineScript
+    inlineScript: |
+      az rest \
+          --headers "Authorization=Bearer $SYSTEM_ACCESSTOKEN" \
+          --resource 499b84ac-1321-427f-aa17-267ca6975798 \
+          --url <url>
+    failOnStandardError: true
+  env:
+    SYSTEM_ACCESSTOKEN: $(System.AccessToken)
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2023/01/python-aiohttp-rate-limit.html b/2023/01/python-aiohttp-rate-limit.html new file mode 100644 index 00000000..9ab18829 --- /dev/null +++ b/2023/01/python-aiohttp-rate-limit.html @@ -0,0 +1,880 @@ + + + + + + +Python aiohttp rate limit - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

HTTP rate limit is often the max requests in a limited time period, and sometimes could also be the max concurrent requests.

+ +

Max requests in a limited time period

+ +
from aiolimiter import AsyncLimiter
+
+RATE_LIMIT_IN_SECOND = 20
+# 1.0 for time period during 1 second
+rate_limit = AsyncLimiter(RATE_LIMIT_IN_SECOND, 1.0)
+
+async with rate_limit:
+    await my_aiohttp_request()
+
+ +

Max concurrent requests

+ +

Official doc: Limiting connection pool size

+ +
import aiohttp
+
+MAX_CONCURRENT = 10
+
+async def main():
+  # The default limit is 100
+  connector = aiohttp.TCPConnector(limit=MAX_CONCURRENT)
+
+  async with aiohttp.ClientSession(connector=connector) as session:
+      await my_aiohttp_request()
+
+if __name__ == "__main__":
+    asyncio.run(main())
+
+ +

The object connector from connector = aiohttp.TCPConnector(limit=MAX_CONCURRENT) must be created within an async function.

+ +

Example

+ +

We can borrow the official example on asyncio queues.

+ +

The below example shows how to send GET method to https://httpbin.org/get with a rate limit of 20 requests per second and max 10 concurrent requests.

+ +
import asyncio
+import random
+import time
+
+import aiohttp
+from aiolimiter import AsyncLimiter
+
+MAX_CONCURRENT = 10
+RATE_LIMIT_IN_SECOND = 20
+rate_limit = AsyncLimiter(RATE_LIMIT_IN_SECOND, 1.0)
+
+
+async def my_aiohttp_request(session, name):
+    response = await session.get("https://httpbin.org/get")
+    response.raise_for_status()
+    json_response = await response.json()
+    print(f"{name} finished aiohttp request with response: {json_response}")
+    # do something on reponse here
+
+
+async def worker(name, queue, session):
+    while True:
+        # Get a "work item" out of the queue.
+        sleep_for = await queue.get()
+
+        # Sleep for the "sleep_for" seconds.
+        await asyncio.sleep(sleep_for)
+
+        async with rate_limit:
+            await my_aiohttp_request(session, name)
+
+        # Notify the queue that the "work item" has been processed.
+        queue.task_done()
+
+        print(f"{name} has slept for {sleep_for:.2f} seconds")
+
+
+async def main():
+    connector = aiohttp.TCPConnector(limit=MAX_CONCURRENT)
+    async with aiohttp.ClientSession(connector=connector) as session:
+        # Create a queue that we will use to store our "workload".
+        queue = asyncio.Queue()
+
+        # Generate random timings and put them into the queue.
+        total_sleep_time = 0
+        for _ in range(20):
+            sleep_for = random.uniform(0.05, 1.0)
+            total_sleep_time += sleep_for
+            queue.put_nowait(sleep_for)
+
+        # Create three worker tasks to process the queue concurrently.
+        tasks = [
+            asyncio.create_task(worker(f"worker-{idx}", queue, session))
+            for idx in range(MAX_CONCURRENT)
+        ]
+        # Wait until the queue is fully processed.
+        started_at = time.monotonic()
+        await queue.join()
+        total_slept_for = time.monotonic() - started_at
+
+        # Cancel our worker tasks.
+        for task in tasks:
+            task.cancel()
+        # Wait until all worker tasks are cancelled.
+        await asyncio.gather(*tasks, return_exceptions=True)
+
+        print("====")
+        print(f"3 workers slept in parallel for {total_slept_for:.2f} seconds")
+        print(f"total expected sleep time: {total_sleep_time:.2f} seconds")
+
+
+asyncio.run(main())
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2023/01/sonarcloud-github-action.html b/2023/01/sonarcloud-github-action.html new file mode 100644 index 00000000..f5ac3911 --- /dev/null +++ b/2023/01/sonarcloud-github-action.html @@ -0,0 +1,860 @@ + + + + + + +Sonarcloud Github Action - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

Sonarcloud Github Action doesn’t work by default with Python pytest coverage.xml file, hereunder a working example.

+ +

file .github/workflows/ci.yml +

+ + +
# file: .github/workflows/ci.yml
+
+# irrelevant part is removed
+env:
+  repo_name: repo
+  app_folder_name: app
+  coverage_percent: 90
+  build_number: ${{ github.run_number }}
+  pytest_coverage_commentator_filename: pytest_coverage_commentator.txt
+  pytest_coverage_xml_file_name: coverage.xml
+
+- name: Test with pytest
+  run: |
+    pytest -v -s \
+      --cov=$app_folder_name \
+      --cov-fail-under=$coverage_percent \
+      --cov-report=xml:$pytest_coverage_xml_file_name \
+      --cov-report=term-missing:skip-covered
+
+# Codecov is a nice tool so given here too
+- name: Upload coverage to Codecov
+  uses: codecov/codecov-action@v3
+  with:
+    token: ${{ secrets.CODECOV_TOKEN }}
+    env_vars: OS,PYTHON
+    fail_ci_if_error: true
+    flags: unittests
+    name: codecov-repo_name
+    files: coverage.xml
+    verbose: true
+
+- name: Test pytest with pytest-coverage-commentator
+  run: |
+    pytest --cache-clear --cov=$app_folder_name > $pytest_coverage_commentator_filename
+
+- name: Comment PR with coverage
+  uses: coroo/pytest-coverage-commentator@v1.0.2
+  with:
+    pytest-coverage: ${{ env.pytest_coverage_commentator_filename }}
+
+- name: Override Coverage Source Path for Sonar
+  # https://community.sonarsource.com/t/code-coverage-doesnt-work-with-github-action/16747/7
+  # we should convert '<source>/home/runner/work/pr/repo/app</source>' to '<source>/github/workspace//app</source>'
+  # be careful DOUBLE slashes in the later part, and the app in the later part is retrieved from sonar.sources from sonar-project.properties
+  run: |
+    echo "GITHUB_WORKSPACE=$GITHUB_WORKSPACE"
+    echo 'coverage.xml before:'
+    head $GITHUB_WORKSPACE/$pytest_coverage_xml_file_name
+    sed -i 's@'$GITHUB_WORKSPACE'@/github/workspace/@g' $GITHUB_WORKSPACE/$pytest_coverage_xml_file_name
+    echo 'coverage.xml after:'
+    head $GITHUB_WORKSPACE/$pytest_coverage_xml_file_name
+
+- name: SonarCloud Scan
+  uses: sonarsource/sonarcloud-github-action@master
+  env:
+    GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
+    SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
+
+ + +

file sonar-project.properties +

+ +

Hereunder an example of the file sonar-project.properties

+ +
# https://github.com/pbrod/numdifftools/blob/master/sonar-project.properties
+# https://github.com/pbrod/numdifftools/blob/master/sonar-project_readme.txt
+# https://github.com/SonarSource/sonarcloud-github-action
+
+sonar.organization=copdips
+sonar.projectKey=copdips_reponame
+
+# relative paths to source directories. More details and properties are described
+# in https://sonarcloud.io/documentation/project-administration/narrowing-the-focus/
+sonar.sources=folder_name
+
+# sonar.exclusions must specify till the file extension,
+# whether *.py for python or * for any files. `folder_name/notebooks/` doesn't work.
+sonar.exclusions=folder_name/notebooks/*.py
+
+sonar.projectVersion=${env.build_number}
+# sonar.python.pylint_config=.pylintrc
+sonar.python.version=3.8, 3.9, 3.10
+
+# https://docs.sonarqube.org/latest/analysis/coverage/
+# https://docs.sonarqube.org/latest/analysis/analysis-parameters/
+sonar.tests=tests
+sonar.python.coverage.reportPaths=${env.pytest_coverage_xml_file_name}
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2023/07/python-asyncio-unittest.html b/2023/07/python-asyncio-unittest.html new file mode 100644 index 00000000..6ee1ec85 --- /dev/null +++ b/2023/07/python-asyncio-unittest.html @@ -0,0 +1,913 @@ + + + + + + +Python Asyncio Unittest - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +
+ + +
+ + + +
+

Unittest based on Pytest framework not embedded unittest.

+
+ +

Mocking async http client aiohttp.ClientSession

+ +

Source code

+ +
# file path: root/module_name/foo.py
+# pip install aiohttp
+import aiohttp
+
+
+class ClassFoo:
+    def __init__(self, access_token: str):
+        self.access_token = access_token
+        self.auth_header = {"Authorization": f"Bearer {self.access_token}"}
+        self.base_url = "https://foo.bar.com/api/v1"
+
+    async def get_foo(self, foo_id: str) -> dict:
+        url = f"{self.base_url}/{foo_id}"
+        async with aiohttp.ClientSession(headers=self.auth_header) as session:
+            async with session.get(url) as resp:
+                resp.raise_for_status()
+                return await resp.json()
+
+ +

Unittest with pytest-asyncio

+ +
# file path: root/tests/module_name/test_foo.py
+# pip install pytest pytest-asyncio
+
+from typing import Any
+import pytest
+from unittest.mock import MagicMock, patch, AsyncMock
+from module_name import foo as test_module
+
+TEST_MODULE_PATH = test_module.__name__
+
+
+@pytest.fixture
+def mock_session():
+    with patch(f"{TEST_MODULE_PATH}.aiohttp.ClientSession") as mock_client_session:
+        session = MagicMock()
+        mock_client_session.return_value.__aenter__.return_value = session
+        yield session
+
+
+@pytest.fixture
+def mock_service():
+    access_token = "bar"
+    yield test_module.ApplicationsService(access_token=access_token)
+
+
+@pytest.mark.asyncio  # could be removed if asyncio_mode = "auto"
+async def test_get_foo(mock_session, mock_service):
+    foo_id = "foo"
+    mock_json_response = {"key": "value"}
+
+    mock_response = AsyncMock()
+    mock_response.json.return_value = mock_json_response
+    mock_response.raise_for_status.return_value = None
+
+    mock_session.get.return_value.__aenter__.return_value = mock_response
+
+    response = await mock_service.get_foo(foo_id=foo_id)
+
+    mock_session.get.assert_called_once_with(f"{mock_service.base_url}/{foo_id}")
+    assert response == mock_json_response
+
+ +

If you set asyncio_mode = "auto" (defaults to strict) to your config (pyproject.toml, setup.cfg or pytest.ini) there is no need for the @pytest.mark.asyncio marker.

+ +

Above unittest will success but also raise a warning:

+ +
============================= warnings summary ==============================
+tests/module_name/test_foo.py::test_get_foo
+  root/module_name/test_foo.py:15: RuntimeWarning: coroutine 'AsyncMockMixin._execute_mock_call' was never awaited
+    resp.raise_for_status()
+  Enable tracemalloc to get traceback where the object was allocated.
+  See https://docs.pytest.org/en/stable/how-to/capture-warnings.html#resource-warnings for more info.
+
+ +

This is because resp is an AsyncMock object, resp.raise_for_status() will be an AsyncMockMixin object. But in fact, raise_for_status() is a traditional sync function, it will not be awaited. So we need to mock it with a MagicMock object:

+ +
In [1]: from unittest.mock import AsyncMock, MagicMock
+
+In [2]: a = AsyncMock()
+
+In [3]: a
+Out[3]: <AsyncMock id='140698543883888'>
+
+In [4]: a.raise_for_status()
+Out[4]: <coroutine object AsyncMockMixin._execute_mock_call at 0x7ff6ef43d2a0>
+
+In [5]: a.raise_for_status = MagicMock()
+
+In [6]: a.raise_for_status()
+Out[6]: <MagicMock name='mock.raise_for_status()' id='140698512592176'>
+
+ +

To fix the warning, we need to change the line:

+ +
# replace line:
+mock_response.raise_for_status.return_value = None
+
+# by:
+mock_response.raise_for_status = MagicMock()
+
+ +

Pytest fixture with session scope

+ +

Say I need a session scope fixture to perform a cleanup before all tests and after all tests:

+ +
@pytest.fixture(scope="session", autouse=True)
+async def _clean_up():
+    await pre_tests_function()
+    yield
+    await post_tests_function()
+
+ +

This session scope fixture will be called automatically before all tests and after all tests. But when you run the tests, you will get an error:

+ +
+

ScopeMismatch: You tried to access the ‘function’ scoped fixture ‘event_loop’ with a ‘session’ scoped request object, involved factories

+
+ +

This is because pytest-asyncio create by default a new function scope event loop, but the async fixture _clean_up is session scoped and is using the event loop fixture, where the ScopeMismatch in the error message. To fix this, we need to create a new session scope event loop for the fixture _clean_up:

+ +
@pytest.fixture(scope="session")
+def event_loop():
+    loop = asyncio.get_event_loop()
+    yield loop
+    loop.close()
+
+@pytest.fixture(scope="session", autouse=True)
+async def _clean_up():
+    await pre_tests_function()
+    yield
+    await post_tests_function()
+
+ + +
+ + + + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2023/09/databricks-python-pip-authentication.html b/2023/09/databricks-python-pip-authentication.html new file mode 100644 index 00000000..bd1a2d89 --- /dev/null +++ b/2023/09/databricks-python-pip-authentication.html @@ -0,0 +1,769 @@ + + + + + + +Databricks Python pip authentication - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +
+ + +
+ + + +

Before the Databricks Unit Catalog’s release, we used init scripts to generate the pip.conf file during cluster startup, allowing each cluster its unique auth token. But with init scripts no longer available in the Unit Catalog’s shared mode, an alternative approach is required.

+ +

A workaround involves placing a prepared pip.conf in the Databricks workspace and setting the PIP_CONFIG_FILE environment variable to point to this file. This method, however, presents security concerns: the pip.conf file, containing the auth token, becomes accessible to the entire workspace, potentially exposing it to all users and clusters. See here to check this workaround.

+ +

In contrast, the Unit Catalog’s single mode retains init script availability. Here, the pip auth token is stored securely in a vault and accessed via the Databricks secret scope. Upon cluster startup, the init script fetches the token from the vault, generating the pip.conf file. This approach is considerably more secure than the shared mode alternative.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2023/09/different-ssh-keys-for-different-github.com-accounts.html b/2023/09/different-ssh-keys-for-different-github.com-accounts.html new file mode 100644 index 00000000..11acca10 --- /dev/null +++ b/2023/09/different-ssh-keys-for-different-github.com-accounts.html @@ -0,0 +1,788 @@ + + + + + + +Different ssh keys for different github.com accounts - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

It might be a common case that you have multiple github.com accounts (personal and professional), and you want to use different ssh keys for different github accounts, as github.com does not allow same ssh key for different accounts with “Key is already in use” error.

+ +

To achieve this, you could follow this tutorial:

+ +
    +
  1. Generate ssh keys for each github.com account. For e.g. ~/.ssh/id_rsa and ~/.ssh/id_rsa_pro.
  2. +
  3. +

    Create a ~/.ssh/config file to specify which ssh key to use for which github account.

    + +
    +
     Host github.com
    + HostName github.com
    + IdentityFile ~/.ssh/id_rsa
    + User copdips
    +
    + # The HostName is still github.com, but the host here is github.com-pro, this is the key point.
    + # You can change it to whatever you want
    + Host github.com-pro
    + HostName github.com
    + IdentityFile ~/.ssh/id_rsa_pro
    + User copdips-pro
    +
    +
  4. +
  5. Git clone the repositories by replacing github.com in the git clone ssh url with the ssh alias defined in ~/.ssh/config. +Say the pro ssh clone url is: git@github.com:my-company/repo.git, than you need to rewrite it to git@github.com-pro:my-company/repo.git to be able to use the ssh key ~/.ssh/id_rsa_pro defined in ~/.ssh/config.
  6. +
+ +

In Chrome (so as to Edge), there’s an extension called MultiLogin that allows you to use multiple accounts (for e.g. personal and professional github.com accounts) across different tabs in the same browser instance. So you do not need to keep two browser instances opened at the same time. In Firefox, you even have a better extension called Firefox Multi-Account Containers.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2023/09/github-actions-cache.html b/2023/09/github-actions-cache.html new file mode 100644 index 00000000..c0aec8cb --- /dev/null +++ b/2023/09/github-actions-cache.html @@ -0,0 +1,859 @@ + + + + + + +Github Actions - Cache - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +
+ + +
+ + + +

Life span

+ +

Github Actions cache has a life span of 7 days, and the total size of all caches in a repository is limited to 10 GB.

+ +

Standard Cache

+ +

Cache key should be as specific as possible, so that the post cache restore installation can be reduced or skipped.

+ +

For Python pip install, we could use the following cache key:

+ +
- name: Get pip cache dir
+  run: |
+    os_version=$(cat /etc/os-release | grep -i "version=" | cut -c9- | tr -d '"' | tr ' ' '_')
+    github_workflow_full_path="${GITHUB_WORKFLOW_REF%@*}"
+    python_full_version=$(python -c 'import platform; print(platform.python_version())')
+    node_major_version=$(node --version | cut -d'.' -f1 | tr -d 'v')
+    echo "os_version=$os_version" >> $GITHUB_ENV
+    echo "github_workflow_full_path=$github_workflow_full_path" >> $GITHUB_ENV
+    echo "python_full_version=$python_full_version" >> $GITHUB_ENV
+    echo "PIP_CACHE_DIR=$(pip cache dir)" >> $GITHUB_ENV
+
+- name: cache pip
+  uses: actions/cache@v3
+  with:
+    # path: ${{ env.PIP_CACHE_DIR }}
+    path: ${{ env.pythonLocation }}
+    key: ${{ env.github_workflow_full_path}}-${{ env.os_version }}-${{ env.python_full_version }}-${{ env.node_major_version}}-${{ hashFiles('requirements/*.txt') }}
+
+ +

The cache action repository provides also some Python caching examples.

+ +

pip cache dir vs pip install dir

+ +

The path parameter in actions/cache@v3 could be:

+ +
    +
  • +${{ env.PIP_CACHE_DIR }} if you only want to cache the pip cache dir, so you can skip the Python package download step, but you still need to install the packages.
  • +
  • +${{ env.pythonLocation }} if you want to cache the whole python installation dir, this is useful when you want to cache the site-packages dir, so that the pip install step can be reduced or skipped, this is also why we must use the ${{ env.os_version }}, ${{ env.python_full_version }} in the cache key. In most of cases, this is the best choice.
  • +
+ +

hashFiles

+ +

In Azure Pipelines, there’s similar thing as hashFiles() function, it should be in the form of glob pattern, like requirements/*.txt, but without double quotes, otherwise treated as a static string.

+ +
# Azure Pipelines
+- task: Cache@2
+  inputs:
+    key: 'python | "$(pythonFullVersion)" | "$(osVersion)" | "$(System.TeamProject)" | "$(Build.DefinitionName)" | "$(Agent.JobName)" | requirements/*.txt'
+    path: ...
+  displayName: ...
+
+ +

Otherwise, we can also achieve the same result by some pure bash commands:

+ +
# suppose parameters.requirementsFilePathList is a list of file paths
+- script: |
+    echo REQUIREMENTS_FILE_PATH_LIST_STRING: $REQUIREMENTS_FILE_PATH_LIST_STRING
+    all_files_in_one_line=$(echo $REQUIREMENTS_FILE_PATH_LIST_STRING | jq  '. | join(" ")' -r)
+    echo all_files_in_one_line: $all_files_in_one_line
+    all_files_md5sum=$(cat $all_files_in_one_line | md5sum | awk '{print $1}')
+    echo all_files_md5sum: $all_files_md5sum
+    echo "##vso[task.setvariable variable=pythonRequirementsFilesHash;]$all_files_md5sum"
+  displayName: Set pythonRequirementsFilesHash
+  env:
+    REQUIREMENTS_FILE_PATH_LIST_STRING: "${{ convertToJson(parameters.requirementsFilePathList) }}"
+
+ +

Cache with actions/setup-python

+ +

The action actions/setup-python has built-in functionality for caching and restoring dependencies with cache key. This cache method can only cache the pip cache dir to reduce the Python packages download time like path: ${{ env.PIP_CACHE_DIR }} in above example, but still need to install the packages, which is much slower than caching the package installation location. As the time of writing, the cache source dir (which is the pip cache dir) is generated by the action itself, and cannot be customized.

+ +

The cache key is something like: setup-python-Linux-22.04-Ubuntu-python-3.10.13-pip-308f89683977de8773e433ddf87c874b6bd931347b779ef0ab18f37ecc4fa914 (copied from workflow run log), which is generated as per this answer.

+ +
steps:
+- uses: actions/checkout@v4
+- uses: actions/setup-python@v4
+  with:
+    python-version: '3.10'
+    cache: 'pip' # caching pip dependencies, could be pip, pipenv, or poetry
+    cache-dependency-path: requirements/*.txt
+- run: pip install -r requirements.txt
+
+ +

If cache-dependency-path is not specified, and if the cache type is pip, it will try to find all the requirements.txt files in the repo and hash them to generate the cache key. For cache type with pipenv or poetry, I didn’t test them.

+ + +
+ + + + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2023/09/github-actions-custom-actions.html b/2023/09/github-actions-custom-actions.html new file mode 100644 index 00000000..2b8a97ac --- /dev/null +++ b/2023/09/github-actions-custom-actions.html @@ -0,0 +1,830 @@ + + + + + + +Github Actions - Custom Actions - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

Actions checkout location in workflow

+ +

Actions are automatically checked out by Github Action from the beginning of a workflow run, the checkout path could be found by: env var $GITHUB_ACTION_PATH, github context ${{ github.action_path }}. This is very useful when you need to reference some files or scripts saved in the same repository as the actions.

+ +

+
+```bash
+
+Actions in workflow:
+
+```yaml
+- name: Check out repository code
+  uses: actions/checkout@v4
+
+- name: Use action in the version of the main branch
+  uses:{org_name}/{repo_name}/actions/{action_path}@main
+
+- name: Use action in the version of v1
+  uses:{org_name}/{repo_name}/actions/{action_path}@v1
+
+ +

Actions checkout location:

+ +
../../_actions/actions/checkout
+├── v4
+│   ├── CHANGELOG.md
+│   ├── CODEOWNERS
+│   ├── ...
+
+../../_actions/{org_name}/{repo_name}
+├── main
+│   ├── README.md
+│   └── actions
+│   └── ...
+├── main.completed
+├── v1
+│   ├── README.md
+│   └── actions
+│   └── ...
+└── v1.completed
+
+ +

Multiple actions in single repository

+ +

You can save multiple actions inside a single repository, and use them in the form of uses: org/repo/folder_path@git_ref in a workflow.

+ +

azure/CLI

+ +

Benefits of using azure/CLI over run task:

+ +
    +
  1. azure/CLI runs az commands in an isolated docker container.
  2. +
  3. azure/CLI can choose the CLI version.
  4. +
  5. For some self-hosted runner, may not have “az cli” pre-installed, the Azure/CLI action eliminates the need for complex installation steps.
  6. +
+ +

Can also set shared variables inside a job to be used outside the azure/CLI step, even it’s run inside a docker container.

+ +

Drawbacks:

+ +
    +
  1. slowness: azure/CLI is much slower (around 20s to bootstrap on a ubuntu-latest-4core runner) than standard run step, because it needs to pull the docker image and run the container.
  2. +
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2023/09/github-actions-environment.html b/2023/09/github-actions-environment.html new file mode 100644 index 00000000..74ee8184 --- /dev/null +++ b/2023/09/github-actions-environment.html @@ -0,0 +1,794 @@ + + + + + + +Github Actions - Environment - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +
+ + +
+ + + +

Dynamic environment

+ +

environment is set at job level (not at step level), so we should use the $GITHUB_OUTPUT context to set the environment name dynamically, see here to learn how to pass data between jobs.

+ +

Standard usage for static value is like this:

+ +
jobs:
+  deployment:
+    runs-on: ubuntu-latest
+    environment: production
+    steps:
+      - name: deploy
+        # ...deployment-specific steps
+
+ +

For advanced usage with dynamic value should be like this:

+ +
# call reusable workflow set_target_env.yml to set the target_env
+jobs:
+  set_target_env:
+    uses: ./.github/workflows/set_target_env.yml
+  deployment:
+    runs-on: ubuntu-latest
+    needs: [set_target_env]
+    environment:
+      name: ${{ needs.set_target_env.outputs.workflow_output_target_env }}
+    env:
+      TARGET_ENV: ${{ needs.set_target_env.outputs.workflow_output_target_env }}
+    steps:
+      - run: |
+          echo "TARGET_ENV: $TARGET_ENV"
+      # ...other deployment-specific steps based on $TARGET_ENV
+
+ + + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2023/09/github-actions-error-handling.html b/2023/09/github-actions-error-handling.html new file mode 100644 index 00000000..4c6f7a7a --- /dev/null +++ b/2023/09/github-actions-error-handling.html @@ -0,0 +1,779 @@ + + + + + + +Github Actions - Error handling - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

continue-on-error vs fail-fast

+ +

The doc explains that continue-on-error applies to a single job or single step which defines whether a job or step can continue on its error, while fail-fast applies to the entire matrix which means if the failure of a job in the matrix can stop other running jobs in the matrix. For example:

+ +
    +
  • if fail-fast is set to true, the entire matrix will stop running when one job fails. But if the failed job has continue-on-error set to true, the matrix will continue running, as the failed job is not considered as a failure.
  • +
  • if fail-fast is set to false, all the jobs triggered by the matrix are considered independent, so the failed job will not affect other jobs.
  • +
+ +

When setting continue-on-error at job level only, and no set at step level, if one of the steps fails, the remaining steps wont be executed, the job will get a red failure badge in the Github Actions UI, but the job status will be considered as success.

+ +

Status check functions

+ +

We can also use status check functions if ${{ success() }}, if: ${{ always() }}, if: ${{ cancelled() }}, if: ${{ failure() }} to check the previous step (or job) status.

+ +

In if expression, we can skip the double curly brackets ${{}}, for example: if: success() instead of if: ${{ success() }}

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2023/09/github-actions-python.html b/2023/09/github-actions-python.html new file mode 100644 index 00000000..0beb4191 --- /dev/null +++ b/2023/09/github-actions-python.html @@ -0,0 +1,910 @@ + + + + + + +Github Actions - Python - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +
+ + +
+ + + +

Setting up pip authentication

+ +

PIP_INDEX_URL vs PIP_EXTRA_INDEX_URL

+ +

In most cases, when setting up private Python package artifacts (like Azure DevOps Artifacts, JFrog Artifactory, etc.) are configured to mirror the public PyPi. In such scenarios, we only need to use PIP_INDEX_URL to point to these private artifacts.

+ +

However, some people might use PIP_INDEX_URL point to the public PyPi, and PIP_EXTRA_INDEX_URL to point to the private artifacts. This approach is not recommended, as it results in the public PyPi searched first, followed by the private artifacts. This poses a security risk where a malicious actor can publish a package with the same name as your private one on the public PyPi.

+ +

Auth for Azure DevOps Artifacts

+ +

Auth by Azure SPN crendentials

+ +

In March 2023, there was a great news that Azure Service Principal was been introduced in Azure DevOps, eliminating the use of service account.

+ +
    +
  1. Create a service principal in Azure Active Directory.
  2. +
  3. Add the service principal to the Azure DevOps Artifacts feed with Contributor role. Package publishing (twine upload) needs Contributor role, but package installation (pip install) only needs Reader role.
  4. +
  5. +

    Add SPN credentials to Github Secrets with name AZURE_CREDENTIALS, and value in JSON format:

    + +
    +
     {
    +   "clientId": "xxxxx",
    +   "clientSecret": "xxxxx",
    +   "subscriptionId": "xxxxx",
    +   "tenantId": "xxxxx"
    + }
    +
    +
  6. +
  7. +

    Create env var PIP_INDEX_URL in the workflow, and set it to the Azure DevOps Artifacts feed URL.

    + +
    +
     - uses: actions/checkout@v4
    +
    +
    + - name: Setup Python
    +   uses: actions/setup-python@v4
    +   with:
    +     python-version: ${{ matrix.python-version }}
    +     # see below post of a faster Python cache:
    +     # https://copdips.com/2023/09/github-actions-cache.html#pip-cache-dir-vs-pip-install-dir
    +     cache: pip
    +     cache-dependency-path: requirements/*.txt
    +
    + - name: Azure Login
    +   uses: azure/login@v1
    +   with:
    +     creds: ${{ secrets.AZURE_CREDENTIALS }}
    +
    + - name: Setup Python package feed
    +   run: |
    +     access_token=$(az account get-access-token | jq .accessToken -r)
    +
    +     # setup pip auth
    +     echo "PIP_INDEX_URL=https://:$access_token@pkgs.dev.azure.com/{azdo_org_name}/_packaging/{azdo_artifacts_feed_name}/pypi/simple/" >> $GITHUB_ENV
    +
    +     # setup twine auth
    +     cat > ~/.pypirc <<EOF
    +     [distutils]
    +     index-servers={azdo_artifacts_feed_name}
    +     [{azdo_artifacts_feed_name}]
    +     repository=https://pkgs.dev.azure.com/{azdo_org_name}/_packaging/{azdo_artifacts_feed_name}/pypi/upload
    +     username=build
    +     password=$access_token
    +     EOF
    +
    +     # setup access token for action pypa/gh-action-pypi-publish
    +     echo "ACCESS_TOKEN=$access_token" >> $GITHUB_ENV
    +
    + - name: Install dependencies
    +   run: |
    +     pip install -U pip
    +     pip install -r requirements/requirements.txt
    +
    + - name: Build Python package
    +   run: |
    +     # need to install wheel in advance
    +     python setup.py sdist bdist_wheel
    +     # modern Python uses `python -m build` instead
    +
    + # alternative Python package build and check
    + - name: Build and Check Package
    +   uses: hynek/build-and-inspect-python-package@v1.5
    +
    + - name: Publish Python package by twine
    +   run: |
    +     # need to install twine in advance
    +     twine upload -r {azdo_artifacts_feed_name} dist/*.whl
    +
    + # alternative Python package publish
    + - name: Publish Python package by action
    +   # does not need to install twine in advance
    +   uses: pypa/gh-action-pypi-publish@release/v1
    +   with:
    +     repository-url: "https://pkgs.dev.azure.com/{azdo_org_name}/_packaging/{azdo_artifacts_feed_name}/pypi/upload"
    +     password: ${{ env.ACCESS_TOKEN }}
    +
    + - name: Cleanup secret envs
    +   run: |
    +     echo "PIP_INDEX_URL=" >> $GITHUB_ENV
    +     echo "ACCESS_TOKEN=" >> $GITHUB_ENV
    +
    +
  8. +
+ +

Auth by Azure OpenID Connect (OIDC)

+ +

We can also setup OpenID Connect (OIDC) between Github Action and Azure. It’s practical because we do not need to worry about Azure SPN secret rotation. However, a drawback is that when setting up OIDC, we must add a filter (subject field in the credential.json). This could be a branch name, tag name, pull request, or environment name, we can not use wildcards in the filter, so we have to set up OIDC for each branch, tag, pull request or environment as needed. This is not very practical. For AWS, there’s no such limitation.

+ +

To use Azure OIDC with Github Action, we need to add the following to the workflow:

+ +
...
+permissions:
+  id-token: write
+  contents: read
+
+jobs:
+  a_job:
+    ...
+    steps:
+      - name: Azure login by OIDC
+        uses: azure/login@v1
+        with:
+          # Official doc puts these 3 fields in secrets, but it's not necessary,
+          # as `subject` field in the credential.json prevents other repos from
+          # using the same credential. And these are not sensitive info neither.
+          tenant-id: ${{ vars.AZURE_TENANT_ID }}
+          subscription-id: ${{ vars.AZURE_SUBSCRIPTION_ID }}
+          client-id: ${{ vars.AZURE_CLIENT_ID }}
+
+ + + +
+ + + + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2023/09/github-actions-variables.html b/2023/09/github-actions-variables.html new file mode 100644 index 00000000..78b0192b --- /dev/null +++ b/2023/09/github-actions-variables.html @@ -0,0 +1,1212 @@ + + + + + + +Github Actions - Variables - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +
+ + +
+ + + +

Variables upon Git events

+ +

Suppose we create a new branch named new_branch, and create a pull request (with id 123) from the new branch new_branch to the main branch. +During the pipeline, we can see following predefined variables in different GIT events.

+ +

Check here for variables upon git events in Azure Pipelines.

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
variable name \ git actionon pushon pull requeston merge (after merge, a push event will be triggered)on manual trigger
$GITHUB_REFrefs/heads/new_branchrefs/pull/123/mergerefs/heads/mainrefs/heads/new_branch
$GITHUB_REF_NAMEnew_branch132/mergemainnew_branch
$GITHUB_EVENT_NAMEpushpull_requestpull_request_targetworkflow_dispatch
$GITHUB_REF_TYPEbranchbranchbranchbranch
$GITHUB_SHAlast commit in branchworkflow commit (not merge commit)merge commitlast commit in branch
${{ github.event.head_commit.message }}last commit messageVAR_NOT_EXISTSVAR_NOT_EXISTSVAR_NOT_EXISTS
${{ github.event.pull_request.merge_commit_sha }}VAR_NOT_EXISTSmerge commitmerge commitVAR_NOT_EXISTS
${{ github.event.pull_request.head.sha }}VAR_NOT_EXISTSlast commit in PR (not merge commit)last commit in PR (not merge commit)VAR_NOT_EXISTS
${{ github.event.pull_request.number }}VAR_NOT_EXISTS123123VAR_NOT_EXISTS
${{ github.event.number }}VAR_NOT_EXISTS123123VAR_NOT_EXISTS
${{ github.event.pull_request.merged }}VAR_NOT_EXISTSfalsetrueVAR_NOT_EXISTS
${{ github.event.pull_request.merged_by.login }}VAR_NOT_EXISTSnulluser loginVAR_NOT_EXISTS
${{ github.event.pull_request.merged_by.type }}VAR_NOT_EXISTSnullUser, etcVAR_NOT_EXISTS
${{ github.event.pull_request.title }}VAR_NOT_EXISTSnull or pr titlenull or pr titleVAR_NOT_EXISTS
${{ github.event.pull_request.body}}VAR_NOT_EXISTSnull or pr bodynull or pr bodVAR_NOT_EXISTS
${{ github.event.after }}last SHA in commitlast commit in PR (not merge commit)VAR_NOT_EXISTSVAR_NOT_EXISTS
${{ github.event.action}}VAR_NOT_EXISTSopened, synchronize, edited, reopned, etc..closedVAR_NOT_EXISTS
${{ github.head_ref }}VAR_NOT_EXISTSnew_branchnew_branchVAR_NOT_EXISTS
${{ github.base_ref }}nullmainmainVAR_NOT_EXISTS
+ +

Setting environment variables by Python

+ +

Same approach applies to other languages:

+ +
- name: Create new env vars by Python
+  shell: python
+  run: |
+    import os
+    with open(os.environ["GITHUB_ENV"], "a") as f:
+      f.write("ENV_VAR_1=value_1\nENV_VAR_2=value_2\n")
+
+ +

JSON Variables

+ +

JSON variables with GITHUB_OUTPUT

+ +

When setting a JSON variable in string as $GITHUB_OUTPUT, and using it in a subsequent step, we should use the Github actions expressions syntax. However, the method of using this syntax can vary based on its context. Consider the following example on a Github Ubuntu runner with a bash shell:

+ +
- name: Write json outputs
+  id: write-json-outputs
+  run: |
+    json_raw='{"name":"foo"}'
+    json_quotes_escaped="{\"name\":\"foo\"}"
+    json_quotes_backslash_escaped="{\\\"name\\\":\\\"foo\\\"}"
+    json_ascii="{\x22name\x22: \x22foo\x22}"
+
+    echo "json_raw=$json_raw" >> $GITHUB_OUTPUT
+    echo "json_quotes_escaped=$json_quotes_escaped" >> $GITHUB_OUTPUT
+    echo "json_quotes_backslash_escaped=$json_quotes_backslash_escaped" >> $GITHUB_OUTPUT
+    echo -e "json_ascii=$json_ascii" >> $GITHUB_OUTPUT
+
+    echo "GITHUB_OUTPUT content:"
+    cat $GITHUB_OUTPUT
+
+- name: Show json outputs
+  run: |
+    json_raw_wo_quotes=${{ steps.write-json-outputs.outputs.json_raw }}
+    json_raw="${{ steps.write-json-outputs.outputs.json_raw }}"
+    json_quotes_escaped="${{ steps.write-json-outputs.outputs.json_quotes_escaped }}"
+    json_quotes_backslash_escaped="${{ steps.write-json-outputs.outputs.json_quotes_backslash_escaped }}"
+    json_ascii="${{ steps.write-json-outputs.outputs.json_ascii }}"
+
+    # echo vars from templating inside bash
+    echo "json_raw_wo_quotes: $json_raw_wo_quotes"
+    echo "json_raw: $json_raw"
+    echo "json_quotes_escaped: $json_quotes_escaped"
+    echo "json_quotes_backslash_escaped: $json_quotes_backslash_escaped"
+    echo "json_ascii: $json_ascii"
+
+    # echo vars from env variables
+    echo "JSON_RAW: $JSON_RAW"
+    echo "JSON_QUOTES_ESCAPED: $JSON_QUOTES_ESCAPED"
+    echo "JSON_QUOTES_BACKSLASH_ESCAPED: $JSON_QUOTES_BACKSLASH_ESCAPED"
+    echo "JSON_QUOTES_BACKSLASH_ESCAPED_TO_JSON: $JSON_QUOTES_BACKSLASH_ESCAPED_TO_JSON"
+    echo "JSON_QUOTES_BACKSLASH_ESCAPED_WITH_QUOTES: $JSON_QUOTES_BACKSLASH_ESCAPED_WITH_QUOTES"
+    echo "JSON_ASCII: $JSON_ASCII"
+  env:
+    JSON_RAW: ${{ steps.write-json-outputs.outputs.json_raw }}
+    JSON_QUOTES_ESCAPED: ${{ steps.write-json-outputs.outputs.json_quotes_escaped }}
+    JSON_QUOTES_BACKSLASH_ESCAPED: ${{ steps.write-json-outputs.outputs.json_quotes_backslash_escaped }}
+    JSON_QUOTES_BACKSLASH_ESCAPED_TO_JSON: ${{ toJson(steps.write-json-outputs.outputs.json_quotes_backslash_escaped) }}
+    JSON_QUOTES_BACKSLASH_ESCAPED_WITH_QUOTES: "${{ steps.write-json-outputs.outputs.json_quotes_backslash_escaped }}"
+    JSON_ASCII: ${{ steps.write-json-outputs.outputs.json_ascii }}
+
+ +

When creating the json string, it would be better to not use blank spaces between keys and values, json_raw='{"name":"foo"}' instead of json_raw='{"name": "foo"}, in order to prevent from bash variable mangling issue.

+ +

We have the following output:

+ +
Write json outputs
+  GITHUB_OUTPUT content:
+  json_raw={"name":"foo"}
+  json_quotes_escaped={"name":"foo"}
+  json_quotes_backslash_escaped={\"name\":\"foo\"}
+  json_ascii={"name":"foo"}
+
+Show json outputs
+  json_raw_wo_quotes: {name:foo}
+  json_raw: {name:foo}
+  json_quotes_escaped: {name:foo}
+  json_quotes_backslash_escaped: {"name":"foo"}
+  json_ascii: {name:foo}
+  JSON_RAW: {"name":"foo"}
+  JSON_QUOTES_ESCAPED: {"name":"foo"}
+  JSON_QUOTES_BACKSLASH_ESCAPED: {\"name\":\"foo\"}
+  JSON_QUOTES_BACKSLASH_ESCAPED_TO_JSON: "{\\\"name\\\":\\\"foo\\\"}"
+  JSON_QUOTES_BACKSLASH_ESCAPED_WITH_QUOTES: {\"name\":\"foo\"}
+  JSON_ASCII: {"name":"foo"}
+
+ +

From the output we can see that there’re two ways to have a valid json string in the show step:

+ +
- name: Show json outputs
+  run: |
+    json_quotes_backslash_escaped="${{ steps.write-json-outputs.outputs.json_quotes_backslash_escaped }}"
+    echo "json_quotes_backslash_escaped: $json_quotes_backslash_escaped"
+
+    # echo vars from env
+    echo "JSON_RAW: $JSON_RAW"
+    echo "JSON_QUOTES_ESCAPED: $JSON_QUOTES_ESCAPED"
+    echo "JSON_ASCII: $JSON_ASCII"
+  env:
+    JSON_RAW: ${{ steps.write-json-outputs.outputs.json_raw }}
+    JSON_QUOTES_ESCAPED: ${{ steps.write-json-outputs.outputs.json_quotes_escaped }}
+    JSON_ASCII: ${{ steps.write-json-outputs.outputs.json_ascii }}
+
+ +

Creating a JSON string in GITHUB_OUTPUT without escaping backslashes, like json_quotes_escaped="{\"name\":\"foo\"}", is more concise than "{\\\"name\\\":\\\"foo\\\"}". However, when using ${{ <expression> }} in a bash shell within GitHub Actions, it’s not a valid JSON string. This is because the expressions are processed before the bash shell runs the script, replacing the expression with its value and discarding double quotes. This results in an output like json_raw: {name:foo}. To address this, the toJson function can be used to convert the string into valid JSON.

+ +
- name: Show json outputs
+  run: |
+    # use toJson() to parse the string to a valid json string
+    json_raw="${{ toJson(steps.write-json-outputs.outputs.json_raw) }}"
+    json_quotes_escaped="${{ toJson(steps.write-json-outputs.outputs.json_quotes_escaped) }}"
+    json_ascii="${{ toJson(steps.write-json-outputs.outputs.json_ascii) }}"
+
+    # echo vars from templating inside bash
+    echo "json_raw: $json_raw"
+    echo "json_quotes_escaped: $json_quotes_escaped"
+    echo "json_ascii: $json_ascii"
+
+    # echo vars from env variables
+    echo "JSON_RAW: $JSON_RAW"
+    echo "JSON_QUOTES_ESCAPED: $JSON_QUOTES_ESCAPED"
+    echo "JSON_ASCII: $JSON_ASCII"
+  env:
+    JSON_RAW: ${{ steps.write-json-outputs.outputs.json_raw }}
+    JSON_QUOTES_ESCAPED: ${{ steps.write-json-outputs.outputs.json_quotes_escaped }}
+    JSON_ASCII: ${{ steps.write-json-outputs.outputs.json_ascii }}
+
+ +

Check also the fromJson function to see how to parse json string to object.

+ +

Do not create JSON secrets

+ +

When creating a secret, we should not create a JSON secret. For e.g. the Github action Azure/Login provides an example how to pass creds inputs with a JSON secret:

+ +
- uses: azure/login@v1
+  with:
+    creds: ${{ secrets.AZURE_CREDENTIALS }}
+
+ +

This works but the drawback is that as the curly brackets are stored in the JSON secret, so whenever we want to show { or } in the Github action logs, they will be replaced by ***, as Github actions considers the curly brackets are secret chars. This doesn’t block the successful run of github workflows, but it’s not convenient for debugging.

+ +

A better usage of Azure/Login is also provided in its documentation here:

+ +
- uses: Azure/login@v1
+    with:
+      creds: '{"clientId":"${{ secrets.CLIENT_ID }}","clientSecret":"${{ secrets.CLIENT_SECRET }}","subscriptionId":"${{ secrets.SUBSCRIPTION_ID }}","tenantId":"${{ secrets.TENANT_ID }}"}'
+
+ +

Parsing variables

+ +

Parsing variables with object type

+ +
- run: |
+    echo "github.event: ${{ github.event }}"
+    echo "github.event toJson: $GITHUB_EVENT"
+  env:
+    GITHUB_EVENT: ${{ toJson(github.event) }}
+
+# output:
+github.event: Object
+github.event toJson: {
+  after: 9da8166fcc52c437871a2e903b3e200a35c09a1e,
+  base_ref: null,
+  before: 1448cfbf10fc149b7d200d0a0e15493f41cc8896,
+  ...
+}
+
+ +

echo "github.event toJson: ${{ toJSON(github.event) }}" will raise error, must parse the variable to environment variable $GITHUB_EVENT at first. So when using toJson method to parse object type variable, it is recommended to send the value to an environment variable first.

+ +

Parsing variables with boolean type

+ +

Check with if:

+ +
on:
+  workflow_dispatch:
+    inputs:
+      print_tags:
+        description: 'True to print to STDOUT'
+        required: true
+        type: boolean
+
+jobs:
+  print-tag:
+    runs-on: ubuntu-latest
+    # all the 4 syntaxes below are valid
+    if: inputs.print_tags
+    if: ${{ inputs.print_tags }}
+    if: inputs.print_tags == true
+    if: ${{ inputs.print_tags == true}}
+    steps:
+      - name: Print the input tag to STDOUT
+        run: echo The tags are ${{ inputs.tags }}
+      - name: Print the input tag to STDOUT
+        # in bash, compare boolean with string value
+        run: |
+          if [[ "${{ inputs.print_tags }}" == "true" ]]; then
+            echo The tags are ${{ inputs.tags }}
+          else
+            echo "print_tags is false"
+          fi
+          if [[ "$PRINT_TAGS" == "true" ]]; then
+            echo The tags are ${{ inputs.tags }}
+          else
+            echo "print_tags is false"
+          fi
+        env:
+          PRINT_TAGS: ${{ inputs.print_tags }}
+
+ +

Never use if: ${{ inputs.print_tags }} == false with == outside of {{}}, it will always be true.

+ +

Passing variables

+ +

Passing data between steps inside a job

+ +

Passing by $GITHUB_ENV between steps

+ +

You can make an environment variable available to any subsequent steps in a workflow job by defining or updating the environment variable and writing this to the GITHUB_ENV environment file.

+ +
- run: echo "var_1=value1" >> $GITHUB_ENV
+- run: echo "var_1: $var1"
+
+ +

Passing by $GITHUB_OUTPUT between steps

+ +

Sets a step’s output parameter. Note that the step will need an id to be defined to later retrieve the output value

+ +

Passing data between jobs inside a workflow

+ +

Passing by artifacts between jobs

+ +

You can use the upload-artifact and download-artifact actions to share data (in the forms of a file) between jobs in a workflow.

+ +

To share variables, you can save the variables in a file with format:

+ +
VAR_1=value1
+VAR_2=value2
+
+ +

Then download the file from another job and source it to load the variables:

+ +
- run: |
+    sed "" {downloaded_file_path} >> $GITHUB_ENV
+  shell: bash
+
+ +

Passing by $GITHUB_OUTPUT between jobs

+ +

https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idoutputs

+ +

Passing data between caller workflow and called (reusable) workflow

+ +

Use on.workflow_call.outputs, called workflow outputs are available to all downstream jobs in the caller workflow.

+ +

Passing data between irrelevant workflows

+ + + + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2023/09/github-actions-workflows.html b/2023/09/github-actions-workflows.html new file mode 100644 index 00000000..3b6b2249 --- /dev/null +++ b/2023/09/github-actions-workflows.html @@ -0,0 +1,782 @@ + + + + + + +Github Actions - Workflows - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +
+ + +
+ + + +

Reusable workflows

+ +

Re-run a reusable workflow

+ +

If reusable workflow is not referenced by SHA, for example a branch name, when re-run a workflow, it will not use the latest version of the workflow in that branch, but the same commit SHA of the first attempt. Which means, if you use the git amend push to overwrite the old commit history, the workflow re-run will fail as it cannot find the specific SHA version of the workflow.

+ +

In contrary, if an action is referenced by branch name, it will always use the latest version of the action in that branch upon re-run.

+ +

Cancelling a workflow

+ +

To cancel the current workflow run inside the run itself:

+ +
- name: cancelling
+  uses: andymckay/cancel-action@0.3
+
+ +

We can use if: cancelled() or if: always() to bypass the workflow cancel signal.

+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2023/09/python-asyncio.html b/2023/09/python-asyncio.html new file mode 100644 index 00000000..4fb18064 --- /dev/null +++ b/2023/09/python-asyncio.html @@ -0,0 +1,1167 @@ + + + + + + +Python Asyncio - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +
+ + +
+ + + +

This is not a Python asyncio tutorial. Just some personal quick tips here, and could be updated from time to time.

+ +

greenlet vs gevent

+ +
    +
  • greenlet needs manual event switch.
  • +
  • gevent is based on greenlet. gevent has gevent.monkey.patch_all().
  • +
+ +

@asyncio.coroutine

+ +

From Python 3.8, async def deprecates @asyncio.coroutine

+ +

yield from

+ +

From Python 3.5, await deprecates yield from

+ +

scope of await

+ +

await can only be used in async def except in ipython

+ +

asyncio with queue

+ +

https://copdips.com/2023/01/python-aiohttp-rate-limit.html#example

+ +

aiohttp with rate limit

+ +

https://copdips.com/2023/01/python-aiohttp-rate-limit.html#example

+ +

get_running_loop vs get_event_loop

+ +
    +
  • +get_running_loop raises error if there’s no running loop.
  • +
  • +get_event_loop return running loop if exists, otherwise create one and return it.
  • +
+ +

Awaitable vs Future vs Task vs Coroutine

+ +
    +
  • +Awaitable is an object can be used in an await expression. There are three main types of awaitable objects: coroutines, Tasks, and Futures.
  • +
  • +Coroutine is declared with the async/await syntax is the preferred way of writing asyncio applications. Coroutines can await on Future objects until they either have a result or an exception set, or until they are cancelled. Python coroutines are awaitables and therefore can be awaited from other coroutines
  • +
  • +Future is an awaitable object. A Future represents an eventual result of an asynchronous operation. Not thread-safe.
  • +
  • +Task is subclass of Future that runs a Python coroutine. Not thread-safe. Tasks are used to schedule coroutines concurrently. When a coroutine is wrapped into a Task with functions like asyncio.create_task() the coroutine is automatically scheduled to run soon
  • +
+ +

ensure_future vs create_task

+ +
    +
  • +create_task is high-level introduced in Python 3.7 and accepts only coroutines, returns a Task object which is subclass of Future. create_task must be called inside a running event loop.
  • +
  • +ensure_future is low-level and accepts both coroutines and Futures. Task is subclass of Future. If ensure_future gets a Task, it will return the input Task itself, as Future is ensured. If ensure_future gets a coroutine, it will call create_task to wrap the input coroutine to a Task, then return it.
  • +
  • +create_task must be called inside an event loop, ensure_future can create an event loop if not exists.
  • +
  • +create_task can name the task.
  • +
+ +

create_task source code, ensure_future source code.

+ +

Warning on ensure_future: +Deprecated since version 3.10: Deprecation warning is emitted if obj is not a Future-like object and loop is not specified and there is no running event loop. Coroutine is not a Future-like object.

+ +

await vs await asyncio.wait_for() vs asyncio.shield()

+ +

Almost the same. but wait_for() can set timeout, and shield() can protect a task from being cancelled.

+ +
await task
+
+# throw TimeoutError if timeout
+await asyncio.wait_for(task, timeout)
+
+# still throw TimeoutError if timeout, but task.cancelled()
+# inside of try/catch asyncio.TimeoutError block will be ignored, a
+# nd task continues to run.
+await asyncio.wait_for(asyncio.shield(task), 1)
+
+ +
import asyncio
+
+async def delay(seconds):
+    print(f"start sleep {seconds}")
+    await asyncio.sleep(seconds)
+    print(f"end sleep")
+    return seconds
+
+async def main():
+    delay_task = asyncio.create_task(delay(2))
+    try:
+        result = await asyncio.wait_for(asyncio.shield(delay_task), 1)
+        print("return value:", result)
+    except asyncio.TimeoutError:
+        # shield() does not protect from timeout, so it throws TimeoutError
+        print("timeout")
+        # shield() does protect from being cancelled
+        print("whether the task is cancelled:", delay_task.cancelled())
+        # from where it throws TimeoutError, continue to run, and wait for it to finish
+        result = await delay_task
+        print("return value:", result)
+
+asyncio.run(main())
+
+"""
+start sleep 2
+timeout
+whether the task is cancelled: False
+end sleep
+return value: 2
+"""
+
+ +

simple aiohttp download demo

+ +
import asyncio
+import os
+
+import aiohttp
+
+
+async def download_img(session, url):
+    file_name = os.path.basename(url)
+    print(f"Downloading:{file_name}")
+    response = await session.get(url, ssl=False)
+    content = await response.content.read()
+    with open(file_name, mode="wb") as file:
+        file.write(content)
+    print(f"Done:{file_name}")
+
+
+async def main():
+    urls = [
+        "https://tenfei05.cfp.cn/creative/vcg/800/new/VCG41560336195.jpg",
+        "https://tenfei03.cfp.cn/creative/vcg/800/new/VCG41688057449.jpg",
+    ]
+    async with aiohttp.ClientSession() as session:
+        # download_img(session, url) returns a coroutine
+        tasks = [asyncio.create_task(download_img(session, url)) for url in urls]
+        await asyncio.wait(tasks)
+
+
+# loop = asyncio.get_event_loop()
+# loop.run_until_complete(main())
+
+# above commented 2 lines are low level API and could be replaced by
+# below asyncio.run() introduced by python 3.7.
+# asyncio.get_event_loop() creates new event loop if doesn't exist.
+# asyncio.run() raises exception if already in a event loop.
+# This function always creates a new event loop and closes it at the end.
+# It should be used as a main entry point for asyncio programs, and should
+# ideally only be called once.
+asyncio.run(main())
+
+ +

aiohttp rate limit example

+ +

https://copdips.com/2023/01/python-aiohttp-rate-limit.html

+ +

run coroutines concurrently as asyncio Tasks

+ +

await coroutines directly will run the coroutines sequentially, so 2 sleeps of 2s takes 4s:

+ +
import asyncio
+import time
+
+print(f"started at {time.strftime('%X')}")
+await asyncio.sleep(2)
+await asyncio.sleep(2)
+print(f"started at {time.strftime('%X')}")
+
+# output, duration 4s
+started at 23:48:19
+started at 23:48:23
+
+ +

Wrap the coroutines into tasks to run concurrently, 2 sleeps of 2s takes 2s:

+ +
import asyncio
+import time
+
+print(f"started at {time.strftime('%X')}")
+
+# create_task() must be inside a running event loop,
+# often created by asyncio.run()
+task1 = asyncio.create_task(asyncio.sleep(2))
+task2 = asyncio.create_task(asyncio.sleep(2))
+
+await task1
+await task2
+# or: await asyncio.wait([task1, task2])
+
+print(f"started at {time.strftime('%X')}")
+
+# output, duration 2s
+started at 23:49:08
+started at 23:49:10
+
+ +

schedule task without asyncio.create_task

+ +

The popular asyncio tasks usage is :

+ +
import asyncio
+import time
+
+async def main()
+    start = time.time()
+    tasks = [
+        asyncio.create_task(asyncio.sleep(2)),
+        asyncio.create_task(asyncio.sleep(2)),
+    ]
+    await asyncio.wait(tasks)
+    print(time.time() - start)
+
+asyncio.run(main())
+
+# output
+2.0010249614715576
+
+ +

asyncio.create_task() must be run inside a event loop, which is created by asyncio.run(). We can also not use asyncio.create_task() to create tasks too:

+ +
import asyncio
+import time
+
+coroutines = [
+    asyncio.sleep(2),
+    asyncio.sleep(2)
+]
+
+start = time.time()
+
+# asyncio.run() creates an event loop,
+# then asyncio.wait() wraps the coroutines into tasks.
+asyncio.run(asyncio.wait(coroutines))
+
+print(time.time() - start)
+
+# output
+2.0026962757110596
+
+ +

wait vs gather

+ +
    +
  • +wait is a low-level api, gather is a high-level api.
  • +
  • +wait has more options than gather: +
      +
    • async def wait(fs, *, loop=None, timeout=None, return_when=ALL_COMPLETED):
    • +
    • def gather(*coros_or_futures, loop=None, return_exceptions=False):
    • +
    +
  • +
  • +wait accepts lists of coroutines/Futures (asyncio.wait(tasks)), gather accepts each element a coroutine/Futures (asyncio.gather(*tasks)).
  • +
  • +wait returns two futures in a tuple: (done, pending), it’s a coroutine async def. To get the wait results: [d.result() for d in done], gather returns the results directly, it’s a standard def.
  • +
  • +

    gather can group tasks, and can also cancel groups of tasks:

    + +
    +
    async def main():
    +  group1 = asyncio.gather(f1(), f1())
    +  group2 = asyncio.gather(f2(), f2())
    +  group1.cancel()
    +  # if return_exceptions=False, `asyncio.exceptions.CancelledError` will be raised,
    +  # if return_exceptions=True, the exception will be returned in the results.
    +  # return_exceptions default value is False
    +  all_groups = await asyncio.gather(group1, group2, return_exceptions=True)
    +  print(all_groups)
    +
    +
  • +
  • If the wait task is cancelled, it simply throws an CancelledError and the waited tasks remain intact. Need to call task.cancel() to cancel the remaining tasks. If gather is cancelled, all submitted awaitables (that have not completed yet) are also cancelled. https://stackoverflow.com/a/64370162 +
  • +
+ +

task.add_done_callback

+ +
import asyncio
+from asyncio import Future
+from functools import partial
+
+
+async def f1():
+    await asyncio.sleep(2)
+    return "f1"
+
+
+def callback1(future: Future):
+    print(future.result())
+    print("this is callback1")
+
+
+def callback2(t1, future: Future):
+    print(t1)
+    print(future.result())
+
+
+async def main():
+
+    task1 = asyncio.create_task(f1())
+
+    # bind callback1 to task1
+    task1.add_done_callback(callback1)
+
+    # bind callback2 to task2 with param
+    task1.add_done_callback(partial(callback2, "this is param t1"))
+
+    # await task1
+    tasks = [task1]
+    await asyncio.wait(tasks)
+
+
+asyncio.run(main())
+
+ +

run_until_complete vs run_forever

+ +

run_until_complete is run_forever with _run_until_complete_cb as callback.

+ +
def _run_until_complete_cb(fut):
+    if not fut.cancelled():
+        exc = fut.exception()
+        if isinstance(exc, (SystemExit, KeyboardInterrupt)):
+            # Issue #22429: run_forever() already finished, no need to
+            # stop it.
+            return
+    futures._get_loop(fut).stop()
+
+ +

run_in_executor (or to_thread) to run un-asyncable functions

+ +

to_thread() calls loop = events.get_running_loop() and loop.run_in_executor() internally, source code here:

+ +
import asyncio
+import time
+from concurrent.futures import ThreadPoolExecutor
+
+
+# non asyncable function, will be wrapped into async task by loop.run_in_executor()
+def download_img(url):
+    print(f"Downloading:{url}")
+    time.sleep(1)
+    print(f"Downloaded:{url}")
+
+
+async def main():
+    executor = ThreadPoolExecutor(2)
+
+    loop = asyncio.get_running_loop()
+    tasks = []
+    for i in range(10):
+        # ThreadPoolExecutor is also the default executor, set None to use it.
+        # t = loop.run_in_executor(None, download_img, i)
+        t = loop.run_in_executor(executor, download_img, i)
+        tasks.append(t)
+
+    await asyncio.wait(tasks)
+
+
+asyncio.run(main())
+
+ +

run_in_executor() calls [ThreadPoolExecutor by default], and can also use ProcessPoolExecutor, source code here:

+ +
# asyncio.base_events.py
+def run_in_executor(self, executor, func, *args):
+    self._check_closed()
+    if self._debug:
+        self._check_callback(func, 'run_in_executor')
+    if executor is None:
+        executor = self._default_executor
+        # Only check when the default executor is being used
+        self._check_default_executor()
+        if executor is None:
+            executor = concurrent.futures.ThreadPoolExecutor(
+                thread_name_prefix='asyncio'
+            )
+            self._default_executor = executor
+    return futures.wrap_future(
+        executor.submit(func, *args), loop=self)
+
+ + +
+ +
+ + + + + + + +

+ Tags: + + + , + + + + +

+ + + + + + +

Updated:

+ +
+ + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2023/10/github-actions-get-azure-keyvault-secrets-action.html b/2023/10/github-actions-get-azure-keyvault-secrets-action.html new file mode 100644 index 00000000..5757eac9 --- /dev/null +++ b/2023/10/github-actions-get-azure-keyvault-secrets-action.html @@ -0,0 +1,807 @@ + + + + + + +Github Actions - copdips/get-azure-keyvault-secrets-action - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

Recently, I began a new project that requires migrating some process from Azure Pipelines to Github Actions. One of the tasks involves retrieving secrets from Azure Key Vault.

+ +

In Azure Pipelines, we have an official task called AzureKeyVault@2 designed for this purpose. However, its official counterpart in Github Actions, Azure/get-keyvault-secrets@v1, has been deprecated. The recommended alternative is Azure CLI. While Azure CLI is a suitable option, it operates in a bash shell without multithreading. If numerous secrets need to be fetched, this can be time-consuming.

+ +

Over the past weekend, I decided to write my own action using Python, leveraging asyncio. I avoided any additional third party Python modules like requests, aiohttp, or httpx, so no pip install needed.

+ +

As anticipated, the pure Python solution is notably faster than using the Azure CLI, and even surpasses the speed of the Azure Pipelines task AzureKeyVault@2. In my tests, it was able to retrieve the all the secrets from an Azure Key Vault within seconds.

+ +

The source code is at: copdips/get-azure-keyvault-secrets-action

+ +

And hereunder is the usage:

+ +
# in the calling workflow, user should first login to Azure
+- uses: Azure/login@v1
+  with:
+    # creds: ${{secrets.AZURE_CREDENTIALS}} is not recommended due to json secrets security concerns.
+    creds: '{"clientId":"${{ secrets.CLIENT_ID }}","clientSecret":"${{ secrets.CLIENT_SECRET }}","subscriptionId":"${{ secrets.SUBSCRIPTION_ID }}","tenantId":"${{ secrets.TENANT_ID }}"}'
+
+- name: Get Azure KeyVault secrets
+  id: get-azure-keyvault-secrets
+  uses: copdips/get-azure-keyvault-secrets-action@v1
+  with:
+    keyvault: {your_azure_keyvault_name}
+
+# Suppose there's a secret named client-secret in the Azure Key Vault,
+# so an env var named CLIENT_SECRET should be created by the action.
+# You won't see the secret value in the workflow log as it's masked by Github automatically.
+- name: Use secrets from env var
+  run: |
+    echo $CLIENT_SECRET
+    echo ${{ env.CLIENT_SECRET }}
+
+- name: Use secrets from output
+  run: |
+    echo $JSON_SECRETS | jq .CLIENT_SECRET -r
+  env:
+    JSON_SECRETS: ${{ steps.get-azure-keyvault-secrets.outputs.json }}
+
+ + + +
+ + + + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/2023/10/hashing-files.html b/2023/10/hashing-files.html new file mode 100644 index 00000000..5e3bd42c --- /dev/null +++ b/2023/10/hashing-files.html @@ -0,0 +1,821 @@ + + + + + + +Hashing files - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + + + + + + + + +
+ + + + + +
+ + + + + +
+ +
+

+ +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +
+ + +
+ + + +

During CI/CD processes, and particularly during CI, we frequently hash dependency files to create cache keys (referred to as key input in Github Action actions/cache and key parameter in Azure pipelines Cache@2 task). However, the default hash functions come with certain limitations like this comment. To address this, we can use the following pure Bash shell command to manually generate the hash value.

+ +

For Github Actions, we can use following snippet:

+ +
# github actions example
+inputs:
+  req-files:
+    description: >
+      requirements files separated by comma or space, glob pattern is allowed.
+      e.g. "requirements/*.txt, requirements.txt"
+    required: true
+runs:
+  using: "composite"
+  steps:
+    - name: Compute hash key
+      shell: bash
+      env:
+        REQ_FILES: ${{ inputs.req-files }}
+      run: |
+        files=$(echo "$REQ_FILES" | tr "," " ")
+        files_sep_by_space=""
+        for file in $files; do
+            files_sep_by_space="$files_sep_by_space $(ls $file | tr '\n' ' ')"
+        done
+        files_sep_by_space=$(echo $files_sep_by_space | tr ' ' '\n' | sort | uniq | tr '\n' ' ')
+        files_hash=$(cat $files_sep_by_space | md5sum | awk '{print $1}')
+        echo "files_hash: $files_hash"
+
+ +

For Azure pipelines, the process is nearly identical to the above Github Action example. The only difference is that we first need to convert the reqFiles parameter from an object to a string. But if you set the parameter type to string (as in the Github Action), the process becomes identical.

+ +
# azure pipelines example
+parameters:
+  - name: reqFiles
+    displayName: >
+      requirements files, glob pattern is allowed.
+      e.g.:
+      - requirements/*.txt
+      - requirements.txt
+    type: object
+  steps:
+    - script: |
+        files=$(echo "$REQ_FILES_JSON" | jq  '. | join(" ")' -r)
+        files_sep_by_space=""
+        for file in $files; do
+            files_sep_by_space="$files_sep_by_space $(ls $file | tr '\n' ' ')"
+        done
+        files_sep_by_space=$(echo $files_sep_by_space | tr ' ' '\n' | sort | uniq | tr '\n' ' ')
+        files_hash=$(cat $files_sep_by_space | md5sum | awk '{print $1}')
+        echo "files_hash: $files_hash"
+      displayName: Compute hash key
+      env:
+        REQ_FILES_JSON: "${{ convertToJson(parameters.reqFiles) }}"
+
+ +

When creating the cache key, we also need to include os version, the one provided by Github action and Azure pipelines environment vars are not precise enough, they do not give patch version number. We can generate the full os version by the following command cat /etc/os-release | grep -i "version=" | cut -c9- | tr -d '"' | tr ' ' '_'

+ + +
+ + + + + + + +
+ + + + + +
+ + +
+ + +

Leave a comment

+
+ +
+ + +
+ + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/CNAME b/CNAME new file mode 100644 index 00000000..0cae7cad --- /dev/null +++ b/CNAME @@ -0,0 +1 @@ +copdips.com \ No newline at end of file diff --git a/assets/css/main.css b/assets/css/main.css new file mode 100644 index 00000000..f1dba25c --- /dev/null +++ b/assets/css/main.css @@ -0,0 +1,5 @@ +.page__content .notice,.page__content .notice--primary,.page__content .notice--info,.page__content .notice--warning,.page__content .notice--success,.page__content .notice--danger{color:#222831}.page__footer{color:#fff !important}.page__footer-follow .social-icons .svg-inline--fa{color:inherit}/*! + * Minimal Mistakes Jekyll Theme 4.24.0 by Michael Rose + * Copyright 2013-2020 Michael Rose - mademistakes.com | @mmistakes + * Licensed under MIT (https://github.com/mmistakes/minimal-mistakes/blob/master/LICENSE) +*/.mfp-counter{font-family:Georgia,Times,serif}.mfp-bg{top:0;left:0;width:100%;height:100%;z-index:1042;overflow:hidden;position:fixed;background:#000;opacity:.8;filter:alpha(opacity=80)}.mfp-wrap{top:0;left:0;width:100%;height:100%;z-index:1043;position:fixed;outline:none !important;-webkit-backface-visibility:hidden}.mfp-container{text-align:center;position:absolute;width:100%;height:100%;left:0;top:0;padding:0 8px;-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}.mfp-container:before{content:"";display:inline-block;height:100%;vertical-align:middle}.mfp-align-top .mfp-container:before{display:none}.mfp-content{position:relative;display:inline-block;vertical-align:middle;margin:0 auto;text-align:left;z-index:1045}.mfp-inline-holder .mfp-content,.mfp-ajax-holder .mfp-content{width:100%;cursor:auto}.mfp-ajax-cur{cursor:progress}.mfp-zoom-out-cur,.mfp-zoom-out-cur .mfp-image-holder .mfp-close{cursor:-moz-zoom-out;cursor:-webkit-zoom-out;cursor:zoom-out}.mfp-zoom{cursor:pointer;cursor:-webkit-zoom-in;cursor:-moz-zoom-in;cursor:zoom-in}.mfp-auto-cursor .mfp-content{cursor:auto}.mfp-close,.mfp-arrow,.mfp-preloader,.mfp-counter{-webkit-user-select:none;-moz-user-select:none;user-select:none}.mfp-loading.mfp-figure{display:none}.mfp-hide{display:none !important}.mfp-preloader{color:#ccc;position:absolute;top:50%;width:auto;text-align:center;margin-top:-0.8em;left:8px;right:8px;z-index:1044}.mfp-preloader a{color:#ccc}.mfp-preloader a:hover{color:#fff}.mfp-s-ready .mfp-preloader{display:none}.mfp-s-error .mfp-content{display:none}button.mfp-close,button.mfp-arrow{overflow:visible;cursor:pointer;background:rgba(0,0,0,0);border:0;-webkit-appearance:none;display:block;outline:none;padding:0;z-index:1046;-webkit-box-shadow:none;box-shadow:none}button::-moz-focus-inner{padding:0;border:0}.mfp-close{width:44px;height:44px;line-height:44px;position:absolute;right:0;top:0;text-decoration:none;text-align:center;opacity:1;filter:alpha(opacity=100);padding:0 0 18px 10px;color:#fff;font-style:normal;font-size:28px;font-family:Georgia,Times,serif}.mfp-close:hover,.mfp-close:focus{opacity:1;filter:alpha(opacity=100)}.mfp-close:active{top:1px}.mfp-close-btn-in .mfp-close{color:#fff}.mfp-image-holder .mfp-close,.mfp-iframe-holder .mfp-close{color:#fff;right:-6px;text-align:right;padding-right:6px;width:100%}.mfp-counter{position:absolute;top:0;right:0;color:#ccc;font-size:12px;line-height:18px}.mfp-arrow{position:absolute;opacity:1;filter:alpha(opacity=100);margin:0;top:50%;margin-top:-55px;padding:0;width:90px;height:110px;-webkit-tap-highlight-color:rgba(0,0,0,0)}.mfp-arrow:active{margin-top:-54px}.mfp-arrow:hover,.mfp-arrow:focus{opacity:1;filter:alpha(opacity=100)}.mfp-arrow:before,.mfp-arrow:after,.mfp-arrow .mfp-b,.mfp-arrow .mfp-a{content:"";display:block;width:0;height:0;position:absolute;left:0;top:0;margin-top:35px;margin-left:35px;border:medium inset rgba(0,0,0,0)}.mfp-arrow:after,.mfp-arrow .mfp-a{border-top-width:13px;border-bottom-width:13px;top:8px}.mfp-arrow:before,.mfp-arrow .mfp-b{border-top-width:21px;border-bottom-width:21px;opacity:.7}.mfp-arrow-left{left:0}.mfp-arrow-left:after,.mfp-arrow-left .mfp-a{border-right:17px solid #fff;margin-left:31px}.mfp-arrow-left:before,.mfp-arrow-left .mfp-b{margin-left:25px;border-right:27px solid #fff}.mfp-arrow-right{right:0}.mfp-arrow-right:after,.mfp-arrow-right .mfp-a{border-left:17px solid #fff;margin-left:39px}.mfp-arrow-right:before,.mfp-arrow-right .mfp-b{border-left:27px solid #fff}.mfp-iframe-holder{padding-top:40px;padding-bottom:40px}.mfp-iframe-holder .mfp-content{line-height:0;width:100%;max-width:900px}.mfp-iframe-holder .mfp-close{top:-40px}.mfp-iframe-scaler{width:100%;height:0;overflow:hidden;padding-top:56.25%}.mfp-iframe-scaler iframe{position:absolute;display:block;top:0;left:0;width:100%;height:100%;box-shadow:0 0 8px rgba(0,0,0,.6);background:#000}img.mfp-img{width:auto;max-width:100%;height:auto;display:block;line-height:0;-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box;padding:40px 0 40px;margin:0 auto}.mfp-figure{line-height:0}.mfp-figure:after{content:"";position:absolute;left:0;top:40px;bottom:40px;display:block;right:0;width:auto;height:auto;z-index:-1;box-shadow:0 0 8px rgba(0,0,0,.6);background:#444}.mfp-figure small{color:#bdbdbd;display:block;font-size:12px;line-height:14px}.mfp-figure figure{margin:0}.mfp-figure figcaption{margin-top:0;margin-bottom:0}.mfp-bottom-bar{margin-top:-36px;position:absolute;top:100%;left:0;width:100%;cursor:auto}.mfp-title{text-align:left;line-height:18px;color:#f3f3f3;word-wrap:break-word;padding-right:36px}.mfp-image-holder .mfp-content{max-width:100%}.mfp-gallery .mfp-image-holder .mfp-figure{cursor:pointer}@media screen and (max-width: 800px)and (orientation: landscape),screen and (max-height: 300px){.mfp-img-mobile .mfp-image-holder{padding-left:0;padding-right:0}.mfp-img-mobile img.mfp-img{padding:0}.mfp-img-mobile .mfp-figure:after{top:0;bottom:0}.mfp-img-mobile .mfp-figure small{display:inline;margin-left:5px}.mfp-img-mobile .mfp-bottom-bar{background:rgba(0,0,0,.6);bottom:0;margin:0;top:auto;padding:3px 5px;position:fixed;-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}.mfp-img-mobile .mfp-bottom-bar:empty{padding:0}.mfp-img-mobile .mfp-counter{right:5px;top:3px}.mfp-img-mobile .mfp-close{top:0;right:0;width:35px;height:35px;line-height:35px;background:rgba(0,0,0,.6);position:fixed;text-align:center;padding:0}}@media all and (max-width: 900px){.mfp-arrow{-webkit-transform:scale(0.75);transform:scale(0.75)}.mfp-arrow-left{-webkit-transform-origin:0;transform-origin:0}.mfp-arrow-right{-webkit-transform-origin:100%;transform-origin:100%}.mfp-container{padding-left:6px;padding-right:6px}}.mfp-ie7 .mfp-img{padding:0}.mfp-ie7 .mfp-bottom-bar{width:600px;left:50%;margin-left:-300px;margin-top:5px;padding-bottom:5px}.mfp-ie7 .mfp-container{padding:0}.mfp-ie7 .mfp-content{padding-top:44px}.mfp-ie7 .mfp-close{top:0;right:0;padding-top:0}button:focus,a:focus{outline:thin dotted #666;outline:5px auto #666;outline-offset:-2px}*{box-sizing:border-box}html{box-sizing:border-box;background-color:#eee;font-size:16px;-webkit-text-size-adjust:100%;-ms-text-size-adjust:100%}@media(min-width: 48em){html{font-size:18px}}@media(min-width: 64em){html{font-size:20px}}@media(min-width: 95em){html{font-size:22px}}body{margin:0}::-moz-selection{color:#fff;background:#000}::selection{color:#fff;background:#000}article,aside,details,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}audio,canvas,video{display:inline-block;*display:inline;*zoom:1}audio:not([controls]){display:none}a{color:#f16334}a:hover,a:active{outline:0}sub,sup{position:relative;font-size:75%;line-height:0;vertical-align:baseline}sup{top:-0.5em}sub{bottom:-0.25em}img{max-width:100%;width:auto\9 ;height:auto;vertical-align:middle;border:0;-ms-interpolation-mode:bicubic}#map_canvas img,.google-maps img{max-width:none}button,input,select,textarea{margin:0;font-size:100%;vertical-align:middle}button,input{*overflow:visible;line-height:normal}button::-moz-focus-inner,input::-moz-focus-inner{padding:0;border:0}button,html input[type=button],input[type=reset],input[type=submit]{-webkit-appearance:button;cursor:pointer}label,select,button,input[type=button],input[type=reset],input[type=submit],input[type=radio],input[type=checkbox]{cursor:pointer}input[type=search]{box-sizing:border-box;-webkit-appearance:textfield}input[type=search]::-webkit-search-decoration,input[type=search]::-webkit-search-cancel-button{-webkit-appearance:none}textarea{overflow:auto;vertical-align:top}html{position:relative;min-height:100%}body{margin:0;padding:0;color:#222831;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;line-height:1.5}body.overflow--hidden{overflow:hidden}h1,h2,h3,h4,h5,h6{margin:2em 0 .5em;line-height:1.2;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-weight:bold}h1{margin-top:0;font-size:1.563em}h2{font-size:1.25em}h3{font-size:1.125em}h4{font-size:1.0625em}h5{font-size:1.03125em}h6{font-size:1em}small,.small{font-size:.75em}p{margin-bottom:1.3em}u,ins{text-decoration:none;border-bottom:1px solid #222831}u a,ins a{color:inherit}del a{color:inherit}p,pre,blockquote,ul,ol,dl,figure,table,fieldset{orphans:3;widows:3}abbr[title],abbr[data-original-title]{text-decoration:none;cursor:help;border-bottom:1px dotted #222831}blockquote{margin:2em 1em 2em 0;padding-left:1em;padding-right:1em;font-style:italic;border-left:.25em solid #666}blockquote cite{font-style:italic}blockquote cite:before{content:"—";padding-right:5px}a:visited{color:#f7a185}a:hover{color:#b54a27;outline:0}tt,code,kbd,samp,pre{font-family:Monaco,Consolas,"Lucida Console",monospace}pre{overflow-x:auto}p>code,a>code,li>code,figcaption>code,td>code{margin:0;padding:0;line-height:1.5;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:1em;color:#c72557;background:#f9f2f4;border-radius:4px}p>code:before,p>code:after,a>code:before,a>code:after,li>code:before,li>code:after,figcaption>code:before,figcaption>code:after,td>code:before,td>code:after{letter-spacing:-0.2em;content:" "}hr{display:block;margin:1em 0;border:0;border-top:1px solid #cecfd1}ul li,ol li{margin-bottom:.5em}li ul,li ol{margin-top:.5em}figure{display:-webkit-box;display:flex;-webkit-box-pack:justify;justify-content:space-between;-webkit-box-align:start;align-items:flex-start;flex-wrap:wrap;margin:2em 0}figure img,figure iframe,figure .fluid-width-video-wrapper{margin-bottom:1em}figure img{width:100%;border-radius:4px;-webkit-transition:all .2s ease-in-out;transition:all .2s ease-in-out}figure>a{display:block}@media(min-width: 37.5em){figure.half>a,figure.half>img{width:calc(50% - .5em)}}figure.half figcaption{width:100%}@media(min-width: 37.5em){figure.third>a,figure.third>img{width:calc(33.3333% - .5em)}}figure.third figcaption{width:100%}figcaption{margin-bottom:.5em;color:#393e46;font-family:Georgia,Times,serif;font-size:.75em}figcaption a{-webkit-transition:all .2s ease-in-out;transition:all .2s ease-in-out}figcaption a:hover{color:#b54a27}svg:not(:root){overflow:hidden}nav ul{margin:0;padding:0}nav li{list-style:none}nav a{text-decoration:none}nav ul li,nav ol li{margin-bottom:0}nav li ul,nav li ol{margin-top:0}b,i,strong,em,blockquote,p,q,span,figure,img,h1,h2,header,input,a,tr,td,form button,input[type=submit],.btn,.highlight,.archive__item-teaser{-webkit-transition:all .2s ease-in-out;transition:all .2s ease-in-out}form{margin:0 0 5px 0;padding:1em;background-color:#f2f3f3}form fieldset{margin-bottom:5px;padding:0;border-width:0}form legend{display:block;width:100%;margin-bottom:10px;*margin-left:-7px;padding:0;color:#222831;border:0;white-space:normal}form p{margin-bottom:2.5px}form ul{list-style-type:none;margin:0 0 5px 0;padding:0}form br{display:none}label,input,button,select,textarea{vertical-align:baseline;*vertical-align:middle}input,button,select,textarea{box-sizing:border-box;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif}label{display:block;margin-bottom:.25em;color:#222831;cursor:pointer}label small{font-size:.75em}label input,label textarea,label select{display:block}input,textarea,select{display:inline-block;width:100%;padding:.25em;margin-bottom:.5em;color:#222831;background-color:#eee;border:#cecfd1;border-radius:4px;box-shadow:0 1px 1px rgba(0,0,0,.125)}.input-mini{width:60px}.input-small{width:90px}input[type=image],input[type=checkbox],input[type=radio]{width:auto;height:auto;padding:0;margin:3px 0;*margin-top:0;line-height:normal;cursor:pointer;border-radius:0;border:0 \9 ;box-shadow:none}input[type=checkbox],input[type=radio]{box-sizing:border-box;padding:0;*width:13px;*height:13px}input[type=image]{border:0}input[type=file]{width:auto;padding:initial;line-height:initial;border:initial;background-color:rgba(0,0,0,0);background-color:initial;box-shadow:none}input[type=button],input[type=reset],input[type=submit]{width:auto;height:auto;cursor:pointer;*overflow:visible}select,input[type=file]{*margin-top:4px}select{width:auto;background-color:#fff}select[multiple],select[size]{height:auto}textarea{resize:vertical;height:auto;overflow:auto;vertical-align:top}input[type=hidden]{display:none}.form{position:relative}.radio,.checkbox{padding-left:18px;font-weight:normal}.radio input[type=radio],.checkbox input[type=checkbox]{float:left;margin-left:-18px}.radio.inline,.checkbox.inline{display:inline-block;padding-top:5px;margin-bottom:0;vertical-align:middle}.radio.inline+.radio.inline,.checkbox.inline+.checkbox.inline{margin-left:10px}input[disabled],select[disabled],textarea[disabled],input[readonly],select[readonly],textarea[readonly]{opacity:.5;cursor:not-allowed}input:focus,textarea:focus{border-color:#666;outline:0;outline:thin dotted \9 ;box-shadow:inset 0 1px 3px rgba(34,40,49,.06),0 0 5px rgba(102,102,102,.7)}input[type=file]:focus,input[type=radio]:focus,input[type=checkbox]:focus,select:focus{box-shadow:none}.help-block,.help-inline{color:#393e46}.help-block{display:block;margin-bottom:1em;line-height:1em}.help-inline{display:inline-block;vertical-align:middle;padding-left:5px}.form-group{margin-bottom:5px;padding:0;border-width:0}.form-inline input,.form-inline textarea,.form-inline select{display:inline-block;margin-bottom:0}.form-inline label{display:inline-block}.form-inline .radio,.form-inline .checkbox,.form-inline .radio{padding-left:0;margin-bottom:0;vertical-align:middle}.form-inline .radio input[type=radio],.form-inline .checkbox input[type=checkbox]{float:left;margin-left:0;margin-right:3px}.form-search input,.form-search textarea,.form-search select{display:inline-block;margin-bottom:0}.form-search .search-query{padding-left:14px;padding-right:14px;margin-bottom:0;border-radius:14px}.form-search label{display:inline-block}.form-search .radio,.form-search .checkbox,.form-inline .radio{padding-left:0;margin-bottom:0;vertical-align:middle}.form-search .radio input[type=radio],.form-search .checkbox input[type=checkbox]{float:left;margin-left:0;margin-right:3px}.form--loading:before{content:""}.form--loading .form__spinner{display:block}.form:before{position:absolute;top:0;left:0;width:100%;height:100%;background-color:rgba(255,255,255,.7);z-index:10}.form__spinner{display:none;position:absolute;top:50%;left:50%;z-index:11}table{display:block;margin-bottom:1em;width:100%;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:.75em;border-collapse:collapse;overflow-x:auto}table+table{margin-top:1em}thead{background-color:#cecfd1;border-bottom:2px solid #9b9b9d}th{padding:.5em;font-weight:bold;text-align:left}td{padding:.5em;border-bottom:1px solid #9b9b9d}tr,td,th{vertical-align:middle}@-webkit-keyframes intro{0%{opacity:0}100%{opacity:1}}@keyframes intro{0%{opacity:0}100%{opacity:1}}.btn{display:inline-block;margin-bottom:.25em;padding:.5em 1em;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:.75em;font-weight:bold;text-align:center;text-decoration:none;border-width:0;border-radius:4px;cursor:pointer}.btn .icon{margin-right:.5em}.btn .icon+.hidden{margin-left:-0.5em}.btn--primary{background-color:#666;color:#fff}.btn--primary:visited{background-color:#666;color:#fff}.btn--primary:hover{background-color:#525252;color:#fff}.btn--inverse{background-color:#fff;color:#3d4144;border:1px solid #cecfd1}.btn--inverse:visited{background-color:#fff;color:#3d4144}.btn--inverse:hover{background-color:#ccc;color:#3d4144}.btn--light-outline{background-color:rgba(0,0,0,0);color:#fff;border:1px solid #fff}.btn--light-outline:visited{background-color:rgba(0,0,0,0);color:#fff}.btn--light-outline:hover{background-color:rgba(0,0,0,.2);color:#fff}.btn--success{background-color:#27ae60;color:#fff}.btn--success:visited{background-color:#27ae60;color:#fff}.btn--success:hover{background-color:#1f8b4d;color:#fff}.btn--warning{background-color:#e67e22;color:#fff}.btn--warning:visited{background-color:#e67e22;color:#fff}.btn--warning:hover{background-color:#b8651b;color:#fff}.btn--danger{background-color:#c0392b;color:#fff}.btn--danger:visited{background-color:#c0392b;color:#fff}.btn--danger:hover{background-color:#9a2e22;color:#fff}.btn--info{background-color:#03a9f4;color:#fff}.btn--info:visited{background-color:#03a9f4;color:#fff}.btn--info:hover{background-color:#0287c3;color:#fff}.btn--facebook{background-color:#3b5998;color:#fff}.btn--facebook:visited{background-color:#3b5998;color:#fff}.btn--facebook:hover{background-color:#2f477a;color:#fff}.btn--twitter{background-color:#55acee;color:#fff}.btn--twitter:visited{background-color:#55acee;color:#fff}.btn--twitter:hover{background-color:#448abe;color:#fff}.btn--linkedin{background-color:#007bb6;color:#fff}.btn--linkedin:visited{background-color:#007bb6;color:#fff}.btn--linkedin:hover{background-color:#006292;color:#fff}.btn--block{display:block;width:100%}.btn--block+.btn--block{margin-top:.25em}.btn--disabled{pointer-events:none;cursor:not-allowed;filter:alpha(opacity=65);box-shadow:none;opacity:.65}.btn--x-large{font-size:1.25em}.btn--large{font-size:1em}.btn--small{font-size:.6875em}.notice{margin:2em 0 !important;padding:1em;color:#222831;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:.75em !important;text-indent:initial;background-color:#e4e5e6;border-radius:4px;box-shadow:0 1px 1px rgba(189,193,196,.25)}.notice h4{margin-top:0 !important;margin-bottom:.75em;line-height:inherit}.page__content .notice h4{margin-bottom:0;font-size:1em}.notice p:last-child{margin-bottom:0 !important}.notice h4+p{margin-top:0;padding-top:0}.notice a{color:#aaaeb0}.notice a:hover{color:#5f6162}blockquote.notice{border-left-color:#aaaeb0}.notice code{background-color:#e9eaea}.notice pre code{background-color:inherit}.notice ul:last-child{margin-bottom:0}.notice--primary{margin:2em 0 !important;padding:1em;color:#222831;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:.75em !important;text-indent:initial;background-color:#d3d3d3;border-radius:4px;box-shadow:0 1px 1px rgba(102,102,102,.25)}.notice--primary h4{margin-top:0 !important;margin-bottom:.75em;line-height:inherit}.page__content .notice--primary h4{margin-bottom:0;font-size:1em}.notice--primary p:last-child{margin-bottom:0 !important}.notice--primary h4+p{margin-top:0;padding-top:0}.notice--primary a{color:#5c5c5c}.notice--primary a:hover{color:#333}blockquote.notice--primary{border-left-color:#5c5c5c}.notice--primary code{background-color:#e0e0e0}.notice--primary pre code{background-color:inherit}.notice--primary ul:last-child{margin-bottom:0}.notice--info{margin:2em 0 !important;padding:1em;color:#222831;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:.75em !important;text-indent:initial;background-color:#bfe0ef;border-radius:4px;box-shadow:0 1px 1px rgba(3,169,244,.25)}.notice--info h4{margin-top:0 !important;margin-bottom:.75em;line-height:inherit}.page__content .notice--info h4{margin-bottom:0;font-size:1em}.notice--info p:last-child{margin-bottom:0 !important}.notice--info h4+p{margin-top:0;padding-top:0}.notice--info a{color:#0398dc}.notice--info a:hover{color:#02557a}blockquote.notice--info{border-left-color:#0398dc}.notice--info code{background-color:#d7e7ef}.notice--info pre code{background-color:inherit}.notice--info ul:last-child{margin-bottom:0}.notice--warning{margin:2em 0 !important;padding:1em;color:#222831;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:.75em !important;text-indent:initial;background-color:#ecd8c5;border-radius:4px;box-shadow:0 1px 1px rgba(230,126,34,.25)}.notice--warning h4{margin-top:0 !important;margin-bottom:.75em;line-height:inherit}.page__content .notice--warning h4{margin-bottom:0;font-size:1em}.notice--warning p:last-child{margin-bottom:0 !important}.notice--warning h4+p{margin-top:0;padding-top:0}.notice--warning a{color:#cf711f}.notice--warning a:hover{color:#733f11}blockquote.notice--warning{border-left-color:#cf711f}.notice--warning code{background-color:#ede3da}.notice--warning pre code{background-color:inherit}.notice--warning ul:last-child{margin-bottom:0}.notice--success{margin:2em 0 !important;padding:1em;color:#222831;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:.75em !important;text-indent:initial;background-color:#c6e1d2;border-radius:4px;box-shadow:0 1px 1px rgba(39,174,96,.25)}.notice--success h4{margin-top:0 !important;margin-bottom:.75em;line-height:inherit}.page__content .notice--success h4{margin-bottom:0;font-size:1em}.notice--success p:last-child{margin-bottom:0 !important}.notice--success h4+p{margin-top:0;padding-top:0}.notice--success a{color:#239d56}.notice--success a:hover{color:#145730}blockquote.notice--success{border-left-color:#239d56}.notice--success code{background-color:#dae8e0}.notice--success pre code{background-color:inherit}.notice--success ul:last-child{margin-bottom:0}.notice--danger{margin:2em 0 !important;padding:1em;color:#222831;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:.75em !important;text-indent:initial;background-color:#e5cac7;border-radius:4px;box-shadow:0 1px 1px rgba(192,57,43,.25)}.notice--danger h4{margin-top:0 !important;margin-bottom:.75em;line-height:inherit}.page__content .notice--danger h4{margin-bottom:0;font-size:1em}.notice--danger p:last-child{margin-bottom:0 !important}.notice--danger h4+p{margin-top:0;padding-top:0}.notice--danger a{color:#ad3327}.notice--danger a:hover{color:#601d16}blockquote.notice--danger{border-left-color:#ad3327}.notice--danger code{background-color:#e9dcdb}.notice--danger pre code{background-color:inherit}.notice--danger ul:last-child{margin-bottom:0}.masthead{position:relative;border-bottom:1px solid #cecfd1;-webkit-animation:intro .3s both;animation:intro .3s both;-webkit-animation-delay:.15s;animation-delay:.15s;z-index:20}.masthead__inner-wrap{clear:both;margin-left:auto;margin-right:auto;padding:1em;max-width:100%;display:-webkit-box;display:-ms-flexbox;display:flex;-webkit-box-pack:justify;-ms-flex-pack:justify;justify-content:space-between;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif}.masthead__inner-wrap::after{clear:both;content:"";display:table}@media(min-width: 95em){.masthead__inner-wrap{max-width:1520px}}.masthead__inner-wrap nav{z-index:10}.masthead__inner-wrap a{text-decoration:none}.site-logo img{max-height:2rem}.site-title{display:-webkit-box;display:-ms-flexbox;display:flex;-ms-flex-item-align:center;align-self:center;font-weight:bold}.site-subtitle{display:block;font-size:.625em}.masthead__menu{float:left;margin-left:0;margin-right:0;width:100%;clear:both}.masthead__menu .site-nav{margin-left:0}@media(min-width: 37.5em){.masthead__menu .site-nav{float:right}}.masthead__menu ul{margin:0;padding:0;clear:both;list-style-type:none}.masthead__menu-item{display:block;list-style-type:none;white-space:nowrap}.masthead__menu-item--lg{padding-right:2em;font-weight:700}.breadcrumbs{clear:both;margin:0 auto;max-width:100%;padding-left:1em;padding-right:1em;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;-webkit-animation:intro .3s both;animation:intro .3s both;-webkit-animation-delay:.3s;animation-delay:.3s}.breadcrumbs::after{clear:both;content:"";display:table}@media(min-width: 95em){.breadcrumbs{max-width:1520px}}.breadcrumbs ol{padding:0;list-style:none;font-size:.75em}@media(min-width: 64em){.breadcrumbs ol{float:right;width:calc(100% - 200px)}}@media(min-width: 95em){.breadcrumbs ol{width:calc(100% - 300px)}}.breadcrumbs li{display:inline}.breadcrumbs .current{font-weight:bold}.pagination{clear:both;float:left;margin-top:1em;padding-top:1em;width:100%}.pagination::after{clear:both;content:"";display:table}.pagination ul{margin:0;padding:0;list-style-type:none;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif}.pagination li{display:block;float:left;margin-left:-1px}.pagination li a{display:block;margin-bottom:.25em;padding:.5em 1em;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:14px;font-weight:bold;line-height:1.5;text-align:center;text-decoration:none;color:#393e46;border:1px solid #9b9b9d;border-radius:0}.pagination li a:hover{color:#b54a27}.pagination li a.current,.pagination li a.current.disabled{color:#fff;background:#666}.pagination li a.disabled{color:rgba(57,62,70,.5);pointer-events:none;cursor:not-allowed}.pagination li:first-child{margin-left:0}.pagination li:first-child a{border-top-left-radius:4px;border-bottom-left-radius:4px}.pagination li:last-child a{border-top-right-radius:4px;border-bottom-right-radius:4px}.pagination--pager{display:block;padding:1em 2em;float:left;width:50%;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:1em;font-weight:bold;text-align:center;text-decoration:none;color:#393e46;border:1px solid #9b9b9d;border-radius:4px}.pagination--pager:hover{background-color:#393e46;color:#fff}.pagination--pager:first-child{border-top-right-radius:0;border-bottom-right-radius:0}.pagination--pager:last-child{margin-left:-1px;border-top-left-radius:0;border-bottom-left-radius:0}.pagination--pager.disabled{color:rgba(57,62,70,.5);pointer-events:none;cursor:not-allowed}.page__content+.pagination,.page__meta+.pagination,.comment__date+.pagination,.page__share+.pagination,.page__comments+.pagination{margin-top:2em;padding-top:2em;border-top:1px solid #cecfd1}.greedy-nav{position:relative;display:-webkit-box;display:-ms-flexbox;display:flex;-webkit-box-align:center;-ms-flex-align:center;align-items:center;min-height:2em;background:#eee}.greedy-nav a{display:block;margin:0 1rem;color:#222831;text-decoration:none;-webkit-transition:none;transition:none}.greedy-nav a:hover{color:#222831}.greedy-nav a.site-logo{margin-left:0;margin-right:.5rem}.greedy-nav a.site-title{margin-left:0}.greedy-nav img{-webkit-transition:none;transition:none}.greedy-nav__toggle{-ms-flex-item-align:center;align-self:center;height:2rem;border:0;outline:none;background-color:rgba(0,0,0,0);cursor:pointer}.greedy-nav .visible-links{display:-webkit-box;display:-ms-flexbox;display:flex;-webkit-box-pack:end;-ms-flex-pack:end;justify-content:flex-end;-webkit-box-flex:1;-ms-flex:1;flex:1;overflow:hidden}.greedy-nav .visible-links li{-webkit-box-flex:0;-ms-flex:none;flex:none}.greedy-nav .visible-links a{position:relative}.greedy-nav .visible-links a:before{content:"";position:absolute;left:0;bottom:0;height:4px;background:#666;width:100%;-webkit-transition:all .2s ease-in-out;transition:all .2s ease-in-out;-webkit-transform:scaleX(0) translate3d(0, 0, 0);transform:scaleX(0) translate3d(0, 0, 0)}.greedy-nav .visible-links a:hover:before{-webkit-transform:scaleX(1);-ms-transform:scaleX(1);transform:scaleX(1)}.greedy-nav .hidden-links{position:absolute;top:100%;right:0;margin-top:15px;padding:5px;border:1px solid #cecfd1;border-radius:4px;background:#eee;-webkit-box-shadow:0 2px 4px 0 rgba(0,0,0,.16),0 2px 10px 0 rgba(0,0,0,.12);box-shadow:0 2px 4px 0 rgba(0,0,0,.16),0 2px 10px 0 rgba(0,0,0,.12)}.greedy-nav .hidden-links.hidden{display:none}.greedy-nav .hidden-links a{margin:0;padding:10px 20px;font-size:1em}.greedy-nav .hidden-links a:hover{color:#222831;background:#d3d4d6}.greedy-nav .hidden-links:before{content:"";position:absolute;top:-11px;right:10px;width:0;border-style:solid;border-width:0 10px 10px;border-color:#cecfd1 rgba(0,0,0,0);display:block;z-index:0}.greedy-nav .hidden-links:after{content:"";position:absolute;top:-10px;right:10px;width:0;border-style:solid;border-width:0 10px 10px;border-color:#eee rgba(0,0,0,0);display:block;z-index:1}.greedy-nav .hidden-links li{display:block;border-bottom:1px solid #cecfd1}.greedy-nav .hidden-links li:last-child{border-bottom:none}.no-js .greedy-nav .visible-links{-ms-flex-wrap:wrap;flex-wrap:wrap;overflow:visible}.nav__list{margin-bottom:1.5em}.nav__list input[type=checkbox],.nav__list label{display:none}@media(max-width: 63.9375em){.nav__list label{position:relative;display:inline-block;padding:.5em 2.5em .5em 1em;color:#7a8288;font-size:.75em;font-weight:bold;border:1px solid #bdc1c4;border-radius:4px;z-index:20;-webkit-transition:.2s ease-out;transition:.2s ease-out;cursor:pointer}.nav__list label:before,.nav__list label:after{content:"";position:absolute;right:1em;top:1.25em;width:.75em;height:.125em;line-height:1;background-color:#7a8288;-webkit-transition:.2s ease-out;transition:.2s ease-out}.nav__list label:after{-webkit-transform:rotate(90deg);-ms-transform:rotate(90deg);transform:rotate(90deg)}.nav__list label:hover{color:#fff;border-color:#7a8288;background-color:#333}.nav__list label:hover:before,.nav__list label:hover:after{background-color:#fff}.nav__list input:checked+label{color:#fff;background-color:#333}.nav__list input:checked+label:before,.nav__list input:checked+label:after{background-color:#fff}.nav__list label:hover:after{-webkit-transform:rotate(90deg);-ms-transform:rotate(90deg);transform:rotate(90deg)}.nav__list input:checked+label:hover:after{-webkit-transform:rotate(0);-ms-transform:rotate(0);transform:rotate(0)}.nav__list ul{margin-bottom:1em}.nav__list a{display:block;padding:.25em 0}}@media(max-width: 63.9375em)and (min-width: 64em){.nav__list a{padding-top:.125em;padding-bottom:.125em}}@media(max-width: 63.9375em){.nav__list a:hover{text-decoration:underline}}.nav__list .nav__items{margin:0;font-size:1.25rem}.nav__list .nav__items a{color:inherit}.nav__list .nav__items .active{margin-left:-0.5em;padding-left:.5em;padding-right:.5em;font-weight:bold}@media(max-width: 63.9375em){.nav__list .nav__items{position:relative;max-height:0;opacity:0%;overflow:hidden;z-index:10;-webkit-transition:.3s ease-in-out;transition:.3s ease-in-out;-webkit-transform:translate(0, 10%);-ms-transform:translate(0, 10%);transform:translate(0, 10%)}}@media(max-width: 63.9375em){.nav__list input:checked~.nav__items{-webkit-transition:.5s ease-in-out;transition:.5s ease-in-out;max-height:9999px;overflow:visible;opacity:1;margin-top:1em;-webkit-transform:translate(0, 0);-ms-transform:translate(0, 0);transform:translate(0, 0)}}.nav__title{margin:0;padding:.5rem .75rem;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:1em;font-weight:bold}.nav__sub-title{display:block;margin:.5rem 0;padding:.25rem 0;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:.75em;font-weight:bold;text-transform:uppercase;border-bottom:1px solid #cecfd1}.toc{font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;color:#7a8288;background-color:#eee;border:1px solid #cecfd1;border-radius:4px;-webkit-box-shadow:0 1px 1px rgba(0,0,0,.125);box-shadow:0 1px 1px rgba(0,0,0,.125)}.toc .nav__title{color:#fff;font-size:.75em;background:#666;border-top-left-radius:4px;border-top-right-radius:4px}.toc .active a{background-color:#e0e0e0;color:#3d4144}.toc__menu{margin:0;padding:0;width:100%;list-style:none;font-size:.75em}@media(min-width: 64em){.toc__menu{font-size:.6875em}}.toc__menu a{display:block;padding:.25rem .75rem;color:#393e46;font-weight:bold;line-height:1.5;border-bottom:1px solid #cecfd1}.toc__menu a:hover{color:#222831}.toc__menu li ul>li a{padding-left:1.25rem;font-weight:normal}.toc__menu li ul li ul>li a{padding-left:1.75rem}.toc__menu li ul li ul li ul>li a{padding-left:2.25rem}.toc__menu li ul li ul li ul li ul>li a{padding-left:2.75rem}.toc__menu li ul li ul li ul li ul li ul>li a{padding-left:3.25rem}.page__footer{clear:both;float:left;margin-left:0;margin-right:0;width:100%;margin-top:3em;color:#393e46;-webkit-animation:intro .3s both;animation:intro .3s both;-webkit-animation-delay:.45s;animation-delay:.45s;background-color:#666}.page__footer::after{clear:both;content:"";display:table}.page__footer footer{clear:both;margin-left:auto;margin-right:auto;margin-top:2em;max-width:100%;padding:0 1em 2em}.page__footer footer::after{clear:both;content:"";display:table}@media(min-width: 95em){.page__footer footer{max-width:1520px}}.page__footer a{color:inherit;text-decoration:none}.page__footer a:hover{text-decoration:underline}.page__footer .fas,.page__footer .fab,.page__footer .far,.page__footer .fal{color:#393e46}.page__footer-copyright{font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:.6875em}.page__footer-follow ul{margin:0;padding:0;list-style-type:none}.page__footer-follow li{display:inline-block;padding-top:5px;padding-bottom:5px;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:.75em;text-transform:uppercase}.page__footer-follow li+li:before{content:"";padding-right:5px}.page__footer-follow a{padding-right:10px;font-weight:bold}.page__footer-follow .social-icons a{white-space:nowrap}.layout--search .archive__item-teaser{margin-bottom:.25em}.search__toggle{margin-left:1rem;margin-right:1rem;height:2rem;border:0;outline:none;color:#666;background-color:rgba(0,0,0,0);cursor:pointer;-webkit-transition:.2s;transition:.2s}.search__toggle:hover{color:#4d4d4d}.search-icon{width:100%;height:100%}.search-content{display:none;visibility:hidden;padding-top:1em;padding-bottom:1em}.search-content__inner-wrap{width:100%;margin-left:auto;margin-right:auto;padding-left:1em;padding-right:1em;-webkit-animation:intro .3s both;animation:intro .3s both;-webkit-animation-delay:.15s;animation-delay:.15s}@media(min-width: 95em){.search-content__inner-wrap{max-width:1520px}}.search-content__form{background-color:rgba(0,0,0,0)}.search-content .search-input{display:block;margin-bottom:0;padding:0;border:none;outline:none;box-shadow:none;background-color:rgba(0,0,0,0);font-size:1.563em}@media(min-width: 64em){.search-content .search-input{font-size:1.953em}}@media(min-width: 95em){.search-content .search-input{font-size:2.441em}}.search-content.is--visible{display:block;visibility:visible}.search-content.is--visible::after{content:"";display:block}.search-content .results__found{margin-top:.5em;font-size:.75em}.search-content .archive__item{margin-bottom:2em}@media(min-width: 64em){.search-content .archive__item{width:75%}}@media(min-width: 95em){.search-content .archive__item{width:50%}}.search-content .archive__item-title{margin-top:0}.search-content .archive__item-excerpt{margin-bottom:0}.ais-search-box{max-width:100% !important;margin-bottom:2em}.archive__item-title .ais-Highlight{color:#666;font-style:normal;text-decoration:underline}.archive__item-excerpt .ais-Highlight{color:#666;font-style:normal;font-weight:bold}div.highlighter-rouge,figure.highlight{position:relative;margin-bottom:1em;background:#263238;color:#eff;font-family:Monaco,Consolas,"Lucida Console",monospace;font-size:.75em;line-height:1.8;border-radius:4px}div.highlighter-rouge>pre,div.highlighter-rouge pre.highlight,figure.highlight>pre,figure.highlight pre.highlight{margin:0;padding:1em}.highlight table{margin-bottom:0;font-size:1em;border:0}.highlight table td{padding:0;width:calc(100% - 1em);border:0}.highlight table td.gutter,.highlight table td.rouge-gutter{padding-right:1em;width:1em;color:#b2ccd6;border-right:1px solid #b2ccd6;text-align:right}.highlight table td.code,.highlight table td.rouge-code{padding-left:1em}.highlight table pre{margin:0}.highlight pre{width:100%}.highlight .hll{background-color:#eff}.highlight .c{color:#b2ccd6}.highlight .err{color:#f07178}.highlight .k{color:#c792ea}.highlight .l{color:#f78c6c}.highlight .n{color:#eff}.highlight .o{color:#89ddff}.highlight .p{color:#eff}.highlight .cm{color:#b2ccd6}.highlight .cp{color:#b2ccd6}.highlight .c1{color:#b2ccd6}.highlight .cs{color:#b2ccd6}.highlight .gd{color:#f07178}.highlight .ge{font-style:italic}.highlight .gh{color:#eff;font-weight:bold}.highlight .gi{color:#c3e88d}.highlight .gp{color:#b2ccd6;font-weight:bold}.highlight .gs{font-weight:bold}.highlight .gu{color:#89ddff;font-weight:bold}.highlight .kc{color:#c792ea}.highlight .kd{color:#c792ea}.highlight .kn{color:#89ddff}.highlight .kp{color:#c792ea}.highlight .kr{color:#c792ea}.highlight .kt{color:#ffcb6b}.highlight .ld{color:#c3e88d}.highlight .m{color:#f78c6c}.highlight .s{color:#c3e88d}.highlight .na{color:#82aaff}.highlight .nb{color:#eff}.highlight .nc{color:#ffcb6b}.highlight .no{color:#f07178}.highlight .nd{color:#89ddff}.highlight .ni{color:#eff}.highlight .ne{color:#f07178}.highlight .nf{color:#82aaff}.highlight .nl{color:#eff}.highlight .nn{color:#ffcb6b}.highlight .nx{color:#82aaff}.highlight .py{color:#eff}.highlight .nt{color:#89ddff}.highlight .nv{color:#f07178}.highlight .ow{color:#89ddff}.highlight .w{color:#eff}.highlight .mf{color:#f78c6c}.highlight .mh{color:#f78c6c}.highlight .mi{color:#f78c6c}.highlight .mo{color:#f78c6c}.highlight .sb{color:#c3e88d}.highlight .sc{color:#eff}.highlight .sd{color:#b2ccd6}.highlight .s2{color:#c3e88d}.highlight .se{color:#f78c6c}.highlight .sh{color:#c3e88d}.highlight .si{color:#f78c6c}.highlight .sx{color:#c3e88d}.highlight .sr{color:#c3e88d}.highlight .s1{color:#c3e88d}.highlight .ss{color:#c3e88d}.highlight .bp{color:#eff}.highlight .vc{color:#f07178}.highlight .vg{color:#f07178}.highlight .vi{color:#f07178}.highlight .il{color:#f78c6c}.gist th,.gist td{border-bottom:0}.hidden,.is--hidden{display:none;visibility:hidden}.load{display:none}.transparent{opacity:0}.visually-hidden,.screen-reader-text,.screen-reader-text span,.screen-reader-shortcut{position:absolute !important;clip:rect(1px, 1px, 1px, 1px);height:1px !important;width:1px !important;border:0 !important;overflow:hidden}body:hover .visually-hidden a,body:hover .visually-hidden input,body:hover .visually-hidden button{display:none !important}.screen-reader-text:focus,.screen-reader-shortcut:focus{clip:auto !important;height:auto !important;width:auto !important;display:block;font-size:1em;font-weight:bold;padding:15px 23px 14px;background:#fff;z-index:100000;text-decoration:none;box-shadow:0 0 2px 2px rgba(0,0,0,.6)}.skip-link{position:fixed;z-index:20;margin:0;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;white-space:nowrap}.skip-link li{height:0;width:0;list-style:none}.text-left{text-align:left}.text-center{text-align:center}.text-right{text-align:right}.text-justify{text-align:justify}.text-nowrap{white-space:nowrap}.task-list{padding:0}.task-list li{list-style-type:none}.task-list .task-list-item-checkbox{margin-right:.5em;opacity:1}.task-list .task-list{margin-left:1em}.cf{clear:both}.wrapper{margin-left:auto;margin-right:auto;width:100%}.align-left{display:block;margin-left:auto;margin-right:auto}@media(min-width: 37.5em){.align-left{float:left;margin-right:1em}}.align-right{display:block;margin-left:auto;margin-right:auto}@media(min-width: 37.5em){.align-right{float:right;margin-left:1em}}.align-center{display:block;margin-left:auto;margin-right:auto}@media(min-width: 64em){.full{margin-right:-20.3389830508% !important}}.icon{display:inline-block;fill:currentColor;width:1em;height:1.1em;line-height:1;position:relative;top:-0.1em;vertical-align:middle}.social-icons .fas,.social-icons .fab,.social-icons .far,.social-icons .fal{color:#222831}.social-icons .fa-behance,.social-icons .fa-behance-square{color:#1769ff}.social-icons .fa-bitbucket{color:#205081}.social-icons .fa-dribbble,.social-icons .fa-dribble-square{color:#ea4c89}.social-icons .fa-facebook,.social-icons .fa-facebook-square,.social-icons .fa-facebook-f{color:#3b5998}.social-icons .fa-flickr{color:#ff0084}.social-icons .fa-foursquare{color:#0072b1}.social-icons .fa-github,.social-icons .fa-github-alt,.social-icons .fa-github-square{color:#171516}.social-icons .fa-gitlab{color:#e24329}.social-icons .fa-instagram{color:#517fa4}.social-icons .fa-keybase{color:#ef7639}.social-icons .fa-lastfm,.social-icons .fa-lastfm-square{color:#d51007}.social-icons .fa-linkedin,.social-icons .fa-linkedin-in{color:#007bb6}.social-icons .fa-mastodon,.social-icons .fa-mastodon-square{color:#2b90d9}.social-icons .fa-pinterest,.social-icons .fa-pinterest-p,.social-icons .fa-pinterest-square{color:#cb2027}.social-icons .fa-reddit{color:#ff4500}.social-icons .fa-rss,.social-icons .fa-rss-square{color:#fa9b39}.social-icons .fa-soundcloud{color:#f30}.social-icons .fa-stack-exchange,.social-icons .fa-stack-overflow{color:#fe7a15}.social-icons .fa-tumblr,.social-icons .fa-tumblr-square{color:#32506d}.social-icons .fa-twitter,.social-icons .fa-twitter-square{color:#55acee}.social-icons .fa-vimeo,.social-icons .fa-vimeo-square,.social-icons .fa-vimeo-v{color:#1ab7ea}.social-icons .fa-vine{color:#00bf8f}.social-icons .fa-youtube{color:#b00}.social-icons .fa-xing,.social-icons .fa-xing-square{color:#006567}.navicon{position:relative;width:1.5rem;height:.25rem;background:#666;margin:auto;-webkit-transition:.3s;transition:.3s}.navicon:before,.navicon:after{content:"";position:absolute;left:0;width:1.5rem;height:.25rem;background:#666;-webkit-transition:.3s;transition:.3s}.navicon:before{top:-0.5rem}.navicon:after{bottom:-0.5rem}.close .navicon{background:rgba(0,0,0,0)}.close .navicon:before,.close .navicon:after{-webkit-transform-origin:50% 50%;-ms-transform-origin:50% 50%;transform-origin:50% 50%;top:0;width:1.5rem}.close .navicon:before{-webkit-transform:rotate3d(0, 0, 1, 45deg);transform:rotate3d(0, 0, 1, 45deg)}.close .navicon:after{-webkit-transform:rotate3d(0, 0, 1, -45deg);transform:rotate3d(0, 0, 1, -45deg)}@supports(pointer-events: none){.greedy-nav__toggle:before{content:"";position:fixed;top:0;left:0;width:100%;height:100%;opacity:0;background-color:#eee;-webkit-transition:all .2s ease-in-out;transition:all .2s ease-in-out;pointer-events:none}}.greedy-nav__toggle.close:before{opacity:.9;-webkit-transition:all .2s ease-in-out;transition:all .2s ease-in-out;pointer-events:auto}.greedy-nav__toggle:hover .navicon,.greedy-nav__toggle:hover .navicon:before,.greedy-nav__toggle:hover .navicon:after{background:#4d4d4d}.greedy-nav__toggle:hover.close .navicon{background:rgba(0,0,0,0)}@media(min-width: 64em){.sticky{clear:both;position:-webkit-sticky;position:sticky;top:2em}.sticky::after{clear:both;content:"";display:table}.sticky>*{display:block}}.well{min-height:20px;padding:19px;margin-bottom:20px;background-color:#f5f5f5;border:1px solid #e3e3e3;border-radius:4px;box-shadow:inset 0 1px 1px rgba(0,0,0,.05)}.show-modal{overflow:hidden;position:relative}.show-modal:before{position:absolute;content:"";top:0;left:0;width:100%;height:100%;z-index:999;background-color:rgba(255,255,255,.85)}.show-modal .modal{display:block}.modal{display:none;position:fixed;width:300px;top:50%;left:50%;margin-left:-150px;margin-top:-150px;min-height:0;z-index:9999;background:#fff;border:1px solid #cecfd1;border-radius:4px;box-shadow:0 1px 1px rgba(0,0,0,.125)}.modal__title{margin:0;padding:.5em 1em}.modal__supporting-text{padding:0 1em .5em 1em}.modal__actions{padding:.5em 1em;border-top:1px solid #cecfd1}.footnote{color:#9ba1a6;text-decoration:none}.footnotes{color:#9ba1a6}.footnotes ol,.footnotes li,.footnotes p{margin-bottom:0;font-size:.75em}a.reversefootnote{color:#7a8288;text-decoration:none}a.reversefootnote:hover{text-decoration:underline}.required{color:#c0392b;font-weight:bold}.gsc-control-cse table,.gsc-control-cse tr,.gsc-control-cse td{border:0}.responsive-video-container{position:relative;margin-bottom:1em;padding-bottom:56.25%;height:0;overflow:hidden;max-width:100%}.responsive-video-container iframe,.responsive-video-container object,.responsive-video-container embed{position:absolute;top:0;left:0;width:100%;height:100%}:-webkit-full-screen-ancestor .masthead,:-webkit-full-screen-ancestor .page__footer{position:static}#main{clear:both;margin-left:auto;margin-right:auto;padding-left:1em;padding-right:1em;-webkit-animation:intro .3s both;animation:intro .3s both;max-width:100%;-webkit-animation-delay:.15s;animation-delay:.15s}#main::after{clear:both;content:"";display:table}@media(min-width: 95em){#main{max-width:1520px}}body{display:-webkit-box;display:-ms-flexbox;display:flex;min-height:100vh;-webkit-box-orient:vertical;-webkit-box-direction:normal;-ms-flex-direction:column;flex-direction:column}.initial-content,.search-content{flex:1 0 auto}@media(min-width: 64em){.page{float:right;width:calc(100% - 200px);padding-right:200px}}@media(min-width: 95em){.page{width:calc(100% - 300px);padding-right:300px}}.page .page__inner-wrap{float:left;margin-top:1em;margin-left:0;margin-right:0;width:100%;clear:both}.page .page__inner-wrap .page__content,.page .page__inner-wrap .page__meta,.page .page__inner-wrap .comment__date,.page .page__inner-wrap .page__share{position:relative;float:left;margin-left:0;margin-right:0;width:100%;clear:both}.page__title{margin-top:0;line-height:1}.page__title a{color:#222831;text-decoration:none}.page__title+.page__meta,.page__title+.comment__date{margin-top:-0.5em}.page__lead{font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:1.25em}.page__content h2{padding-bottom:.5em;border-bottom:1px solid #cecfd1}.page__content h1 .header-link,.page__content h2 .header-link,.page__content h3 .header-link,.page__content h4 .header-link,.page__content h5 .header-link,.page__content h6 .header-link{position:relative;left:.5em;opacity:0;font-size:.8em;-webkit-transition:opacity .2s ease-in-out .1s;-moz-transition:opacity .2s ease-in-out .1s;-o-transition:opacity .2s ease-in-out .1s;transition:opacity .2s ease-in-out .1s}.page__content h1:hover .header-link,.page__content h2:hover .header-link,.page__content h3:hover .header-link,.page__content h4:hover .header-link,.page__content h5:hover .header-link,.page__content h6:hover .header-link{opacity:1}.page__content p,.page__content li,.page__content dl{font-size:1em}.page__content p{margin:0 0 1.3em}.page__content a:not(.btn):hover{text-decoration:underline}.page__content a:not(.btn):hover img{box-shadow:0 0 10px rgba(0,0,0,.25)}.page__content :not(pre)>code{padding-top:.1rem;padding-bottom:.1rem;font-size:.8em;background:#f9f2f4;border-radius:4px}.page__content :not(pre)>code::before,.page__content :not(pre)>code::after{letter-spacing:-0.2em;content:" "}.page__content dt{margin-top:1em;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-weight:bold}.page__content dd{margin-left:1em;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:.75em}.page__content .small{font-size:.75em}.page__content blockquote+.small{margin-top:-1.5em;padding-left:1.25rem}.page__hero{position:relative;margin-bottom:2em;clear:both;-webkit-animation:intro .3s both;animation:intro .3s both;-webkit-animation-delay:.25s;animation-delay:.25s}.page__hero::after{clear:both;content:"";display:table}.page__hero--overlay{position:relative;margin-bottom:2em;padding:3em 0;clear:both;background-size:cover;background-repeat:no-repeat;background-position:center;-webkit-animation:intro .3s both;animation:intro .3s both;-webkit-animation-delay:.25s;animation-delay:.25s}.page__hero--overlay::after{clear:both;content:"";display:table}.page__hero--overlay a{color:#fff}.page__hero--overlay .wrapper{padding-left:1em;padding-right:1em}@media(min-width: 95em){.page__hero--overlay .wrapper{max-width:1520px}}.page__hero--overlay .page__title,.page__hero--overlay .page__meta,.page__hero--overlay .comment__date,.page__hero--overlay .page__lead,.page__hero--overlay .btn{color:#fff;text-shadow:1px 1px 4px rgba(0,0,0,.5)}.page__hero--overlay .page__lead{max-width:768px}.page__hero--overlay .page__title{font-size:1.953em}@media(min-width: 37.5em){.page__hero--overlay .page__title{font-size:2.441em}}.page__hero-image{width:100%;height:auto;-ms-interpolation-mode:bicubic}.page__hero-caption{position:absolute;bottom:0;right:0;margin:0 auto;padding:2px 5px;color:#fff;font-family:Georgia,Times,serif;font-size:.6875em;background:#000;text-align:right;z-index:5;opacity:.5;border-radius:4px 0 0 0}@media(min-width: 64em){.page__hero-caption{padding:5px 10px}}.page__hero-caption a{color:#fff;text-decoration:none}.page__share{margin-top:2em;padding-top:1em;border-top:1px solid #cecfd1}@media(max-width: 37.5em){.page__share .btn span{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}}.page__share-title{margin-bottom:10px;font-size:.75em;text-transform:uppercase}.page__meta,.comment__date{margin-top:2em;color:#393e46;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:.75em}.page__meta p,.comment__date p{margin:0}.page__meta a,.comment__date a{color:inherit}.page__meta-title{margin-bottom:10px;font-size:.75em;text-transform:uppercase}.page__meta-sep::before{content:"•";padding-left:.5em;padding-right:.5em}.page__taxonomy .sep{display:none}.page__taxonomy strong{margin-right:10px}.page__taxonomy-item{display:inline-block;margin-right:5px;margin-bottom:8px;padding:5px 10px;text-decoration:none;border:1px solid #9b9b9d;border-radius:4px}.page__taxonomy-item:hover{text-decoration:none;color:#b54a27}.taxonomy__section{margin-bottom:2em;padding-bottom:1em}.taxonomy__section:not(:last-child){border-bottom:solid 1px #cecfd1}.taxonomy__section .archive__item-title{margin-top:0}.taxonomy__section .archive__subtitle{clear:both;border:0}.taxonomy__section+.taxonomy__section{margin-top:2em}.taxonomy__title{margin-bottom:.5em;color:#393e46}.taxonomy__count{color:#393e46}.taxonomy__index{display:grid;grid-column-gap:2em;grid-template-columns:repeat(2, 1fr);margin:1.414em 0;padding:0;font-size:.75em;list-style:none}@media(min-width: 64em){.taxonomy__index{grid-template-columns:repeat(3, 1fr)}}.taxonomy__index a{display:-webkit-box;display:-ms-flexbox;display:flex;padding:.25em 0;-webkit-box-pack:justify;-ms-flex-pack:justify;justify-content:space-between;color:inherit;text-decoration:none;border-bottom:1px solid #cecfd1}.back-to-top{display:block;clear:both;color:#393e46;font-size:.6em;text-transform:uppercase;text-align:right;text-decoration:none}.page__comments{float:left;margin-left:0;margin-right:0;width:100%;clear:both}.page__comments-title{margin-top:2rem;margin-bottom:10px;padding-top:2rem;font-size:.75em;border-top:1px solid #cecfd1;text-transform:uppercase}.page__comments-form{-webkit-transition:all .2s ease-in-out;transition:all .2s ease-in-out}.page__comments-form.disabled input,.page__comments-form.disabled button,.page__comments-form.disabled textarea,.page__comments-form.disabled label{pointer-events:none;cursor:not-allowed;filter:alpha(opacity=65);box-shadow:none;opacity:.65}.comment{clear:both;margin:1em 0}.comment::after{clear:both;content:"";display:table}.comment:not(:last-child){border-bottom:1px solid #cecfd1}.comment__avatar-wrapper{float:left;width:60px;height:60px}@media(min-width: 64em){.comment__avatar-wrapper{width:100px;height:100px}}.comment__avatar{width:40px;height:40px;border-radius:50%}@media(min-width: 64em){.comment__avatar{width:80px;height:80px;padding:5px;border:1px solid #cecfd1}}.comment__content-wrapper{float:right;width:calc(100% - 60px)}@media(min-width: 64em){.comment__content-wrapper{width:calc(100% - 100px)}}.comment__author{margin:0}.comment__author a{text-decoration:none}.comment__date{margin:0}.comment__date a{text-decoration:none}.page__related{clear:both;float:left;margin-top:2em;padding-top:1em;border-top:1px solid #cecfd1}.page__related::after{clear:both;content:"";display:table}@media(min-width: 64em){.page__related{float:right;width:calc(100% - 200px)}}@media(min-width: 95em){.page__related{width:calc(100% - 300px)}}.page__related a{color:inherit;text-decoration:none}.page__related-title{margin-bottom:10px;font-size:.75em;text-transform:uppercase}@media(min-width: 64em){.wide .page{padding-right:0}}@media(min-width: 95em){.wide .page{padding-right:0}}@media(min-width: 64em){.wide .page__related{padding-right:0}}@media(min-width: 95em){.wide .page__related{padding-right:0}}.archive{margin-top:1em;margin-bottom:2em}@media(min-width: 64em){.archive{float:right;width:calc(100% - 200px);padding-right:200px}}@media(min-width: 95em){.archive{width:calc(100% - 300px);padding-right:300px}}.archive__item{position:relative}.archive__item a{position:relative;z-index:10}.archive__item a[rel=permalink]{position:static}.archive__subtitle{margin:1.414em 0 .5em;padding-bottom:.5em;font-size:1em;color:#393e46;border-bottom:1px solid #cecfd1}.archive__subtitle+.list__item .archive__item-title{margin-top:.5em}.archive__item-title{margin-bottom:.25em;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;line-height:initial;overflow:hidden;text-overflow:ellipsis}.archive__item-title a[rel=permalink]::before{content:"";position:absolute;left:0;top:0;right:0;bottom:0}.archive__item-title a+a{opacity:.5}.page__content .archive__item-title{margin-top:1em;border-bottom:none}.archive__item-excerpt{margin-top:0;font-size:.75em}.archive__item-excerpt+p{text-indent:0}.archive__item-excerpt a{position:relative}.archive__item-teaser{position:relative;border-radius:4px;overflow:hidden}.archive__item-teaser img{width:100%}.archive__item-caption{position:absolute;bottom:0;right:0;margin:0 auto;padding:2px 5px;color:#fff;font-family:Georgia,Times,serif;font-size:.625em;background:#000;text-align:right;z-index:5;opacity:.5;border-radius:4px 0 0 0}@media(min-width: 64em){.archive__item-caption{padding:5px 10px}}.archive__item-caption a{color:#fff;text-decoration:none}.list__item .page__meta,.list__item .comment__date{margin:0 0 4px;font-size:.6em}@media(min-width: 64em){.archive .grid__wrapper{margin-right:-200px}}@media(min-width: 95em){.archive .grid__wrapper{margin-right:-300px}}.grid__item{margin-bottom:2em}@media(min-width: 37.5em){.grid__item{float:left;width:48.9795918367%}.grid__item:nth-child(2n+1){clear:both;margin-left:0}.grid__item:nth-child(2n+2){clear:none;margin-left:2.0408163265%}}@media(min-width: 48em){.grid__item{margin-left:0;margin-right:0;width:23.7288135593%}.grid__item:nth-child(2n+1){clear:none}.grid__item:nth-child(4n+1){clear:both}.grid__item:nth-child(4n+2){clear:none;margin-left:1.6949152542%}.grid__item:nth-child(4n+3){clear:none;margin-left:1.6949152542%}.grid__item:nth-child(4n+4){clear:none;margin-left:1.6949152542%}}.grid__item .page__meta,.grid__item .comment__date{margin:0 0 4px;font-size:.6em}.grid__item .page__meta-sep{display:block}.grid__item .page__meta-sep::before{display:none}.grid__item .archive__item-title{margin-top:.5em;font-size:1em}.grid__item .archive__item-excerpt{display:none}@media(min-width: 48em){.grid__item .archive__item-excerpt{display:block;font-size:.75em}}@media(min-width: 37.5em){.grid__item .archive__item-teaser{max-height:200px}}@media(min-width: 48em){.grid__item .archive__item-teaser{max-height:120px}}.feature__wrapper{clear:both;margin-bottom:2em;border-bottom:1px solid #cecfd1}.feature__wrapper::after{clear:both;content:"";display:table}.feature__wrapper .archive__item-title{margin-bottom:0}.feature__item{position:relative;margin-bottom:2em;font-size:1.125em}@media(min-width: 37.5em){.feature__item{float:left;margin-bottom:0;width:32.2033898305%}.feature__item:nth-child(3n+1){clear:both;margin-left:0}.feature__item:nth-child(3n+2){clear:none;margin-left:1.6949152542%}.feature__item:nth-child(3n+3){clear:none;margin-left:1.6949152542%}.feature__item .feature__item-teaser{max-height:200px;overflow:hidden}}.feature__item .archive__item-body{padding-left:1.6949152542%;padding-right:1.6949152542%}.feature__item a.btn::before{content:"";position:absolute;left:0;top:0;right:0;bottom:0}.feature__item--left{position:relative;float:left;margin-left:0;margin-right:0;width:100%;clear:both;font-size:1.125em}.feature__item--left .archive__item{float:left}.feature__item--left .archive__item-teaser{margin-bottom:2em}.feature__item--left a.btn::before{content:"";position:absolute;left:0;top:0;right:0;bottom:0}@media(min-width: 37.5em){.feature__item--left .archive__item-teaser{float:left;width:40.6779661017%}.feature__item--left .archive__item-body{float:right;padding-left:1.6949152542%;padding-right:1.6949152542%;width:57.6271186441%}}.feature__item--right{position:relative;float:left;margin-left:0;margin-right:0;width:100%;clear:both;font-size:1.125em}.feature__item--right .archive__item{float:left}.feature__item--right .archive__item-teaser{margin-bottom:2em}.feature__item--right a.btn::before{content:"";position:absolute;left:0;top:0;right:0;bottom:0}@media(min-width: 37.5em){.feature__item--right{text-align:right}.feature__item--right .archive__item-teaser{float:right;width:40.6779661017%}.feature__item--right .archive__item-body{float:left;width:57.6271186441%;padding-left:1.6949152542%;padding-right:1.6949152542%}}.feature__item--center{position:relative;float:left;margin-left:0;margin-right:0;width:100%;clear:both;font-size:1.125em}.feature__item--center .archive__item{float:left;width:100%}.feature__item--center .archive__item-teaser{margin-bottom:2em}.feature__item--center a.btn::before{content:"";position:absolute;left:0;top:0;right:0;bottom:0}@media(min-width: 37.5em){.feature__item--center{text-align:center}.feature__item--center .archive__item-teaser{margin:0 auto;width:40.6779661017%}.feature__item--center .archive__item-body{margin:0 auto;width:57.6271186441%}}.archive .feature__wrapper .archive__item-title{margin-top:.25em;font-size:1em}.archive .feature__item,.archive .feature__item--left,.archive .feature__item--center,.archive .feature__item--right{font-size:1em}@media(min-width: 64em){.wide .archive{padding-right:0}}@media(min-width: 95em){.wide .archive{padding-right:0}}.layout--single .feature__wrapper{display:inline-block}.sidebar{clear:both}.sidebar::after{clear:both;content:"";display:table}@media(min-width: 64em){.sidebar{float:left;width:calc(200px - 1em);opacity:.75;-webkit-transition:opacity .2s ease-in-out;transition:opacity .2s ease-in-out}.sidebar:hover{opacity:1}.sidebar.sticky{overflow-y:auto;max-height:calc(100vh - 2em - 2em)}}@media(min-width: 95em){.sidebar{width:calc(300px - 1em)}}.sidebar>*{margin-top:1em;margin-bottom:1em}.sidebar h2,.sidebar h3,.sidebar h4,.sidebar h5,.sidebar h6{margin-bottom:0;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif}.sidebar p,.sidebar li{font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:.75em;line-height:1.5}.sidebar img{width:100%}.sidebar img.emoji{width:20px;height:20px}.sidebar__right{margin-bottom:1em}@media(min-width: 64em){.sidebar__right{position:absolute;top:0;right:0;width:200px;margin-right:-200px;padding-left:1em;z-index:10}.sidebar__right.sticky{clear:both;position:-webkit-sticky;position:sticky;top:2em;float:right}.sidebar__right.sticky::after{clear:both;content:"";display:table}.sidebar__right.sticky .toc .toc__menu{overflow-y:auto;max-height:calc(100vh - 7em)}}@media(min-width: 95em){.sidebar__right{width:300px;margin-right:-300px}}@media(min-width: 64em){.splash .sidebar__right{position:relative;float:right;margin-right:0}}@media(min-width: 95em){.splash .sidebar__right{margin-right:0}}.author__avatar{display:table-cell;vertical-align:top;width:36px;height:36px}@media(min-width: 64em){.author__avatar{display:block;width:auto;height:auto}}.author__avatar img{max-width:110px;border-radius:50%}@media(min-width: 64em){.author__avatar img{padding:5px;border:1px solid #cecfd1}}.author__content{display:table-cell;vertical-align:top;padding-left:15px;padding-right:25px;line-height:1}@media(min-width: 64em){.author__content{display:block;width:100%;padding-left:0;padding-right:0}}.author__content a{color:inherit;text-decoration:none}.author__name{margin:0}@media(min-width: 64em){.author__name{margin-top:10px;margin-bottom:10px}}.sidebar .author__name{font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;font-size:1em}.author__bio{margin:0}@media(min-width: 64em){.author__bio{margin-top:10px;margin-bottom:20px}}.author__urls-wrapper{position:relative;display:table-cell;vertical-align:middle;font-family:-apple-system,BlinkMacSystemFont,"Roboto","Segoe UI","Helvetica Neue","Lucida Grande",Arial,sans-serif;z-index:20;cursor:pointer}.author__urls-wrapper li:last-child a{margin-bottom:0}.author__urls-wrapper .author__urls span.label{padding-left:5px}@media(min-width: 64em){.author__urls-wrapper{display:block}}.author__urls-wrapper button{position:relative;margin-bottom:0}@supports(pointer-events: none){.author__urls-wrapper button:before{content:"";position:fixed;top:0;left:0;width:100%;height:100%;pointer-events:none}}.author__urls-wrapper button.open:before{pointer-events:auto}@media(min-width: 64em){.author__urls-wrapper button{display:none}}.author__urls{display:none;position:absolute;right:0;margin-top:15px;padding:10px;list-style-type:none;border:1px solid #cecfd1;border-radius:4px;background:#eee;box-shadow:0 2px 4px 0 rgba(0,0,0,.16),0 2px 10px 0 rgba(0,0,0,.12);cursor:default}.author__urls.is--visible{display:block}@media(min-width: 64em){.author__urls{display:block;position:relative;margin:0;padding:0;border:0;background:rgba(0,0,0,0);box-shadow:none}}.author__urls:before{display:block;content:"";position:absolute;top:-11px;left:calc(50% - 10px);width:0;border-style:solid;border-width:0 10px 10px;border-color:#cecfd1 rgba(0,0,0,0);z-index:0}@media(min-width: 64em){.author__urls:before{display:none}}.author__urls:after{display:block;content:"";position:absolute;top:-10px;left:calc(50% - 10px);width:0;border-style:solid;border-width:0 10px 10px;border-color:#eee rgba(0,0,0,0);z-index:1}@media(min-width: 64em){.author__urls:after{display:none}}.author__urls ul{padding:10px;list-style-type:none}.author__urls li{white-space:nowrap}.author__urls a{display:block;margin-bottom:5px;padding-right:5px;padding-top:2px;padding-bottom:2px;color:inherit;font-size:1em;text-decoration:none}.author__urls a:hover{text-decoration:underline}.wide .sidebar__right{margin-bottom:1em}@media(min-width: 64em){.wide .sidebar__right{position:initial;top:initial;right:initial;width:initial;margin-right:initial;padding-left:initial;z-index:initial}.wide .sidebar__right.sticky{float:none}}@media(min-width: 95em){.wide .sidebar__right{width:initial;margin-right:initial}}@media print{[hidden]{display:none}*{-moz-box-sizing:border-box;-webkit-box-sizing:border-box;box-sizing:border-box}html{margin:0;padding:0;min-height:auto !important;font-size:16px}body{margin:0 auto;background:#fff !important;color:#000 !important;font-size:1rem;line-height:1.5;-moz-osx-font-smoothing:grayscale;-webkit-font-smoothing:antialiased;text-rendering:optimizeLegibility}h1,h2,h3,h4,h5,h6{color:#000;line-height:1.2;margin-bottom:.75rem;margin-top:0}h1{font-size:2.5rem}h2{font-size:2rem}h3{font-size:1.75rem}h4{font-size:1.5rem}h5{font-size:1.25rem}h6{font-size:1rem}a,a:visited{color:#000;text-decoration:underline;word-wrap:break-word}table{border-collapse:collapse}thead{display:table-header-group}table,th,td{border-bottom:1px solid #000}td,th{padding:8px 16px}img{border:0;display:block;max-width:100% !important;vertical-align:middle}hr{border:0;border-bottom:2px solid #bbb;height:0;margin:2.25rem 0;padding:0}dt{font-weight:bold}dd{margin:0;margin-bottom:.75rem}abbr[title],acronym[title]{border:0;text-decoration:none}table,blockquote,pre,code,figure,li,hr,ul,ol,a,tr{page-break-inside:avoid}h2,h3,h4,p,a{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid;page-break-inside:avoid}h1+p,h2+p,h3+p{page-break-before:avoid}img{page-break-after:auto;page-break-before:auto;page-break-inside:avoid}pre{white-space:pre-wrap !important;word-wrap:break-word}a[href^="http://"]:after,a[href^="https://"]:after,a[href^="ftp://"]:after{content:" (" attr(href) ")";font-size:80%}abbr[title]:after,acronym[title]:after{content:" (" attr(title) ")"}#main{max-width:100%}.page{margin:0;padding:0;width:100%}.page-break,.page-break-before{page-break-before:always}.page-break-after{page-break-after:always}.no-print{display:none}a.no-reformat:after{content:""}abbr[title].no-reformat:after,acronym[title].no-reformat:after{content:""}.page__hero-caption{color:#000 !important;background:#fff !important;opacity:1}.page__hero-caption a{color:#000 !important}.masthead,.toc,.page__share,.page__related,.pagination,.ads,.page__footer,.page__comments-form,.author__avatar,.author__content,.author__urls-wrapper,.nav__list,.sidebar,.adsbygoogle{display:none !important;height:1px !important}}/*# sourceMappingURL=main.css.map */ \ No newline at end of file diff --git a/assets/css/main.css.map b/assets/css/main.css.map new file mode 100644 index 00000000..23a44309 --- /dev/null +++ b/assets/css/main.css.map @@ -0,0 +1 @@ +{"version":3,"sourceRoot":"","sources":["../../_sass/minimal-mistakes/skins/_air.scss","../../_sass/minimal-mistakes.scss","../../_sass/minimal-mistakes/vendor/magnific-popup/_settings.scss","../../_sass/minimal-mistakes/_variables.scss","../../_sass/minimal-mistakes/vendor/magnific-popup/_magnific-popup.scss","../../_sass/minimal-mistakes/_mixins.scss","../../_sass/minimal-mistakes/_reset.scss","../../_sass/minimal-mistakes/vendor/breakpoint/_breakpoint.scss","../../_sass/minimal-mistakes/_base.scss","../../_sass/minimal-mistakes/_forms.scss","../../_sass/minimal-mistakes/_tables.scss","../../_sass/minimal-mistakes/_animations.scss","../../_sass/minimal-mistakes/_buttons.scss","../../_sass/minimal-mistakes/_notices.scss","../../_sass/minimal-mistakes/_masthead.scss","../../_sass/minimal-mistakes/_navigation.scss","../../_sass/minimal-mistakes/_footer.scss","../../_sass/minimal-mistakes/_search.scss","../../_sass/minimal-mistakes/_syntax.scss","../../_sass/minimal-mistakes/_utilities.scss","../../_sass/minimal-mistakes/_page.scss","../../_sass/minimal-mistakes/_archive.scss","../../_sass/minimal-mistakes/_sidebar.scss","../../_sass/minimal-mistakes/_print.scss"],"names":[],"mappings":"CA6BE,mLAME,MA5BS,QAgCb,cACE,sBAGF,mDACE,cC5CF;AAAA;AAAA;AAAA;AAAA,EC0CA,yBC3BQ,oBC8DR,QACE,MACA,OACA,WACA,YACA,aACA,gBACA,eAEA,WFjFoC,KEkFpC,QFjFoC,GEmFlC,yBAKJ,UACE,MACA,OACA,WACA,YACA,aACA,eACA,wBACA,mCAIF,eACE,kBACA,kBACA,WACA,YACA,OACA,MACA,cACA,8BACA,2BACA,sBAKA,sBACE,WACA,qBACA,YACA,sBAOA,qCACE,aAMN,aACE,kBACA,qBACA,sBACA,cACA,gBACA,aAIA,8DACE,WACA,YAKJ,cACE,gBAGA,iEACE,qBACA,wBACA,gBAGJ,UACE,eACA,uBACA,oBACA,eAGA,8BACE,YAIJ,kDAIE,yBACA,sBACA,iBAKA,wBACE,aAkBF,UACE,wBAUJ,eACE,MFvMoC,KEwMpC,kBACA,QACA,WACA,kBACA,kBACA,SACA,UACA,aACA,iBACE,MFjNkC,KEkNlC,uBACE,MFlNgC,KEyNpC,4BACE,aAMF,0BACE,aAMF,kCAEE,iBACA,eACA,yBACA,SACA,wBACA,cACA,aACA,UACA,aACA,wBACA,gBAEF,yBACI,UACA,SAMN,WACE,WACA,YACA,iBAEA,kBACA,QACA,MACA,qBACA,kBACA,QF5QoC,EE8QlC,0BAEF,sBACA,MFhRoC,KEkRpC,kBACA,eACA,YDvRM,oBCyRN,kCAEE,UAEE,0BAIJ,kBACE,QAIF,6BACE,MFlSkC,KEuSpC,2DACE,MF1SkC,KE2SlC,WACA,iBACA,kBACA,WAKJ,aACE,kBACA,MACA,QACA,MFpToC,KEqTpC,eACA,iBAKA,WACE,kBACA,QFjUkC,EEmUhC,0BAEF,SACA,QACA,iBACA,UACA,WACA,aACA,0CACA,kBACE,iBAEF,kCAEE,UAEE,0BAGJ,uEAIE,WACA,cACA,QACA,SACA,kBACA,OACA,MACA,gBACA,iBACA,kCAGF,mCAGE,sBACA,yBACA,QAGF,oCAEE,sBACA,yBACA,WAKJ,gBACE,OAEA,6CAEE,6BACA,iBAEF,8CAEE,iBACA,6BAIJ,iBACE,QACA,+CAEE,4BACA,iBAEF,gDAEE,4BASJ,mBACE,YF/YkC,KEgZlC,eFhZkC,KEiZlC,gCACE,cACA,WACA,UFlZgC,MEoZlC,8BACE,UAGJ,mBACE,WACA,SACA,gBACA,mBACA,0BACE,kBACA,cACA,MACA,OACA,WACA,YACA,WF1bgC,uBE2bhC,WFtagC,KEkblC,YACE,WACA,eACA,YACA,cACA,cACA,8BACA,2BACA,sBACA,oBACA,cAKJ,YACE,cACA,kBACE,WACA,kBACA,OACA,IFhcgC,KEichC,OFhcgC,KEichC,cACA,QACA,WACA,YACA,WACA,WFnegC,uBEoehC,WFzcgC,KE2clC,kBACE,MFrcgC,QEschC,cACA,eACA,iBAEF,mBACE,SAEF,uBACE,aACA,gBAGJ,gBACE,iBACA,kBACA,SACA,OACA,WACA,YAEF,WACE,gBACA,iBACA,MF9dkC,QE+dlC,qBACA,mBAIA,+BACE,eAMA,2CACE,eAOJ,gGAKI,kCACE,eACA,gBAGA,4BACE,UAKF,kCACE,MACA,SAEF,kCACE,eACA,gBAGJ,gCACE,0BACA,SACA,SACA,SACA,gBACA,eACA,8BACA,2BACA,sBACA,sCACE,UAGJ,6BACE,UACA,QAEF,2BACE,MACA,QACA,WACA,YACA,iBACA,0BACA,eACA,kBACA,WAUV,kCACE,WACE,8BACA,sBAEF,gBACE,2BACA,mBAEF,iBACE,8BACA,sBAEF,eACE,aF5lBkC,IE6lBlC,cF7lBkC,KEumBlC,kBACE,UAEF,yBACE,YACA,SACA,mBACA,eACA,mBAEF,wBACE,UAEF,sBACE,iBAEF,oBACE,MACA,QACA,cCjoBN,qBAEE,yBAEA,sBACA,oBCLF,wBAEA,KAEE,sBACA,iBNHiB,KMIjB,eAcA,8BACA,0BCuCE,wBD1DJ,KAOI,gBCmDA,wBD1DJ,KAWI,gBC+CA,wBD1DJ,KAeI,gBASJ,cAIA,iBACE,WACA,gBAGF,YACE,WACA,gBAKF,8EAWE,cAKF,mBAGE,qBACA,gBACA,QAKF,sBACE,aAGF,EACE,MNhEW,QM2Eb,iBAEE,UAKF,QAEE,kBACA,cACA,cACA,wBAGF,IACE,WAGF,IACE,eAKF,IAEE,eACA,cACA,YAEA,sBACA,SACA,+BAKF,iCAEE,eAKF,6BAIE,SACA,eACA,sBAGF,aAEE,kBACA,mBAGF,iDAEE,UACA,SAGF,oEAII,0BACA,eAGJ,mHAQI,eAGJ,mBACE,sBACA,6BAGF,+FAEE,wBAGF,SACE,cACA,mBErLF,KAEE,kBACA,gBAGF,KACE,SACA,UACA,MRNW,QQOX,YLEW,uGKDX,gBAEA,sBAEE,gBAIJ,kBAME,kBACA,gBACA,YLfW,uGKgBX,iBAGF,GACE,aACA,ULSS,QKNX,GACE,ULMS,OKHX,GACE,ULGS,WKCT,mBAGF,GACE,ULHS,UKMX,GACE,ULNS,IKSX,aAEE,ULrBY,MKwBd,EACE,oBAGF,MAEE,qBACA,gCACA,UACE,cAIJ,MACE,cAKF,gDASE,UACA,SAKF,sCAEE,qBACA,YACA,iCAKF,WACE,qBACA,iBACA,kBACA,kBACA,6BAEA,gBACE,kBAEA,uBACE,YACA,kBAYJ,UACE,MR3G0B,QQ8G5B,QACE,MRhH0B,QQiH1B,UAYJ,qBAKE,YLzIU,2CK4IZ,IACE,gBAGF,8CAKE,SACA,UACA,gBACA,YL1JW,uGK2JX,ULnIY,IKoIZ,MR1J4B,QQ2J5B,WR1J4B,QQ2J5B,cLNc,IKQd,6JAEE,sBACA,YAMJ,GACE,cACA,aACA,SACA,6BAKF,YAEE,mBAGF,YAEE,gBASF,OACE,oBACA,aACA,yBACA,8BACA,wBACA,uBACA,eACA,aAEA,2DAGE,kBAGF,WACE,WACA,cL5DY,IK6DZ,mBLvDgB,oBKwDhB,WLxDgB,oBK2DlB,SACE,cD1KA,0BC8KA,8BAGI,wBAIJ,uBACE,WDtLF,0BC2LA,gCAGI,6BAIJ,wBACE,WAON,WACE,mBACA,MRpQiB,QQqQjB,YL9PM,oBK+PN,ULrOY,MKuOZ,aACE,mBLnGgB,oBKoGhB,WLpGgB,oBKsGhB,mBACE,MR5PwB,QQmQ9B,eACE,gBAqBA,OACE,SACA,UAGF,OACE,gBAGF,MACE,qBAIF,oBAEE,gBAGF,oBAEE,aAQJ,6IAsBE,mBLvLkB,oBKwLlB,WLxLkB,oBM1KpB,KACE,iBACA,YACA,iBNsDa,QMpDb,cACE,kBACA,UACA,eAGF,YACE,cACA,WACA,mBACA,kBACA,UACA,MTdS,QSeT,SACA,mBAGF,OACE,oBAGF,QACE,qBACA,iBACA,UAGF,QACE,aAIJ,mCAKE,wBACA,uBAGF,6BAIE,sBACA,YNvCW,uGM0Cb,MACE,cACA,oBACA,MTtDW,QSuDX,eAEA,YACE,UNxBU,MM2BZ,wCAGE,cAIJ,sBAGE,qBACA,WACA,cACA,mBACA,MT3EW,QS4EX,iBT7EiB,KS8EjB,OTzEa,QS0Eb,cNmFc,IMlFd,WNmFW,2BMhFb,YACE,WAGF,aACE,WAGF,yDAGE,WACA,YACA,UACA,aACA,cACA,mBACA,eACA,gBACA,aACA,gBAGF,uCAEE,sBACA,UACA,YACA,aAGF,kBACE,SAGF,iBACE,WACA,gBACA,oBACA,eACA,+BACA,yBACA,gBAGF,wDAGE,WACA,YACA,eACA,kBAGF,wBAEE,gBAGF,OACE,WACA,sBAGF,8BAEE,YAGF,SACE,gBACA,YACA,cACA,mBAGF,mBACE,aAGF,MACE,kBAGF,iBAEE,kBACA,mBAGF,wDAEE,WACA,kBAGF,+BAEE,qBACA,gBACA,gBACA,sBAGF,8DAEE,iBAOF,wGAME,WACA,mBAOF,2BAEE,aThNc,KSiNd,UACA,wBACA,2EAIF,uFAIE,gBAOF,yBAEE,MTtOiB,QSyOnB,YACE,cACA,kBACA,gBAGF,aACE,qBACA,sBACA,iBAOF,YACE,kBACA,UACA,eAOF,6DAGE,qBACA,gBAGF,mBACE,qBAGF,+DAGE,eACA,gBACA,sBAGF,kFAEE,WACA,cACA,iBAOF,6DAGE,qBACA,gBAGF,2BACE,kBACA,mBACA,gBACA,mBAGF,mBACE,qBAGF,+DAGE,eACA,gBACA,sBAGF,kFAEE,WACA,cACA,iBAOF,sBACE,WAGF,8BACE,cAGF,aACE,kBACA,MACA,OACA,WACA,YACA,sCACA,WAGF,eACE,aACA,kBACA,QACA,SACA,WCjWF,MACE,cACA,kBACA,WACA,YPQW,uGOPX,UPgCY,MO/BZ,yBACA,gBAEA,YACE,eAIJ,MACE,iBVRa,QUSb,gCAGF,GACE,aACA,iBACA,gBAGF,GACE,aACA,gCAGF,SAGE,sBCjCF,yBACE,GACE,UAEF,KACE,WAIJ,iBACE,GACE,UAEF,KACE,WCVJ,KAEE,qBACA,oBACA,iBACA,YTGW,uGSFX,UT2BY,MS1BZ,iBACA,kBACA,qBACA,eACA,cTqJc,ISpJd,eAEA,WACE,kBAGF,mBACE,mBAiBA,cP6CF,iBOzDA,KP0DA,WOrCI,sBPoCJ,iBOzDA,KP0DA,WOjCI,oBPgCJ,iBO/B8B,QPgC9B,WO9CE,cP6CF,iBOzDA,KP0DA,cO3CM,yBAMF,sBPoCJ,iBOzDA,KP0DA,cOjCI,oBPgCJ,iBO/B8B,KPgC9B,cO9CE,oBP6CF,iBOzDA,cP0DA,WOxCM,sBAGF,4BPoCJ,iBOzDA,cP0DA,WOjCI,0BPgCJ,iBO/B8B,ePgC9B,WO9CE,cP6CF,iBOzDA,QP0DA,WOrCI,sBPoCJ,iBOzDA,QP0DA,WOjCI,oBPgCJ,iBO/B8B,QPgC9B,WO9CE,cP6CF,iBOzDA,QP0DA,WOrCI,sBPoCJ,iBOzDA,QP0DA,WOjCI,oBPgCJ,iBO/B8B,QPgC9B,WO9CE,aP6CF,iBOzDA,QP0DA,WOrCI,qBPoCJ,iBOzDA,QP0DA,WOjCI,mBPgCJ,iBO/B8B,QPgC9B,WO9CE,WP6CF,iBOzDA,QP0DA,WOrCI,mBPoCJ,iBOzDA,QP0DA,WOjCI,iBPgCJ,iBO/B8B,QPgC9B,WO9CE,eP6CF,iBOzDA,QP0DA,WOrCI,uBPoCJ,iBOzDA,QP0DA,WOjCI,qBPgCJ,iBO/B8B,QPgC9B,WO9CE,cP6CF,iBOzDA,QP0DA,WOrCI,sBPoCJ,iBOzDA,QP0DA,WOjCI,oBPgCJ,iBO/B8B,QPgC9B,WO9CE,eP6CF,iBOzDA,QP0DA,WOrCI,uBPoCJ,iBOzDA,QP0DA,WOjCI,qBPgCJ,iBO/B8B,QPgC9B,WO1BA,YACE,cACA,WAEA,wBACE,iBAKJ,eACE,oBACA,mBACA,yBACA,gBACA,YAIF,cACE,UT7CU,OSiDZ,YACE,UTjDU,ISqDZ,YACE,UTpDU,QUkCd,QA/DE,wBACA,YACA,MbRW,QaSX,mHACA,2BACA,oBACA,yBACA,cVoJc,IUnJd,2CAEA,WACE,wBACA,oBACA,oBAGO,0BAEP,gBACA,cAIA,qBACE,2BAIJ,aAEE,aACA,cAGF,UACE,cAEA,gBACE,cAIK,kBACP,0BAGF,aACE,yBAGH,iBACC,yBAIE,sBACE,gBAaN,iBArEE,wBACA,YACA,MbRW,QaSX,mHACA,2BACA,oBACA,yBACA,cVoJc,IUnJd,2CAEA,oBACE,wBACA,oBACA,oBAGO,mCAEP,gBACA,cAIA,8BACE,2BAIJ,sBAEE,aACA,cAGF,mBACE,cAEA,yBACE,WAIK,2BACP,0BAGF,sBACE,yBAGH,0BACC,yBAIE,+BACE,gBAmBN,cA3EE,wBACA,YACA,MbRW,QaSX,mHACA,2BACA,oBACA,yBACA,cVoJc,IUnJd,yCAEA,iBACE,wBACA,oBACA,oBAGO,gCAEP,gBACA,cAIA,2BACE,2BAIJ,mBAEE,aACA,cAGF,gBACE,cAEA,sBACE,cAIK,wBACP,0BAGF,mBACE,yBAGH,uBACC,yBAIE,4BACE,gBAyBN,iBAjFE,wBACA,YACA,MbRW,QaSX,mHACA,2BACA,oBACA,yBACA,cVoJc,IUnJd,0CAEA,oBACE,wBACA,oBACA,oBAGO,mCAEP,gBACA,cAIA,8BACE,2BAIJ,sBAEE,aACA,cAGF,mBACE,cAEA,yBACE,cAIK,2BACP,0BAGF,sBACE,yBAGH,0BACC,yBAIE,+BACE,gBA+BN,iBAvFE,wBACA,YACA,MbRW,QaSX,mHACA,2BACA,oBACA,yBACA,cVoJc,IUnJd,yCAEA,oBACE,wBACA,oBACA,oBAGO,mCAEP,gBACA,cAIA,8BACE,2BAIJ,sBAEE,aACA,cAGF,mBACE,cAEA,yBACE,cAIK,2BACP,0BAGF,sBACE,yBAGH,0BACC,yBAIE,+BACE,gBAqCN,gBA7FE,wBACA,YACA,MbRW,QaSX,mHACA,2BACA,oBACA,yBACA,cVoJc,IUnJd,yCAEA,mBACE,wBACA,oBACA,oBAGO,kCAEP,gBACA,cAIA,6BACE,2BAIJ,qBAEE,aACA,cAGF,kBACE,cAEA,wBACE,cAIK,0BACP,0BAGF,qBACE,yBAGH,yBACC,yBAIE,8BACE,gBCjEN,UACE,kBACA,gCACA,kBXwKiB,eWvKjB,UXuKiB,eWtKjB,6BACA,qBACA,WAEA,sBTgCA,WS9BE,iBACA,kBACA,YACA,eACA,oBACA,oBACA,aACA,yBACA,sBACA,8BACA,YXTS,uGE+BX,6BACE,WACA,WACA,cEcA,wBOnDF,sBAeI,UX6HI,QW1HN,0BACE,WAGF,wBACE,qBAKN,eACE,gBAGF,YACE,oBACA,oBACA,aACA,2BACA,kBACA,iBAIF,eACE,cACA,UXdY,OWiBd,gBACE,WACA,cACA,eACA,WACA,WAEA,0BACE,cPJA,0BOGF,0BAII,aAIJ,mBACE,SACA,UACA,WACA,qBAIJ,qBACE,cACA,qBACA,mBAEA,yBACE,kBACA,gBClFJ,aVqCE,WUnCA,cACA,eACA,iBACA,kBACA,YZEW,uGYDX,kBZgKiB,eY/JjB,UZ+JiB,eY9JjB,4BACA,oBV6BA,oBACE,WACA,WACA,cEcA,wBQxDJ,aAaI,UZoIM,QYjIR,gBACE,UACA,gBACA,UZcU,MIuBV,wBQxCF,gBAMI,YACA,0BRiCF,wBQxCF,gBAWI,0BAIJ,gBACE,eAGF,sBACE,iBAQJ,YVPE,WUSA,WACA,eACA,gBACA,WVVA,mBACE,WACA,WACA,cUSF,eACE,SACA,UACA,qBACA,YZ/CS,uGYkDX,eACE,cACA,WACA,iBAEA,iBACE,cACA,oBACA,iBACA,YZ3DO,uGY4DP,eACA,iBACA,gBACA,kBACA,qBACA,MfzEa,Qe0Eb,yBACA,gBAEA,uBACE,Mf7DsB,QegExB,2DAEE,WACA,WflFQ,KeqFV,0BACE,wBACA,oBACA,mBAIJ,2BACE,cAEA,6BACE,uBZ8DQ,IY7DR,0BZ6DQ,IYxDV,4BACE,wBZuDQ,IYtDR,2BZsDQ,IYhDd,mBACE,cACA,gBACA,WACA,UACA,YZ7GS,uGY8GT,UZtFU,IYuFV,iBACA,kBACA,qBACA,Mf1He,Qe2Hf,yBACA,cZoCY,IYlCZ,yBV7CF,iBLjFiB,QKkFjB,WUgDE,+BACE,0BACA,6BAGF,8BACE,iBACA,yBACA,4BAGF,4BACE,wBACA,oBACA,mBAKN,mIAIE,eACA,gBACA,6BAOF,YACE,kBACA,oBACA,oBACA,aACA,yBACA,sBACA,mBACA,WZRW,IYSX,Wf7KiB,Ke+KjB,cACE,cACA,cACA,MfjLS,QekLT,qBACA,wBACA,gBAEA,oBACE,MfvLO,Qe0LT,wBACE,cACA,mBAGF,yBACE,cAIJ,gBACE,wBACA,gBAGF,oBACE,2BACA,kBACA,OZxCgB,KYyChB,SACA,aACA,+BACA,eAGF,2BACE,oBACA,oBACA,aACA,qBACA,kBACA,yBACA,mBACA,WACA,OACA,gBAEA,8BACE,mBACA,cACA,UAGF,6BACE,kBAEA,oCACE,WACA,kBACA,OACA,SACA,WACA,Wf3OQ,Ke4OR,WACA,mBZzEY,oBY0EZ,WZ1EY,oBY2EZ,iDACA,yCAGF,0CACE,4BACA,wBACA,oBAKN,0BACE,kBACA,SACA,QACA,gBACA,YACA,yBACA,cZpGY,IYqGZ,WfvQe,KewQf,4EAEA,oEAEA,iCACE,aAGF,4BACE,SACA,kBACA,UZjPQ,IYmPR,kCACE,MfrRK,QesRL,Wf7QmB,QeiRvB,iCACE,WACA,kBACA,UACA,WACA,QACA,mBACA,yBACA,mCACA,cACA,UAGF,gCACE,WACA,kBACA,UACA,WACA,QACA,mBACA,yBACA,gCACA,cACA,UAGF,6BACE,cACA,gCAEA,wCACE,mBAQJ,kCACE,mBACA,eACA,iBASN,WACE,oBAEA,iDAEE,aRzRA,6BQ6RA,iBACE,kBACA,qBACA,4BACA,MZxSC,QYySD,UZzTQ,MY0TR,iBACA,yBACA,cZ7LU,IY8LV,WACA,gCACA,wBACA,eAEA,+CAEE,WACA,kBACA,UACA,WACA,YACA,cACA,cACA,iBZ3TD,QY4TC,gCACA,wBAGF,uBACE,gCACA,4BACA,wBAGF,uBACE,WACA,aZxUD,QYyUC,sBAEA,2DAEE,sBAMN,+BACE,WACA,sBAEA,2EAEE,sBAKJ,6BACE,gCACA,4BACA,wBAGF,2CACE,4BACA,wBACA,oBAGF,cACE,kBAGF,aACE,cACA,iBRzWF,kDQuWA,aAKI,mBACA,uBR7WJ,6BQgXE,mBACE,2BAMR,uBACE,SACA,kBAEA,yBACE,cAGF,+BACE,mBACA,kBACA,mBACA,iBRnYA,6BQuXJ,uBAgBI,kBACA,aACA,WACA,gBACA,WACA,mCACA,2BACA,oCACA,gCACA,6BRhZA,6BQqZF,qCACE,mCACA,2BACA,kBACA,iBACA,UACA,eACA,kCACA,8BACA,2BAIJ,YACE,SACA,qBACA,YZrdW,uGYsdX,UZ9bY,IY+bZ,iBAGF,gBACE,cACA,eACA,iBACA,YZ9dW,uGY+dX,UZtcY,MYucZ,iBACA,yBACA,gCAOF,KACE,YZ1eW,uGY2eX,MZlcK,QYmcL,iBftfiB,KeufjB,yBACA,cZtVc,IYuVd,mBZtVW,2BYuVX,WZvVW,2BYyVX,iBACE,WACA,UZ3dU,MY4dV,Wf3fY,Ke4fZ,uBZ9VY,IY+VZ,wBZ/VY,IYmWd,eVlbA,iBFTa,QEUb,cUsbF,WACE,SACA,UACA,WACA,gBACA,UZ5eY,MIuBV,wBQgdJ,WAQI,UZ9eU,SYifZ,aACE,cACA,sBACA,MfthBe,QeuhBf,iBACA,gBACA,gCAEA,mBACE,Mf7hBO,QeiiBX,sBACE,qBACA,mBAGF,4BACE,qBAGF,kCACE,qBAGF,wCACE,qBAGF,8CACE,qBCtjBJ,cXyCE,WWvCA,WACA,cACA,eACA,WACA,eACA,MhBHiB,QgBIjB,kBbmKiB,ealKjB,UbkKiB,eajKjB,6BACA,qBACA,iBhBNc,KKqCd,qBACE,WACA,WACA,cWhCF,qBX2BA,WWzBE,iBACA,kBACA,eACA,eACA,kBXuBF,4BACE,WACA,WACA,cEcA,wBS9CF,qBASI,Ub8HI,Qa1HR,gBACE,cACA,qBAEA,sBACE,0BAIJ,4EAIE,MhBpCe,QgBwCnB,wBACE,YbjCW,uGakCX,UbRY,QaYZ,wBACE,SACA,UACA,qBAGF,wBACE,qBACA,gBACA,mBACA,YbhDS,uGaiDT,UbxBU,MayBV,yBAGF,kCACE,WACA,kBAGF,uBACE,mBACA,iBAIA,qCACE,mBC5EJ,sCACE,oBAIJ,gBACE,iBACA,kBACA,Od8JkB,Kc7JlB,SACA,aACA,MjBNc,KiBOd,+BACA,eACA,uBACA,eAEA,sBACE,cAIJ,aACE,WACA,YAGF,gBACE,aACA,kBACA,gBACA,mBAEA,4BACE,WACA,iBACA,kBACA,iBACA,kBACA,kBdmIe,eclIf,UdkIe,ecjIf,6BACA,qBViBA,wBU1BF,4BAYI,UduGI,QclGR,sBACE,+BAGF,8BACE,cACA,gBACA,UACA,YACA,aACA,gBACA,+BACA,Ud7BU,QI0BV,wBULF,8BAWI,UdjCQ,SI2BV,wBULF,8BAeI,UdtCQ,Sc0CZ,4BACE,cACA,mBAEA,mCACE,WACA,cAIJ,gCACE,gBACA,UdjDU,McoDZ,+BACE,kBV9BA,wBU6BF,+BAII,WVjCF,wBU6BF,+BAQI,WAIJ,qCACE,aAGF,uCACE,gBAMJ,gBACE,0BACA,kBAGF,oCACE,MjBhHc,KiBiHd,kBACA,0BAGF,sCACE,MjBtHc,KiBuHd,kBACA,iBC9HF,uCAEE,kBACA,kBACA,WfwHO,QevHP,Mf4HO,Ke3HP,YfQU,2CePV,Uf8BY,Me7BZ,gBACA,cf2Jc,IezJd,kHAEE,SACA,YAIJ,iBACE,gBACA,cACA,SAEA,oBACE,UACA,uBACA,SAGA,4DAEE,kBACA,UACA,Mf+FG,Qe9FH,+BACA,iBAIF,wDAEE,iBAIJ,qBACE,SAIJ,eACE,WAGF,gBACE,iBf2EO,KexEP,cAEE,MfoEK,QelEP,gBAEE,MfoEK,QelEP,cAEE,MfsEK,QepEP,cAEE,Mf6DK,Qe3DP,cAEE,MfqDK,KenDP,cAEE,MfwDK,QetDP,cAEE,Mf6CK,Ke3CP,eAEE,MfwCK,QetCP,eAEE,MfoCK,QelCP,eAEE,MfgCK,Qe9BP,eAEE,Mf4BK,Qe1BP,eAEE,Mf4BK,Qe1BP,eAEE,kBAEF,eAEE,MfiBK,KehBL,iBAEF,eAEE,MfkBK,QehBP,eAEE,MfOK,QeNL,iBAEF,eAEE,iBAEF,eAEE,MfMK,QeLL,iBAEF,eAEE,MfGK,QeDP,eAEE,MfDK,QeGP,eAEE,MfPK,QeSP,eAEE,MfTK,QeWP,eAEE,MfbK,QeeP,eAEE,MfrBK,QeuBP,eAEE,MfxBK,Qe0BP,cAEE,Mf9BK,QegCP,cAEE,MfhCK,QekCP,eAEE,MflCK,QeoCP,eAEE,Mf9CK,KegDP,eAEE,Mf7CK,Qe+CP,eAEE,MfnDK,QeqDP,eAEE,MfnDK,QeqDP,eAEE,Mf9DK,KegEP,eAEE,Mf/DK,QeiEP,eAEE,Mf9DK,QegEP,eAEE,Mf1EK,Ke4EP,eAEE,MfzEK,Qe2EP,eAEE,Mf1EK,Qe4EP,eAEE,MftFK,KewFP,eAEE,MfnFK,QeqFP,eAEE,Mf3FK,Qe6FP,eAEE,Mf3FK,Qe6FP,cAEE,MftGK,KewGP,eAEE,MftGK,QewGP,eAEE,Mf1GK,Qe4GP,eAEE,Mf9GK,QegHP,eAEE,MflHK,QeoHP,eAEE,MfpHK,QesHP,eAEE,Mf9HK,KegIP,eAEE,MfnIK,QeqIP,eAEE,MfhIK,QekIP,eAEE,MftIK,QewIP,eAEE,MfxIK,Qe0IP,eAEE,Mf9IK,QegJP,eAEE,MfhJK,QekJP,eAEE,MfpJK,QesJP,eAEE,MfxJK,Qe0JP,eAEE,Mf5JK,Qe8JP,eAEE,MftKK,KewKP,eAEE,MfvKK,QeyKP,eAEE,Mf3KK,Qe6KP,eAEE,Mf/KK,QeiLP,eAEE,MflLK,QeuLP,kBACE,gBCvTJ,oBAEE,aACA,kBAKF,MACE,aAGF,aACE,UAKF,sFAIE,6BACA,8BACA,sBACA,qBACA,oBACA,gBAGF,mGAGE,wBAKF,wDAEE,qBACA,uBACA,sBACA,cACA,cACA,iBACA,uBACA,gBACA,eACA,qBACA,sCAOF,WACE,eACA,WACA,SACA,YhBvDW,uGgBwDX,mBAGF,cACE,SACA,QACA,gBAOF,WACE,gBAGF,aACE,kBAGF,YACE,iBAGF,cACE,mBAGF,aACE,mBAOF,WACE,UAEA,cACE,qBAGF,oCACE,kBACA,UAIJ,sBACE,gBASF,IACE,WAGF,SACE,iBACA,kBACA,WASF,YACE,cACA,iBACA,kBZvFE,0BYoFJ,YAMI,WACA,kBAMJ,aACE,cACA,iBACA,kBZpGE,0BYiGJ,aAMI,YACA,iBAMJ,cACE,cACA,iBACA,kBZjHE,wBYsHJ,MAEI,yCAQJ,MACE,qBACA,kBACA,UACA,aACA,cACA,kBACA,WACA,sBAMA,4EAIE,MnB3MS,QmB8MX,2DAEE,MhB9HY,QgBiId,4BACE,MhBjIc,QgBoIhB,4DAEE,MhBrIa,QgBwIf,0FAGE,MhB1Ia,QgB6If,yBACE,MhB7IW,QgBgJb,6BACE,MhBhJe,QgBmJjB,sFAGE,MhBrJW,QgBwJb,yBACE,MhBxJW,QgB2Jb,4BACE,MhB3Jc,QgB8JhB,0BACE,MhB9JY,QgBiKd,yDAEE,MhBlKW,QgBqKb,yDAEE,MhBtKa,QgByKf,6DAEE,MhB1Ka,QgB6Kf,6FAGE,MhB/Kc,QgBkLhB,yBACE,MhBlLW,QgBqLb,mDAEE,MhBtLQ,QgByLV,6BACE,MhBzLe,KgB4LjB,kEAEE,MhB7LkB,QgBgMpB,yDAEE,MhBjMW,QgBoMb,2DAEE,MhBrMY,QgBwMd,iFAGE,MhB1MU,QgB6MZ,uBACE,MhB7MS,QgBgNX,0BACE,MhBhNY,KgBmNd,qDAEE,MhBpNS,QgB4Nb,SACE,kBACA,MhBlKc,OgBmKd,OhBlKe,OgBmKf,WnBtUc,KmBuUd,YACA,uBACA,eAEA,+BAEE,WACA,kBACA,OACA,MhB9KY,OgB+KZ,OhB9Ka,OgB+Kb,WnBlVY,KmBmVZ,uBACA,eAGF,gBACE,YAGF,eACE,eAIJ,gBAEE,yBAGA,6CAEE,iCACA,6BACA,yBACA,MACA,MhBzMY,OgB6Md,uBACE,2CACA,mCAEF,sBACE,4CACA,oCAMA,gCADF,2BAEI,WACA,eACA,MACA,OACA,WACA,YACA,UACA,iBnBvYa,KmBwYb,mBhBhOc,oBgBiOd,WhBjOc,oBgBkOd,qBAKF,iCACE,WACA,mBhBzOc,oBgB0Od,WhB1Oc,oBgB2Od,oBAMJ,sHAGE,mBAIA,yCACE,yBZvWF,wBYgXJ,QdnYE,WcsYE,wBACA,gBACA,QdtYF,eACE,WACA,WACA,ccqYA,UACE,eASN,MACE,gBACA,aACA,mBACA,yBACA,yBACA,chB/Rc,IgBgSd,2CAOF,YACE,gBACA,kBAEA,mBACE,kBACA,WACA,MACA,OACA,WACA,YACA,YACA,uCAGF,mBACE,cAIJ,OACE,aACA,eACA,YACA,QACA,SACA,mBACA,kBACA,aACA,aACA,gBACA,yBACA,chBvUc,IgBwUd,WhBvUW,2BgByUX,cACE,SACA,iBAGF,wBACE,uBAGF,gBACE,iBACA,6BAQJ,UACE,cACA,qBAGF,WACE,cAEA,yCAGE,gBACA,UhBxeU,MgB4ed,kBACE,MhB7dK,QgB8dL,qBAEA,wBACE,0BAQJ,UACE,MnB5gB4B,QmB6gB5B,iBAQA,+DAGE,SAQJ,4BACE,kBACA,kBACA,sBACA,SACA,gBACA,eAEA,wGAGE,kBACA,MACA,OACA,WACA,YAMF,oFAEE,gBC1kBJ,MfyCE,WevCA,iBACA,kBACA,iBACA,kBACA,kBjBqKiB,eiBpKjB,UjBoKiB,eiBnKjB,eACA,6BACA,qBfiCA,aACE,WACA,WACA,cEcA,wBa5DJ,MAaI,UjBwIM,QiBpIV,KACE,oBACA,oBACA,aACA,iBACA,4BACA,6BACA,0BACQ,sBAGV,iCAEE,cb8BE,wBa3BJ,MAEI,YACA,yBACA,cjBuHyB,OIhGzB,wBa3BJ,MAQI,yBACA,cjBmHkB,OiBhHpB,wBACE,WACA,eACA,cACA,eACA,WACA,WAEA,uJAGE,kBACA,WACA,cACA,eACA,WACA,WAKN,aACE,aACA,cAEA,eACE,MpBpES,QoBqET,qBAGF,qDACE,kBAIJ,YACE,YjBrEW,uGiBsEX,UjB/CY,OiBmDZ,kBACE,oBACA,gCAIF,0LACC,kBACA,UACA,UACA,eACA,+CACA,4CACA,0CACA,uCAGD,8NACC,UAID,qDAGE,cAIF,iBACE,iBAYA,iCACE,0BAEA,qCACE,oCAKN,8BACE,kBACA,qBACA,eACA,WpB9H0B,QoB+H1B,cjBsBY,IiBpBZ,2EAEE,sBACA,YAIJ,kBACE,eACA,YjB7IS,uGiB8IT,iBAGF,kBACE,gBACA,YjBnJS,uGiBoJT,UjB3HU,MiB8HZ,sBACE,UjB/HU,MiBmIZ,iCACE,kBACA,qBAIJ,YACE,kBACA,kBfvIA,WeyIA,kBjBPiB,eiBQjB,UjBRiB,eiBSjB,6BACA,qBf1IA,mBACE,WACA,WACA,ceyIF,qBACE,kBACA,kBACA,cfjJF,WemJE,sBACA,4BACA,2BACA,kBjBpBe,eiBqBf,UjBrBe,eiBsBf,6BACA,qBfvJF,4BACE,WACA,WACA,cesJA,uBACE,WAGF,8BACE,iBACA,kBb9IF,wBa4IA,8BAKI,UjBxDE,QiB4DN,kKAIE,WACA,uCAGF,iCACE,UjBxEG,MiB2EL,kCACE,UjB7LQ,QI2BV,0BaiKA,kCAII,UjBjMM,SiBuMd,kBACE,WACA,YACA,+BAGF,oBACE,kBACA,SACA,QACA,cACA,gBACA,WACA,YjBzOM,oBiB0ON,UjB/MY,QiBgNZ,gBACA,iBACA,UACA,WACA,wBb9LE,wBaiLJ,oBAgBI,kBAGF,sBACE,WACA,qBAQJ,aACE,eACA,gBACA,6BbjNE,0BaoNA,uBACE,SACA,mBACA,WACA,YACA,gBACA,UACA,kBACA,WAKN,mBACE,mBACA,UjB1PY,MiB2PZ,yBAOF,2BACE,eACA,MpBrSiB,QoBsSjB,YjB9RW,uGiB+RX,UjBtQY,MiBwQZ,+BACE,SAGF,+BACE,cAIJ,kBACE,mBACA,UjBnRY,MiBoRZ,yBAGF,wBACE,YACA,kBACA,mBAQA,qBACE,aAGF,uBACE,kBAIJ,qBACE,qBACA,iBACA,kBACA,iBACA,qBACA,yBACA,cjBnLc,IiBqLd,2BACE,qBACA,MpBtU0B,QoB0U9B,mBACE,kBACA,mBAEA,oCACE,gCAGF,wCACE,aAGF,sCACE,WACA,SAGF,sCACE,eAIJ,iBACE,mBACA,MpBnXiB,QoBsXnB,iBACE,MpBvXiB,QoB0XnB,iBACE,aACA,oBACA,qCACA,iBACA,UACA,gBACA,gBbzUE,wBakUJ,iBAUI,sCAGF,mBACE,oBACA,oBACA,aACA,gBACA,yBACA,sBACA,8BACA,cACA,qBACA,gCAIJ,aACE,cACA,WACA,MpBxZiB,QoByZjB,eACA,yBACA,iBACA,qBAOF,gBACE,WACA,cACA,eACA,WACA,WAGF,sBACE,gBACA,mBACA,iBACA,UjB9YY,MiB+YZ,6BACA,yBAGF,qBACE,mBjB/QkB,oBiBgRlB,WjBhRkB,oBiBmRhB,oJAIE,oBACA,mBACA,yBACA,gBACA,YAKN,SfjaE,WemaA,afjaA,gBACE,WACA,WACA,cegaF,0BACE,gCAIJ,yBACE,WACA,WACA,Yb1ZE,wBauZJ,yBAMI,YACA,cAIJ,iBACE,WACA,YACA,kBbraE,wBakaJ,iBAMI,WACA,YACA,YACA,0BAIJ,0BACE,YACA,wBbjbE,wBa+aJ,0BAKI,0BAIJ,iBACE,SAEA,mBACE,qBAIJ,eAEE,SAEA,iBACE,qBAQJ,efheE,WekeA,WACA,eACA,gBACA,6BfneA,sBACE,WACA,WACA,cEcA,wBa6cJ,eAQI,YACA,0BbtdA,wBa6cJ,eAaI,0BAGF,iBACE,cACA,qBAIJ,qBACE,mBACA,UjB5fY,MiB6fZ,yBbteE,wBa8eF,YAEI,iBbhfF,wBa8eF,YAMI,iBbpfF,wBawfF,qBAEI,iBb1fF,wBawfF,qBAMI,iBC1jBN,SACE,eACA,kBd0DE,wBc5DJ,SAKI,YACA,yBACA,clBqJyB,OIhGzB,wBc5DJ,SAWI,yBACA,clBiJkB,OkB7ItB,eACE,kBAEA,iBACE,kBACA,WAGF,gCACE,gBAIJ,mBACE,sBACA,oBACA,UlBIY,IkBHZ,MrB7BiB,QqB8BjB,gCAEA,oDACE,gBAIJ,qBACE,oBACA,YlB/BW,uGkBgCX,oBACA,gBACA,uBAEA,8CACE,WACA,kBACA,OACA,MACA,QACA,SAGF,yBACE,WAMF,oCACE,eACA,mBAIJ,uBACE,aACA,UlBnCY,MkBqCZ,yBACE,cAGF,yBACE,kBAIJ,sBACE,kBACA,clB+Ec,IkB9Ed,gBAEA,0BACE,WAIJ,uBACE,kBACA,SACA,QACA,cACA,gBACA,WACA,YlBzFM,oBkB0FN,UlB9DY,OkB+DZ,gBACA,iBACA,UACA,WACA,wBd9CE,wBciCJ,uBAgBI,kBAGF,yBACE,WACA,qBASF,mDACE,eACA,edjEA,wBc0EF,wBAII,qBd9EF,wBc0EF,wBAQI,qBAKN,YACE,kBdxFE,0BcuFJ,YAII,WACA,qBAEA,4BACE,WACA,cAGF,4BACE,WACA,2BdrGF,wBcuFJ,YAmBI,cACA,eACA,qBAEA,4BACE,WAGF,4BACE,WAGF,4BACE,WACA,0BAGF,4BACE,WACA,0BAGF,4BACE,WACA,2BAIJ,mDACE,eACA,eAGF,4BACE,cAEA,oCACE,aAIJ,iCACE,gBACA,UlB7KU,IkBgLZ,mCACE,adzJA,wBcwJF,mCAII,cACA,UlBpLQ,OIuBV,0BciKF,kCAEI,kBdnKF,wBciKF,kCAMI,kBASN,kBhBnME,WgBqMA,kBACA,gChBpMA,yBACE,WACA,WACA,cgBmMF,uCACE,gBAIJ,eACE,kBACA,kBACA,kBd7LE,0Bc0LJ,eAMI,WACA,gBACA,qBAEA,+BACE,WACA,cAGF,+BACE,WACA,0BAGF,+BACE,WACA,0BAGF,qCACE,iBACA,iBAIJ,mCACE,2BACA,4BAGF,6BACE,WACA,kBACA,OACA,MACA,QACA,SAGF,qBACE,kBACA,WACA,cACA,eACA,WACA,WACA,kBAEA,oCACE,WAGF,2CACE,kBAGF,mCACE,WACA,kBACA,OACA,MACA,QACA,Sd9PF,0BckQE,2CACE,WACA,qBAGF,yCACE,YACA,2BACA,4BACA,sBAKN,sBACE,kBACA,WACA,cACA,eACA,WACA,WACA,kBAEA,qCACE,WAGF,4CACE,kBAGF,oCACE,WACA,kBACA,OACA,MACA,QACA,SdvSF,0BcgRF,sBA2BI,iBAEA,4CACE,YACA,qBAGF,0CACE,WACA,qBACA,2BACA,6BAKN,uBACE,kBACA,WACA,cACA,eACA,WACA,WACA,kBAEA,sCACE,WACA,WAGF,6CACE,kBAGF,qCACE,WACA,kBACA,OACA,MACA,QACA,SdnVF,0Bc2TF,uBA4BI,kBAEA,6CACE,cACA,qBAGF,2CACE,cACA,sBAUJ,gDACE,iBACA,cAIJ,qHAIE,cdpXA,wBc6XF,eAEI,iBd/XF,wBc6XF,eAMI,iBAQL,kCACC,qBCpcF,SjBqCE,WAEA,gBACE,WACA,WACA,cEcA,wBexDJ,SAWI,WACA,wBACA,YACA,2CACA,mCAEA,eACE,UAGF,gBACE,gBAIA,oCf8BF,wBexDJ,SA+BI,yBAGF,WACE,eACA,kBAGF,4DAKE,gBACA,YnBrCS,uGmBwCX,uBAEE,YnB1CS,uGmB2CT,UnBlBU,MmBmBV,gBAGF,aACE,WAEA,mBACE,WACA,YAKN,gBACE,kBfVE,wBeSJ,gBAII,kBACA,MACA,QACA,MnBgFyB,MmB/EzB,oBACA,iBACA,WAEA,uBjBxCF,WiB0CI,wBACA,gBACA,QACA,YjB3CJ,8BACE,WACA,WACA,ciB2CI,uCACE,gBACA,8Bf/BN,wBeSJ,gBA6BI,MnB2DkB,MmB1DlB,qBfvCA,wBe2CJ,wBAEI,kBACA,YACA,gBf/CA,wBe2CJ,wBAQI,gBAQJ,gBACE,mBACA,mBACA,WACA,Yf/DE,wBe2DJ,gBAOI,cACA,WACA,aAGF,oBACE,gBACA,kBfzEA,wBeuEF,oBAKI,YACA,0BAKN,iBACE,mBACA,mBACA,kBACA,mBACA,cfvFE,wBekFJ,iBAQI,cACA,WACA,eACA,iBAGF,mBACE,cACA,qBAIJ,cACE,SfvGE,wBesGJ,cAII,gBACA,oBAGJ,uBACE,YnB/JW,uGmBgKX,UnBxIY,ImB2Id,aACE,SfpHE,wBemHJ,aAII,gBACA,oBAIJ,sBACE,kBACA,mBACA,sBACA,YnBhLW,uGmBiLX,WACA,eAGE,sCACE,gBAKF,+CACE,iBf5IF,wBe4HJ,sBAqBI,eAGF,6BACE,kBACA,gBAGE,gCADF,oCAEI,WACA,eACA,MACA,OACA,WACA,YACA,qBAKF,yCACE,oBftKJ,wBeoJF,6BAuBI,cAKN,cACE,aACA,kBACA,QACA,gBACA,aACA,qBACA,yBACA,cnBhFc,ImBiFd,WtBnPiB,KsBoPjB,oEACA,eAEA,0BACE,cf9LA,wBegLJ,cAkBI,cACA,kBACA,SACA,UACA,SACA,yBACA,iBAGF,qBACE,cACA,WACA,kBACA,UACA,sBACA,QACA,mBACA,yBACA,mCACA,UfrNA,wBe2MF,qBAaI,cAIJ,oBACE,cACA,WACA,kBACA,UACA,sBACA,QACA,mBACA,yBACA,gCACA,UftOA,wBe4NF,oBAaI,cAIJ,iBACE,aACA,qBAGF,iBACE,mBAGF,gBACE,cACA,kBACA,kBACA,gBACA,mBACA,cACA,UnBrRU,ImBsRV,qBAEA,sBACE,0BASN,sBACE,kBf3QE,wBe0QJ,sBAII,iBACA,YACA,cACA,cACA,qBACA,qBACA,gBAEA,6BACE,YfvRF,wBe0QJ,sBAkBI,cACA,sBCzVJ,aAEE,SACE,aAGF,EACE,2BACA,8BACA,sBAGF,KACE,SACA,UACA,2BACA,eAGF,KACE,cACA,2BACA,sBACA,eACA,gBACA,kCACA,mCACA,kCAGF,kBAME,WACA,gBACA,qBACA,aAGF,GACE,iBAGF,GACE,eAGF,GACE,kBAGF,GACE,iBAGF,GACE,kBAGF,GACE,eAGF,YAEE,WACA,0BACA,qBAGF,MACE,yBAGF,MACE,2BAGF,YAGE,6BAGF,MAEE,iBAGF,IACE,SACA,cACA,0BACA,sBAGF,GACE,SACA,6BACA,SACA,iBACA,UAGF,GACE,iBAGF,GACE,SACA,qBAGF,2BAEE,SACA,qBAGF,kDAWE,wBAGF,aAKE,UACA,SAGF,kBAME,uBACA,wBAGF,eAGE,wBAGF,IACE,sBACA,uBACA,wBAGF,IACE,gCACA,qBAGF,2EAGE,4BACA,cAGF,uCAEE,6BAGF,MACE,eAGF,MACE,SACA,UACA,WAGF,+BAEE,yBAGF,kBACE,wBAGF,UACE,aAGF,oBACE,WAGF,+DAEE,WAGF,oBACE,sBACA,2BACA,UAEA,sBACE,sBAQJ,uLAcE,wBACA","sourcesContent":["/* ==========================================================================\n Air skin\n ========================================================================== */\n\n/* Colors */\n\n$background-color: #eeeeee !default;\n$text-color: #222831 !default;\n$muted-text-color: #393e46 !default;\n// $primary-color: #000 !default;\n$primary-color: #666666 !default;\n$border-color: mix(#fff, #393e46, 75%) !default;\n$footer-background-color: $primary-color !default;\n$link-color: #F16334 !default;\n$masthead-link-color: $text-color !default;\n$masthead-link-color-hover: $text-color !default;\n$navicon-link-color-hover: mix(#fff, $text-color, 80%) !default;\n\n$code-color : #C72557 !default;\n$code-background-color : #F9F2F4 !default;\n$code-background-color-dark : #F9F2F4 !default;\n$success-color : #27ae60 !default;\n$warning-color : #e67e22 !default;\n$danger-color : #c0392b !default;\n$info-color : #03a9f4 !default;\n$link-color-hover : mix(#000, $link-color, 25%) !default;\n$link-color-visited : mix(#fff, $link-color, 40%) !default;\n\n.page__content {\n .notice,\n .notice--primary,\n .notice--info,\n .notice--warning,\n .notice--success,\n .notice--danger {\n color: $text-color;\n }\n}\n\n.page__footer {\n color: #fff !important; // override\n}\n\n.page__footer-follow .social-icons .svg-inline--fa {\n color: inherit;\n}\n","/*!\n * Minimal Mistakes Jekyll Theme 4.24.0 by Michael Rose\n * Copyright 2013-2020 Michael Rose - mademistakes.com | @mmistakes\n * Licensed under MIT (https://github.com/mmistakes/minimal-mistakes/blob/master/LICENSE)\n*/\n\n/* Variables */\n@import \"minimal-mistakes/variables\";\n\n/* Mixins and functions */\n@import \"minimal-mistakes/vendor/breakpoint/breakpoint\";\n@include breakpoint-set(\"to ems\", true);\n@import \"minimal-mistakes/vendor/magnific-popup/magnific-popup\"; // Magnific Popup\n@import \"minimal-mistakes/vendor/susy/susy\";\n@import \"minimal-mistakes/mixins\";\n\n/* Core CSS */\n@import \"minimal-mistakes/reset\";\n@import \"minimal-mistakes/base\";\n@import \"minimal-mistakes/forms\";\n@import \"minimal-mistakes/tables\";\n@import \"minimal-mistakes/animations\";\n\n/* Components */\n@import \"minimal-mistakes/buttons\";\n@import \"minimal-mistakes/notices\";\n@import \"minimal-mistakes/masthead\";\n@import \"minimal-mistakes/navigation\";\n@import \"minimal-mistakes/footer\";\n@import \"minimal-mistakes/search\";\n@import \"minimal-mistakes/syntax\";\n\n/* Utility classes */\n@import \"minimal-mistakes/utilities\";\n\n/* Layout specific */\n@import \"minimal-mistakes/page\";\n@import \"minimal-mistakes/archive\";\n@import \"minimal-mistakes/sidebar\";\n@import \"minimal-mistakes/print\";\n","////////////////////////\n// Settings //\n////////////////////////\n\n// overlay\n$mfp-overlay-color: #000; // Color of overlay screen\n$mfp-overlay-opacity: 0.8; // Opacity of overlay screen\n$mfp-shadow: 0 0 8px rgba(0, 0, 0, 0.6); // Shadow on image or iframe\n\n// spacing\n$mfp-popup-padding-left: 8px; // Padding from left and from right side\n$mfp-popup-padding-left-mobile: 6px; // Same as above, but is applied when width of window is less than 800px\n\n$mfp-z-index-base: 1040; // Base z-index of popup\n\n// controls\n$mfp-include-arrows: true; // Include styles for nav arrows\n$mfp-controls-opacity: 1; // Opacity of controls\n$mfp-controls-color: #fff; // Color of controls\n$mfp-controls-border-color: #fff; // Border color of controls\n$mfp-inner-close-icon-color: #fff; // Color of close button when inside\n$mfp-controls-text-color: #ccc; // Color of preloader and \"1 of X\" indicator\n$mfp-controls-text-color-hover: #fff; // Hover color of preloader and \"1 of X\" indicator\n$mfp-IE7support: true; // Very basic IE7 support\n\n// Iframe-type options\n$mfp-include-iframe-type: true; // Enable Iframe-type popups\n$mfp-iframe-padding-top: 40px; // Iframe padding top\n$mfp-iframe-background: #000; // Background color of iframes\n$mfp-iframe-max-width: 900px; // Maximum width of iframes\n$mfp-iframe-ratio: 9/16; // Ratio of iframe (9/16 = widescreen, 3/4 = standard, etc.)\n\n// Image-type options\n$mfp-include-image-type: true; // Enable Image-type popups\n$mfp-image-background: #444 !default;\n$mfp-image-padding-top: 40px; // Image padding top\n$mfp-image-padding-bottom: 40px; // Image padding bottom\n$mfp-include-mobile-layout-for-image: true; // Removes paddings from top and bottom\n\n// Image caption options\n$mfp-caption-title-color: #f3f3f3; // Caption title color\n$mfp-caption-subtitle-color: #bdbdbd; // Caption subtitle color\n.mfp-counter { font-family: $serif; } // Caption font family\n\n// A11y\n$mfp-use-visuallyhidden: false;","/* ==========================================================================\n Variables\n ========================================================================== */\n\n/*\n Typography\n ========================================================================== */\n\n$doc-font-size: 16 !default;\n\n/* paragraph indention */\n$paragraph-indent: false !default; // true, false (default)\n$indent-var: 1.3em !default;\n\n/* system typefaces */\n$serif: Georgia, Times, serif !default;\n$sans-serif: -apple-system, BlinkMacSystemFont, \"Roboto\", \"Segoe UI\",\n \"Helvetica Neue\", \"Lucida Grande\", Arial, sans-serif !default;\n$monospace: Monaco, Consolas, \"Lucida Console\", monospace !default;\n\n/* sans serif typefaces */\n$sans-serif-narrow: $sans-serif !default;\n$helvetica: Helvetica, \"Helvetica Neue\", Arial, sans-serif !default;\n\n/* serif typefaces */\n$georgia: Georgia, serif !default;\n$times: Times, serif !default;\n$bodoni: \"Bodoni MT\", serif !default;\n$calisto: \"Calisto MT\", serif !default;\n$garamond: Garamond, serif !default;\n\n$global-font-family: $sans-serif !default;\n$header-font-family: $sans-serif !default;\n$caption-font-family: $serif !default;\n\n/* type scale */\n$type-size-1: 2.441em !default; // ~39.056px\n$type-size-2: 1.953em !default; // ~31.248px\n$type-size-3: 1.563em !default; // ~25.008px\n$type-size-4: 1.25em !default; // ~20px\n$type-size-5: 1em !default; // ~16px\n$type-size-6: 0.75em !default; // ~12px\n$type-size-7: 0.6875em !default; // ~11px\n$type-size-8: 0.625em !default; // ~10px\n\n/* headline scale */\n$h-size-1: 1.563em !default; // ~25.008px\n$h-size-2: 1.25em !default; // ~20px\n$h-size-3: 1.125em !default; // ~18px\n$h-size-4: 1.0625em !default; // ~17px\n$h-size-5: 1.03125em !default; // ~16.5px\n$h-size-6: 1em !default; // ~16px\n\n/*\n Colors\n ========================================================================== */\n\n$gray: #7a8288 !default;\n$dark-gray: mix(#000, $gray, 50%) !default;\n$darker-gray: mix(#000, $gray, 60%) !default;\n$light-gray: mix(#fff, $gray, 50%) !default;\n$lighter-gray: mix(#fff, $gray, 90%) !default;\n\n$code-color: #0066cc !default;\n\n$background-color: #fff !default;\n$code-background-color: #fafafa !default;\n$code-background-color-dark: $light-gray !default;\n$text-color: $dark-gray !default;\n$muted-text-color: mix(#fff, $text-color, 20%) !default;\n$border-color: $lighter-gray !default;\n$form-background-color: $lighter-gray !default;\n$footer-background-color: $lighter-gray !default;\n\n$primary-color: #6f777d !default;\n$success-color: #3fa63f !default;\n$warning-color: #d67f05 !default;\n$danger-color: #ee5f5b !default;\n$info-color: #3b9cba !default;\n$focus-color: $primary-color !default;\n$active-color: mix(#fff, $primary-color, 80%) !default;\n\n/* YIQ color contrast */\n$yiq-contrasted-dark-default: $dark-gray !default;\n$yiq-contrasted-light-default: #fff !default;\n$yiq-contrasted-threshold: 175 !default;\n$yiq-debug: false !default;\n\n/* brands */\n$behance-color: #1769ff !default;\n$bitbucket-color: #205081 !default;\n$dribbble-color: #ea4c89 !default;\n$facebook-color: #3b5998 !default;\n$flickr-color: #ff0084 !default;\n$foursquare-color: #0072b1 !default;\n$github-color: #171516 !default;\n$gitlab-color: #e24329 !default;\n$instagram-color: #517fa4 !default;\n$keybase-color: #ef7639 !default;\n$lastfm-color: #d51007 !default;\n$linkedin-color: #007bb6 !default;\n$mastodon-color: #2b90d9 !default;\n$pinterest-color: #cb2027 !default;\n$reddit-color: #ff4500 !default;\n$rss-color: #fa9b39 !default;\n$soundcloud-color: #ff3300 !default;\n$stackoverflow-color: #fe7a15 !default;\n$tumblr-color: #32506d !default;\n$twitter-color: #55acee !default;\n$vimeo-color: #1ab7ea !default;\n$vine-color: #00bf8f !default;\n$youtube-color: #bb0000 !default;\n$xing-color: #006567 !default;\n$reddit-color: #e55820 !default;\n\n/* links */\n$link-color: mix(#000, $info-color, 20%) !default;\n$link-color-hover: mix(#000, $link-color, 25%) !default;\n$link-color-visited: mix(#fff, $link-color, 15%) !default;\n$masthead-link-color: $primary-color !default;\n$masthead-link-color-hover: mix(#000, $primary-color, 25%) !default;\n$navicon-link-color-hover: mix(#fff, $primary-color, 75%) !default;\n\n/* notices */\n$notice-background-mix: 80% !default;\n$code-notice-background-mix: 90% !default;\n\n/* syntax highlighting (base16) */\n$base00: #263238 !default;\n$base01: #2e3c43 !default;\n$base02: #314549 !default;\n$base03: #546e7a !default;\n$base04: #b2ccd6 !default;\n$base05: #eeffff !default;\n$base06: #eeffff !default;\n$base07: #ffffff !default;\n$base08: #f07178 !default;\n$base09: #f78c6c !default;\n$base0a: #ffcb6b !default;\n$base0b: #c3e88d !default;\n$base0c: #89ddff !default;\n$base0d: #82aaff !default;\n$base0e: #c792ea !default;\n$base0f: #ff5370 !default;\n\n/*\n Breakpoints\n ========================================================================== */\n\n$small: 600px !default;\n$medium: 768px !default;\n$medium-wide: 900px !default;\n$large: 1024px !default;\n$x-large: 1520px !default;\n$max-width: $x-large !default;\n\n/*\n Grid\n ========================================================================== */\n\n$right-sidebar-width-narrow: 200px !default;\n$right-sidebar-width: 300px !default;\n$right-sidebar-width-wide: 400px !default;\n\n/*\n Other\n ========================================================================== */\n\n$border-radius: 4px !default;\n$box-shadow: 0 1px 1px rgba(0, 0, 0, 0.125) !default;\n$nav-height: 2em !default;\n$nav-toggle-height: 2rem !default;\n$navicon-width: 1.5rem !default;\n$navicon-height: 0.25rem !default;\n$global-transition: all 0.2s ease-in-out !default;\n$intro-transition: intro 0.3s both !default;\n","/* Magnific Popup CSS */\n\n@import \"settings\";\n\n////////////////////////\n//\n// Contents:\n//\n// 1. Default Settings\n// 2. General styles\n// - Transluscent overlay\n// - Containers, wrappers\n// - Cursors\n// - Helper classes\n// 3. Appearance\n// - Preloader & text that displays error messages\n// - CSS reset for buttons\n// - Close icon\n// - \"1 of X\" counter\n// - Navigation (left/right) arrows\n// - Iframe content type styles\n// - Image content type styles\n// - Media query where size of arrows is reduced\n// - IE7 support\n//\n////////////////////////\n\n\n\n////////////////////////\n// 1. Default Settings\n////////////////////////\n\n$mfp-overlay-color: #0b0b0b !default;\n$mfp-overlay-opacity: 0.8 !default;\n$mfp-shadow: 0 0 8px rgba(0, 0, 0, 0.6) !default; // shadow on image or iframe\n$mfp-popup-padding-left: 8px !default; // Padding from left and from right side\n$mfp-popup-padding-left-mobile: 6px !default; // Same as above, but is applied when width of window is less than 800px\n\n$mfp-z-index-base: 1040 !default; // Base z-index of popup\n$mfp-include-arrows: true !default; // include styles for nav arrows\n$mfp-controls-opacity: 0.65 !default;\n$mfp-controls-color: #FFF !default;\n$mfp-controls-border-color: #3F3F3F !default;\n$mfp-inner-close-icon-color: #333 !default;\n$mfp-controls-text-color: #CCC !default; // Color of preloader and \"1 of X\" indicator\n$mfp-controls-text-color-hover: #FFF !default;\n$mfp-IE7support: true !default; // Very basic IE7 support\n\n// Iframe-type options\n$mfp-include-iframe-type: true !default;\n$mfp-iframe-padding-top: 40px !default;\n$mfp-iframe-background: #000 !default;\n$mfp-iframe-max-width: 900px !default;\n$mfp-iframe-ratio: 9/16 !default;\n\n// Image-type options\n$mfp-include-image-type: true !default;\n$mfp-image-background: #444 !default;\n$mfp-image-padding-top: 40px !default;\n$mfp-image-padding-bottom: 40px !default;\n$mfp-include-mobile-layout-for-image: true !default; // Removes paddings from top and bottom\n\n// Image caption options\n$mfp-caption-title-color: #F3F3F3 !default;\n$mfp-caption-subtitle-color: #BDBDBD !default;\n\n// A11y\n$mfp-use-visuallyhidden: false !default; // Hide content from browsers, but make it available for screen readers\n\n\n\n////////////////////////\n// 2. General styles\n////////////////////////\n\n// Transluscent overlay\n.mfp-bg {\n top: 0;\n left: 0;\n width: 100%;\n height: 100%;\n z-index: $mfp-z-index-base + 2;\n overflow: hidden;\n position: fixed;\n\n background: $mfp-overlay-color;\n opacity: $mfp-overlay-opacity;\n @if $mfp-IE7support {\n filter: unquote(\"alpha(opacity=#{$mfp-overlay-opacity*100})\");\n }\n}\n\n// Wrapper for popup\n.mfp-wrap {\n top: 0;\n left: 0;\n width: 100%;\n height: 100%;\n z-index: $mfp-z-index-base + 3;\n position: fixed;\n outline: none !important;\n -webkit-backface-visibility: hidden; // fixes webkit bug that can cause \"false\" scrollbar\n}\n\n// Root container\n.mfp-container {\n text-align: center;\n position: absolute;\n width: 100%;\n height: 100%;\n left: 0;\n top: 0;\n padding: 0 $mfp-popup-padding-left;\n -webkit-box-sizing: border-box;\n -moz-box-sizing: border-box;\n box-sizing: border-box;\n}\n\n// Vertical centerer helper\n.mfp-container {\n &:before {\n content: '';\n display: inline-block;\n height: 100%;\n vertical-align: middle;\n }\n}\n\n// Remove vertical centering when popup has class `mfp-align-top`\n.mfp-align-top {\n .mfp-container {\n &:before {\n display: none;\n }\n }\n}\n\n// Popup content holder\n.mfp-content {\n position: relative;\n display: inline-block;\n vertical-align: middle;\n margin: 0 auto;\n text-align: left;\n z-index: $mfp-z-index-base + 5;\n}\n.mfp-inline-holder,\n.mfp-ajax-holder {\n .mfp-content {\n width: 100%;\n cursor: auto;\n }\n}\n\n// Cursors\n.mfp-ajax-cur {\n cursor: progress;\n}\n.mfp-zoom-out-cur {\n &, .mfp-image-holder .mfp-close {\n cursor: -moz-zoom-out;\n cursor: -webkit-zoom-out;\n cursor: zoom-out;\n }\n}\n.mfp-zoom {\n cursor: pointer;\n cursor: -webkit-zoom-in;\n cursor: -moz-zoom-in;\n cursor: zoom-in;\n}\n.mfp-auto-cursor {\n .mfp-content {\n cursor: auto;\n }\n}\n\n.mfp-close,\n.mfp-arrow,\n.mfp-preloader,\n.mfp-counter {\n -webkit-user-select:none;\n -moz-user-select: none;\n user-select: none;\n}\n\n// Hide the image during the loading\n.mfp-loading {\n &.mfp-figure {\n display: none;\n }\n}\n\n// Helper class that hides stuff\n@if $mfp-use-visuallyhidden {\n // From HTML5 Boilerplate https://github.com/h5bp/html5-boilerplate/blob/v4.2.0/doc/css.md#visuallyhidden\n .mfp-hide {\n border: 0 !important;\n clip: rect(0 0 0 0) !important;\n height: 1px !important;\n margin: -1px !important;\n overflow: hidden !important;\n padding: 0 !important;\n position: absolute !important;\n width: 1px !important;\n }\n} @else {\n .mfp-hide {\n display: none !important;\n }\n}\n\n\n////////////////////////\n// 3. Appearance\n////////////////////////\n\n// Preloader and text that displays error messages\n.mfp-preloader {\n color: $mfp-controls-text-color;\n position: absolute;\n top: 50%;\n width: auto;\n text-align: center;\n margin-top: -0.8em;\n left: 8px;\n right: 8px;\n z-index: $mfp-z-index-base + 4;\n a {\n color: $mfp-controls-text-color;\n &:hover {\n color: $mfp-controls-text-color-hover;\n }\n }\n}\n\n// Hide preloader when content successfully loaded\n.mfp-s-ready {\n .mfp-preloader {\n display: none;\n }\n}\n\n// Hide content when it was not loaded\n.mfp-s-error {\n .mfp-content {\n display: none;\n }\n}\n\n// CSS-reset for buttons\nbutton {\n &.mfp-close,\n &.mfp-arrow {\n overflow: visible;\n cursor: pointer;\n background: transparent;\n border: 0;\n -webkit-appearance: none;\n display: block;\n outline: none;\n padding: 0;\n z-index: $mfp-z-index-base + 6;\n -webkit-box-shadow: none;\n box-shadow: none;\n }\n &::-moz-focus-inner {\n padding: 0;\n border: 0\n }\n}\n\n\n// Close icon\n.mfp-close {\n width: 44px;\n height: 44px;\n line-height: 44px;\n\n position: absolute;\n right: 0;\n top: 0;\n text-decoration: none;\n text-align: center;\n opacity: $mfp-controls-opacity;\n @if $mfp-IE7support {\n filter: unquote(\"alpha(opacity=#{$mfp-controls-opacity*100})\");\n }\n padding: 0 0 18px 10px;\n color: $mfp-controls-color;\n\n font-style: normal;\n font-size: 28px;\n font-family: $serif;\n\n &:hover,\n &:focus {\n opacity: 1;\n @if $mfp-IE7support {\n filter: unquote(\"alpha(opacity=#{1*100})\");\n }\n }\n\n &:active {\n top: 1px;\n }\n}\n.mfp-close-btn-in {\n .mfp-close {\n color: $mfp-inner-close-icon-color;\n }\n}\n.mfp-image-holder,\n.mfp-iframe-holder {\n .mfp-close {\n color: $mfp-controls-color;\n right: -6px;\n text-align: right;\n padding-right: 6px;\n width: 100%;\n }\n}\n\n// \"1 of X\" counter\n.mfp-counter {\n position: absolute;\n top: 0;\n right: 0;\n color: $mfp-controls-text-color;\n font-size: 12px;\n line-height: 18px;\n}\n\n// Navigation arrows\n@if $mfp-include-arrows {\n .mfp-arrow {\n position: absolute;\n opacity: $mfp-controls-opacity;\n @if $mfp-IE7support {\n filter: unquote(\"alpha(opacity=#{$mfp-controls-opacity*100})\");\n }\n margin: 0;\n top: 50%;\n margin-top: -55px;\n padding: 0;\n width: 90px;\n height: 110px;\n -webkit-tap-highlight-color: rgba(0,0,0,0);\n &:active {\n margin-top: -54px;\n }\n &:hover,\n &:focus {\n opacity: 1;\n @if $mfp-IE7support {\n filter: unquote(\"alpha(opacity=#{1*100})\");\n }\n }\n &:before,\n &:after,\n .mfp-b,\n .mfp-a {\n content: '';\n display: block;\n width: 0;\n height: 0;\n position: absolute;\n left: 0;\n top: 0;\n margin-top: 35px;\n margin-left: 35px;\n border: medium inset transparent;\n }\n\n &:after,\n .mfp-a {\n\n border-top-width: 13px;\n border-bottom-width: 13px;\n top:8px;\n }\n\n &:before,\n .mfp-b {\n border-top-width: 21px;\n border-bottom-width: 21px;\n opacity: 0.7;\n }\n\n }\n\n .mfp-arrow-left {\n left: 0;\n\n &:after,\n .mfp-a {\n border-right: 17px solid $mfp-controls-color;\n margin-left: 31px;\n }\n &:before,\n .mfp-b {\n margin-left: 25px;\n border-right: 27px solid $mfp-controls-border-color;\n }\n }\n\n .mfp-arrow-right {\n right: 0;\n &:after,\n .mfp-a {\n border-left: 17px solid $mfp-controls-color;\n margin-left: 39px\n }\n &:before,\n .mfp-b {\n border-left: 27px solid $mfp-controls-border-color;\n }\n }\n}\n\n\n\n// Iframe content type\n@if $mfp-include-iframe-type {\n .mfp-iframe-holder {\n padding-top: $mfp-iframe-padding-top;\n padding-bottom: $mfp-iframe-padding-top;\n .mfp-content {\n line-height: 0;\n width: 100%;\n max-width: $mfp-iframe-max-width;\n }\n .mfp-close {\n top: -40px;\n }\n }\n .mfp-iframe-scaler {\n width: 100%;\n height: 0;\n overflow: hidden;\n padding-top: $mfp-iframe-ratio * 100%;\n iframe {\n position: absolute;\n display: block;\n top: 0;\n left: 0;\n width: 100%;\n height: 100%;\n box-shadow: $mfp-shadow;\n background: $mfp-iframe-background;\n }\n }\n}\n\n\n\n// Image content type\n@if $mfp-include-image-type {\n\n /* Main image in popup */\n img {\n &.mfp-img {\n width: auto;\n max-width: 100%;\n height: auto;\n display: block;\n line-height: 0;\n -webkit-box-sizing: border-box;\n -moz-box-sizing: border-box;\n box-sizing: border-box;\n padding: $mfp-image-padding-top 0 $mfp-image-padding-bottom;\n margin: 0 auto;\n }\n }\n\n /* The shadow behind the image */\n .mfp-figure {\n line-height: 0;\n &:after {\n content: '';\n position: absolute;\n left: 0;\n top: $mfp-image-padding-top;\n bottom: $mfp-image-padding-bottom;\n display: block;\n right: 0;\n width: auto;\n height: auto;\n z-index: -1;\n box-shadow: $mfp-shadow;\n background: $mfp-image-background;\n }\n small {\n color: $mfp-caption-subtitle-color;\n display: block;\n font-size: 12px;\n line-height: 14px;\n }\n figure {\n margin: 0;\n }\n figcaption {\n margin-top: 0;\n margin-bottom: 0; // reset for bottom spacing\n }\n }\n .mfp-bottom-bar {\n margin-top: -$mfp-image-padding-bottom + 4;\n position: absolute;\n top: 100%;\n left: 0;\n width: 100%;\n cursor: auto;\n }\n .mfp-title {\n text-align: left;\n line-height: 18px;\n color: $mfp-caption-title-color;\n word-wrap: break-word;\n padding-right: 36px; // leave some space for counter at right side\n }\n\n .mfp-image-holder {\n .mfp-content {\n max-width: 100%;\n }\n }\n\n .mfp-gallery {\n .mfp-image-holder {\n .mfp-figure {\n cursor: pointer;\n }\n }\n }\n\n\n @if $mfp-include-mobile-layout-for-image {\n @media screen and (max-width: 800px) and (orientation:landscape), screen and (max-height: 300px) {\n /**\n * Remove all paddings around the image on small screen\n */\n .mfp-img-mobile {\n .mfp-image-holder {\n padding-left: 0;\n padding-right: 0;\n }\n img {\n &.mfp-img {\n padding: 0;\n }\n }\n .mfp-figure {\n // The shadow behind the image\n &:after {\n top: 0;\n bottom: 0;\n }\n small {\n display: inline;\n margin-left: 5px;\n }\n }\n .mfp-bottom-bar {\n background: rgba(0,0,0,0.6);\n bottom: 0;\n margin: 0;\n top: auto;\n padding: 3px 5px;\n position: fixed;\n -webkit-box-sizing: border-box;\n -moz-box-sizing: border-box;\n box-sizing: border-box;\n &:empty {\n padding: 0;\n }\n }\n .mfp-counter {\n right: 5px;\n top: 3px;\n }\n .mfp-close {\n top: 0;\n right: 0;\n width: 35px;\n height: 35px;\n line-height: 35px;\n background: rgba(0, 0, 0, 0.6);\n position: fixed;\n text-align: center;\n padding: 0;\n }\n }\n }\n }\n}\n\n\n\n// Scale navigation arrows and reduce padding from sides\n@media all and (max-width: 900px) {\n .mfp-arrow {\n -webkit-transform: scale(0.75);\n transform: scale(0.75);\n }\n .mfp-arrow-left {\n -webkit-transform-origin: 0;\n transform-origin: 0;\n }\n .mfp-arrow-right {\n -webkit-transform-origin: 100%;\n transform-origin: 100%;\n }\n .mfp-container {\n padding-left: $mfp-popup-padding-left-mobile;\n padding-right: $mfp-popup-padding-left-mobile;\n }\n}\n\n\n\n// IE7 support\n// Styles that make popup look nicier in old IE\n@if $mfp-IE7support {\n .mfp-ie7 {\n .mfp-img {\n padding: 0;\n }\n .mfp-bottom-bar {\n width: 600px;\n left: 50%;\n margin-left: -300px;\n margin-top: 5px;\n padding-bottom: 5px;\n }\n .mfp-container {\n padding: 0;\n }\n .mfp-content {\n padding-top: 44px;\n }\n .mfp-close {\n top: 0;\n right: 0;\n padding-top: 0;\n }\n }\n}\n","/* ==========================================================================\n MIXINS\n ========================================================================== */\n\n%tab-focus {\n /* Default*/\n outline: thin dotted $focus-color;\n /* Webkit*/\n outline: 5px auto $focus-color;\n outline-offset: -2px;\n}\n\n/*\n em function\n ========================================================================== */\n\n@function em($target, $context: $doc-font-size) {\n @return ($target / $context) * 1em;\n}\n\n\n/*\n Bourbon clearfix\n ========================================================================== */\n\n/*\n * Provides an easy way to include a clearfix for containing floats.\n * link http://cssmojo.com/latest_new_clearfix_so_far/\n *\n * example scss - Usage\n *\n * .element {\n * @include clearfix;\n * }\n *\n * example css - CSS Output\n *\n * .element::after {\n * clear: both;\n * content: \"\";\n * display: table;\n * }\n*/\n\n@mixin clearfix {\n clear: both;\n\n &::after {\n clear: both;\n content: \"\";\n display: table;\n }\n}\n\n/*\n Compass YIQ Color Contrast\n https://github.com/easy-designs/yiq-color-contrast\n ========================================================================== */\n\n@function yiq-is-light(\n $color,\n $threshold: $yiq-contrasted-threshold\n) {\n $red: red($color);\n $green: green($color);\n $blue: blue($color);\n\n $yiq: (($red*299)+($green*587)+($blue*114))/1000;\n\n @if $yiq-debug { @debug $yiq, $threshold; }\n\n @return if($yiq >= $threshold, true, false);\n}\n\n@function yiq-contrast-color(\n $color,\n $dark: $yiq-contrasted-dark-default,\n $light: $yiq-contrasted-light-default,\n $threshold: $yiq-contrasted-threshold\n) {\n @return if(yiq-is-light($color, $threshold), $yiq-contrasted-dark-default, $yiq-contrasted-light-default);\n}\n\n@mixin yiq-contrasted(\n $background-color,\n $dark: $yiq-contrasted-dark-default,\n $light: $yiq-contrasted-light-default,\n $threshold: $yiq-contrasted-threshold\n) {\n background-color: $background-color;\n color: yiq-contrast-color($background-color, $dark, $light, $threshold);\n}","/* ==========================================================================\n STYLE RESETS\n ========================================================================== */\n\n* { box-sizing: border-box; }\n\nhtml {\n /* apply a natural box layout model to all elements */\n box-sizing: border-box;\n background-color: $background-color;\n font-size: 16px;\n\n @include breakpoint($medium) {\n font-size: 18px;\n }\n\n @include breakpoint($large) {\n font-size: 20px;\n }\n\n @include breakpoint($x-large) {\n font-size: 22px;\n }\n\n -webkit-text-size-adjust: 100%;\n -ms-text-size-adjust: 100%;\n}\n\n/* Remove margin */\n\nbody { margin: 0; }\n\n/* Selected elements */\n\n::-moz-selection {\n color: #fff;\n background: #000;\n}\n\n::selection {\n color: #fff;\n background: #000;\n}\n\n/* Display HTML5 elements in IE6-9 and FF3 */\n\narticle,\naside,\ndetails,\nfigcaption,\nfigure,\nfooter,\nheader,\nhgroup,\nmain,\nnav,\nsection {\n display: block;\n}\n\n/* Display block in IE6-9 and FF3 */\n\naudio,\ncanvas,\nvideo {\n display: inline-block;\n *display: inline;\n *zoom: 1;\n}\n\n/* Prevents modern browsers from displaying 'audio' without controls */\n\naudio:not([controls]) {\n display: none;\n}\n\na {\n color: $link-color;\n}\n\n/* Apply focus state */\n\na:focus {\n @extend %tab-focus;\n}\n\n/* Remove outline from links */\n\na:hover,\na:active {\n outline: 0;\n}\n\n/* Prevent sub and sup affecting line-height in all browsers */\n\nsub,\nsup {\n position: relative;\n font-size: 75%;\n line-height: 0;\n vertical-align: baseline;\n}\n\nsup {\n top: -0.5em;\n}\n\nsub {\n bottom: -0.25em;\n}\n\n/* img border in anchor's and image quality */\n\nimg {\n /* Responsive images (ensure images don't scale beyond their parents) */\n max-width: 100%; /* part 1: Set a maximum relative to the parent*/\n width: auto\\9; /* IE7-8 need help adjusting responsive images*/\n height: auto; /* part 2: Scale the height according to the width, otherwise you get stretching*/\n\n vertical-align: middle;\n border: 0;\n -ms-interpolation-mode: bicubic;\n}\n\n/* Prevent max-width from affecting Google Maps */\n\n#map_canvas img,\n.google-maps img {\n max-width: none;\n}\n\n/* Consistent form font size in all browsers, margin changes, misc */\n\nbutton,\ninput,\nselect,\ntextarea {\n margin: 0;\n font-size: 100%;\n vertical-align: middle;\n}\n\nbutton,\ninput {\n *overflow: visible; /* inner spacing ie IE6/7*/\n line-height: normal; /* FF3/4 have !important on line-height in UA stylesheet*/\n}\n\nbutton::-moz-focus-inner,\ninput::-moz-focus-inner { /* inner padding and border oddities in FF3/4*/\n padding: 0;\n border: 0;\n}\n\nbutton,\nhtml input[type=\"button\"], // avoid the WebKit bug in Android 4.0.* where (2) destroys native `audio` and `video` controls\ninput[type=\"reset\"],\ninput[type=\"submit\"] {\n -webkit-appearance: button; /* corrects inability to style clickable `input` types in iOS*/\n cursor: pointer; /* improves usability and consistency of cursor style between image-type `input` and others*/\n}\n\nlabel,\nselect,\nbutton,\ninput[type=\"button\"],\ninput[type=\"reset\"],\ninput[type=\"submit\"],\ninput[type=\"radio\"],\ninput[type=\"checkbox\"] {\n cursor: pointer; /* improves usability and consistency of cursor style between image-type `input` and others*/\n}\n\ninput[type=\"search\"] { /* Appearance in Safari/Chrome*/\n box-sizing: border-box;\n -webkit-appearance: textfield;\n}\n\ninput[type=\"search\"]::-webkit-search-decoration,\ninput[type=\"search\"]::-webkit-search-cancel-button {\n -webkit-appearance: none; /* inner-padding issues in Chrome OSX, Safari 5*/\n}\n\ntextarea {\n overflow: auto; /* remove vertical scrollbar in IE6-9*/\n vertical-align: top; /* readability and alignment cross-browser*/\n}","//////////////////////////////\n// Default Variables\n//////////////////////////////\n$Breakpoint-Settings: (\n 'default media': all,\n 'default feature': min-width,\n 'default pair': width,\n\n 'force all media type': false,\n 'to ems': false,\n 'transform resolutions': true,\n\n 'no queries': false,\n 'no query fallbacks': false,\n\n 'base font size': 16px,\n\n 'legacy syntax': false\n);\n\n$breakpoint: () !default;\n\n//////////////////////////////\n// Imports\n//////////////////////////////\n@import \"settings\";\n@import \"context\";\n@import \"helpers\";\n@import \"parsers\";\n@import \"no-query\";\n\n@import \"respond-to\";\n\n@import \"legacy-settings\";\n\n//////////////////////////////\n// Breakpoint Mixin\n//////////////////////////////\n\n@mixin breakpoint($query, $no-query: false) {\n @include legacy-settings-warning;\n\n // Reset contexts\n @include private-breakpoint-reset-contexts();\n\n $breakpoint: breakpoint($query, false);\n\n $query-string: map-get($breakpoint, 'query');\n $query-fallback: map-get($breakpoint, 'fallback');\n\n $private-breakpoint-context-holder: map-get($breakpoint, 'context holder') !global;\n $private-breakpoint-query-count: map-get($breakpoint, 'query count') !global;\n\n // Allow for an as-needed override or usage of no query fallback.\n @if $no-query != false {\n $query-fallback: $no-query;\n }\n\n @if $query-fallback != false {\n $context-setter: private-breakpoint-set-context('no-query', $query-fallback);\n }\n\n // Print Out Query String\n @if not breakpoint-get('no queries') {\n @media #{$query-string} {\n @content;\n }\n }\n\n @if breakpoint-get('no query fallbacks') != false or breakpoint-get('no queries') == true {\n\n $type: type-of(breakpoint-get('no query fallbacks'));\n $print: false;\n\n @if ($type == 'bool') {\n $print: true;\n }\n @else if ($type == 'string') {\n @if $query-fallback == breakpoint-get('no query fallbacks') {\n $print: true;\n }\n }\n @else if ($type == 'list') {\n @each $wrapper in breakpoint-get('no query fallbacks') {\n @if $query-fallback == $wrapper {\n $print: true;\n }\n }\n }\n\n // Write Fallback\n @if ($query-fallback != false) and ($print == true) {\n $type-fallback: type-of($query-fallback);\n\n @if ($type-fallback != 'bool') {\n #{$query-fallback} & {\n @content;\n }\n }\n @else {\n @content;\n }\n }\n }\n\n @include private-breakpoint-reset-contexts();\n}\n\n\n@mixin mq($query, $no-query: false) {\n @include breakpoint($query, $no-query) {\n @content;\n }\n}\n","/* ==========================================================================\n BASE ELEMENTS\n ========================================================================== */\n\nhtml {\n /* sticky footer fix */\n position: relative;\n min-height: 100%;\n}\n\nbody {\n margin: 0;\n padding: 0;\n color: $text-color;\n font-family: $global-font-family;\n line-height: 1.5;\n\n &.overflow--hidden {\n /* when primary navigation is visible, the content in the background won't scroll */\n overflow: hidden;\n }\n}\n\nh1,\nh2,\nh3,\nh4,\nh5,\nh6 {\n margin: 2em 0 0.5em;\n line-height: 1.2;\n font-family: $header-font-family;\n font-weight: bold;\n}\n\nh1 {\n margin-top: 0;\n font-size: $h-size-1;\n}\n\nh2 {\n font-size: $h-size-2;\n}\n\nh3 {\n font-size: $h-size-3;\n}\n\nh4 {\n font-size: $h-size-4;\n}\n\nh5 {\n font-size: $h-size-5;\n}\n\nh6 {\n font-size: $h-size-6;\n}\n\nsmall,\n.small {\n font-size: $type-size-6;\n}\n\np {\n margin-bottom: 1.3em;\n}\n\nu,\nins {\n text-decoration: none;\n border-bottom: 1px solid $text-color;\n a {\n color: inherit;\n }\n}\n\ndel a {\n color: inherit;\n}\n\n/* reduce orphans and widows when printing */\n\np,\npre,\nblockquote,\nul,\nol,\ndl,\nfigure,\ntable,\nfieldset {\n orphans: 3;\n widows: 3;\n}\n\n/* abbreviations */\n\nabbr[title],\nabbr[data-original-title] {\n text-decoration: none;\n cursor: help;\n border-bottom: 1px dotted $text-color;\n}\n\n/* blockquotes */\n\nblockquote {\n margin: 2em 1em 2em 0;\n padding-left: 1em;\n padding-right: 1em;\n font-style: italic;\n border-left: 0.25em solid $primary-color;\n\n cite {\n font-style: italic;\n\n &:before {\n content: \"\\2014\";\n padding-right: 5px;\n }\n }\n}\n\n/* links */\n\na {\n &:focus {\n @extend %tab-focus;\n }\n\n &:visited {\n color: $link-color-visited;\n }\n\n &:hover {\n color: $link-color-hover;\n outline: 0;\n }\n}\n\n/* buttons */\n\nbutton:focus {\n @extend %tab-focus;\n}\n\n/* code */\n\ntt,\ncode,\nkbd,\nsamp,\npre {\n font-family: $monospace;\n}\n\npre {\n overflow-x: auto; /* add scrollbars to wide code blocks*/\n}\n\np > code,\na > code,\nli > code,\nfigcaption > code,\ntd > code {\n margin: 0;\n padding: 0;\n line-height: 1.5;\n font-family: $global-font-family;\n font-size: $type-size-5;\n color: $code-color;\n background: $code-background-color;\n border-radius: $border-radius;\n\n &:before,\n &:after {\n letter-spacing: -0.2em;\n content: \"\\00a0\"; /* non-breaking space*/\n }\n}\n\n/* horizontal rule */\n\nhr {\n display: block;\n margin: 1em 0;\n border: 0;\n border-top: 1px solid $border-color;\n}\n\n/* lists */\n\nul li,\nol li {\n margin-bottom: 0.5em;\n}\n\nli ul,\nli ol {\n margin-top: 0.5em;\n}\n\n/*\n Media and embeds\n ========================================================================== */\n\n/* Figures and images */\n\nfigure {\n display: -webkit-box;\n display: flex;\n -webkit-box-pack: justify;\n justify-content: space-between;\n -webkit-box-align: start;\n align-items: flex-start;\n flex-wrap: wrap;\n margin: 2em 0;\n\n img,\n iframe,\n .fluid-width-video-wrapper {\n margin-bottom: 1em;\n }\n\n img {\n width: 100%;\n border-radius: $border-radius;\n -webkit-transition: $global-transition;\n transition: $global-transition;\n }\n\n > a {\n display: block;\n }\n\n &.half {\n > a,\n > img {\n @include breakpoint($small) {\n width: calc(50% - 0.5em);\n }\n }\n\n figcaption {\n width: 100%;\n }\n }\n\n &.third {\n > a,\n > img {\n @include breakpoint($small) {\n width: calc(33.3333% - 0.5em);\n }\n }\n\n figcaption {\n width: 100%;\n }\n }\n}\n\n/* Figure captions */\n\nfigcaption {\n margin-bottom: 0.5em;\n color: $muted-text-color;\n font-family: $caption-font-family;\n font-size: $type-size-6;\n\n a {\n -webkit-transition: $global-transition;\n transition: $global-transition;\n\n &:hover {\n color: $link-color-hover;\n }\n }\n}\n\n/* Fix IE9 SVG bug */\n\nsvg:not(:root) {\n overflow: hidden;\n}\n\n/*\n Navigation lists\n ========================================================================== */\n\n/**\n * Removes margins, padding, and bullet points from navigation lists\n *\n * Example usage:\n * \n */\n\nnav {\n ul {\n margin: 0;\n padding: 0;\n }\n\n li {\n list-style: none;\n }\n\n a {\n text-decoration: none;\n }\n\n /* override white-space for nested lists */\n ul li,\n ol li {\n margin-bottom: 0;\n }\n\n li ul,\n li ol {\n margin-top: 0;\n }\n}\n\n/*\n Global animation transition\n ========================================================================== */\n\nb,\ni,\nstrong,\nem,\nblockquote,\np,\nq,\nspan,\nfigure,\nimg,\nh1,\nh2,\nheader,\ninput,\na,\ntr,\ntd,\nform button,\ninput[type=\"submit\"],\n.btn,\n.highlight,\n.archive__item-teaser {\n -webkit-transition: $global-transition;\n transition: $global-transition;\n}\n","/* ==========================================================================\n Forms\n ========================================================================== */\n\nform {\n margin: 0 0 5px 0;\n padding: 1em;\n background-color: $form-background-color;\n\n fieldset {\n margin-bottom: 5px;\n padding: 0;\n border-width: 0;\n }\n\n legend {\n display: block;\n width: 100%;\n margin-bottom: 5px * 2;\n *margin-left: -7px;\n padding: 0;\n color: $text-color;\n border: 0;\n white-space: normal;\n }\n\n p {\n margin-bottom: (5px / 2);\n }\n\n ul {\n list-style-type: none;\n margin: 0 0 5px 0;\n padding: 0;\n }\n\n br {\n display: none;\n }\n}\n\nlabel,\ninput,\nbutton,\nselect,\ntextarea {\n vertical-align: baseline;\n *vertical-align: middle;\n}\n\ninput,\nbutton,\nselect,\ntextarea {\n box-sizing: border-box;\n font-family: $sans-serif;\n}\n\nlabel {\n display: block;\n margin-bottom: 0.25em;\n color: $text-color;\n cursor: pointer;\n\n small {\n font-size: $type-size-6;\n }\n\n input,\n textarea,\n select {\n display: block;\n }\n}\n\ninput,\ntextarea,\nselect {\n display: inline-block;\n width: 100%;\n padding: 0.25em;\n margin-bottom: 0.5em;\n color: $text-color;\n background-color: $background-color;\n border: $border-color;\n border-radius: $border-radius;\n box-shadow: $box-shadow;\n}\n\n.input-mini {\n width: 60px;\n}\n\n.input-small {\n width: 90px;\n}\n\ninput[type=\"image\"],\ninput[type=\"checkbox\"],\ninput[type=\"radio\"] {\n width: auto;\n height: auto;\n padding: 0;\n margin: 3px 0;\n *margin-top: 0;\n line-height: normal;\n cursor: pointer;\n border-radius: 0;\n border: 0 \\9;\n box-shadow: none;\n}\n\ninput[type=\"checkbox\"],\ninput[type=\"radio\"] {\n box-sizing: border-box;\n padding: 0;\n *width: 13px;\n *height: 13px;\n}\n\ninput[type=\"image\"] {\n border: 0;\n}\n\ninput[type=\"file\"] {\n width: auto;\n padding: initial;\n line-height: initial;\n border: initial;\n background-color: transparent;\n background-color: initial;\n box-shadow: none;\n}\n\ninput[type=\"button\"],\ninput[type=\"reset\"],\ninput[type=\"submit\"] {\n width: auto;\n height: auto;\n cursor: pointer;\n *overflow: visible;\n}\n\nselect,\ninput[type=\"file\"] {\n *margin-top: 4px;\n}\n\nselect {\n width: auto;\n background-color: #fff;\n}\n\nselect[multiple],\nselect[size] {\n height: auto;\n}\n\ntextarea {\n resize: vertical;\n height: auto;\n overflow: auto;\n vertical-align: top;\n}\n\ninput[type=\"hidden\"] {\n display: none;\n}\n\n.form {\n position: relative;\n}\n\n.radio,\n.checkbox {\n padding-left: 18px;\n font-weight: normal;\n}\n\n.radio input[type=\"radio\"],\n.checkbox input[type=\"checkbox\"] {\n float: left;\n margin-left: -18px;\n}\n\n.radio.inline,\n.checkbox.inline {\n display: inline-block;\n padding-top: 5px;\n margin-bottom: 0;\n vertical-align: middle;\n}\n\n.radio.inline + .radio.inline,\n.checkbox.inline + .checkbox.inline {\n margin-left: 10px;\n}\n\n/*\n Disabled state\n ========================================================================== */\n\ninput[disabled],\nselect[disabled],\ntextarea[disabled],\ninput[readonly],\nselect[readonly],\ntextarea[readonly] {\n opacity: 0.5;\n cursor: not-allowed;\n}\n\n/*\n Focus & active state\n ========================================================================== */\n\ninput:focus,\ntextarea:focus {\n border-color: $primary-color;\n outline: 0;\n outline: thin dotted \\9;\n box-shadow: inset 0 1px 3px rgba($text-color, 0.06),\n 0 0 5px rgba($primary-color, 0.7);\n}\n\ninput[type=\"file\"]:focus,\ninput[type=\"radio\"]:focus,\ninput[type=\"checkbox\"]:focus,\nselect:focus {\n box-shadow: none;\n}\n\n/*\n Help text\n ========================================================================== */\n\n.help-block,\n.help-inline {\n color: $muted-text-color;\n}\n\n.help-block {\n display: block;\n margin-bottom: 1em;\n line-height: 1em;\n}\n\n.help-inline {\n display: inline-block;\n vertical-align: middle;\n padding-left: 5px;\n}\n\n/*\n .form-group\n ========================================================================== */\n\n.form-group {\n margin-bottom: 5px;\n padding: 0;\n border-width: 0;\n}\n\n/*\n .form-inline\n ========================================================================== */\n\n.form-inline input,\n.form-inline textarea,\n.form-inline select {\n display: inline-block;\n margin-bottom: 0;\n}\n\n.form-inline label {\n display: inline-block;\n}\n\n.form-inline .radio,\n.form-inline .checkbox,\n.form-inline .radio {\n padding-left: 0;\n margin-bottom: 0;\n vertical-align: middle;\n}\n\n.form-inline .radio input[type=\"radio\"],\n.form-inline .checkbox input[type=\"checkbox\"] {\n float: left;\n margin-left: 0;\n margin-right: 3px;\n}\n\n/*\n .form-search\n ========================================================================== */\n\n.form-search input,\n.form-search textarea,\n.form-search select {\n display: inline-block;\n margin-bottom: 0;\n}\n\n.form-search .search-query {\n padding-left: 14px;\n padding-right: 14px;\n margin-bottom: 0;\n border-radius: 14px;\n}\n\n.form-search label {\n display: inline-block;\n}\n\n.form-search .radio,\n.form-search .checkbox,\n.form-inline .radio {\n padding-left: 0;\n margin-bottom: 0;\n vertical-align: middle;\n}\n\n.form-search .radio input[type=\"radio\"],\n.form-search .checkbox input[type=\"checkbox\"] {\n float: left;\n margin-left: 0;\n margin-right: 3px;\n}\n\n/*\n .form--loading\n ========================================================================== */\n\n.form--loading:before {\n content: \"\";\n}\n\n.form--loading .form__spinner {\n display: block;\n}\n\n.form:before {\n position: absolute;\n top: 0;\n left: 0;\n width: 100%;\n height: 100%;\n background-color: rgba(255, 255, 255, 0.7);\n z-index: 10;\n}\n\n.form__spinner {\n display: none;\n position: absolute;\n top: 50%;\n left: 50%;\n z-index: 11;\n}\n","/* ==========================================================================\n TABLES\n ========================================================================== */\n\ntable {\n display: block;\n margin-bottom: 1em;\n width: 100%;\n font-family: $global-font-family;\n font-size: $type-size-6;\n border-collapse: collapse;\n overflow-x: auto;\n\n & + table {\n margin-top: 1em;\n }\n}\n\nthead {\n background-color: $border-color;\n border-bottom: 2px solid mix(#000, $border-color, 25%);\n}\n\nth {\n padding: 0.5em;\n font-weight: bold;\n text-align: left;\n}\n\ntd {\n padding: 0.5em;\n border-bottom: 1px solid mix(#000, $border-color, 25%);\n}\n\ntr,\ntd,\nth {\n vertical-align: middle;\n}","/* ==========================================================================\n ANIMATIONS\n ========================================================================== */\n\n@-webkit-keyframes intro {\n 0% {\n opacity: 0;\n }\n 100% {\n opacity: 1;\n }\n}\n\n@keyframes intro {\n 0% {\n opacity: 0;\n }\n 100% {\n opacity: 1;\n }\n}","/* ==========================================================================\n BUTTONS\n ========================================================================== */\n\n/*\n Default button\n ========================================================================== */\n\n.btn {\n /* default */\n display: inline-block;\n margin-bottom: 0.25em;\n padding: 0.5em 1em;\n font-family: $sans-serif;\n font-size: $type-size-6;\n font-weight: bold;\n text-align: center;\n text-decoration: none;\n border-width: 0;\n border-radius: $border-radius;\n cursor: pointer;\n\n .icon {\n margin-right: 0.5em;\n }\n\n .icon + .hidden {\n margin-left: -0.5em; /* override for hidden text*/\n }\n\n /* button colors */\n $buttoncolors:\n (primary, $primary-color),\n (inverse, #fff),\n (light-outline, transparent),\n (success, $success-color),\n (warning, $warning-color),\n (danger, $danger-color),\n (info, $info-color),\n (facebook, $facebook-color),\n (twitter, $twitter-color),\n (linkedin, $linkedin-color);\n\n @each $buttoncolor, $color in $buttoncolors {\n &--#{$buttoncolor} {\n @include yiq-contrasted($color);\n @if ($buttoncolor == inverse) {\n border: 1px solid $border-color;\n }\n @if ($buttoncolor == light-outline) {\n border: 1px solid #fff;\n }\n\n &:visited {\n @include yiq-contrasted($color);\n }\n\n &:hover {\n @include yiq-contrasted(mix(#000, $color, 20%));\n }\n }\n }\n\n /* fills width of parent container */\n &--block {\n display: block;\n width: 100%;\n\n + .btn--block {\n margin-top: 0.25em;\n }\n }\n\n /* disabled */\n &--disabled {\n pointer-events: none;\n cursor: not-allowed;\n filter: alpha(opacity=65);\n box-shadow: none;\n opacity: 0.65;\n }\n\n /* extra large button */\n &--x-large {\n font-size: $type-size-4;\n }\n\n /* large button */\n &--large {\n font-size: $type-size-5;\n }\n\n /* small button */\n &--small {\n font-size: $type-size-7;\n }\n}\n","/* ==========================================================================\n NOTICE TEXT BLOCKS\n ========================================================================== */\n\n/**\n * Default Kramdown usage (no indents!):\n *
\n * #### Headline for the Notice\n * Text for the notice\n *
\n */\n\n@mixin notice($notice-color) {\n margin: 2em 0 !important; /* override*/\n padding: 1em;\n color: $text-color;\n font-family: $global-font-family;\n font-size: $type-size-6 !important;\n text-indent: initial; /* override*/\n background-color: mix($background-color, $notice-color, $notice-background-mix);\n border-radius: $border-radius;\n box-shadow: 0 1px 1px rgba($notice-color, 0.25);\n\n h4 {\n margin-top: 0 !important; /* override*/\n margin-bottom: 0.75em;\n line-height: inherit;\n }\n\n @at-root .page__content #{&} h4 {\n /* using at-root to override .page-content h4 font size*/\n margin-bottom: 0;\n font-size: 1em;\n }\n\n p {\n &:last-child {\n margin-bottom: 0 !important; /* override*/\n }\n }\n\n h4 + p {\n /* remove space above paragraphs that appear directly after notice headline*/\n margin-top: 0;\n padding-top: 0;\n }\n\n a {\n color: mix(#000, $notice-color, 10%);\n\n &:hover {\n color: mix(#000, $notice-color, 50%);\n }\n }\n\n @at-root #{selector-unify(&, \"blockquote\")} {\n border-left-color: mix(#000, $notice-color, 10%);\n }\n\n code {\n background-color: mix($background-color, $notice-color, $code-notice-background-mix)\n }\n\n\tpre code {\n\t\tbackground-color: inherit;\n\t}\n\n ul {\n &:last-child {\n margin-bottom: 0; /* override*/\n }\n }\n}\n\n/* Default notice */\n\n.notice {\n @include notice($light-gray);\n}\n\n/* Primary notice */\n\n.notice--primary {\n @include notice($primary-color);\n}\n\n/* Info notice */\n\n.notice--info {\n @include notice($info-color);\n}\n\n/* Warning notice */\n\n.notice--warning {\n @include notice($warning-color);\n}\n\n/* Success notice */\n\n.notice--success {\n @include notice($success-color);\n}\n\n/* Danger notice */\n\n.notice--danger {\n @include notice($danger-color);\n}\n","/* ==========================================================================\n MASTHEAD\n ========================================================================== */\n\n.masthead {\n position: relative;\n border-bottom: 1px solid $border-color;\n -webkit-animation: $intro-transition;\n animation: $intro-transition;\n -webkit-animation-delay: 0.15s;\n animation-delay: 0.15s;\n z-index: 20;\n\n &__inner-wrap {\n @include clearfix;\n margin-left: auto;\n margin-right: auto;\n padding: 1em;\n max-width: 100%;\n display: -webkit-box;\n display: -ms-flexbox;\n display: flex;\n -webkit-box-pack: justify;\n -ms-flex-pack: justify;\n justify-content: space-between;\n font-family: $sans-serif-narrow;\n\n @include breakpoint($x-large) {\n max-width: $max-width;\n }\n\n nav {\n z-index: 10;\n }\n\n a {\n text-decoration: none;\n }\n }\n}\n\n.site-logo img {\n max-height: 2rem;\n}\n\n.site-title {\n display: -webkit-box;\n display: -ms-flexbox;\n display: flex;\n -ms-flex-item-align: center;\n align-self: center;\n font-weight: bold;\n // z-index: 20;\n}\n\n.site-subtitle {\n display: block;\n font-size: $type-size-8;\n}\n\n.masthead__menu {\n float: left;\n margin-left: 0;\n margin-right: 0;\n width: 100%;\n clear: both;\n\n .site-nav {\n margin-left: 0;\n\n @include breakpoint($small) {\n float: right;\n }\n }\n\n ul {\n margin: 0;\n padding: 0;\n clear: both;\n list-style-type: none;\n }\n}\n\n.masthead__menu-item {\n display: block;\n list-style-type: none;\n white-space: nowrap;\n\n &--lg {\n padding-right: 2em;\n font-weight: 700;\n }\n}\n","/* ==========================================================================\n NAVIGATION\n ========================================================================== */\n\n/*\n Breadcrumb navigation links\n ========================================================================== */\n\n.breadcrumbs {\n @include clearfix;\n margin: 0 auto;\n max-width: 100%;\n padding-left: 1em;\n padding-right: 1em;\n font-family: $sans-serif;\n -webkit-animation: $intro-transition;\n animation: $intro-transition;\n -webkit-animation-delay: 0.3s;\n animation-delay: 0.3s;\n\n @include breakpoint($x-large) {\n max-width: $x-large;\n }\n\n ol {\n padding: 0;\n list-style: none;\n font-size: $type-size-6;\n\n @include breakpoint($large) {\n float: right;\n width: calc(100% - #{$right-sidebar-width-narrow});\n }\n\n @include breakpoint($x-large) {\n width: calc(100% - #{$right-sidebar-width});\n }\n }\n\n li {\n display: inline;\n }\n\n .current {\n font-weight: bold;\n }\n}\n\n/*\n Post pagination navigation links\n ========================================================================== */\n\n.pagination {\n @include clearfix();\n float: left;\n margin-top: 1em;\n padding-top: 1em;\n width: 100%;\n\n ul {\n margin: 0;\n padding: 0;\n list-style-type: none;\n font-family: $sans-serif;\n }\n\n li {\n display: block;\n float: left;\n margin-left: -1px;\n\n a {\n display: block;\n margin-bottom: 0.25em;\n padding: 0.5em 1em;\n font-family: $sans-serif;\n font-size: 14px;\n font-weight: bold;\n line-height: 1.5;\n text-align: center;\n text-decoration: none;\n color: $muted-text-color;\n border: 1px solid mix(#000, $border-color, 25%);\n border-radius: 0;\n\n &:hover {\n color: $link-color-hover;\n }\n\n &.current,\n &.current.disabled {\n color: #fff;\n background: $primary-color;\n }\n\n &.disabled {\n color: rgba($muted-text-color, 0.5);\n pointer-events: none;\n cursor: not-allowed;\n }\n }\n\n &:first-child {\n margin-left: 0;\n\n a {\n border-top-left-radius: $border-radius;\n border-bottom-left-radius: $border-radius;\n }\n }\n\n &:last-child {\n a {\n border-top-right-radius: $border-radius;\n border-bottom-right-radius: $border-radius;\n }\n }\n }\n\n /* next/previous buttons */\n &--pager {\n display: block;\n padding: 1em 2em;\n float: left;\n width: 50%;\n font-family: $sans-serif;\n font-size: $type-size-5;\n font-weight: bold;\n text-align: center;\n text-decoration: none;\n color: $muted-text-color;\n border: 1px solid mix(#000, $border-color, 25%);\n border-radius: $border-radius;\n\n &:hover {\n @include yiq-contrasted($muted-text-color);\n }\n\n &:first-child {\n border-top-right-radius: 0;\n border-bottom-right-radius: 0;\n }\n\n &:last-child {\n margin-left: -1px;\n border-top-left-radius: 0;\n border-bottom-left-radius: 0;\n }\n\n &.disabled {\n color: rgba($muted-text-color, 0.5);\n pointer-events: none;\n cursor: not-allowed;\n }\n }\n}\n\n.page__content + .pagination,\n.page__meta + .pagination,\n.page__share + .pagination,\n.page__comments + .pagination {\n margin-top: 2em;\n padding-top: 2em;\n border-top: 1px solid $border-color;\n}\n\n/*\n Priority plus navigation\n ========================================================================== */\n\n.greedy-nav {\n position: relative;\n display: -webkit-box;\n display: -ms-flexbox;\n display: flex;\n -webkit-box-align: center;\n -ms-flex-align: center;\n align-items: center;\n min-height: $nav-height;\n background: $background-color;\n\n a {\n display: block;\n margin: 0 1rem;\n color: $masthead-link-color;\n text-decoration: none;\n -webkit-transition: none;\n transition: none;\n\n &:hover {\n color: $masthead-link-color-hover;\n }\n\n &.site-logo {\n margin-left: 0;\n margin-right: 0.5rem;\n }\n\n &.site-title {\n margin-left: 0;\n }\n }\n\n img {\n -webkit-transition: none;\n transition: none;\n }\n\n &__toggle {\n -ms-flex-item-align: center;\n align-self: center;\n height: $nav-toggle-height;\n border: 0;\n outline: none;\n background-color: transparent;\n cursor: pointer;\n }\n\n .visible-links {\n display: -webkit-box;\n display: -ms-flexbox;\n display: flex;\n -webkit-box-pack: end;\n -ms-flex-pack: end;\n justify-content: flex-end;\n -webkit-box-flex: 1;\n -ms-flex: 1;\n flex: 1;\n overflow: hidden;\n\n li {\n -webkit-box-flex: 0;\n -ms-flex: none;\n flex: none;\n }\n\n a {\n position: relative;\n\n &:before {\n content: \"\";\n position: absolute;\n left: 0;\n bottom: 0;\n height: 4px;\n background: $primary-color;\n width: 100%;\n -webkit-transition: $global-transition;\n transition: $global-transition;\n -webkit-transform: scaleX(0) translate3d(0, 0, 0);\n transform: scaleX(0) translate3d(0, 0, 0); // hide\n }\n\n &:hover:before {\n -webkit-transform: scaleX(1);\n -ms-transform: scaleX(1);\n transform: scaleX(1); // reveal\n }\n }\n }\n\n .hidden-links {\n position: absolute;\n top: 100%;\n right: 0;\n margin-top: 15px;\n padding: 5px;\n border: 1px solid $border-color;\n border-radius: $border-radius;\n background: $background-color;\n -webkit-box-shadow: 0 2px 4px 0 rgba(#000, 0.16),\n 0 2px 10px 0 rgba(#000, 0.12);\n box-shadow: 0 2px 4px 0 rgba(#000, 0.16), 0 2px 10px 0 rgba(#000, 0.12);\n\n &.hidden {\n display: none;\n }\n\n a {\n margin: 0;\n padding: 10px 20px;\n font-size: $type-size-5;\n\n &:hover {\n color: $masthead-link-color-hover;\n background: $navicon-link-color-hover;\n }\n }\n\n &:before {\n content: \"\";\n position: absolute;\n top: -11px;\n right: 10px;\n width: 0;\n border-style: solid;\n border-width: 0 10px 10px;\n border-color: $border-color transparent;\n display: block;\n z-index: 0;\n }\n\n &:after {\n content: \"\";\n position: absolute;\n top: -10px;\n right: 10px;\n width: 0;\n border-style: solid;\n border-width: 0 10px 10px;\n border-color: $background-color transparent;\n display: block;\n z-index: 1;\n }\n\n li {\n display: block;\n border-bottom: 1px solid $border-color;\n\n &:last-child {\n border-bottom: none;\n }\n }\n }\n}\n\n.no-js {\n .greedy-nav {\n .visible-links {\n -ms-flex-wrap: wrap;\n flex-wrap: wrap;\n overflow: visible;\n }\n }\n}\n\n/*\n Navigation list\n ========================================================================== */\n\n.nav__list {\n margin-bottom: 1.5em;\n\n input[type=\"checkbox\"],\n label {\n display: none;\n }\n\n @include breakpoint(max-width $large - 1px) {\n label {\n position: relative;\n display: inline-block;\n padding: 0.5em 2.5em 0.5em 1em;\n color: $gray;\n font-size: $type-size-6;\n font-weight: bold;\n border: 1px solid $light-gray;\n border-radius: $border-radius;\n z-index: 20;\n -webkit-transition: 0.2s ease-out;\n transition: 0.2s ease-out;\n cursor: pointer;\n\n &:before,\n &:after {\n content: \"\";\n position: absolute;\n right: 1em;\n top: 1.25em;\n width: 0.75em;\n height: 0.125em;\n line-height: 1;\n background-color: $gray;\n -webkit-transition: 0.2s ease-out;\n transition: 0.2s ease-out;\n }\n\n &:after {\n -webkit-transform: rotate(90deg);\n -ms-transform: rotate(90deg);\n transform: rotate(90deg);\n }\n\n &:hover {\n color: #fff;\n border-color: $gray;\n background-color: mix(white, #000, 20%);\n\n &:before,\n &:after {\n background-color: #fff;\n }\n }\n }\n\n /* selected*/\n input:checked + label {\n color: white;\n background-color: mix(white, #000, 20%);\n\n &:before,\n &:after {\n background-color: #fff;\n }\n }\n\n /* on hover show expand*/\n label:hover:after {\n -webkit-transform: rotate(90deg);\n -ms-transform: rotate(90deg);\n transform: rotate(90deg);\n }\n\n input:checked + label:hover:after {\n -webkit-transform: rotate(0);\n -ms-transform: rotate(0);\n transform: rotate(0);\n }\n\n ul {\n margin-bottom: 1em;\n }\n\n a {\n display: block;\n padding: 0.25em 0;\n\n @include breakpoint($large) {\n padding-top: 0.125em;\n padding-bottom: 0.125em;\n }\n\n &:hover {\n text-decoration: underline;\n }\n }\n }\n}\n\n.nav__list .nav__items {\n margin: 0;\n font-size: 1.25rem;\n\n a {\n color: inherit;\n }\n\n .active {\n margin-left: -0.5em;\n padding-left: 0.5em;\n padding-right: 0.5em;\n font-weight: bold;\n }\n\n @include breakpoint(max-width $large - 1px) {\n position: relative;\n max-height: 0;\n opacity: 0%;\n overflow: hidden;\n z-index: 10;\n -webkit-transition: 0.3s ease-in-out;\n transition: 0.3s ease-in-out;\n -webkit-transform: translate(0, 10%);\n -ms-transform: translate(0, 10%);\n transform: translate(0, 10%);\n }\n}\n\n@include breakpoint(max-width $large - 1px) {\n .nav__list input:checked ~ .nav__items {\n -webkit-transition: 0.5s ease-in-out;\n transition: 0.5s ease-in-out;\n max-height: 9999px; /* exaggerate max-height to accommodate tall lists*/\n overflow: visible;\n opacity: 1;\n margin-top: 1em;\n -webkit-transform: translate(0, 0);\n -ms-transform: translate(0, 0);\n transform: translate(0, 0);\n }\n}\n\n.nav__title {\n margin: 0;\n padding: 0.5rem 0.75rem;\n font-family: $sans-serif-narrow;\n font-size: $type-size-5;\n font-weight: bold;\n}\n\n.nav__sub-title {\n display: block;\n margin: 0.5rem 0;\n padding: 0.25rem 0;\n font-family: $sans-serif-narrow;\n font-size: $type-size-6;\n font-weight: bold;\n text-transform: uppercase;\n border-bottom: 1px solid $border-color;\n}\n\n/*\n Table of contents navigation\n ========================================================================== */\n\n.toc {\n font-family: $sans-serif-narrow;\n color: $gray;\n background-color: $background-color;\n border: 1px solid $border-color;\n border-radius: $border-radius;\n -webkit-box-shadow: $box-shadow;\n box-shadow: $box-shadow;\n\n .nav__title {\n color: #fff;\n font-size: $type-size-6;\n background: $primary-color;\n border-top-left-radius: $border-radius;\n border-top-right-radius: $border-radius;\n }\n\n // Scrollspy marks toc items as .active when they are in focus\n .active a {\n @include yiq-contrasted($active-color);\n }\n}\n\n.toc__menu {\n margin: 0;\n padding: 0;\n width: 100%;\n list-style: none;\n font-size: $type-size-6;\n\n @include breakpoint($large) {\n font-size: $type-size-7;\n }\n\n a {\n display: block;\n padding: 0.25rem 0.75rem;\n color: $muted-text-color;\n font-weight: bold;\n line-height: 1.5;\n border-bottom: 1px solid $border-color;\n\n &:hover {\n color: $text-color;\n }\n }\n\n li ul > li a {\n padding-left: 1.25rem;\n font-weight: normal;\n }\n\n li ul li ul > li a {\n padding-left: 1.75rem;\n }\n\n li ul li ul li ul > li a {\n padding-left: 2.25rem;\n }\n\n li ul li ul li ul li ul > li a {\n padding-left: 2.75rem;\n }\n\n li ul li ul li ul li ul li ul > li a {\n padding-left: 3.25rem\n }\n}\n","/* ==========================================================================\n FOOTER\n ========================================================================== */\n\n.page__footer {\n @include clearfix;\n float: left;\n margin-left: 0;\n margin-right: 0;\n width: 100%;\n margin-top: 3em;\n color: $muted-text-color;\n -webkit-animation: $intro-transition;\n animation: $intro-transition;\n -webkit-animation-delay: 0.45s;\n animation-delay: 0.45s;\n background-color: $footer-background-color;\n\n footer {\n @include clearfix;\n margin-left: auto;\n margin-right: auto;\n margin-top: 2em;\n max-width: 100%;\n padding: 0 1em 2em;\n\n @include breakpoint($x-large) {\n max-width: $x-large;\n }\n }\n\n a {\n color: inherit;\n text-decoration: none;\n\n &:hover {\n text-decoration: underline;\n }\n }\n\n .fas,\n .fab,\n .far,\n .fal {\n color: $muted-text-color;\n }\n}\n\n.page__footer-copyright {\n font-family: $global-font-family;\n font-size: $type-size-7;\n}\n\n.page__footer-follow {\n ul {\n margin: 0;\n padding: 0;\n list-style-type: none;\n }\n\n li {\n display: inline-block;\n padding-top: 5px;\n padding-bottom: 5px;\n font-family: $sans-serif-narrow;\n font-size: $type-size-6;\n text-transform: uppercase;\n }\n\n li + li:before {\n content: \"\";\n padding-right: 5px;\n }\n\n a {\n padding-right: 10px;\n font-weight: bold;\n }\n\n .social-icons {\n a {\n white-space: nowrap;\n }\n }\n}\n","/* ==========================================================================\n SEARCH\n ========================================================================== */\n\n.layout--search {\n .archive__item-teaser {\n margin-bottom: 0.25em;\n }\n}\n\n.search__toggle {\n margin-left: 1rem;\n margin-right: 1rem;\n height: $nav-toggle-height;\n border: 0;\n outline: none;\n color: $primary-color;\n background-color: transparent;\n cursor: pointer;\n -webkit-transition: 0.2s;\n transition: 0.2s;\n\n &:hover {\n color: mix(#000, $primary-color, 25%);\n }\n}\n\n.search-icon {\n width: 100%;\n height: 100%;\n}\n\n.search-content {\n display: none;\n visibility: hidden;\n padding-top: 1em;\n padding-bottom: 1em;\n\n &__inner-wrap {\n width: 100%;\n margin-left: auto;\n margin-right: auto;\n padding-left: 1em;\n padding-right: 1em;\n -webkit-animation: $intro-transition;\n animation: $intro-transition;\n -webkit-animation-delay: 0.15s;\n animation-delay: 0.15s;\n\n @include breakpoint($x-large) {\n max-width: $max-width;\n }\n\n }\n\n &__form {\n background-color: transparent;\n }\n\n .search-input {\n display: block;\n margin-bottom: 0;\n padding: 0;\n border: none;\n outline: none;\n box-shadow: none;\n background-color: transparent;\n font-size: $type-size-3;\n\n @include breakpoint($large) {\n font-size: $type-size-2;\n }\n\n @include breakpoint($x-large) {\n font-size: $type-size-1;\n }\n }\n\n &.is--visible {\n display: block;\n visibility: visible;\n\n &::after {\n content: \"\";\n display: block;\n }\n }\n\n .results__found {\n margin-top: 0.5em;\n font-size: $type-size-6;\n }\n\n .archive__item {\n margin-bottom: 2em;\n\n @include breakpoint($large) {\n width: 75%;\n }\n\n @include breakpoint($x-large) {\n width: 50%;\n }\n }\n\n .archive__item-title {\n margin-top: 0;\n }\n\n .archive__item-excerpt {\n margin-bottom: 0;\n }\n}\n\n/* Algolia search */\n\n.ais-search-box {\n max-width: 100% !important;\n margin-bottom: 2em;\n}\n\n.archive__item-title .ais-Highlight {\n color: $primary-color;\n font-style: normal;\n text-decoration: underline;\n}\n\n.archive__item-excerpt .ais-Highlight {\n color: $primary-color;\n font-style: normal;\n font-weight: bold;\n}\n","/* ==========================================================================\n Syntax highlighting\n ========================================================================== */\n\ndiv.highlighter-rouge,\nfigure.highlight {\n position: relative;\n margin-bottom: 1em;\n background: $base00;\n color: $base05;\n font-family: $monospace;\n font-size: $type-size-6;\n line-height: 1.8;\n border-radius: $border-radius;\n\n > pre,\n pre.highlight {\n margin: 0;\n padding: 1em;\n }\n}\n\n.highlight table {\n margin-bottom: 0;\n font-size: 1em;\n border: 0;\n\n td {\n padding: 0;\n width: calc(100% - 1em);\n border: 0;\n\n /* line numbers*/\n &.gutter,\n &.rouge-gutter {\n padding-right: 1em;\n width: 1em;\n color: $base04;\n border-right: 1px solid $base04;\n text-align: right;\n }\n\n /* code */\n &.code,\n &.rouge-code {\n padding-left: 1em;\n }\n }\n\n pre {\n margin: 0;\n }\n}\n\n.highlight pre {\n width: 100%;\n}\n\n.highlight .hll {\n background-color: $base06;\n}\n.highlight {\n .c {\n /* Comment */\n color: $base04;\n }\n .err {\n /* Error */\n color: $base08;\n }\n .k {\n /* Keyword */\n color: $base0e;\n }\n .l {\n /* Literal */\n color: $base09;\n }\n .n {\n /* Name */\n color: $base05;\n }\n .o {\n /* Operator */\n color: $base0c;\n }\n .p {\n /* Punctuation */\n color: $base05;\n }\n .cm {\n /* Comment.Multiline */\n color: $base04;\n }\n .cp {\n /* Comment.Preproc */\n color: $base04;\n }\n .c1 {\n /* Comment.Single */\n color: $base04;\n }\n .cs {\n /* Comment.Special */\n color: $base04;\n }\n .gd {\n /* Generic.Deleted */\n color: $base08;\n }\n .ge {\n /* Generic.Emph */\n font-style: italic;\n }\n .gh {\n /* Generic.Heading */\n color: $base05;\n font-weight: bold;\n }\n .gi {\n /* Generic.Inserted */\n color: $base0b;\n }\n .gp {\n /* Generic.Prompt */\n color: $base04;\n font-weight: bold;\n }\n .gs {\n /* Generic.Strong */\n font-weight: bold;\n }\n .gu {\n /* Generic.Subheading */\n color: $base0c;\n font-weight: bold;\n }\n .kc {\n /* Keyword.Constant */\n color: $base0e;\n }\n .kd {\n /* Keyword.Declaration */\n color: $base0e;\n }\n .kn {\n /* Keyword.Namespace */\n color: $base0c;\n }\n .kp {\n /* Keyword.Pseudo */\n color: $base0e;\n }\n .kr {\n /* Keyword.Reserved */\n color: $base0e;\n }\n .kt {\n /* Keyword.Type */\n color: $base0a;\n }\n .ld {\n /* Literal.Date */\n color: $base0b;\n }\n .m {\n /* Literal.Number */\n color: $base09;\n }\n .s {\n /* Literal.String */\n color: $base0b;\n }\n .na {\n /* Name.Attribute */\n color: $base0d;\n }\n .nb {\n /* Name.Builtin */\n color: $base05;\n }\n .nc {\n /* Name.Class */\n color: $base0a;\n }\n .no {\n /* Name.Constant */\n color: $base08;\n }\n .nd {\n /* Name.Decorator */\n color: $base0c;\n }\n .ni {\n /* Name.Entity */\n color: $base05;\n }\n .ne {\n /* Name.Exception */\n color: $base08;\n }\n .nf {\n /* Name.Function */\n color: $base0d;\n }\n .nl {\n /* Name.Label */\n color: $base05;\n }\n .nn {\n /* Name.Namespace */\n color: $base0a;\n }\n .nx {\n /* Name.Other */\n color: $base0d;\n }\n .py {\n /* Name.Property */\n color: $base05;\n }\n .nt {\n /* Name.Tag */\n color: $base0c;\n }\n .nv {\n /* Name.Variable */\n color: $base08;\n }\n .ow {\n /* Operator.Word */\n color: $base0c;\n }\n .w {\n /* Text.Whitespace */\n color: $base05;\n }\n .mf {\n /* Literal.Number.Float */\n color: $base09;\n }\n .mh {\n /* Literal.Number.Hex */\n color: $base09;\n }\n .mi {\n /* Literal.Number.Integer */\n color: $base09;\n }\n .mo {\n /* Literal.Number.Oct */\n color: $base09;\n }\n .sb {\n /* Literal.String.Backtick */\n color: $base0b;\n }\n .sc {\n /* Literal.String.Char */\n color: $base05;\n }\n .sd {\n /* Literal.String.Doc */\n color: $base04;\n }\n .s2 {\n /* Literal.String.Double */\n color: $base0b;\n }\n .se {\n /* Literal.String.Escape */\n color: $base09;\n }\n .sh {\n /* Literal.String.Heredoc */\n color: $base0b;\n }\n .si {\n /* Literal.String.Interpol */\n color: $base09;\n }\n .sx {\n /* Literal.String.Other */\n color: $base0b;\n }\n .sr {\n /* Literal.String.Regex */\n color: $base0b;\n }\n .s1 {\n /* Literal.String.Single */\n color: $base0b;\n }\n .ss {\n /* Literal.String.Symbol */\n color: $base0b;\n }\n .bp {\n /* Name.Builtin.Pseudo */\n color: $base05;\n }\n .vc {\n /* Name.Variable.Class */\n color: $base08;\n }\n .vg {\n /* Name.Variable.Global */\n color: $base08;\n }\n .vi {\n /* Name.Variable.Instance */\n color: $base08;\n }\n .il {\n /* Literal.Number.Integer.Long */\n color: $base09;\n }\n}\n\n.gist {\n th, td {\n border-bottom: 0;\n }\n}","/* ==========================================================================\n UTILITY CLASSES\n ========================================================================== */\n\n/*\n Visibility\n ========================================================================== */\n\n/* http://www.456bereastreet.com/archive/200711/screen_readers_sometimes_ignore_displaynone/ */\n\n.hidden,\n.is--hidden {\n display: none;\n visibility: hidden;\n}\n\n/* for preloading images */\n\n.load {\n display: none;\n}\n\n.transparent {\n opacity: 0;\n}\n\n/* https://developer.yahoo.com/blogs/ydn/clip-hidden-content-better-accessibility-53456.html */\n\n.visually-hidden,\n.screen-reader-text,\n.screen-reader-text span,\n.screen-reader-shortcut {\n position: absolute !important;\n clip: rect(1px, 1px, 1px, 1px);\n height: 1px !important;\n width: 1px !important;\n border: 0 !important;\n overflow: hidden;\n}\n\nbody:hover .visually-hidden a,\nbody:hover .visually-hidden input,\nbody:hover .visually-hidden button {\n display: none !important;\n}\n\n/* screen readers */\n\n.screen-reader-text:focus,\n.screen-reader-shortcut:focus {\n clip: auto !important;\n height: auto !important;\n width: auto !important;\n display: block;\n font-size: 1em;\n font-weight: bold;\n padding: 15px 23px 14px;\n background: #fff;\n z-index: 100000;\n text-decoration: none;\n box-shadow: 0 0 2px 2px rgba(0, 0, 0, 0.6);\n}\n\n/*\n Skip links\n ========================================================================== */\n\n.skip-link {\n position: fixed;\n z-index: 20;\n margin: 0;\n font-family: $sans-serif;\n white-space: nowrap;\n}\n\n.skip-link li {\n height: 0;\n width: 0;\n list-style: none;\n}\n\n/*\n Type\n ========================================================================== */\n\n.text-left {\n text-align: left;\n}\n\n.text-center {\n text-align: center;\n}\n\n.text-right {\n text-align: right;\n}\n\n.text-justify {\n text-align: justify;\n}\n\n.text-nowrap {\n white-space: nowrap;\n}\n\n/*\n Task lists\n ========================================================================== */\n\n.task-list {\n padding:0;\n\n li {\n list-style-type: none;\n }\n\n .task-list-item-checkbox {\n margin-right: 0.5em;\n opacity: 1;\n }\n}\n\n.task-list .task-list {\n margin-left: 1em;\n}\n\n/*\n Alignment\n ========================================================================== */\n\n/* clearfix */\n\n.cf {\n clear: both;\n}\n\n.wrapper {\n margin-left: auto;\n margin-right: auto;\n width: 100%;\n}\n\n/*\n Images\n ========================================================================== */\n\n/* image align left */\n\n.align-left {\n display: block;\n margin-left: auto;\n margin-right: auto;\n\n @include breakpoint($small) {\n float: left;\n margin-right: 1em;\n }\n}\n\n/* image align right */\n\n.align-right {\n display: block;\n margin-left: auto;\n margin-right: auto;\n\n @include breakpoint($small) {\n float: right;\n margin-left: 1em;\n }\n}\n\n/* image align center */\n\n.align-center {\n display: block;\n margin-left: auto;\n margin-right: auto;\n}\n\n/* file page content container */\n\n.full {\n @include breakpoint($large) {\n margin-right: -1 * span(2.5 of 12) !important;\n }\n}\n\n/*\n Icons\n ========================================================================== */\n\n.icon {\n display: inline-block;\n fill: currentColor;\n width: 1em;\n height: 1.1em;\n line-height: 1;\n position: relative;\n top: -0.1em;\n vertical-align: middle;\n}\n\n/* social icons*/\n\n.social-icons {\n .fas,\n .fab,\n .far,\n .fal {\n color: $text-color;\n }\n\n .fa-behance,\n .fa-behance-square {\n color: $behance-color;\n }\n\n .fa-bitbucket {\n color: $bitbucket-color;\n }\n\n .fa-dribbble,\n .fa-dribble-square {\n color: $dribbble-color;\n }\n\n .fa-facebook,\n .fa-facebook-square,\n .fa-facebook-f {\n color: $facebook-color;\n }\n\n .fa-flickr {\n color: $flickr-color;\n }\n\n .fa-foursquare {\n color: $foursquare-color;\n }\n\n .fa-github,\n .fa-github-alt,\n .fa-github-square {\n color: $github-color;\n }\n\n .fa-gitlab {\n color: $gitlab-color;\n }\n\n .fa-instagram {\n color: $instagram-color;\n }\n\n .fa-keybase {\n color: $keybase-color;\n }\n\n .fa-lastfm,\n .fa-lastfm-square {\n color: $lastfm-color;\n }\n\n .fa-linkedin,\n .fa-linkedin-in {\n color: $linkedin-color;\n }\n\n .fa-mastodon,\n .fa-mastodon-square {\n color: $mastodon-color;\n }\n\n .fa-pinterest,\n .fa-pinterest-p,\n .fa-pinterest-square {\n color: $pinterest-color;\n }\n\n .fa-reddit {\n color: $reddit-color;\n }\n\n .fa-rss,\n .fa-rss-square {\n color: $rss-color;\n }\n\n .fa-soundcloud {\n color: $soundcloud-color;\n }\n\n .fa-stack-exchange,\n .fa-stack-overflow {\n color: $stackoverflow-color;\n }\n\n .fa-tumblr,\n .fa-tumblr-square {\n color: $tumblr-color;\n }\n\n .fa-twitter,\n .fa-twitter-square {\n color: $twitter-color;\n }\n\n .fa-vimeo,\n .fa-vimeo-square,\n .fa-vimeo-v {\n color: $vimeo-color;\n }\n\n .fa-vine {\n color: $vine-color;\n }\n\n .fa-youtube {\n color: $youtube-color;\n }\n\n .fa-xing,\n .fa-xing-square {\n color: $xing-color;\n }\n}\n\n/*\n Navicons\n ========================================================================== */\n\n.navicon {\n position: relative;\n width: $navicon-width;\n height: $navicon-height;\n background: $primary-color;\n margin: auto;\n -webkit-transition: 0.3s;\n transition: 0.3s;\n\n &:before,\n &:after {\n content: \"\";\n position: absolute;\n left: 0;\n width: $navicon-width;\n height: $navicon-height;\n background: $primary-color;\n -webkit-transition: 0.3s;\n transition: 0.3s;\n }\n\n &:before {\n top: (-2 * $navicon-height);\n }\n\n &:after {\n bottom: (-2 * $navicon-height);\n }\n}\n\n.close .navicon {\n /* hide the middle line*/\n background: transparent;\n\n /* overlay the lines by setting both their top values to 0*/\n &:before,\n &:after {\n -webkit-transform-origin: 50% 50%;\n -ms-transform-origin: 50% 50%;\n transform-origin: 50% 50%;\n top: 0;\n width: $navicon-width;\n }\n\n /* rotate the lines to form the x shape*/\n &:before {\n -webkit-transform: rotate3d(0, 0, 1, 45deg);\n transform: rotate3d(0, 0, 1, 45deg);\n }\n &:after {\n -webkit-transform: rotate3d(0, 0, 1, -45deg);\n transform: rotate3d(0, 0, 1, -45deg);\n }\n}\n\n.greedy-nav__toggle {\n &:before {\n @supports (pointer-events: none) {\n content: '';\n position: fixed;\n top: 0;\n left: 0;\n width: 100%;\n height: 100%;\n opacity: 0;\n background-color: $background-color;\n -webkit-transition: $global-transition;\n transition: $global-transition;\n pointer-events: none;\n }\n }\n\n &.close {\n &:before {\n opacity: 0.9;\n -webkit-transition: $global-transition;\n transition: $global-transition;\n pointer-events: auto;\n }\n }\n}\n\n.greedy-nav__toggle:hover {\n .navicon,\n .navicon:before,\n .navicon:after {\n background: mix(#000, $primary-color, 25%);\n }\n\n &.close {\n .navicon {\n background: transparent;\n }\n }\n}\n\n/*\n Sticky, fixed to top content\n ========================================================================== */\n\n.sticky {\n @include breakpoint($large) {\n @include clearfix();\n position: -webkit-sticky;\n position: sticky;\n top: 2em;\n\n > * {\n display: block;\n }\n }\n}\n\n/*\n Wells\n ========================================================================== */\n\n.well {\n min-height: 20px;\n padding: 19px;\n margin-bottom: 20px;\n background-color: #f5f5f5;\n border: 1px solid #e3e3e3;\n border-radius: $border-radius;\n box-shadow: inset 0 1px 1px rgba(0, 0, 0, 0.05);\n}\n\n/*\n Modals\n ========================================================================== */\n\n.show-modal {\n overflow: hidden;\n position: relative;\n\n &:before {\n position: absolute;\n content: \"\";\n top: 0;\n left: 0;\n width: 100%;\n height: 100%;\n z-index: 999;\n background-color: rgba(255, 255, 255, 0.85);\n }\n\n .modal {\n display: block;\n }\n}\n\n.modal {\n display: none;\n position: fixed;\n width: 300px;\n top: 50%;\n left: 50%;\n margin-left: -150px;\n margin-top: -150px;\n min-height: 0;\n z-index: 9999;\n background: #fff;\n border: 1px solid $border-color;\n border-radius: $border-radius;\n box-shadow: $box-shadow;\n\n &__title {\n margin: 0;\n padding: 0.5em 1em;\n }\n\n &__supporting-text {\n padding: 0 1em 0.5em 1em;\n }\n\n &__actions {\n padding: 0.5em 1em;\n border-top: 1px solid $border-color;\n }\n}\n\n/*\n Footnotes\n ========================================================================== */\n\n.footnote {\n color: mix(#fff, $gray, 25%);\n text-decoration: none;\n}\n\n.footnotes {\n color: mix(#fff, $gray, 25%);\n\n ol,\n li,\n p {\n margin-bottom: 0;\n font-size: $type-size-6;\n }\n}\n\na.reversefootnote {\n color: $gray;\n text-decoration: none;\n\n &:hover {\n text-decoration: underline;\n }\n}\n\n/*\n Required\n ========================================================================== */\n\n.required {\n color: $danger-color;\n font-weight: bold;\n}\n\n/*\n Google Custom Search Engine\n ========================================================================== */\n\n.gsc-control-cse {\n table,\n tr,\n td {\n border: 0; /* remove table borders widget */\n }\n}\n\n/*\n Responsive Video Embed\n ========================================================================== */\n\n.responsive-video-container {\n position: relative;\n margin-bottom: 1em;\n padding-bottom: 56.25%;\n height: 0;\n overflow: hidden;\n max-width: 100%;\n\n iframe,\n object,\n embed {\n position: absolute;\n top: 0;\n left: 0;\n width: 100%;\n height: 100%;\n }\n}\n\n// full screen video fixes\n:-webkit-full-screen-ancestor {\n .masthead,\n .page__footer {\n position: static;\n }\n}\n","/* ==========================================================================\n SINGLE PAGE/POST\n ========================================================================== */\n\n#main {\n @include clearfix;\n margin-left: auto;\n margin-right: auto;\n padding-left: 1em;\n padding-right: 1em;\n -webkit-animation: $intro-transition;\n animation: $intro-transition;\n max-width: 100%;\n -webkit-animation-delay: 0.15s;\n animation-delay: 0.15s;\n\n @include breakpoint($x-large) {\n max-width: $max-width;\n }\n}\n\nbody {\n display: -webkit-box;\n display: -ms-flexbox;\n display: flex;\n min-height: 100vh;\n -webkit-box-orient: vertical;\n -webkit-box-direction: normal;\n -ms-flex-direction: column;\n flex-direction: column;\n}\n\n.initial-content,\n.search-content {\n flex: 1 0 auto;\n}\n\n.page {\n @include breakpoint($large) {\n float: right;\n width: calc(100% - #{$right-sidebar-width-narrow});\n padding-right: $right-sidebar-width-narrow;\n }\n\n @include breakpoint($x-large) {\n width: calc(100% - #{$right-sidebar-width});\n padding-right: $right-sidebar-width;\n }\n\n .page__inner-wrap {\n float: left;\n margin-top: 1em;\n margin-left: 0;\n margin-right: 0;\n width: 100%;\n clear: both;\n\n .page__content,\n .page__meta,\n .page__share {\n position: relative;\n float: left;\n margin-left: 0;\n margin-right: 0;\n width: 100%;\n clear: both;\n }\n }\n}\n\n.page__title {\n margin-top: 0;\n line-height: 1;\n\n a {\n color: $text-color;\n text-decoration: none;\n }\n\n & + .page__meta {\n margin-top: -0.5em;\n }\n}\n\n.page__lead {\n font-family: $global-font-family;\n font-size: $type-size-4;\n}\n\n.page__content {\n h2 {\n padding-bottom: 0.5em;\n border-bottom: 1px solid $border-color;\n }\n\n\th1, h2, h3, h4, h5, h6 {\n\t\t.header-link {\n\t\t\tposition: relative;\n\t\t\tleft: 0.5em;\n\t\t\topacity: 0;\n\t\t\tfont-size: 0.8em;\n\t\t\t-webkit-transition: opacity 0.2s ease-in-out 0.1s;\n\t\t\t-moz-transition: opacity 0.2s ease-in-out 0.1s;\n\t\t\t-o-transition: opacity 0.2s ease-in-out 0.1s;\n\t\t\ttransition: opacity 0.2s ease-in-out 0.1s;\n\t\t}\n\n\t\t&:hover .header-link {\n\t\t\topacity: 1;\n\t\t}\n\t}\n\n p,\n li,\n dl {\n font-size: 1em;\n }\n\n /* paragraph indents */\n p {\n margin: 0 0 $indent-var;\n\n /* sibling indentation*/\n @if $paragraph-indent == true {\n & + p {\n text-indent: $indent-var;\n margin-top: -($indent-var);\n }\n }\n }\n\n a:not(.btn) {\n &:hover {\n text-decoration: underline;\n\n img {\n box-shadow: 0 0 10px rgba(#000, 0.25);\n }\n }\n }\n\n :not(pre) > code {\n padding-top: 0.1rem;\n padding-bottom: 0.1rem;\n font-size: 0.8em;\n background: $code-background-color;\n border-radius: $border-radius;\n\n &::before,\n &::after {\n letter-spacing: -0.2em;\n content: \"\\00a0\"; /* non-breaking space*/\n }\n }\n\n dt {\n margin-top: 1em;\n font-family: $sans-serif;\n font-weight: bold;\n }\n\n dd {\n margin-left: 1em;\n font-family: $sans-serif;\n font-size: $type-size-6;\n }\n\n .small {\n font-size: $type-size-6;\n }\n\n /* blockquote citations */\n blockquote + .small {\n margin-top: -1.5em;\n padding-left: 1.25rem;\n }\n}\n\n.page__hero {\n position: relative;\n margin-bottom: 2em;\n @include clearfix;\n -webkit-animation: $intro-transition;\n animation: $intro-transition;\n -webkit-animation-delay: 0.25s;\n animation-delay: 0.25s;\n\n &--overlay {\n position: relative;\n margin-bottom: 2em;\n padding: 3em 0;\n @include clearfix;\n background-size: cover;\n background-repeat: no-repeat;\n background-position: center;\n -webkit-animation: $intro-transition;\n animation: $intro-transition;\n -webkit-animation-delay: 0.25s;\n animation-delay: 0.25s;\n\n a {\n color: #fff;\n }\n\n .wrapper {\n padding-left: 1em;\n padding-right: 1em;\n\n @include breakpoint($x-large) {\n max-width: $x-large;\n }\n }\n\n .page__title,\n .page__meta,\n .page__lead,\n .btn {\n color: #fff;\n text-shadow: 1px 1px 4px rgba(#000, 0.5);\n }\n\n .page__lead {\n max-width: $medium;\n }\n\n .page__title {\n font-size: $type-size-2;\n\n @include breakpoint($small) {\n font-size: $type-size-1;\n }\n }\n }\n}\n\n.page__hero-image {\n width: 100%;\n height: auto;\n -ms-interpolation-mode: bicubic;\n}\n\n.page__hero-caption {\n position: absolute;\n bottom: 0;\n right: 0;\n margin: 0 auto;\n padding: 2px 5px;\n color: #fff;\n font-family: $caption-font-family;\n font-size: $type-size-7;\n background: #000;\n text-align: right;\n z-index: 5;\n opacity: 0.5;\n border-radius: $border-radius 0 0 0;\n\n @include breakpoint($large) {\n padding: 5px 10px;\n }\n\n a {\n color: #fff;\n text-decoration: none;\n }\n}\n\n/*\n Social sharing\n ========================================================================== */\n\n.page__share {\n margin-top: 2em;\n padding-top: 1em;\n border-top: 1px solid $border-color;\n\n @include breakpoint(max-width $small) {\n .btn span {\n border: 0;\n clip: rect(0 0 0 0);\n height: 1px;\n margin: -1px;\n overflow: hidden;\n padding: 0;\n position: absolute;\n width: 1px;\n }\n }\n}\n\n.page__share-title {\n margin-bottom: 10px;\n font-size: $type-size-6;\n text-transform: uppercase;\n}\n\n/*\n Page meta\n ========================================================================== */\n\n.page__meta {\n margin-top: 2em;\n color: $muted-text-color;\n font-family: $sans-serif;\n font-size: $type-size-6;\n\n p {\n margin: 0;\n }\n\n a {\n color: inherit;\n }\n}\n\n.page__meta-title {\n margin-bottom: 10px;\n font-size: $type-size-6;\n text-transform: uppercase;\n}\n\n.page__meta-sep::before {\n content: \"\\2022\";\n padding-left: 0.5em;\n padding-right: 0.5em;\n}\n\n/*\n Page taxonomy\n ========================================================================== */\n\n.page__taxonomy {\n .sep {\n display: none;\n }\n\n strong {\n margin-right: 10px;\n }\n}\n\n.page__taxonomy-item {\n display: inline-block;\n margin-right: 5px;\n margin-bottom: 8px;\n padding: 5px 10px;\n text-decoration: none;\n border: 1px solid mix(#000, $border-color, 25%);\n border-radius: $border-radius;\n\n &:hover {\n text-decoration: none;\n color: $link-color-hover;\n }\n}\n\n.taxonomy__section {\n margin-bottom: 2em;\n padding-bottom: 1em;\n\n &:not(:last-child) {\n border-bottom: solid 1px $border-color;\n }\n\n .archive__item-title {\n margin-top: 0;\n }\n\n .archive__subtitle {\n clear: both;\n border: 0;\n }\n\n + .taxonomy__section {\n margin-top: 2em;\n }\n}\n\n.taxonomy__title {\n margin-bottom: 0.5em;\n color: $muted-text-color;\n}\n\n.taxonomy__count {\n color: $muted-text-color;\n}\n\n.taxonomy__index {\n display: grid;\n grid-column-gap: 2em;\n grid-template-columns: repeat(2, 1fr);\n margin: 1.414em 0;\n padding: 0;\n font-size: 0.75em;\n list-style: none;\n\n @include breakpoint($large) {\n grid-template-columns: repeat(3, 1fr);\n }\n\n a {\n display: -webkit-box;\n display: -ms-flexbox;\n display: flex;\n padding: 0.25em 0;\n -webkit-box-pack: justify;\n -ms-flex-pack: justify;\n justify-content: space-between;\n color: inherit;\n text-decoration: none;\n border-bottom: 1px solid $border-color;\n }\n}\n\n.back-to-top {\n display: block;\n clear: both;\n color: $muted-text-color;\n font-size: 0.6em;\n text-transform: uppercase;\n text-align: right;\n text-decoration: none;\n}\n\n/*\n Comments\n ========================================================================== */\n\n.page__comments {\n float: left;\n margin-left: 0;\n margin-right: 0;\n width: 100%;\n clear: both;\n}\n\n.page__comments-title {\n margin-top: 2rem;\n margin-bottom: 10px;\n padding-top: 2rem;\n font-size: $type-size-6;\n border-top: 1px solid $border-color;\n text-transform: uppercase;\n}\n\n.page__comments-form {\n -webkit-transition: $global-transition;\n transition: $global-transition;\n\n &.disabled {\n input,\n button,\n textarea,\n label {\n pointer-events: none;\n cursor: not-allowed;\n filter: alpha(opacity=65);\n box-shadow: none;\n opacity: 0.65;\n }\n }\n}\n\n.comment {\n @include clearfix();\n margin: 1em 0;\n\n &:not(:last-child) {\n border-bottom: 1px solid $border-color;\n }\n}\n\n.comment__avatar-wrapper {\n float: left;\n width: 60px;\n height: 60px;\n\n @include breakpoint($large) {\n width: 100px;\n height: 100px;\n }\n}\n\n.comment__avatar {\n width: 40px;\n height: 40px;\n border-radius: 50%;\n\n @include breakpoint($large) {\n width: 80px;\n height: 80px;\n padding: 5px;\n border: 1px solid $border-color;\n }\n}\n\n.comment__content-wrapper {\n float: right;\n width: calc(100% - 60px);\n\n @include breakpoint($large) {\n width: calc(100% - 100px);\n }\n}\n\n.comment__author {\n margin: 0;\n\n a {\n text-decoration: none;\n }\n}\n\n.comment__date {\n @extend .page__meta;\n margin: 0;\n\n a {\n text-decoration: none;\n }\n}\n\n/*\n Related\n ========================================================================== */\n\n.page__related {\n @include clearfix();\n float: left;\n margin-top: 2em;\n padding-top: 1em;\n border-top: 1px solid $border-color;\n\n @include breakpoint($large) {\n float: right;\n width: calc(100% - #{$right-sidebar-width-narrow});\n }\n\n @include breakpoint($x-large) {\n width: calc(100% - #{$right-sidebar-width});\n }\n\n a {\n color: inherit;\n text-decoration: none;\n }\n}\n\n.page__related-title {\n margin-bottom: 10px;\n font-size: $type-size-6;\n text-transform: uppercase;\n}\n\n/*\n Wide Pages\n ========================================================================== */\n\n.wide {\n .page {\n @include breakpoint($large) {\n padding-right: 0;\n }\n\n @include breakpoint($x-large) {\n padding-right: 0;\n }\n }\n\n .page__related {\n @include breakpoint($large) {\n padding-right: 0;\n }\n\n @include breakpoint($x-large) {\n padding-right: 0;\n }\n }\n}\n","/* ==========================================================================\n ARCHIVE\n ========================================================================== */\n\n.archive {\n margin-top: 1em;\n margin-bottom: 2em;\n\n @include breakpoint($large) {\n float: right;\n width: calc(100% - #{$right-sidebar-width-narrow});\n padding-right: $right-sidebar-width-narrow;\n }\n\n @include breakpoint($x-large) {\n width: calc(100% - #{$right-sidebar-width});\n padding-right: $right-sidebar-width;\n }\n}\n\n.archive__item {\n position: relative;\n\n a {\n position: relative;\n z-index: 10;\n }\n\n a[rel=\"permalink\"] {\n position: static;\n }\n}\n\n.archive__subtitle {\n margin: 1.414em 0 0.5em;\n padding-bottom: 0.5em;\n font-size: $type-size-5;\n color: $muted-text-color;\n border-bottom: 1px solid $border-color;\n\n + .list__item .archive__item-title {\n margin-top: 0.5em;\n }\n}\n\n.archive__item-title {\n margin-bottom: 0.25em;\n font-family: $sans-serif-narrow;\n line-height: initial;\n overflow: hidden;\n text-overflow: ellipsis;\n\n a[rel=\"permalink\"]::before {\n content: '';\n position: absolute;\n left: 0;\n top: 0;\n right: 0;\n bottom: 0;\n }\n\n a + a {\n opacity: 0.5;\n }\n}\n\n/* remove border*/\n.page__content {\n .archive__item-title {\n margin-top: 1em;\n border-bottom: none;\n }\n}\n\n.archive__item-excerpt {\n margin-top: 0;\n font-size: $type-size-6;\n\n & + p {\n text-indent: 0;\n }\n\n a {\n position: relative;\n }\n}\n\n.archive__item-teaser {\n position: relative;\n border-radius: $border-radius;\n overflow: hidden;\n\n img {\n width: 100%;\n }\n}\n\n.archive__item-caption {\n position: absolute;\n bottom: 0;\n right: 0;\n margin: 0 auto;\n padding: 2px 5px;\n color: #fff;\n font-family: $caption-font-family;\n font-size: $type-size-8;\n background: #000;\n text-align: right;\n z-index: 5;\n opacity: 0.5;\n border-radius: $border-radius 0 0 0;\n\n @include breakpoint($large) {\n padding: 5px 10px;\n }\n\n a {\n color: #fff;\n text-decoration: none;\n }\n}\n\n/*\n List view\n ========================================================================== */\n\n.list__item {\n .page__meta {\n margin: 0 0 4px;\n font-size: 0.6em;\n }\n}\n\n/*\n Grid view\n ========================================================================== */\n\n.archive {\n .grid__wrapper {\n /* extend grid elements to the right */\n\n @include breakpoint($large) {\n margin-right: -1 * $right-sidebar-width-narrow;\n }\n\n @include breakpoint($x-large) {\n margin-right: -1 * $right-sidebar-width;\n }\n }\n}\n\n.grid__item {\n margin-bottom: 2em;\n\n @include breakpoint($small) {\n float: left;\n width: span(5 of 10);\n\n &:nth-child(2n + 1) {\n clear: both;\n margin-left: 0;\n }\n\n &:nth-child(2n + 2) {\n clear: none;\n margin-left: gutter(of 10);\n }\n }\n\n @include breakpoint($medium) {\n margin-left: 0; /* override margin*/\n margin-right: 0; /* override margin*/\n width: span(3 of 12);\n\n &:nth-child(2n + 1) {\n clear: none;\n }\n\n &:nth-child(4n + 1) {\n clear: both;\n }\n\n &:nth-child(4n + 2) {\n clear: none;\n margin-left: gutter(1 of 12);\n }\n\n &:nth-child(4n + 3) {\n clear: none;\n margin-left: gutter(1 of 12);\n }\n\n &:nth-child(4n + 4) {\n clear: none;\n margin-left: gutter(1 of 12);\n }\n }\n\n .page__meta {\n margin: 0 0 4px;\n font-size: 0.6em;\n }\n\n .page__meta-sep {\n display: block;\n\n &::before {\n display: none;\n }\n }\n\n .archive__item-title {\n margin-top: 0.5em;\n font-size: $type-size-5;\n }\n\n .archive__item-excerpt {\n display: none;\n\n @include breakpoint($medium) {\n display: block;\n font-size: $type-size-6;\n }\n }\n\n .archive__item-teaser {\n @include breakpoint($small) {\n max-height: 200px;\n }\n\n @include breakpoint($medium) {\n max-height: 120px;\n }\n }\n}\n\n/*\n Features\n ========================================================================== */\n\n.feature__wrapper {\n @include clearfix();\n margin-bottom: 2em;\n border-bottom: 1px solid $border-color;\n\n .archive__item-title {\n margin-bottom: 0;\n }\n}\n\n.feature__item {\n position: relative;\n margin-bottom: 2em;\n font-size: 1.125em;\n\n @include breakpoint($small) {\n float: left;\n margin-bottom: 0;\n width: span(4 of 12);\n\n &:nth-child(3n + 1) {\n clear: both;\n margin-left: 0;\n }\n\n &:nth-child(3n + 2) {\n clear: none;\n margin-left: gutter(of 12);\n }\n\n &:nth-child(3n + 3) {\n clear: none;\n margin-left: gutter(of 12);\n }\n\n .feature__item-teaser {\n max-height: 200px;\n overflow: hidden;\n }\n }\n\n .archive__item-body {\n padding-left: gutter(1 of 12);\n padding-right: gutter(1 of 12);\n }\n\n a.btn::before {\n content: '';\n position: absolute;\n left: 0;\n top: 0;\n right: 0;\n bottom: 0;\n }\n\n &--left {\n position: relative;\n float: left;\n margin-left: 0;\n margin-right: 0;\n width: 100%;\n clear: both;\n font-size: 1.125em;\n\n .archive__item {\n float: left;\n }\n\n .archive__item-teaser {\n margin-bottom: 2em;\n }\n\n a.btn::before {\n content: '';\n position: absolute;\n left: 0;\n top: 0;\n right: 0;\n bottom: 0;\n }\n\n @include breakpoint($small) {\n .archive__item-teaser {\n float: left;\n width: span(5 of 12);\n }\n\n .archive__item-body {\n float: right;\n padding-left: gutter(0.5 of 12);\n padding-right: gutter(1 of 12);\n width: span(7 of 12);\n }\n }\n }\n\n &--right {\n position: relative;\n float: left;\n margin-left: 0;\n margin-right: 0;\n width: 100%;\n clear: both;\n font-size: 1.125em;\n\n .archive__item {\n float: left;\n }\n\n .archive__item-teaser {\n margin-bottom: 2em;\n }\n\n a.btn::before {\n content: '';\n position: absolute;\n left: 0;\n top: 0;\n right: 0;\n bottom: 0;\n }\n\n @include breakpoint($small) {\n text-align: right;\n\n .archive__item-teaser {\n float: right;\n width: span(5 of 12);\n }\n\n .archive__item-body {\n float: left;\n width: span(7 of 12);\n padding-left: gutter(0.5 of 12);\n padding-right: gutter(1 of 12);\n }\n }\n }\n\n &--center {\n position: relative;\n float: left;\n margin-left: 0;\n margin-right: 0;\n width: 100%;\n clear: both;\n font-size: 1.125em;\n\n .archive__item {\n float: left;\n width: 100%;\n }\n\n .archive__item-teaser {\n margin-bottom: 2em;\n }\n\n a.btn::before {\n content: '';\n position: absolute;\n left: 0;\n top: 0;\n right: 0;\n bottom: 0;\n }\n\n @include breakpoint($small) {\n text-align: center;\n\n .archive__item-teaser {\n margin: 0 auto;\n width: span(5 of 12);\n }\n\n .archive__item-body {\n margin: 0 auto;\n width: span(7 of 12);\n }\n }\n }\n}\n\n/* Place inside an archive layout */\n\n.archive {\n .feature__wrapper {\n .archive__item-title {\n margin-top: 0.25em;\n font-size: 1em;\n }\n }\n\n .feature__item,\n .feature__item--left,\n .feature__item--center,\n .feature__item--right {\n font-size: 1em;\n }\n}\n\n/*\n Wide Pages\n ========================================================================== */\n\n .wide {\n .archive {\n @include breakpoint($large) {\n padding-right: 0;\n }\n\n @include breakpoint($x-large) {\n padding-right: 0;\n }\n }\n}\n\n/* Place inside a single layout */\n\n.layout--single {\n\t.feature__wrapper {\n\t\tdisplay: inline-block;\n\t}\n}\n","/* ==========================================================================\n SIDEBAR\n ========================================================================== */\n\n/*\n Default\n ========================================================================== */\n\n.sidebar {\n @include clearfix();\n // @include breakpoint(max-width $large) {\n // /* fix z-index order of follow links */\n // position: relative;\n // z-index: 10;\n // -webkit-transform: translate3d(0, 0, 0);\n // transform: translate3d(0, 0, 0);\n // }\n\n @include breakpoint($large) {\n float: left;\n width: calc(#{$right-sidebar-width-narrow} - 1em);\n opacity: 0.75;\n -webkit-transition: opacity 0.2s ease-in-out;\n transition: opacity 0.2s ease-in-out;\n\n &:hover {\n opacity: 1;\n }\n\n &.sticky {\n overflow-y: auto;\n /* calculate height of nav list\n viewport height - nav height - masthead x-padding\n */\n max-height: calc(100vh - #{$nav-height} - 2em);\n }\n }\n\n @include breakpoint($x-large) {\n width: calc(#{$right-sidebar-width} - 1em);\n }\n\n > * {\n margin-top: 1em;\n margin-bottom: 1em;\n }\n\n h2,\n h3,\n h4,\n h5,\n h6 {\n margin-bottom: 0;\n font-family: $sans-serif-narrow;\n }\n\n p,\n li {\n font-family: $sans-serif;\n font-size: $type-size-6;\n line-height: 1.5;\n }\n\n img {\n width: 100%;\n\n &.emoji {\n width: 20px;\n height: 20px;\n }\n }\n}\n\n.sidebar__right {\n margin-bottom: 1em;\n\n @include breakpoint($large) {\n position: absolute;\n top: 0;\n right: 0;\n width: $right-sidebar-width-narrow;\n margin-right: -1 * $right-sidebar-width-narrow;\n padding-left: 1em;\n z-index: 10;\n\n &.sticky {\n @include clearfix();\n position: -webkit-sticky;\n position: sticky;\n top: 2em;\n float: right;\n\n .toc {\n .toc__menu {\n overflow-y: auto;\n max-height: calc(100vh - 7em);\n }\n }\n }\n }\n\n @include breakpoint($x-large) {\n width: $right-sidebar-width;\n margin-right: -1 * $right-sidebar-width;\n }\n}\n\n.splash .sidebar__right {\n @include breakpoint($large) {\n position: relative;\n float: right;\n margin-right: 0;\n }\n\n @include breakpoint($x-large) {\n margin-right: 0;\n }\n}\n\n/*\n Author profile and links\n ========================================================================== */\n\n.author__avatar {\n display: table-cell;\n vertical-align: top;\n width: 36px;\n height: 36px;\n\n @include breakpoint($large) {\n display: block;\n width: auto;\n height: auto;\n }\n\n img {\n max-width: 110px;\n border-radius: 50%;\n\n @include breakpoint($large) {\n padding: 5px;\n border: 1px solid $border-color;\n }\n }\n}\n\n.author__content {\n display: table-cell;\n vertical-align: top;\n padding-left: 15px;\n padding-right: 25px;\n line-height: 1;\n\n @include breakpoint($large) {\n display: block;\n width: 100%;\n padding-left: 0;\n padding-right: 0;\n }\n\n a {\n color: inherit;\n text-decoration: none;\n }\n}\n\n.author__name {\n margin: 0;\n\n @include breakpoint($large) {\n margin-top: 10px;\n margin-bottom: 10px;\n }\n}\n.sidebar .author__name {\n font-family: $sans-serif;\n font-size: $type-size-5;\n}\n\n.author__bio {\n margin: 0;\n\n @include breakpoint($large) {\n margin-top: 10px;\n margin-bottom: 20px;\n }\n}\n\n.author__urls-wrapper {\n position: relative;\n display: table-cell;\n vertical-align: middle;\n font-family: $sans-serif;\n z-index: 20;\n cursor: pointer;\n\n li:last-child {\n a {\n margin-bottom: 0;\n }\n }\n\n .author__urls {\n span.label {\n padding-left: 5px;\n }\n }\n\n @include breakpoint($large) {\n display: block;\n }\n\n button {\n position: relative;\n margin-bottom: 0;\n\n &:before {\n @supports (pointer-events: none) {\n content: '';\n position: fixed;\n top: 0;\n left: 0;\n width: 100%;\n height: 100%;\n pointer-events: none;\n }\n }\n\n &.open {\n &:before {\n pointer-events: auto;\n }\n }\n\n @include breakpoint($large) {\n display: none;\n }\n }\n}\n\n.author__urls {\n display: none;\n position: absolute;\n right: 0;\n margin-top: 15px;\n padding: 10px;\n list-style-type: none;\n border: 1px solid $border-color;\n border-radius: $border-radius;\n background: $background-color;\n box-shadow: 0 2px 4px 0 rgba(#000, 0.16), 0 2px 10px 0 rgba(#000, 0.12);\n cursor: default;\n\n &.is--visible {\n display: block;\n }\n\n @include breakpoint($large) {\n display: block;\n position: relative;\n margin: 0;\n padding: 0;\n border: 0;\n background: transparent;\n box-shadow: none;\n }\n\n &:before {\n display: block;\n content: \"\";\n position: absolute;\n top: -11px;\n left: calc(50% - 10px);\n width: 0;\n border-style: solid;\n border-width: 0 10px 10px;\n border-color: $border-color transparent;\n z-index: 0;\n\n @include breakpoint($large) {\n display: none;\n }\n }\n\n &:after {\n display: block;\n content: \"\";\n position: absolute;\n top: -10px;\n left: calc(50% - 10px);\n width: 0;\n border-style: solid;\n border-width: 0 10px 10px;\n border-color: $background-color transparent;\n z-index: 1;\n\n @include breakpoint($large) {\n display: none;\n }\n }\n\n ul {\n padding: 10px;\n list-style-type: none;\n }\n\n li {\n white-space: nowrap;\n }\n\n a {\n display: block;\n margin-bottom: 5px;\n padding-right: 5px;\n padding-top: 2px;\n padding-bottom: 2px;\n color: inherit;\n font-size: $type-size-5;\n text-decoration: none;\n\n &:hover {\n text-decoration: underline;\n }\n }\n}\n\n/*\n Wide Pages\n ========================================================================== */\n\n.wide .sidebar__right {\n margin-bottom: 1em;\n\n @include breakpoint($large) {\n position: initial;\n top: initial;\n right: initial;\n width: initial;\n margin-right: initial;\n padding-left: initial;\n z-index: initial;\n\n &.sticky {\n float: none;\n }\n }\n\n @include breakpoint($x-large) {\n width: initial;\n margin-right: initial;\n }\n}\n\n","/* ==========================================================================\n PRINT STYLES\n ========================================================================== */\n\n@media print {\n\n [hidden] {\n display: none;\n }\n\n * {\n -moz-box-sizing: border-box;\n -webkit-box-sizing: border-box;\n box-sizing: border-box;\n }\n\n html {\n margin: 0;\n padding: 0;\n min-height: auto !important;\n font-size: 16px;\n }\n\n body {\n margin: 0 auto;\n background: #fff !important;\n color: #000 !important;\n font-size: 1rem;\n line-height: 1.5;\n -moz-osx-font-smoothing: grayscale;\n -webkit-font-smoothing: antialiased;\n text-rendering: optimizeLegibility;\n }\n\n h1,\n h2,\n h3,\n h4,\n h5,\n h6 {\n color: #000;\n line-height: 1.2;\n margin-bottom: 0.75rem;\n margin-top: 0;\n }\n\n h1 {\n font-size: 2.5rem;\n }\n\n h2 {\n font-size: 2rem;\n }\n\n h3 {\n font-size: 1.75rem;\n }\n\n h4 {\n font-size: 1.5rem;\n }\n\n h5 {\n font-size: 1.25rem;\n }\n\n h6 {\n font-size: 1rem;\n }\n\n a,\n a:visited {\n color: #000;\n text-decoration: underline;\n word-wrap: break-word;\n }\n\n table {\n border-collapse: collapse;\n }\n\n thead {\n display: table-header-group;\n }\n\n table,\n th,\n td {\n border-bottom: 1px solid #000;\n }\n\n td,\n th {\n padding: 8px 16px;\n }\n\n img {\n border: 0;\n display: block;\n max-width: 100% !important;\n vertical-align: middle;\n }\n\n hr {\n border: 0;\n border-bottom: 2px solid #bbb;\n height: 0;\n margin: 2.25rem 0;\n padding: 0;\n }\n\n dt {\n font-weight: bold;\n }\n\n dd {\n margin: 0;\n margin-bottom: 0.75rem;\n }\n\n abbr[title],\n acronym[title] {\n border: 0;\n text-decoration: none;\n }\n\n table,\n blockquote,\n pre,\n code,\n figure,\n li,\n hr,\n ul,\n ol,\n a,\n tr {\n page-break-inside: avoid;\n }\n\n h2,\n h3,\n h4,\n p,\n a {\n orphans: 3;\n widows: 3;\n }\n\n h1,\n h2,\n h3,\n h4,\n h5,\n h6 {\n page-break-after: avoid;\n page-break-inside: avoid;\n }\n\n h1 + p,\n h2 + p,\n h3 + p {\n page-break-before: avoid;\n }\n\n img {\n page-break-after: auto;\n page-break-before: auto;\n page-break-inside: avoid;\n }\n\n pre {\n white-space: pre-wrap !important;\n word-wrap: break-word;\n }\n\n a[href^='http://']:after,\n a[href^='https://']:after,\n a[href^='ftp://']:after {\n content: \" (\" attr(href) \")\";\n font-size: 80%;\n }\n\n abbr[title]:after,\n acronym[title]:after {\n content: \" (\" attr(title) \")\";\n }\n\n #main {\n max-width: 100%;\n }\n\n .page {\n margin: 0;\n padding: 0;\n width: 100%;\n }\n\n .page-break,\n .page-break-before {\n page-break-before: always;\n }\n\n .page-break-after {\n page-break-after: always;\n }\n\n .no-print {\n display: none;\n }\n\n a.no-reformat:after {\n content: '';\n }\n\n abbr[title].no-reformat:after,\n acronym[title].no-reformat:after {\n content: '';\n }\n\n .page__hero-caption {\n color: #000 !important;\n background: #fff !important;\n opacity: 1;\n\n a {\n color: #000 !important;\n }\n }\n\n/*\n Hide the following elements on print\n ========================================================================== */\n\n .masthead,\n .toc,\n .page__share,\n .page__related,\n .pagination,\n .ads,\n .page__footer,\n .page__comments-form,\n .author__avatar,\n .author__content,\n .author__urls-wrapper,\n .nav__list,\n .sidebar,\n .adsbygoogle {\n display: none !important;\n height: 1px !important;\n }\n}"],"file":"main.css"} \ No newline at end of file diff --git a/assets/js/lunr/lunr-en.js b/assets/js/lunr/lunr-en.js new file mode 100644 index 00000000..d1400a76 --- /dev/null +++ b/assets/js/lunr/lunr-en.js @@ -0,0 +1,69 @@ +var idx = lunr(function () { + this.field('title') + this.field('excerpt') + this.field('categories') + this.field('tags') + this.ref('id') + + this.pipeline.remove(lunr.trimmer) + + for (var item in store) { + this.add({ + title: store[item].title, + excerpt: store[item].excerpt, + categories: store[item].categories, + tags: store[item].tags, + id: item + }) + } +}); + +$(document).ready(function() { + $('input#search').on('keyup', function () { + var resultdiv = $('#results'); + var query = $(this).val().toLowerCase(); + var result = + idx.query(function (q) { + query.split(lunr.tokenizer.separator).forEach(function (term) { + q.term(term, { boost: 100 }) + if(query.lastIndexOf(" ") != query.length-1){ + q.term(term, { usePipeline: false, wildcard: lunr.Query.wildcard.TRAILING, boost: 10 }) + } + if (term != ""){ + q.term(term, { usePipeline: false, editDistance: 1, boost: 1 }) + } + }) + }); + resultdiv.empty(); + resultdiv.prepend('

'+result.length+' Result(s) found

'); + for (var item in result) { + var ref = result[item].ref; + if(store[ref].teaser){ + var searchitem = + '
'+ + '
'+ + '

'+ + ''+store[ref].title+''+ + '

'+ + '
'+ + ''+ + '
'+ + '

'+store[ref].excerpt.split(" ").splice(0,20).join(" ")+'...

'+ + '
'+ + '
'; + } + else{ + var searchitem = + '
'+ + '
'+ + '

'+ + ''+store[ref].title+''+ + '

'+ + '

'+store[ref].excerpt.split(" ").splice(0,20).join(" ")+'...

'+ + '
'+ + '
'; + } + resultdiv.append(searchitem); + } + }); +}); diff --git a/assets/js/lunr/lunr-gr.js b/assets/js/lunr/lunr-gr.js new file mode 100644 index 00000000..e829362b --- /dev/null +++ b/assets/js/lunr/lunr-gr.js @@ -0,0 +1,522 @@ +step1list = new Array(); +step1list["ΦΑΓΙΑ"] = "ΦΑ"; +step1list["ΦΑΓΙΟΥ"] = "ΦΑ"; +step1list["ΦΑΓΙΩΝ"] = "ΦΑ"; +step1list["ΣΚΑΓΙΑ"] = "ΣΚΑ"; +step1list["ΣΚΑΓΙΟΥ"] = "ΣΚΑ"; +step1list["ΣΚΑΓΙΩΝ"] = "ΣΚΑ"; +step1list["ΟΛΟΓΙΟΥ"] = "ΟΛΟ"; +step1list["ΟΛΟΓΙΑ"] = "ΟΛΟ"; +step1list["ΟΛΟΓΙΩΝ"] = "ΟΛΟ"; +step1list["ΣΟΓΙΟΥ"] = "ΣΟ"; +step1list["ΣΟΓΙΑ"] = "ΣΟ"; +step1list["ΣΟΓΙΩΝ"] = "ΣΟ"; +step1list["ΤΑΤΟΓΙΑ"] = "ΤΑΤΟ"; +step1list["ΤΑΤΟΓΙΟΥ"] = "ΤΑΤΟ"; +step1list["ΤΑΤΟΓΙΩΝ"] = "ΤΑΤΟ"; +step1list["ΚΡΕΑΣ"] = "ΚΡΕ"; +step1list["ΚΡΕΑΤΟΣ"] = "ΚΡΕ"; +step1list["ΚΡΕΑΤΑ"] = "ΚΡΕ"; +step1list["ΚΡΕΑΤΩΝ"] = "ΚΡΕ"; +step1list["ΠΕΡΑΣ"] = "ΠΕΡ"; +step1list["ΠΕΡΑΤΟΣ"] = "ΠΕΡ"; +step1list["ΠΕΡΑΤΑ"] = "ΠΕΡ"; +step1list["ΠΕΡΑΤΩΝ"] = "ΠΕΡ"; +step1list["ΤΕΡΑΣ"] = "ΤΕΡ"; +step1list["ΤΕΡΑΤΟΣ"] = "ΤΕΡ"; +step1list["ΤΕΡΑΤΑ"] = "ΤΕΡ"; +step1list["ΤΕΡΑΤΩΝ"] = "ΤΕΡ"; +step1list["ΦΩΣ"] = "ΦΩ"; +step1list["ΦΩΤΟΣ"] = "ΦΩ"; +step1list["ΦΩΤΑ"] = "ΦΩ"; +step1list["ΦΩΤΩΝ"] = "ΦΩ"; +step1list["ΚΑΘΕΣΤΩΣ"] = "ΚΑΘΕΣΤ"; +step1list["ΚΑΘΕΣΤΩΤΟΣ"] = "ΚΑΘΕΣΤ"; +step1list["ΚΑΘΕΣΤΩΤΑ"] = "ΚΑΘΕΣΤ"; +step1list["ΚΑΘΕΣΤΩΤΩΝ"] = "ΚΑΘΕΣΤ"; +step1list["ΓΕΓΟΝΟΣ"] = "ΓΕΓΟΝ"; +step1list["ΓΕΓΟΝΟΤΟΣ"] = "ΓΕΓΟΝ"; +step1list["ΓΕΓΟΝΟΤΑ"] = "ΓΕΓΟΝ"; +step1list["ΓΕΓΟΝΟΤΩΝ"] = "ΓΕΓΟΝ"; + +v = "[ΑΕΗΙΟΥΩ]"; +v2 = "[ΑΕΗΙΟΩ]" + +function stemWord(w) { + var stem; + var suffix; + var firstch; + var origword = w; + test1 = new Boolean(true); + + if(w.length < 4) { + return w; + } + + var re; + var re2; + var re3; + var re4; + + re = /(.*)(ΦΑΓΙΑ|ΦΑΓΙΟΥ|ΦΑΓΙΩΝ|ΣΚΑΓΙΑ|ΣΚΑΓΙΟΥ|ΣΚΑΓΙΩΝ|ΟΛΟΓΙΟΥ|ΟΛΟΓΙΑ|ΟΛΟΓΙΩΝ|ΣΟΓΙΟΥ|ΣΟΓΙΑ|ΣΟΓΙΩΝ|ΤΑΤΟΓΙΑ|ΤΑΤΟΓΙΟΥ|ΤΑΤΟΓΙΩΝ|ΚΡΕΑΣ|ΚΡΕΑΤΟΣ|ΚΡΕΑΤΑ|ΚΡΕΑΤΩΝ|ΠΕΡΑΣ|ΠΕΡΑΤΟΣ|ΠΕΡΑΤΑ|ΠΕΡΑΤΩΝ|ΤΕΡΑΣ|ΤΕΡΑΤΟΣ|ΤΕΡΑΤΑ|ΤΕΡΑΤΩΝ|ΦΩΣ|ΦΩΤΟΣ|ΦΩΤΑ|ΦΩΤΩΝ|ΚΑΘΕΣΤΩΣ|ΚΑΘΕΣΤΩΤΟΣ|ΚΑΘΕΣΤΩΤΑ|ΚΑΘΕΣΤΩΤΩΝ|ΓΕΓΟΝΟΣ|ΓΕΓΟΝΟΤΟΣ|ΓΕΓΟΝΟΤΑ|ΓΕΓΟΝΟΤΩΝ)$/; + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + w = stem + step1list[suffix]; + test1 = false; + } + + re = /^(.+?)(ΑΔΕΣ|ΑΔΩΝ)$/; + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem; + + reg1 = /(ΟΚ|ΜΑΜ|ΜΑΝ|ΜΠΑΜΠ|ΠΑΤΕΡ|ΓΙΑΓΙ|ΝΤΑΝΤ|ΚΥΡ|ΘΕΙ|ΠΕΘΕΡ)$/; + + if(!(reg1.test(w))) { + w = w + "ΑΔ"; + } + } + + re2 = /^(.+?)(ΕΔΕΣ|ΕΔΩΝ)$/; + + if(re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1]; + w = stem; + + exept2 = /(ΟΠ|ΙΠ|ΕΜΠ|ΥΠ|ΓΗΠ|ΔΑΠ|ΚΡΑΣΠ|ΜΙΛ)$/; + + if(exept2.test(w)) { + w = w + "ΕΔ"; + } + } + + re3 = /^(.+?)(ΟΥΔΕΣ|ΟΥΔΩΝ)$/; + + if(re3.test(w)) { + var fp = re3.exec(w); + stem = fp[1]; + w = stem; + + exept3 = /(ΑΡΚ|ΚΑΛΙΑΚ|ΠΕΤΑΛ|ΛΙΧ|ΠΛΕΞ|ΣΚ|Σ|ΦΛ|ΦΡ|ΒΕΛ|ΛΟΥΛ|ΧΝ|ΣΠ|ΤΡΑΓ|ΦΕ)$/; + + if(exept3.test(w)) { + w = w + "ΟΥΔ"; + } + } + + re4 = /^(.+?)(ΕΩΣ|ΕΩΝ)$/; + + if(re4.test(w)) { + var fp = re4.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + exept4 = /^(Θ|Δ|ΕΛ|ΓΑΛ|Ν|Π|ΙΔ|ΠΑΡ)$/; + + if(exept4.test(w)) { + w = w + "Ε"; + } + } + + re = /^(.+?)(ΙΑ|ΙΟΥ|ΙΩΝ)$/; + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem; + re2 = new RegExp(v + "$"); + test1 = false; + + if(re2.test(w)) { + w = stem + "Ι"; + } + } + + re = /^(.+?)(ΙΚΑ|ΙΚΟ|ΙΚΟΥ|ΙΚΩΝ)$/; + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + re2 = new RegExp(v + "$"); + exept5 = /^(ΑΛ|ΑΔ|ΕΝΔ|ΑΜΑΝ|ΑΜΜΟΧΑΛ|ΗΘ|ΑΝΗΘ|ΑΝΤΙΔ|ΦΥΣ|ΒΡΩΜ|ΓΕΡ|ΕΞΩΔ|ΚΑΛΠ|ΚΑΛΛΙΝ|ΚΑΤΑΔ|ΜΟΥΛ|ΜΠΑΝ|ΜΠΑΓΙΑΤ|ΜΠΟΛ|ΜΠΟΣ|ΝΙΤ|ΞΙΚ|ΣΥΝΟΜΗΛ|ΠΕΤΣ|ΠΙΤΣ|ΠΙΚΑΝΤ|ΠΛΙΑΤΣ|ΠΟΣΤΕΛΝ|ΠΡΩΤΟΔ|ΣΕΡΤ|ΣΥΝΑΔ|ΤΣΑΜ|ΥΠΟΔ|ΦΙΛΟΝ|ΦΥΛΟΔ|ΧΑΣ)$/; + + if((exept5.test(w)) || (re2.test(w))) { + w = w + "ΙΚ"; + } + } + + re = /^(.+?)(ΑΜΕ)$/; + re2 = /^(.+?)(ΑΓΑΜΕ|ΗΣΑΜΕ|ΟΥΣΑΜΕ|ΗΚΑΜΕ|ΗΘΗΚΑΜΕ)$/; + if(w == "ΑΓΑΜΕ") { + w = "ΑΓΑΜ"; + } + + if(re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + } + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + exept6 = /^(ΑΝΑΠ|ΑΠΟΘ|ΑΠΟΚ|ΑΠΟΣΤ|ΒΟΥΒ|ΞΕΘ|ΟΥΛ|ΠΕΘ|ΠΙΚΡ|ΠΟΤ|ΣΙΧ|Χ)$/; + + if(exept6.test(w)) { + w = w + "ΑΜ"; + } + } + + re2 = /^(.+?)(ΑΝΕ)$/; + re3 = /^(.+?)(ΑΓΑΝΕ|ΗΣΑΝΕ|ΟΥΣΑΝΕ|ΙΟΝΤΑΝΕ|ΙΟΤΑΝΕ|ΙΟΥΝΤΑΝΕ|ΟΝΤΑΝΕ|ΟΤΑΝΕ|ΟΥΝΤΑΝΕ|ΗΚΑΝΕ|ΗΘΗΚΑΝΕ)$/; + + if(re3.test(w)) { + var fp = re3.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + re3 = /^(ΤΡ|ΤΣ)$/; + + if(re3.test(w)) { + w = w + "ΑΓΑΝ"; + } + } + + if(re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + re2 = new RegExp(v2 + "$"); + exept7 = /^(ΒΕΤΕΡ|ΒΟΥΛΚ|ΒΡΑΧΜ|Γ|ΔΡΑΔΟΥΜ|Θ|ΚΑΛΠΟΥΖ|ΚΑΣΤΕΛ|ΚΟΡΜΟΡ|ΛΑΟΠΛ|ΜΩΑΜΕΘ|Μ|ΜΟΥΣΟΥΛΜ|Ν|ΟΥΛ|Π|ΠΕΛΕΚ|ΠΛ|ΠΟΛΙΣ|ΠΟΡΤΟΛ|ΣΑΡΑΚΑΤΣ|ΣΟΥΛΤ|ΤΣΑΡΛΑΤ|ΟΡΦ|ΤΣΙΓΓ|ΤΣΟΠ|ΦΩΤΟΣΤΕΦ|Χ|ΨΥΧΟΠΛ|ΑΓ|ΟΡΦ|ΓΑΛ|ΓΕΡ|ΔΕΚ|ΔΙΠΛ|ΑΜΕΡΙΚΑΝ|ΟΥΡ|ΠΙΘ|ΠΟΥΡΙΤ|Σ|ΖΩΝΤ|ΙΚ|ΚΑΣΤ|ΚΟΠ|ΛΙΧ|ΛΟΥΘΗΡ|ΜΑΙΝΤ|ΜΕΛ|ΣΙΓ|ΣΠ|ΣΤΕΓ|ΤΡΑΓ|ΤΣΑΓ|Φ|ΕΡ|ΑΔΑΠ|ΑΘΙΓΓ|ΑΜΗΧ|ΑΝΙΚ|ΑΝΟΡΓ|ΑΠΗΓ|ΑΠΙΘ|ΑΤΣΙΓΓ|ΒΑΣ|ΒΑΣΚ|ΒΑΘΥΓΑΛ|ΒΙΟΜΗΧ|ΒΡΑΧΥΚ|ΔΙΑΤ|ΔΙΑΦ|ΕΝΟΡΓ|ΘΥΣ|ΚΑΠΝΟΒΙΟΜΗΧ|ΚΑΤΑΓΑΛ|ΚΛΙΒ|ΚΟΙΛΑΡΦ|ΛΙΒ|ΜΕΓΛΟΒΙΟΜΗΧ|ΜΙΚΡΟΒΙΟΜΗΧ|ΝΤΑΒ|ΞΗΡΟΚΛΙΒ|ΟΛΙΓΟΔΑΜ|ΟΛΟΓΑΛ|ΠΕΝΤΑΡΦ|ΠΕΡΗΦ|ΠΕΡΙΤΡ|ΠΛΑΤ|ΠΟΛΥΔΑΠ|ΠΟΛΥΜΗΧ|ΣΤΕΦ|ΤΑΒ|ΤΕΤ|ΥΠΕΡΗΦ|ΥΠΟΚΟΠ|ΧΑΜΗΛΟΔΑΠ|ΨΗΛΟΤΑΒ)$/; + + if((re2.test(w)) || (exept7.test(w))) { + w = w + "ΑΝ"; + } + } + + re3 = /^(.+?)(ΕΤΕ)$/; + re4 = /^(.+?)(ΗΣΕΤΕ)$/; + + if(re4.test(w)) { + var fp = re4.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + } + + if(re3.test(w)) { + var fp = re3.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + re3 = new RegExp(v2 + "$"); + exept8 = /(ΟΔ|ΑΙΡ|ΦΟΡ|ΤΑΘ|ΔΙΑΘ|ΣΧ|ΕΝΔ|ΕΥΡ|ΤΙΘ|ΥΠΕΡΘ|ΡΑΘ|ΕΝΘ|ΡΟΘ|ΣΘ|ΠΥΡ|ΑΙΝ|ΣΥΝΔ|ΣΥΝ|ΣΥΝΘ|ΧΩΡ|ΠΟΝ|ΒΡ|ΚΑΘ|ΕΥΘ|ΕΚΘ|ΝΕΤ|ΡΟΝ|ΑΡΚ|ΒΑΡ|ΒΟΛ|ΩΦΕΛ)$/; + exept9 = /^(ΑΒΑΡ|ΒΕΝ|ΕΝΑΡ|ΑΒΡ|ΑΔ|ΑΘ|ΑΝ|ΑΠΛ|ΒΑΡΟΝ|ΝΤΡ|ΣΚ|ΚΟΠ|ΜΠΟΡ|ΝΙΦ|ΠΑΓ|ΠΑΡΑΚΑΛ|ΣΕΡΠ|ΣΚΕΛ|ΣΥΡΦ|ΤΟΚ|Υ|Δ|ΕΜ|ΘΑΡΡ|Θ)$/; + + if((re3.test(w)) || (exept8.test(w)) || (exept9.test(w))) { + w = w + "ΕΤ"; + } + } + + re = /^(.+?)(ΟΝΤΑΣ|ΩΝΤΑΣ)$/; + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + exept10 = /^(ΑΡΧ)$/; + exept11 = /(ΚΡΕ)$/; + if(exept10.test(w)) { + w = w + "ΟΝΤ"; + } + if(exept11.test(w)) { + w = w + "ΩΝΤ"; + } + } + + re = /^(.+?)(ΟΜΑΣΤΕ|ΙΟΜΑΣΤΕ)$/; + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + exept11 = /^(ΟΝ)$/; + + if(exept11.test(w)) { + w = w + "ΟΜΑΣΤ"; + } + } + + re = /^(.+?)(ΕΣΤΕ)$/; + re2 = /^(.+?)(ΙΕΣΤΕ)$/; + + if(re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + re2 = /^(Π|ΑΠ|ΣΥΜΠ|ΑΣΥΜΠ|ΑΚΑΤΑΠ|ΑΜΕΤΑΜΦ)$/; + + if(re2.test(w)) { + w = w + "ΙΕΣΤ"; + } + } + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + exept12 = /^(ΑΛ|ΑΡ|ΕΚΤΕΛ|Ζ|Μ|Ξ|ΠΑΡΑΚΑΛ|ΑΡ|ΠΡΟ|ΝΙΣ)$/; + + if(exept12.test(w)) { + w = w + "ΕΣΤ"; + } + } + + re = /^(.+?)(ΗΚΑ|ΗΚΕΣ|ΗΚΕ)$/; + re2 = /^(.+?)(ΗΘΗΚΑ|ΗΘΗΚΕΣ|ΗΘΗΚΕ)$/; + + if(re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + } + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + exept13 = /(ΣΚΩΛ|ΣΚΟΥΛ|ΝΑΡΘ|ΣΦ|ΟΘ|ΠΙΘ)$/; + exept14 = /^(ΔΙΑΘ|Θ|ΠΑΡΑΚΑΤΑΘ|ΠΡΟΣΘ|ΣΥΝΘ|)$/; + + if((exept13.test(w)) || (exept14.test(w))) { + w = w + "ΗΚ"; + } + } + + re = /^(.+?)(ΟΥΣΑ|ΟΥΣΕΣ|ΟΥΣΕ)$/; + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + exept15 = /^(ΦΑΡΜΑΚ|ΧΑΔ|ΑΓΚ|ΑΝΑΡΡ|ΒΡΟΜ|ΕΚΛΙΠ|ΛΑΜΠΙΔ|ΛΕΧ|Μ|ΠΑΤ|Ρ|Λ|ΜΕΔ|ΜΕΣΑΖ|ΥΠΟΤΕΙΝ|ΑΜ|ΑΙΘ|ΑΝΗΚ|ΔΕΣΠΟΖ|ΕΝΔΙΑΦΕΡ|ΔΕ|ΔΕΥΤΕΡΕΥ|ΚΑΘΑΡΕΥ|ΠΛΕ|ΤΣΑ)$/; + exept16 = /(ΠΟΔΑΡ|ΒΛΕΠ|ΠΑΝΤΑΧ|ΦΡΥΔ|ΜΑΝΤΙΛ|ΜΑΛΛ|ΚΥΜΑΤ|ΛΑΧ|ΛΗΓ|ΦΑΓ|ΟΜ|ΠΡΩΤ)$/; + + if((exept15.test(w)) || (exept16.test(w))) { + w = w + "ΟΥΣ"; + } + } + + re = /^(.+?)(ΑΓΑ|ΑΓΕΣ|ΑΓΕ)$/; + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + exept17 = /^(ΨΟΦ|ΝΑΥΛΟΧ)$/; + exept20 = /(ΚΟΛΛ)$/; + exept18 = /^(ΑΒΑΣΤ|ΠΟΛΥΦ|ΑΔΗΦ|ΠΑΜΦ|Ρ|ΑΣΠ|ΑΦ|ΑΜΑΛ|ΑΜΑΛΛΙ|ΑΝΥΣΤ|ΑΠΕΡ|ΑΣΠΑΡ|ΑΧΑΡ|ΔΕΡΒΕΝ|ΔΡΟΣΟΠ|ΞΕΦ|ΝΕΟΠ|ΝΟΜΟΤ|ΟΛΟΠ|ΟΜΟΤ|ΠΡΟΣΤ|ΠΡΟΣΩΠΟΠ|ΣΥΜΠ|ΣΥΝΤ|Τ|ΥΠΟΤ|ΧΑΡ|ΑΕΙΠ|ΑΙΜΟΣΤ|ΑΝΥΠ|ΑΠΟΤ|ΑΡΤΙΠ|ΔΙΑΤ|ΕΝ|ΕΠΙΤ|ΚΡΟΚΑΛΟΠ|ΣΙΔΗΡΟΠ|Λ|ΝΑΥ|ΟΥΛΑΜ|ΟΥΡ|Π|ΤΡ|Μ)$/; + exept19 = /(ΟΦ|ΠΕΛ|ΧΟΡΤ|ΛΛ|ΣΦ|ΡΠ|ΦΡ|ΠΡ|ΛΟΧ|ΣΜΗΝ)$/; + + if(((exept18.test(w)) || (exept19.test(w))) && !((exept17.test(w)) || (exept20.test(w)))) { + w = w + "ΑΓ"; + } + } + + re = /^(.+?)(ΗΣΕ|ΗΣΟΥ|ΗΣΑ)$/; + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + exept21 = /^(Ν|ΧΕΡΣΟΝ|ΔΩΔΕΚΑΝ|ΕΡΗΜΟΝ|ΜΕΓΑΛΟΝ|ΕΠΤΑΝ)$/; + + if(exept21.test(w)) { + w = w + "ΗΣ"; + } + } + + re = /^(.+?)(ΗΣΤΕ)$/; + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + exept22 = /^(ΑΣΒ|ΣΒ|ΑΧΡ|ΧΡ|ΑΠΛ|ΑΕΙΜΝ|ΔΥΣΧΡ|ΕΥΧΡ|ΚΟΙΝΟΧΡ|ΠΑΛΙΜΨ)$/; + + if(exept22.test(w)) { + w = w + "ΗΣΤ"; + } + } + + re = /^(.+?)(ΟΥΝΕ|ΗΣΟΥΝΕ|ΗΘΟΥΝΕ)$/; + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + exept23 = /^(Ν|Ρ|ΣΠΙ|ΣΤΡΑΒΟΜΟΥΤΣ|ΚΑΚΟΜΟΥΤΣ|ΕΞΩΝ)$/; + + if(exept23.test(w)) { + w = w + "ΟΥΝ"; + } + } + + re = /^(.+?)(ΟΥΜΕ|ΗΣΟΥΜΕ|ΗΘΟΥΜΕ)$/; + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem; + test1 = false; + + exept24 = /^(ΠΑΡΑΣΟΥΣ|Φ|Χ|ΩΡΙΟΠΛ|ΑΖ|ΑΛΛΟΣΟΥΣ|ΑΣΟΥΣ)$/; + + if(exept24.test(w)) { + w = w + "ΟΥΜ"; + } + } + + re = /^(.+?)(ΜΑΤΑ|ΜΑΤΩΝ|ΜΑΤΟΣ)$/; + re2 = /^(.+?)(Α|ΑΓΑΤΕ|ΑΓΑΝ|ΑΕΙ|ΑΜΑΙ|ΑΝ|ΑΣ|ΑΣΑΙ|ΑΤΑΙ|ΑΩ|Ε|ΕΙ|ΕΙΣ|ΕΙΤΕ|ΕΣΑΙ|ΕΣ|ΕΤΑΙ|Ι|ΙΕΜΑΙ|ΙΕΜΑΣΤΕ|ΙΕΤΑΙ|ΙΕΣΑΙ|ΙΕΣΑΣΤΕ|ΙΟΜΑΣΤΑΝ|ΙΟΜΟΥΝ|ΙΟΜΟΥΝΑ|ΙΟΝΤΑΝ|ΙΟΝΤΟΥΣΑΝ|ΙΟΣΑΣΤΑΝ|ΙΟΣΑΣΤΕ|ΙΟΣΟΥΝ|ΙΟΣΟΥΝΑ|ΙΟΤΑΝ|ΙΟΥΜΑ|ΙΟΥΜΑΣΤΕ|ΙΟΥΝΤΑΙ|ΙΟΥΝΤΑΝ|Η|ΗΔΕΣ|ΗΔΩΝ|ΗΘΕΙ|ΗΘΕΙΣ|ΗΘΕΙΤΕ|ΗΘΗΚΑΤΕ|ΗΘΗΚΑΝ|ΗΘΟΥΝ|ΗΘΩ|ΗΚΑΤΕ|ΗΚΑΝ|ΗΣ|ΗΣΑΝ|ΗΣΑΤΕ|ΗΣΕΙ|ΗΣΕΣ|ΗΣΟΥΝ|ΗΣΩ|Ο|ΟΙ|ΟΜΑΙ|ΟΜΑΣΤΑΝ|ΟΜΟΥΝ|ΟΜΟΥΝΑ|ΟΝΤΑΙ|ΟΝΤΑΝ|ΟΝΤΟΥΣΑΝ|ΟΣ|ΟΣΑΣΤΑΝ|ΟΣΑΣΤΕ|ΟΣΟΥΝ|ΟΣΟΥΝΑ|ΟΤΑΝ|ΟΥ|ΟΥΜΑΙ|ΟΥΜΑΣΤΕ|ΟΥΝ|ΟΥΝΤΑΙ|ΟΥΝΤΑΝ|ΟΥΣ|ΟΥΣΑΝ|ΟΥΣΑΤΕ|Υ|ΥΣ|Ω|ΩΝ)$/; + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem + "ΜΑ"; + } + + if((re2.test(w)) && (test1)) { + var fp = re2.exec(w); + stem = fp[1]; + w = stem; + + } + + re = /^(.+?)(ΕΣΤΕΡ|ΕΣΤΑΤ|ΟΤΕΡ|ΟΤΑΤ|ΥΤΕΡ|ΥΤΑΤ|ΩΤΕΡ|ΩΤΑΤ)$/; + + if(re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem; + } + + return w; +}; + +var greekStemmer = function (token) { + return token.update(function (word) { + return stemWord(word); + }) +} + +var idx = lunr(function () { + this.field('title') + this.field('excerpt') + this.field('categories') + this.field('tags') + this.ref('id') + + this.pipeline.remove(lunr.trimmer) + this.pipeline.add(greekStemmer) + this.pipeline.remove(lunr.stemmer) + + for (var item in store) { + this.add({ + title: store[item].title, + excerpt: store[item].excerpt, + categories: store[item].categories, + tags: store[item].tags, + id: item + }) + } +}); + +$(document).ready(function() { + $('input#search').on('keyup', function () { + var resultdiv = $('#results'); + var query = $(this).val().toLowerCase(); + var result = + idx.query(function (q) { + query.split(lunr.tokenizer.separator).forEach(function (term) { + q.term(term, { boost: 100 }) + if(query.lastIndexOf(" ") != query.length-1){ + q.term(term, { usePipeline: false, wildcard: lunr.Query.wildcard.TRAILING, boost: 10 }) + } + if (term != ""){ + q.term(term, { usePipeline: false, editDistance: 1, boost: 1 }) + } + }) + }); + resultdiv.empty(); + resultdiv.prepend('

'+result.length+' Result(s) found

'); + for (var item in result) { + var ref = result[item].ref; + if(store[ref].teaser){ + var searchitem = + '
'+ + '
'+ + '

'+ + ''+store[ref].title+''+ + '

'+ + '
'+ + ''+ + '
'+ + '

'+store[ref].excerpt.split(" ").splice(0,20).join(" ")+'...

'+ + '
'+ + '
'; + } + else{ + var searchitem = + '
'+ + '
'+ + '

'+ + ''+store[ref].title+''+ + '

'+ + '

'+store[ref].excerpt.split(" ").splice(0,20).join(" ")+'...

'+ + '
'+ + '
'; + } + resultdiv.append(searchitem); + } + }); +}); diff --git a/assets/js/lunr/lunr-store.js b/assets/js/lunr/lunr-store.js new file mode 100644 index 00000000..a14fb455 --- /dev/null +++ b/assets/js/lunr/lunr-store.js @@ -0,0 +1,499 @@ +var store = [{ + "title": "Setting up Github Pages With custom domain over HTTPS", + "excerpt":"With Github pages, we can create our blogs in our own domain over HTTPS completely free. Of course you should pay for your domain name at the Registrar. Create Github pages on Github.com On Github create a repo with name : githubUserName.github.io Push a file index.html to branch master or...","categories": [], + "tags": ["github","github-pages","web"], + "url": "/2018/05/setting-up-github-pages-with-custom-domain-over-https.html", + "teaser": null + },{ + "title": "Setting Up Powershell gallery And Nuget gallery", + "excerpt":"As like pypi for Python, npm for Node.js, we also have Powershell Gallery for Powershell to add some extra Powershell modules, and Nuget Gallery for Powershell to add some extra executables. Powershell version All commands provided here are tested on Windows 10 with Windows Powershell v5.1. Configure proxy in Powershell...","categories": [], + "tags": ["nuget","powershell","powershell-gallery","packaging","proxy"], + "url": "/2018/05/setting-up-powershell-gallery-and-nuget-gallery-for-powershell.html", + "teaser": null + },{ + "title": "Powershell stop-parsing (`--%`)", + "excerpt":"A friend of mine told me about the Powershell stop-parsing (--%) last year, he said the stop-parsing tells powershell to treat the remaining characters in the line as a literal, but I’d never known where to use it. Recently working on git ssh made it happened. The use case is...","categories": [], + "tags": ["powershell","parsing","ssh"], + "url": "/2018/05/powershell-stop-parsing.html", + "teaser": null + },{ + "title": "Setting up Jekyll with Minimal Mistakes theme on Windows", + "excerpt":"Do you want to preview Jekyll blog locally on Windows before publishing it to Internet? Many online tutorials about setting up Jekyll on Windows are out of date, I will show you in this post the 2018 version and with the Minimal Mistakes theme. Some online tutorials https://jekyllrb.com/docs/home/ https://help.github.com/articles/using-jekyll-as-a-static-site-generator-with-github-pages/ https://mmistakes.github.io/minimal-mistakes/docs/quick-start-guide/...","categories": [], + "tags": ["jekyll","web","windows","ruby"], + "url": "/2018/05/setting-up-jekyll-with-minimal-mistakes-theme-on-windows.html", + "teaser": null + },{ + "title": "Using Readline In Python REPL On Windows With PyReadline and PtPython", + "excerpt":"As an ex-sysadmin, I’m in love with the Readline. In Powershell, we have its variation PSReadline. In Python REPL on Windows OS, I’ll show you the PyReadline and the PtPython. PyReadline When you search on Internet, you will find many tutorials telling you to install a Python module called readline,...","categories": [], + "tags": ["python","repl","readline"], + "url": "/2018/05/using-readline-in-python-repl-on-windows.html", + "teaser": null + },{ + "title": "Select-ColorString : A Unix's grep-like Powershell Cmdlet Based On Select-String With Color", + "excerpt":"Update 2019-12-28 Powershell 7 Select-String default highlighting Update 2019-12-28: It’s very exciting to see that since Powershell 7, the Select-String has highlighting (internal name: emphasis) by default. It uses similar way (index, length) to find and highlight the matches. The emphasis uses negative colors based on your PowerShell background and...","categories": [], + "tags": ["powershell","string","regex"], + "url": "/2018/05/grep-like-powershell-colorful-select-string.html", + "teaser": null + },{ + "title": "Converting Python json dict list to csv file in 2 lines of code by pandas", + "excerpt":"Converting a Powershell object list to a csv file is quiet easy, for example : 6.0.2> gps | select name,id,path | ConvertTo-Csv | Out-File .\\gps.csv ; ii .\\gps.csv I’ll show you in this post the Python way to convert a dict list to a csv file. During my work, I...","categories": [], + "tags": ["python","json","csv"], + "url": "/2018/06/converting-python-json-list-to-csv-in-2-lines-of-code-by-pandas.html", + "teaser": null + },{ + "title": "Import Python module with sys.path variable when without `__init__` file", + "excerpt":"We’re familiar to put a python file inside a folder, and create a __init__.py file under the same folder, then we can easily import the file by import the folder, as the folder is transformed to a python module. But if we don’t have the __init__.py, how can we import...","categories": [], + "tags": ["python","module"], + "url": "/2018/06/import-python-module-with-sys-path-when-without-init-file.html", + "teaser": null + },{ + "title": "Git untrack submodule from git status", + "excerpt":"When we have submodules in a git repo, even if we add the submodules’ folders into the .gitignore file, these submodules folders are still tracked from the git status output. Method 1: .gitmodules file There’re several methods to ignore it, one of them is in .gitmodules file, add following line...","categories": [], + "tags": ["git","submodule"], + "url": "/2018/06/git-untrack-submodule-from-git-status.html", + "teaser": null + },{ + "title": "Install Python on Windows with Powershell without administrator privileges", + "excerpt":"As a Windows DevOps, I often use Powershell and Python, Powershell is installed by Windows out of box, but this is not for Python. And for my working environment, I don’t have the administrator privileges on some servers. I will show you in this post how to rapidly deploy Python...","categories": [], + "tags": ["python","powershell","nuget","package"], + "url": "/2018/06/install-python-on-windows-with-powershell-without-administrator-privileges.html", + "teaser": null + },{ + "title": "Use pyVmomi EventHistoryCollector to get all the vCenter events", + "excerpt":"pyVmomi eventManager’s QueryEvents() method returns by default only the last 1000 events occurred on the vCenter. I will show you how to use another method CreateCollectorForEvents() to create an EventHistoryCollector object and then we use this object to collect all the events in a given time range by using its...","categories": [], + "tags": ["python","pyvmomi","vmware"], + "url": "/2018/07/use-pyvmomi-EventHistoryCollector-to-get-all-the-vcenter-events.html", + "teaser": null + },{ + "title": "Use python tabulate module to create tables", + "excerpt":"If you want to create some tables from a python list, you can use the tabulate module, it can generate the table easily in text mode and in many formats, than you can past it into markdown, wiki files or add the print version to your python CLI in order...","categories": [], + "tags": ["python","markdown","format"], + "url": "/2018/07/use-python-tabulate-module-to-create-tables.html", + "teaser": null + },{ + "title": "Convert markdown or rst to Atlassian Confluance documentation format", + "excerpt":"A recent working experience needed me to write doc on Atlassian Confluance documentation product. I will show you how to convert your markdown doc to Confluance version. Convert markdown or rst to Confluance Confluance’s web doc editor is very powerfull, but I a markdown guy, I write everything in markdown...","categories": [], + "tags": ["markdown","format"], + "url": "/2018/07/convert-markdown-or-rst-to-atlassian-confluance-documentation-format.html", + "teaser": null + },{ + "title": "Use Powershell to manage Windows Scheduled Task", + "excerpt":"A recent project made me to use the Windows scheduled task to execute periodically some python scripts. After the project, I find using Powershell to manage the Windows scheduled task is not so straightforward, that’s why I opened this post to share my experience on some common usage, and hope...","categories": [], + "tags": ["scheduled-task","powershell"], + "url": "/2018/09/windows-scheduled-task-by-powershell.html", + "teaser": null + },{ + "title": "Install Gitlab-CE in Docker on Ubuntu", + "excerpt":"Gitlab-CE (Community Edition) is a completely free and powerful web-based Git-repository manager with wiki, issue-tracking and CI/CD pipeline features, using an open-source license, developed by GitLab Inc. There’re already many much better docs on the net, I’ve never worked with Docker and Linux before, so I wrote this post to...","categories": [], + "tags": ["gitlab","cicd","docker","ubuntu"], + "url": "/2018/09/install-gitlab-ce-in-docker-on-ubuntu.html", + "teaser": null + },{ + "title": "Setup HTTPS for Gitlab", + "excerpt":"Gitlab-CE default installation goes with HTTPS disable. We need to generate a SSL certificate, and bind it to the HTTPS of Gitlab-CE. Some docs on the Internet Gitlab omnibus SSL settings Gitlab omnibus enable HTTPS Generate a self-signed certificate with openssl How to install and configure Gitlab on Ubuntu 16.04...","categories": [], + "tags": ["gitlab","cicd","certificate","openssl","ubuntu"], + "url": "/2018/09/setup-https-for-gitlab.html", + "teaser": null + },{ + "title": "Install Gitlab Runner on Windows by Powershell PsRemoting", + "excerpt":"Gitlab runner can be installed on Windows OS. For people like me who is more familiar with Windows, we would like to use Windows as a Gitlab runner. This post will give you a simplified procedure (winrm PsRemoting full command line) about its installation with some tips and tricks that...","categories": [], + "tags": ["gitlab","cicd","powershell"], + "url": "/2018/09/install-gitlab-runner-on-windows-by-powershell-psremoting.html", + "teaser": null + },{ + "title": "Backup and restore Gitlab in docker", + "excerpt":"Gitlab hosts everything about the code including the docs and the pipeline data, etc. It’s crucial to back it up. You can also use restore to migrate the Gitlab to another server. This post will show you how to backup and restore the Gitlab-CE docker version. Some docs on the...","categories": [], + "tags": ["gitlab","cicd","docker","backup","ubuntu"], + "url": "/2018/09/backup-and-restore-gitlab-in-docker.html", + "teaser": null + },{ + "title": "Terminate Powershell script or session", + "excerpt":"I always asked myself how to terminate a Powershell script or session, each time I needed to do some tests by myself and also searched on Google. But I could never remember it. So I would like to take this post to note it down, the next time I need...","categories": [], + "tags": ["powershell"], + "url": "/2018/09/terminate-powershell-script-or-session.html", + "teaser": null + },{ + "title": "Update Gitlab in docker", + "excerpt":"Gitlab has several methods to update to newer version depending on the type of the original installation and the Gitlab version. This post will show you the way for docker version of Gitlab, which is the simplest among others. Some docs on the Internet This post will follow the official...","categories": [], + "tags": ["gitlab","cicd","docker","update","ubuntu"], + "url": "/2018/10/update-gitlab-in-docker.html", + "teaser": null + },{ + "title": "Migrate Gitlab in docker", + "excerpt":"This post will walk you through the steps to migrate Gitlab from one docker container to another. The steps need you to know how to install a new Gitlab container and how to backup and restore Gitlab container, because the migration is just a restoration of a backup to another...","categories": [], + "tags": ["gitlab","cicd","docker","migration","ubuntu"], + "url": "/2018/10/migrate-gitlab-in-docker.html", + "teaser": null + },{ + "title": "Using Gitlab integrated CICD for Python project on Windows", + "excerpt":"Gitlab ships with its own free CICD which works pretty well. This post will give you an example of the CICD file .gitlab-ci.yml for a Python project running on Gitlab Windows runner. Some docs on the Internet Official GitLab Continuous Integration (GitLab CI/CD) Official Configuration of your jobs with .gitlab-ci.yml...","categories": [], + "tags": ["gitlab","cicd","python","powershell"], + "url": "/2018/10/using-gitlab-integrated-cicd-for-python-project-on-windows.html", + "teaser": null + },{ + "title": "Setting Pwsh Invoke-WebRequest Proxy", + "excerpt":"Different than Windows Powershell, Powershell Core doesn’t use the system proxy setting on Windows. This post will show you an one-line command to set Powershell Core web cmdlets proxy. My office working environment is behind an Internet proxy, and I use Scoop to install many dev tools on my Windows...","categories": [], + "tags": ["powershell","proxy"], + "url": "/2018/11/setting-pwsh-invoke-webrequest-proxy.html", + "teaser": null + },{ + "title": "Creating Multiple Redis Instance Services On Windows", + "excerpt":"Even Salvatore Sanfilippo (creator of Redis) thinks it’s a bad idea to use multiple DBs in Redis. So we can install as many Redis instances as the number of DBs we need. This post will show you how to create multiple Redis instance as Windows service on the same Windows...","categories": [], + "tags": ["powershell","service","redis"], + "url": "/2018/11/creating-multiple-redis-instance-services-on-windows.html", + "teaser": null + },{ + "title": "Creating Custom Python Request Auth Class", + "excerpt":"When you need to use a complicated, or a non-standard API authentication method, or your dev and prd environments don’t use the same API authentication method, it might be better to create a Python requests auth method to reduce your work. Create the class MyAuth Suppose you have an API...","categories": [], + "tags": ["python","requests"], + "url": "/2019/04/creating-custom-python-request-auth-class.html", + "teaser": null + },{ + "title": "Using Python SQLAlchemy session in multithreading", + "excerpt":"SQLAlchemy DB session is not thread safe. In this post, I will show you 2 ways to use it in a multithreading context. Way 1 - Using contextmanager to create a session per thread Below is an example given by the official doc to show how to use the contextmanager...","categories": [], + "tags": ["python","sqlalchemy","multithreading"], + "url": "/2019/05/using-python-sqlalchemy-session-in-multithreading.html", + "teaser": null + },{ + "title": "Git Cheat Sheet", + "excerpt":"This is not a complete Git cheat sheet for everyone, this is just a personal cheat sheet for some often forgotten git commands. Alias User level alias Edit ~/.gitconfig git config --global alias.st status git config --global alias.lga log --graph --decorate --oneline --all git config --global alias.co checkout git config...","categories": [], + "tags": ["git"], + "url": "/2019/06/git-cheat-sheet.html", + "teaser": null + },{ + "title": "Filtering In Pandas Dataframe", + "excerpt":"Pandas dataframe is like a small database, we can use it to inject some data and do some in-memory filtering without any external SQL. This post is much like a summary of this StackOverflow thread. Building dataframe In [1]: import pandas as pd ...: import numpy as np ...: df...","categories": [], + "tags": ["python","pandas","filtering"], + "url": "/2019/07/filtering-pandas-dataframe.html", + "teaser": null + },{ + "title": "Troubleshooting Python Twine Cannot Upload Package On Windows", + "excerpt":"Python has several tools to upload packages to PyPi or some private Artifactory locations. The mostly used one should be twine. Although twine is not a Python originate tool, but it’s officially recommended by Python.org. Building the package Just a quick callback on how to build the pacakge. We need...","categories": [], + "tags": ["python","packaging"], + "url": "/2019/07/troubleshooting-python-twine-cannot-upload-package-on-windows.html", + "teaser": null + },{ + "title": "A fast way to check TCP port in Powershell", + "excerpt":"The Test-NetConnection cmdlet is great and verbose but too slow if the remote port to check is not opened. This is due to its timeout setting and cannot be modified. In this port, I will show you a custom function that leverages the power of System.Net.Sockets.TcpClient to accelerate the port...","categories": [], + "tags": ["powershell","network"], + "url": "/2019/09/fast-tcp-port-check-in-powershell.html", + "teaser": null + },{ + "title": "SQLAlchemy mixin in method", + "excerpt":"If I’m not wrong, the SQLAlchemy official doc provides some examples to explain how to share a set of common columns, some common table options, or other mapped properties, across many classes. But I cannot find how to share common methods (e.g. your customized to_dict() method). This post will just...","categories": [], + "tags": ["python","sqlalchemy"], + "url": "/2019/09/sqlalchemy-mixin-in-method.html", + "teaser": null + },{ + "title": "Install Python3 on Ubuntu", + "excerpt":"Most of tutorials on the Internet about installing Python3.6 on Ubuntu are by using 3rd party PPA repositories. If for any reason, you cannot use them, hereunder a quick tutorial for installing it from the Python official source, you should in advance download the source to the Ubuntu. Installing Python3.6...","categories": [], + "tags": ["python","ubuntu"], + "url": "/2019/10/installing-python3-on-ubuntu.html", + "teaser": null + },{ + "title": "Elastic Painless Scripted Field On Null/Missing Value", + "excerpt":"This post shows how to use elastic painless language in scripted field to work on documents’ keys which might not exist in some documents. Parsing analyzed field in Painless Suppose we have following 2 documents in elastic: [{ \"kye1\": \"value1\", \"key2\": { \"key22\": \"value22\" } }, { \"key1\": \"valuex\" }]...","categories": [], + "tags": ["elastic"], + "url": "/2019/12/elastic-painless-scripted-field-on-null-or-mssing-value.html", + "teaser": null + },{ + "title": "Using Powershell To Retrieve Latest Package Url From Github Releases", + "excerpt":"Github can host package releases, I will show you how to use Powershell to retrieve the latest release download url. Download latest Powershell release for Windows x64 zip version The goal of this demo is to convert the static url: https://github.com/PowerShell/PowerShell/releases/latest to the real download url (latest version on 2019/12/29):...","categories": [], + "tags": ["powershell"], + "url": "/2019/12/Using-Powershell-to-retrieve-latest-package-url-from-github-releases.html", + "teaser": null + },{ + "title": "Using Scoop On Windows", + "excerpt":"I’ve been using Scoop for setting up my personal and professional Windows development desktops since nearly 2 years. For me, it’s much more useful than another famous Windows package management tool Chocolatey, because with Scoop, everything is run & installed without any administrator privileges. This is very important in an...","categories": [], + "tags": ["scoop","powershell"], + "url": "/2019/12/Using-Scoop-On-Windows.html", + "teaser": null + },{ + "title": "Setting up WSL", + "excerpt":"Cleaning up manually the WSL instance For any reason you failed to install WSL from Microsoft store, you might need to clean up manually the downloaded WSL instance, the default location is at: $env:LOCALAPPDATA\\Packages For example, Ubuntu v1804 is at: C:\\Users\\xiang\\AppData\\Local\\Packages\\CanonicalGroupLimited.UbuntuonWindows_79rhkp1fndgsc\\ Just delete the folder then reinstall it from Microsoft...","categories": [], + "tags": ["wsl","linux"], + "url": "/2020/02/setting-up-wsl.html", + "teaser": null + },{ + "title": "Flattening nested dict in Python", + "excerpt":"Problem Given a nested dict with list as some keys’ value, we want to flatten the dict to a list. For example, given a dict as like: nested_data = { \"env\": [\"prd\", \"dev\"], \"os\": [\"win\", \"unx\"], \"msg\": \"ok\" } we want to convert it to a list as like: {'msg':...","categories": [], + "tags": ["python","itertools"], + "url": "/2020/03/flattening-nested-dict-in-python.html", + "teaser": null + },{ + "title": "Fixing an ipython Windows ConEmu only bug on 'MouseEventType.MOUSE_DOWN'", + "excerpt":"Problem Previously I updated the python version, the ipython version and maybe ConEmu on my Windows 10 (I don’t remember which one exactly), I got an error when I wanted to copy some text from ipython repl in ConEmu console by the right mouse click: ps.7.0.0 | py.3.8.2❯ ipython Python...","categories": [], + "tags": ["python","ipython"], + "url": "/2020/04/fixing-ipython-on-Windows10-ConEmu-mouse-event-bug.html", + "teaser": null + },{ + "title": "Making isort compatible with black", + "excerpt":"Update 2020-12-06, thanks to Christian Jauvin’s comment, since isort v5, it has introduced --profile=black option, so the life is much easier now:) Both isort and black are a must have in my python life, but with their default settings, I will get different imports formats. multi_line_output, include_trailing_comma and line_length The...","categories": [], + "tags": ["python","format","vscode"], + "url": "/2020/04/making-isort-compatible-with-black.html", + "teaser": null + },{ + "title": "Using Python Contextmanager To Create A Timer Decorator", + "excerpt":"This stackoverflow post has already given an example on how to use contextmanager to create a timer decorator: from contextlib import contextmanager from timeit import default_timer @contextmanager def elapsed_timer(): start = default_timer() elapser = lambda: default_timer() - start yield lambda: elapser() end = default_timer() elapser = lambda: end-start It works...","categories": [], + "tags": ["python","contextlib"], + "url": "/2020/05/using-python-contextmanager-to-create-a-timer-decorator.html", + "teaser": null + },{ + "title": "Compiling SQLAlchemy query to nearly real raw sql query", + "excerpt":"Some useful links https://stackoverflow.com/questions/5631078/sqlalchemy-print-the-actual-query https://docs.sqlalchemy.org/en/13/faq/sqlexpressions.html?highlight=literal_bind#rendering-bound-parameters-inline https://docs.sqlalchemy.org/en/13/core/engines.html#configuring-logging Query to compile Suppose we have a table called Movie, and a column release_date in the table Movie. > from datetime import date > from sqlalchemy import create_engine, sessionmaker > engine = create_engine('sqlite:///moive_example.db') > Session = sessionmaker(bind=engine) > session = Session() > filter1 =...","categories": [], + "tags": ["python","sqlalchemy"], + "url": "/2020/06/compiling-sqlalchemy-query-to-nearly-real-raw-sql-query.html", + "teaser": null + },{ + "title": "Rolling back from flask-restplus reqparse to native flask request to parse inputs", + "excerpt":"flask-restplus’ (or flask-restx) reqparse module is deprecated, so I decided to use the native flask request object to parse the incoming inputs. After the try, I noticed some points to take care of. Before listing these points, I will show you how to use native flask request to parse the...","categories": [], + "tags": ["python","flask"], + "url": "/2020/07/rolling-back-from-flask-restplus-reqparse-to-native-flask-request-to-parse-inputs.html", + "teaser": null + },{ + "title": "My Powerline setup and configuration", + "excerpt":"If you’re working in an enterprise environment, and you don’t have the admin rights on your Windows desktop to install additional fonts, or your enterprise admin cannot do that, then I suggest you to ignore this post, powerline will be installed, but very ugly. If you have a Linux desktop,...","categories": [], + "tags": ["linux","wsl","shell"], + "url": "/2020/11/my-powerline.html", + "teaser": null + },{ + "title": "Python Lint And Format", + "excerpt":"Azure SDK Python Guidelines https://azure.github.io/azure-sdk/python_implementation.html Lint Update 2023-05-21: Replaced flake8, pylint, and isort by ruff. When replacing pylint, should add check by mypy. ruff ruff . ruff check . # check is the default command so can be ignore # show ignored ruff alerts ruff . --ignore-noqa --exit-zero pylint Could...","categories": [], + "tags": ["python","format"], + "url": "/2021/01/python-lint-and-format.html", + "teaser": null + },{ + "title": "Python Requests With Retry", + "excerpt":"There’re several solutions to retry a HTTP request with Requests module, some of them are: Native Requests’ retry based on urllib3’s HTTPAdapter. Third party module: backoff. Third party module: tenacity. The native HTTPAdapter is not easy to use. The tenacity module is very powerful, but is also more or less...","categories": [], + "tags": ["python","requests"], + "url": "/2021/01/python-requests-with-retry.html", + "teaser": null + },{ + "title": "Trying Python pipreqs and pip-tools", + "excerpt":"Relative to pipenv, and poetry, if you’re searching for some lightweight python package managers for a small project, I will introduce 2 handy tools for you: pipreqs and pip-tools. pipreqs pipreqs github Suppose you are onboarded to an existing project where only pip is used. The requirements.txt file is generated...","categories": [], + "tags": ["python","pip"], + "url": "/2021/03/trying-python-pipreqs-and-pip-tools.html", + "teaser": null + },{ + "title": "Python Unittest Cheet Sheet", + "excerpt":"Python unittest and Pytest is a big deal, this post just gives some small & quick examples on how to use Python unittest framwork, especially with Pytest framework. This post is not finished yet. pytest in Makefile # Makefile # https://github.com/databrickslabs/dbx/blob/main/Makefile SHELL=/bin/bash VENV_NAME := $(shell [ -d venv ] &&...","categories": [], + "tags": ["python","unittest","pytest"], + "url": "/2021/06/python-unittest-cheet-sheet.html", + "teaser": null + },{ + "title": "Python datetime utcnow", + "excerpt":"Previously, when I needed a real UTC now with ISO 8601 format, I used to use the strftime function or the pytz module. But recently I just found that Python at least since v3.5 has already provide it with built-in module: datetime.now(timezone.utc), and this is also the preferred method over...","categories": [], + "tags": ["python","datetime"], + "url": "/2021/06/python-datetime-utc-now.html", + "teaser": null + },{ + "title": "Python Asyncio Study notes", + "excerpt":"concurrent.futures The concurrent.futures is a high-level abstraction for the threading and multiprocessing modules. ","categories": [], + "tags": ["python","async"], + "url": "/2021/09/python-asyncio.html", + "teaser": null + },{ + "title": "Azure pipeline predefined variables", + "excerpt":"The official doc gives an explanation of all the predefined variables, but it lacks of some concret examples. Hereunder some examples for my preferred variables. Access the predefined variables To access the variables value in YAML pipeline, we can use 2 methods: $(System.PullRequest.SourceBranch) : the standard way to access pipeline...","categories": [], + "tags": ["azure","cicd"], + "url": "/2022/01/azure-pipeline-predefined-variables.html", + "teaser": null + },{ + "title": "Azure pipeline reuse variables in template from another repository", + "excerpt":"Context In my project, I have several Azure pipelines that share some same variables, instead of declaring them in each pipeline, I would like to refactor it by using some central places to store the shared variables. I can split the variables into 3 groups: organization level variables: organization name,...","categories": [], + "tags": ["azure","cicd"], + "url": "/2022/02/azure-pipeline-reuse-variables-in-template-from-another-repository.html", + "teaser": null + },{ + "title": "Azure pipeline checkout repository from another project", + "excerpt":"Context This post can be an extend to my previous post on variables and templates reuse In fact, in addition to the variables and templates, I also need to reuse some non native Azure pipeline yaml files, for example some Python scripts defined in the shared template. If we use...","categories": [], + "tags": ["azure","cicd"], + "url": "/2022/02/azure-pipeline-checkout-repository-from-another-project.html", + "teaser": null + },{ + "title": "Azure pipeline variables and parameters", + "excerpt":"Variable Variable scope When we set variables from a script, the new variable is only available from the next step, not the step where the variable is defined. variables: sauce: orange steps: # Create a variable - bash: | echo \"##vso[task.setvariable variable=sauce]crushed tomatoes\" # remember to use double quotes echo...","categories": [], + "tags": ["azure","cicd"], + "url": "/2022/03/azure-pipeline-variables-and-parameters.html", + "teaser": null + },{ + "title": "Manage Azure Databricks Service Principal", + "excerpt":"Most of Databricks management can be done from the GUI or CLI, but for Azure Service Principal, we can only manage it by the SCIM API. There’s an open PR for adding support of SCIM API in Databricks CLI, but the lastest update is back to the beginning of 2021....","categories": [], + "tags": ["azure","databricks"], + "url": "/2022/03/manage-azure-databricks-service-principal.html", + "teaser": null + },{ + "title": "Azure Pipeline Checkout Multiple Repositories", + "excerpt":"This post will talk about some Azure pipeline predefined variables’ values in a multiple repositories checkout situation. The official doc is here. The examples given in this post is using Azure DevOps repositories and Azure pipeline Ubuntu agent. Default Pipeline workspace structure When a pipeline starts, something is created inside...","categories": [], + "tags": ["azure","cicd"], + "url": "/2022/04/azure-pipeline-checkout-multiple-repositories.html", + "teaser": null + },{ + "title": "Using Databricks Connect inside a container", + "excerpt":"Why use Databricks Connect From the very beginning of the Databricks Connect official doc, it says already that Databricks Connect has some limitations and is more or less deprecated in favor of dbx. But for some usages like local IDE live debug, Databricks Connect is still a very good tool...","categories": [], + "tags": ["databricks","vscode","container","docker","spark"], + "url": "/2022/06/using-databricks-connect-inside-a-container.html", + "teaser": null + },{ + "title": "Azure pipeline conditions", + "excerpt":"Azure pipeline has two kinds of conditions: With keyword condition With jinja like format ${{if elseif else}} In both syntax, we have use parameters and variables, but there’s a big difference between them which makes DevOps frustrated. Conditions with keyword $ With ${{if elseif else}} condition, the using parameters and...","categories": [], + "tags": ["azure","cicd"], + "url": "/2022/07/azure-pipeline-conditions.html", + "teaser": null + },{ + "title": "Databricks job/task context", + "excerpt":"Suppose we’re running following job/task in a Azure Databricks workspace: jobId: \"1111\" jobRunId: \"2222\" taskRunId: \"3333\" jobName: \"ths job name\" taskName: \"first-task\" databricksWorkspaceUrl: https://adb-4444444444.123.azuredatabricks.net/ Run below command in a Databricks job (task precisely): dbutils.notebook.entry_point.getDbutils().notebook().getContext().toJson() We will get following json: { \"rootRunId\": null, \"currentRunId\": null, \"jobGroup\": \"7777777777777777777_8888888888888888888_job-1111-run-3333-action-9999999999999999\", \"tags\": { \"jobId\": \"1111\",...","categories": [], + "tags": ["databricks","azure"], + "url": "/2022/07/databricks-job-context.html", + "teaser": null + },{ + "title": "Azure pipeline jobs", + "excerpt":"Traditional jobs vs deployment jobs traditional jobs run in parallel, deployment jobs run in sequence, save the deployment history to a environment and a resource, and can also be applied with deployment strategy (runOnce, rolling, and the canary) Deployment jobs Tracking deployment history As per example given here: we can...","categories": [], + "tags": ["azure","cicd"], + "url": "/2022/08/azure-pipeline-jobs.html", + "teaser": null + },{ + "title": "Azure pipeline System.AccessToken in shared pipeline", + "excerpt":"Var $(System.AccessToken) System.AccessToken is a special variable that carries the security token used by the running build. If you check the doc of job authorization scope, you might think the var $(System.AccessToken) has by default the access to all the repositories in the same project where hosts the calling Azure...","categories": [], + "tags": ["azure","cicd"], + "url": "/2022/09/azure-pipeline-system-access-token-in-shared-pipeline.html", + "teaser": null + },{ + "title": "Adding data files to Python package with setup.py", + "excerpt":"setup.py vs pyproject.toml pyproject.toml is the new Python project metadata specification standard since PEP 621. As per PEP 517, and as per one of the comments of this StackOverflow thread, in some rare cases, we might have a chicken and egg problem when using setup.py if it needs to import...","categories": [], + "tags": ["python","packaging"], + "url": "/2022/09/adding-data-files-to-python-package-with-setup-py.html", + "teaser": null + },{ + "title": "Databricks cluster access mode", + "excerpt":"What is cluster access mode Just a copy from Azure Databricks official doc: Amazon Databricks official doc has less info on access mode. Access Mode Visible to user UC Support Supported Languages Notes Single User Always Yes Python, SQL, Scala, R Can be assigned to and used by a single...","categories": [], + "tags": ["azure","databricks","spark"], + "url": "/2022/09/databricks-cluster-access-mode.html", + "teaser": null + },{ + "title": "Azure pipeline delete blobs from blob storage", + "excerpt":"The example given by this post is for Azure Pipeline with the latest Ubuntu agent, for AzCli from local machine, removing the --auth-mode login part should work. As it’s a Linux pipeline agent, the pipeline task AzureFileCopy can not be used, it’s written in Powershell, we should use the AzureCLI...","categories": [], + "tags": ["azure","cicd","storage"], + "url": "/2022/11/azure-pipeline-delete-blobs-from-blob-storage.html", + "teaser": null + },{ + "title": "Azure pipeline Windows agent UnicodeEncodeError", + "excerpt":"For people who encounter UnicodeEncodeError when using Windows Azure Pipeline agent, the issue might be here. As per above link, or this email, the solutions could be: You can override just sys.std* to UTF-8 by setting the environment variable PYTHONIOENCODING=UTF-8. You can override all I/O to use UTF-8 by setting...","categories": [], + "tags": ["azure","cicd","codec"], + "url": "/2022/11/azure-pipeline-windows-agent-UnicodeEncodeError.html", + "teaser": null + },{ + "title": "Using ast and cst to change Python code", + "excerpt":"Difference between AST and CST A brief comparison could be found in the libcst doc. Generally speaking, CST could keep the original source code format including the comments. Using AST to change Python code Since Python 3.9, the helper ast.unparse has been introduced, so we have both ast.parse and ast.unparse...","categories": [], + "tags": ["python","ast"], + "url": "/2022/11/using-ast-and-cst-to-change-python-code.html", + "teaser": null + },{ + "title": "Python difference on subprocess run(), call(), check_call(), check_output()", + "excerpt":"Difference on subprocess run(), call(), check_call(), check_output() Since Python 3.5, the official doc explains that: Prior to Python 3.5, these three functions (subprocess.call(), subprocess.check_call(), subprocess.check_output()) comprised the high level API to subprocess. You can now use subprocess.run() in many cases, but lots of existing code calls these functions. subprocess.run common...","categories": [], + "tags": ["python"], + "url": "/2022/12/python-difference-on-subprocess-run-call-check-call-check-output.html", + "teaser": null + },{ + "title": "Syncing repository from github to gitee", + "excerpt":"I need to sync github repository (files and commits only) https://github.com/copdips/copdips.github.io to gitee repository https://gitee.com/copdips/copdips.github.io. In gitee: create an empty repository, normal the same name as the one you want to sync from github. For example for this blog repository: https://gitee.com/copdips/copdips.github.io In gitee: create a PAT in gitee with necessary...","categories": [], + "tags": ["git"], + "url": "/2022/12/syncing-repository-from-github-to-gitee.html", + "teaser": null + },{ + "title": "Python aiohttp rate limit", + "excerpt":"HTTP rate limit is often the max requests in a limited time period, and sometimes could also be the max concurrent requests. Max requests in a limited time period from aiolimiter import AsyncLimiter RATE_LIMIT_IN_SECOND = 20 # 1.0 for time period during 1 second rate_limit = AsyncLimiter(RATE_LIMIT_IN_SECOND, 1.0) async with...","categories": [], + "tags": ["python","async"], + "url": "/2023/01/python-aiohttp-rate-limit.html", + "teaser": null + },{ + "title": "Calling Azure REST API", + "excerpt":"This blog Calling Azure REST API via curl is pretty good. Just two more things. Auth token in curl We can use curl -X GET -u :$token instead of curl -X GET -H \"Authorization: Bearer $token\" Azure DevOps API resource id for OAuth when using az rest to call Azure...","categories": [], + "tags": ["azure","api","rest"], + "url": "/2023/01/calling-azure-rest-api.html", + "teaser": null + },{ + "title": "Sonarcloud Github Action", + "excerpt":"Sonarcloud Github Action doesn’t work by default with Python pytest coverage.xml file, hereunder a working example. file .github/workflows/ci.yml # file: .github/workflows/ci.yml # irrelevant part is removed env: repo_name: repo app_folder_name: app coverage_percent: 90 build_number: ${{ github.run_number }} pytest_coverage_commentator_filename: pytest_coverage_commentator.txt pytest_coverage_xml_file_name: coverage.xml - name: Test with pytest run: | pytest -v...","categories": [], + "tags": ["githubaction","sonar","cicd"], + "url": "/2023/01/sonarcloud-github-action.html", + "teaser": null + },{ + "title": "Python Asyncio Unittest", + "excerpt":"Unittest based on Pytest framework not embedded unittest. Mocking async http client aiohttp.ClientSession Source code # file path: root/module_name/foo.py # pip install aiohttp import aiohttp class ClassFoo: def __init__(self, access_token: str): self.access_token = access_token self.auth_header = {\"Authorization\": f\"Bearer {self.access_token}\"} self.base_url = \"https://foo.bar.com/api/v1\" async def get_foo(self, foo_id: str) -> dict: url...","categories": [], + "tags": ["python","async","pytest","unittest"], + "url": "/2023/07/python-asyncio-unittest.html", + "teaser": null + },{ + "title": "Different ssh keys for different github.com accounts", + "excerpt":"It might be a common case that you have multiple github.com accounts (personal and professional), and you want to use different ssh keys for different github accounts, as github.com does not allow same ssh key for different accounts with “Key is already in use” error. To achieve this, you could...","categories": [], + "tags": ["git","ssh"], + "url": "/2023/09/different-ssh-keys-for-different-github.com-accounts.html", + "teaser": null + },{ + "title": "Python Asyncio", + "excerpt":"This is not a Python asyncio tutorial. Just some personal quick tips here, and could be updated from time to time. greenlet vs gevent greenlet needs manual event switch. gevent is based on greenlet. gevent has gevent.monkey.patch_all(). @asyncio.coroutine From Python 3.8, async def deprecates @asyncio.coroutine yield from From Python 3.5,...","categories": [], + "tags": ["python","async"], + "url": "/2023/09/python-asyncio.html", + "teaser": null + },{ + "title": "Github Actions - Cache", + "excerpt":"Life span Github Actions cache has a life span of 7 days, and the total size of all caches in a repository is limited to 10 GB. Standard Cache Cache key should be as specific as possible, so that the post cache restore installation can be reduced or skipped. For...","categories": [], + "tags": ["cicd","githubaction","cache","azure"], + "url": "/2023/09/github-actions-cache.html", + "teaser": null + },{ + "title": "Github Actions - Custom Actions", + "excerpt":"Actions checkout location in workflow Actions are automatically checked out by Github Action from the beginning of a workflow run, the checkout path could be found by: env var $GITHUB_ACTION_PATH, github context ${{ github.action_path }}. This is very useful when you need to reference some files or scripts saved in...","categories": [], + "tags": ["cicd","githubaction","azure"], + "url": "/2023/09/github-actions-custom-actions.html", + "teaser": null + },{ + "title": "Github Actions - Environment", + "excerpt":"Dynamic environment environment is set at job level (not at step level), so we should use the $GITHUB_OUTPUT context to set the environment name dynamically, see here to learn how to pass data between jobs. Standard usage for static value is like this: jobs: deployment: runs-on: ubuntu-latest environment: production steps:...","categories": [], + "tags": ["cicd","githubaction"], + "url": "/2023/09/github-actions-environment.html", + "teaser": null + },{ + "title": "Github Actions - Variables", + "excerpt":"Variables upon Git events Suppose we create a new branch named new_branch, and create a pull request (with id 123) from the new branch new_branch to the main branch. During the pipeline, we can see following predefined variables in different GIT events. Check here for variables upon git events in...","categories": [], + "tags": ["cicd","githubaction"], + "url": "/2023/09/github-actions-variables.html", + "teaser": null + },{ + "title": "Github Actions - Error handling", + "excerpt":"continue-on-error vs fail-fast The doc explains that continue-on-error applies to a single job or single step which defines whether a job or step can continue on its error, while fail-fast applies to the entire matrix which means if the failure of a job in the matrix can stop other running...","categories": [], + "tags": ["cicd","githubaction"], + "url": "/2023/09/github-actions-error-handling.html", + "teaser": null + },{ + "title": "Github Actions - Workflows", + "excerpt":"Reusable workflows Re-run a reusable workflow If reusable workflow is not referenced by SHA, for example a branch name, when re-run a workflow, it will not use the latest version of the workflow in that branch, but the same commit SHA of the first attempt. Which means, if you use...","categories": [], + "tags": ["cicd","githubaction"], + "url": "/2023/09/github-actions-workflows.html", + "teaser": null + },{ + "title": "Databricks Python pip authentication", + "excerpt":"Before the Databricks Unit Catalog’s release, we used init scripts to generate the pip.conf file during cluster startup, allowing each cluster its unique auth token. But with init scripts no longer available in the Unit Catalog’s shared mode, an alternative approach is required. A workaround involves placing a prepared pip.conf...","categories": [], + "tags": ["databricks","python","pip","auth"], + "url": "/2023/09/databricks-python-pip-authentication.html", + "teaser": null + },{ + "title": "Github Actions - Python", + "excerpt":"Setting up pip authentication PIP_INDEX_URL vs PIP_EXTRA_INDEX_URL In most cases, when setting up private Python package artifacts (like Azure DevOps Artifacts, JFrog Artifactory, etc.) are configured to mirror the public PyPi. In such scenarios, we only need to use PIP_INDEX_URL to point to these private artifacts. However, some people might...","categories": [], + "tags": ["cicd","githubaction","python","pip","auth","azure"], + "url": "/2023/09/github-actions-python.html", + "teaser": null + },{ + "title": "Github Actions - copdips/get-azure-keyvault-secrets-action", + "excerpt":"Recently, I began a new project that requires migrating some process from Azure Pipelines to Github Actions. One of the tasks involves retrieving secrets from Azure Key Vault. In Azure Pipelines, we have an official task called AzureKeyVault@2 designed for this purpose. However, its official counterpart in Github Actions, Azure/get-keyvault-secrets@v1,...","categories": [], + "tags": ["cicd","githubaction","python","async","azure","vault"], + "url": "/2023/10/github-actions-get-azure-keyvault-secrets-action.html", + "teaser": null + },{ + "title": "Hashing files", + "excerpt":"During CI/CD processes, and particularly during CI, we frequently hash dependency files to create cache keys (referred to as key input in Github Action actions/cache and key parameter in Azure pipelines Cache@2 task). However, the default hash functions come with certain limitations like this comment. To address this, we can...","categories": [], + "tags": ["cicd","githubaction","azure","shell","cache"], + "url": "/2023/10/hashing-files.html", + "teaser": null + }] diff --git a/assets/js/lunr/lunr.js b/assets/js/lunr/lunr.js new file mode 100644 index 00000000..6aa370fb --- /dev/null +++ b/assets/js/lunr/lunr.js @@ -0,0 +1,3475 @@ +/** + * lunr - http://lunrjs.com - A bit like Solr, but much smaller and not as bright - 2.3.9 + * Copyright (C) 2020 Oliver Nightingale + * @license MIT + */ + +;(function(){ + +/** + * A convenience function for configuring and constructing + * a new lunr Index. + * + * A lunr.Builder instance is created and the pipeline setup + * with a trimmer, stop word filter and stemmer. + * + * This builder object is yielded to the configuration function + * that is passed as a parameter, allowing the list of fields + * and other builder parameters to be customised. + * + * All documents _must_ be added within the passed config function. + * + * @example + * var idx = lunr(function () { + * this.field('title') + * this.field('body') + * this.ref('id') + * + * documents.forEach(function (doc) { + * this.add(doc) + * }, this) + * }) + * + * @see {@link lunr.Builder} + * @see {@link lunr.Pipeline} + * @see {@link lunr.trimmer} + * @see {@link lunr.stopWordFilter} + * @see {@link lunr.stemmer} + * @namespace {function} lunr + */ +var lunr = function (config) { + var builder = new lunr.Builder + + builder.pipeline.add( + lunr.trimmer, + lunr.stopWordFilter, + lunr.stemmer + ) + + builder.searchPipeline.add( + lunr.stemmer + ) + + config.call(builder, builder) + return builder.build() +} + +lunr.version = "2.3.9" +/*! + * lunr.utils + * Copyright (C) 2020 Oliver Nightingale + */ + +/** + * A namespace containing utils for the rest of the lunr library + * @namespace lunr.utils + */ +lunr.utils = {} + +/** + * Print a warning message to the console. + * + * @param {String} message The message to be printed. + * @memberOf lunr.utils + * @function + */ +lunr.utils.warn = (function (global) { + /* eslint-disable no-console */ + return function (message) { + if (global.console && console.warn) { + console.warn(message) + } + } + /* eslint-enable no-console */ +})(this) + +/** + * Convert an object to a string. + * + * In the case of `null` and `undefined` the function returns + * the empty string, in all other cases the result of calling + * `toString` on the passed object is returned. + * + * @param {Any} obj The object to convert to a string. + * @return {String} string representation of the passed object. + * @memberOf lunr.utils + */ +lunr.utils.asString = function (obj) { + if (obj === void 0 || obj === null) { + return "" + } else { + return obj.toString() + } +} + +/** + * Clones an object. + * + * Will create a copy of an existing object such that any mutations + * on the copy cannot affect the original. + * + * Only shallow objects are supported, passing a nested object to this + * function will cause a TypeError. + * + * Objects with primitives, and arrays of primitives are supported. + * + * @param {Object} obj The object to clone. + * @return {Object} a clone of the passed object. + * @throws {TypeError} when a nested object is passed. + * @memberOf Utils + */ +lunr.utils.clone = function (obj) { + if (obj === null || obj === undefined) { + return obj + } + + var clone = Object.create(null), + keys = Object.keys(obj) + + for (var i = 0; i < keys.length; i++) { + var key = keys[i], + val = obj[key] + + if (Array.isArray(val)) { + clone[key] = val.slice() + continue + } + + if (typeof val === 'string' || + typeof val === 'number' || + typeof val === 'boolean') { + clone[key] = val + continue + } + + throw new TypeError("clone is not deep and does not support nested objects") + } + + return clone +} +lunr.FieldRef = function (docRef, fieldName, stringValue) { + this.docRef = docRef + this.fieldName = fieldName + this._stringValue = stringValue +} + +lunr.FieldRef.joiner = "/" + +lunr.FieldRef.fromString = function (s) { + var n = s.indexOf(lunr.FieldRef.joiner) + + if (n === -1) { + throw "malformed field ref string" + } + + var fieldRef = s.slice(0, n), + docRef = s.slice(n + 1) + + return new lunr.FieldRef (docRef, fieldRef, s) +} + +lunr.FieldRef.prototype.toString = function () { + if (this._stringValue == undefined) { + this._stringValue = this.fieldName + lunr.FieldRef.joiner + this.docRef + } + + return this._stringValue +} +/*! + * lunr.Set + * Copyright (C) 2020 Oliver Nightingale + */ + +/** + * A lunr set. + * + * @constructor + */ +lunr.Set = function (elements) { + this.elements = Object.create(null) + + if (elements) { + this.length = elements.length + + for (var i = 0; i < this.length; i++) { + this.elements[elements[i]] = true + } + } else { + this.length = 0 + } +} + +/** + * A complete set that contains all elements. + * + * @static + * @readonly + * @type {lunr.Set} + */ +lunr.Set.complete = { + intersect: function (other) { + return other + }, + + union: function () { + return this + }, + + contains: function () { + return true + } +} + +/** + * An empty set that contains no elements. + * + * @static + * @readonly + * @type {lunr.Set} + */ +lunr.Set.empty = { + intersect: function () { + return this + }, + + union: function (other) { + return other + }, + + contains: function () { + return false + } +} + +/** + * Returns true if this set contains the specified object. + * + * @param {object} object - Object whose presence in this set is to be tested. + * @returns {boolean} - True if this set contains the specified object. + */ +lunr.Set.prototype.contains = function (object) { + return !!this.elements[object] +} + +/** + * Returns a new set containing only the elements that are present in both + * this set and the specified set. + * + * @param {lunr.Set} other - set to intersect with this set. + * @returns {lunr.Set} a new set that is the intersection of this and the specified set. + */ + +lunr.Set.prototype.intersect = function (other) { + var a, b, elements, intersection = [] + + if (other === lunr.Set.complete) { + return this + } + + if (other === lunr.Set.empty) { + return other + } + + if (this.length < other.length) { + a = this + b = other + } else { + a = other + b = this + } + + elements = Object.keys(a.elements) + + for (var i = 0; i < elements.length; i++) { + var element = elements[i] + if (element in b.elements) { + intersection.push(element) + } + } + + return new lunr.Set (intersection) +} + +/** + * Returns a new set combining the elements of this and the specified set. + * + * @param {lunr.Set} other - set to union with this set. + * @return {lunr.Set} a new set that is the union of this and the specified set. + */ + +lunr.Set.prototype.union = function (other) { + if (other === lunr.Set.complete) { + return lunr.Set.complete + } + + if (other === lunr.Set.empty) { + return this + } + + return new lunr.Set(Object.keys(this.elements).concat(Object.keys(other.elements))) +} +/** + * A function to calculate the inverse document frequency for + * a posting. This is shared between the builder and the index + * + * @private + * @param {object} posting - The posting for a given term + * @param {number} documentCount - The total number of documents. + */ +lunr.idf = function (posting, documentCount) { + var documentsWithTerm = 0 + + for (var fieldName in posting) { + if (fieldName == '_index') continue // Ignore the term index, its not a field + documentsWithTerm += Object.keys(posting[fieldName]).length + } + + var x = (documentCount - documentsWithTerm + 0.5) / (documentsWithTerm + 0.5) + + return Math.log(1 + Math.abs(x)) +} + +/** + * A token wraps a string representation of a token + * as it is passed through the text processing pipeline. + * + * @constructor + * @param {string} [str=''] - The string token being wrapped. + * @param {object} [metadata={}] - Metadata associated with this token. + */ +lunr.Token = function (str, metadata) { + this.str = str || "" + this.metadata = metadata || {} +} + +/** + * Returns the token string that is being wrapped by this object. + * + * @returns {string} + */ +lunr.Token.prototype.toString = function () { + return this.str +} + +/** + * A token update function is used when updating or optionally + * when cloning a token. + * + * @callback lunr.Token~updateFunction + * @param {string} str - The string representation of the token. + * @param {Object} metadata - All metadata associated with this token. + */ + +/** + * Applies the given function to the wrapped string token. + * + * @example + * token.update(function (str, metadata) { + * return str.toUpperCase() + * }) + * + * @param {lunr.Token~updateFunction} fn - A function to apply to the token string. + * @returns {lunr.Token} + */ +lunr.Token.prototype.update = function (fn) { + this.str = fn(this.str, this.metadata) + return this +} + +/** + * Creates a clone of this token. Optionally a function can be + * applied to the cloned token. + * + * @param {lunr.Token~updateFunction} [fn] - An optional function to apply to the cloned token. + * @returns {lunr.Token} + */ +lunr.Token.prototype.clone = function (fn) { + fn = fn || function (s) { return s } + return new lunr.Token (fn(this.str, this.metadata), this.metadata) +} +/*! + * lunr.tokenizer + * Copyright (C) 2020 Oliver Nightingale + */ + +/** + * A function for splitting a string into tokens ready to be inserted into + * the search index. Uses `lunr.tokenizer.separator` to split strings, change + * the value of this property to change how strings are split into tokens. + * + * This tokenizer will convert its parameter to a string by calling `toString` and + * then will split this string on the character in `lunr.tokenizer.separator`. + * Arrays will have their elements converted to strings and wrapped in a lunr.Token. + * + * Optional metadata can be passed to the tokenizer, this metadata will be cloned and + * added as metadata to every token that is created from the object to be tokenized. + * + * @static + * @param {?(string|object|object[])} obj - The object to convert into tokens + * @param {?object} metadata - Optional metadata to associate with every token + * @returns {lunr.Token[]} + * @see {@link lunr.Pipeline} + */ +lunr.tokenizer = function (obj, metadata) { + if (obj == null || obj == undefined) { + return [] + } + + if (Array.isArray(obj)) { + return obj.map(function (t) { + return new lunr.Token( + lunr.utils.asString(t).toLowerCase(), + lunr.utils.clone(metadata) + ) + }) + } + + var str = obj.toString().toLowerCase(), + len = str.length, + tokens = [] + + for (var sliceEnd = 0, sliceStart = 0; sliceEnd <= len; sliceEnd++) { + var char = str.charAt(sliceEnd), + sliceLength = sliceEnd - sliceStart + + if ((char.match(lunr.tokenizer.separator) || sliceEnd == len)) { + + if (sliceLength > 0) { + var tokenMetadata = lunr.utils.clone(metadata) || {} + tokenMetadata["position"] = [sliceStart, sliceLength] + tokenMetadata["index"] = tokens.length + + tokens.push( + new lunr.Token ( + str.slice(sliceStart, sliceEnd), + tokenMetadata + ) + ) + } + + sliceStart = sliceEnd + 1 + } + + } + + return tokens +} + +/** + * The separator used to split a string into tokens. Override this property to change the behaviour of + * `lunr.tokenizer` behaviour when tokenizing strings. By default this splits on whitespace and hyphens. + * + * @static + * @see lunr.tokenizer + */ +lunr.tokenizer.separator = /[\s\-]+/ +/*! + * lunr.Pipeline + * Copyright (C) 2020 Oliver Nightingale + */ + +/** + * lunr.Pipelines maintain an ordered list of functions to be applied to all + * tokens in documents entering the search index and queries being ran against + * the index. + * + * An instance of lunr.Index created with the lunr shortcut will contain a + * pipeline with a stop word filter and an English language stemmer. Extra + * functions can be added before or after either of these functions or these + * default functions can be removed. + * + * When run the pipeline will call each function in turn, passing a token, the + * index of that token in the original list of all tokens and finally a list of + * all the original tokens. + * + * The output of functions in the pipeline will be passed to the next function + * in the pipeline. To exclude a token from entering the index the function + * should return undefined, the rest of the pipeline will not be called with + * this token. + * + * For serialisation of pipelines to work, all functions used in an instance of + * a pipeline should be registered with lunr.Pipeline. Registered functions can + * then be loaded. If trying to load a serialised pipeline that uses functions + * that are not registered an error will be thrown. + * + * If not planning on serialising the pipeline then registering pipeline functions + * is not necessary. + * + * @constructor + */ +lunr.Pipeline = function () { + this._stack = [] +} + +lunr.Pipeline.registeredFunctions = Object.create(null) + +/** + * A pipeline function maps lunr.Token to lunr.Token. A lunr.Token contains the token + * string as well as all known metadata. A pipeline function can mutate the token string + * or mutate (or add) metadata for a given token. + * + * A pipeline function can indicate that the passed token should be discarded by returning + * null, undefined or an empty string. This token will not be passed to any downstream pipeline + * functions and will not be added to the index. + * + * Multiple tokens can be returned by returning an array of tokens. Each token will be passed + * to any downstream pipeline functions and all will returned tokens will be added to the index. + * + * Any number of pipeline functions may be chained together using a lunr.Pipeline. + * + * @interface lunr.PipelineFunction + * @param {lunr.Token} token - A token from the document being processed. + * @param {number} i - The index of this token in the complete list of tokens for this document/field. + * @param {lunr.Token[]} tokens - All tokens for this document/field. + * @returns {(?lunr.Token|lunr.Token[])} + */ + +/** + * Register a function with the pipeline. + * + * Functions that are used in the pipeline should be registered if the pipeline + * needs to be serialised, or a serialised pipeline needs to be loaded. + * + * Registering a function does not add it to a pipeline, functions must still be + * added to instances of the pipeline for them to be used when running a pipeline. + * + * @param {lunr.PipelineFunction} fn - The function to check for. + * @param {String} label - The label to register this function with + */ +lunr.Pipeline.registerFunction = function (fn, label) { + if (label in this.registeredFunctions) { + lunr.utils.warn('Overwriting existing registered function: ' + label) + } + + fn.label = label + lunr.Pipeline.registeredFunctions[fn.label] = fn +} + +/** + * Warns if the function is not registered as a Pipeline function. + * + * @param {lunr.PipelineFunction} fn - The function to check for. + * @private + */ +lunr.Pipeline.warnIfFunctionNotRegistered = function (fn) { + var isRegistered = fn.label && (fn.label in this.registeredFunctions) + + if (!isRegistered) { + lunr.utils.warn('Function is not registered with pipeline. This may cause problems when serialising the index.\n', fn) + } +} + +/** + * Loads a previously serialised pipeline. + * + * All functions to be loaded must already be registered with lunr.Pipeline. + * If any function from the serialised data has not been registered then an + * error will be thrown. + * + * @param {Object} serialised - The serialised pipeline to load. + * @returns {lunr.Pipeline} + */ +lunr.Pipeline.load = function (serialised) { + var pipeline = new lunr.Pipeline + + serialised.forEach(function (fnName) { + var fn = lunr.Pipeline.registeredFunctions[fnName] + + if (fn) { + pipeline.add(fn) + } else { + throw new Error('Cannot load unregistered function: ' + fnName) + } + }) + + return pipeline +} + +/** + * Adds new functions to the end of the pipeline. + * + * Logs a warning if the function has not been registered. + * + * @param {lunr.PipelineFunction[]} functions - Any number of functions to add to the pipeline. + */ +lunr.Pipeline.prototype.add = function () { + var fns = Array.prototype.slice.call(arguments) + + fns.forEach(function (fn) { + lunr.Pipeline.warnIfFunctionNotRegistered(fn) + this._stack.push(fn) + }, this) +} + +/** + * Adds a single function after a function that already exists in the + * pipeline. + * + * Logs a warning if the function has not been registered. + * + * @param {lunr.PipelineFunction} existingFn - A function that already exists in the pipeline. + * @param {lunr.PipelineFunction} newFn - The new function to add to the pipeline. + */ +lunr.Pipeline.prototype.after = function (existingFn, newFn) { + lunr.Pipeline.warnIfFunctionNotRegistered(newFn) + + var pos = this._stack.indexOf(existingFn) + if (pos == -1) { + throw new Error('Cannot find existingFn') + } + + pos = pos + 1 + this._stack.splice(pos, 0, newFn) +} + +/** + * Adds a single function before a function that already exists in the + * pipeline. + * + * Logs a warning if the function has not been registered. + * + * @param {lunr.PipelineFunction} existingFn - A function that already exists in the pipeline. + * @param {lunr.PipelineFunction} newFn - The new function to add to the pipeline. + */ +lunr.Pipeline.prototype.before = function (existingFn, newFn) { + lunr.Pipeline.warnIfFunctionNotRegistered(newFn) + + var pos = this._stack.indexOf(existingFn) + if (pos == -1) { + throw new Error('Cannot find existingFn') + } + + this._stack.splice(pos, 0, newFn) +} + +/** + * Removes a function from the pipeline. + * + * @param {lunr.PipelineFunction} fn The function to remove from the pipeline. + */ +lunr.Pipeline.prototype.remove = function (fn) { + var pos = this._stack.indexOf(fn) + if (pos == -1) { + return + } + + this._stack.splice(pos, 1) +} + +/** + * Runs the current list of functions that make up the pipeline against the + * passed tokens. + * + * @param {Array} tokens The tokens to run through the pipeline. + * @returns {Array} + */ +lunr.Pipeline.prototype.run = function (tokens) { + var stackLength = this._stack.length + + for (var i = 0; i < stackLength; i++) { + var fn = this._stack[i] + var memo = [] + + for (var j = 0; j < tokens.length; j++) { + var result = fn(tokens[j], j, tokens) + + if (result === null || result === void 0 || result === '') continue + + if (Array.isArray(result)) { + for (var k = 0; k < result.length; k++) { + memo.push(result[k]) + } + } else { + memo.push(result) + } + } + + tokens = memo + } + + return tokens +} + +/** + * Convenience method for passing a string through a pipeline and getting + * strings out. This method takes care of wrapping the passed string in a + * token and mapping the resulting tokens back to strings. + * + * @param {string} str - The string to pass through the pipeline. + * @param {?object} metadata - Optional metadata to associate with the token + * passed to the pipeline. + * @returns {string[]} + */ +lunr.Pipeline.prototype.runString = function (str, metadata) { + var token = new lunr.Token (str, metadata) + + return this.run([token]).map(function (t) { + return t.toString() + }) +} + +/** + * Resets the pipeline by removing any existing processors. + * + */ +lunr.Pipeline.prototype.reset = function () { + this._stack = [] +} + +/** + * Returns a representation of the pipeline ready for serialisation. + * + * Logs a warning if the function has not been registered. + * + * @returns {Array} + */ +lunr.Pipeline.prototype.toJSON = function () { + return this._stack.map(function (fn) { + lunr.Pipeline.warnIfFunctionNotRegistered(fn) + + return fn.label + }) +} +/*! + * lunr.Vector + * Copyright (C) 2020 Oliver Nightingale + */ + +/** + * A vector is used to construct the vector space of documents and queries. These + * vectors support operations to determine the similarity between two documents or + * a document and a query. + * + * Normally no parameters are required for initializing a vector, but in the case of + * loading a previously dumped vector the raw elements can be provided to the constructor. + * + * For performance reasons vectors are implemented with a flat array, where an elements + * index is immediately followed by its value. E.g. [index, value, index, value]. This + * allows the underlying array to be as sparse as possible and still offer decent + * performance when being used for vector calculations. + * + * @constructor + * @param {Number[]} [elements] - The flat list of element index and element value pairs. + */ +lunr.Vector = function (elements) { + this._magnitude = 0 + this.elements = elements || [] +} + + +/** + * Calculates the position within the vector to insert a given index. + * + * This is used internally by insert and upsert. If there are duplicate indexes then + * the position is returned as if the value for that index were to be updated, but it + * is the callers responsibility to check whether there is a duplicate at that index + * + * @param {Number} insertIdx - The index at which the element should be inserted. + * @returns {Number} + */ +lunr.Vector.prototype.positionForIndex = function (index) { + // For an empty vector the tuple can be inserted at the beginning + if (this.elements.length == 0) { + return 0 + } + + var start = 0, + end = this.elements.length / 2, + sliceLength = end - start, + pivotPoint = Math.floor(sliceLength / 2), + pivotIndex = this.elements[pivotPoint * 2] + + while (sliceLength > 1) { + if (pivotIndex < index) { + start = pivotPoint + } + + if (pivotIndex > index) { + end = pivotPoint + } + + if (pivotIndex == index) { + break + } + + sliceLength = end - start + pivotPoint = start + Math.floor(sliceLength / 2) + pivotIndex = this.elements[pivotPoint * 2] + } + + if (pivotIndex == index) { + return pivotPoint * 2 + } + + if (pivotIndex > index) { + return pivotPoint * 2 + } + + if (pivotIndex < index) { + return (pivotPoint + 1) * 2 + } +} + +/** + * Inserts an element at an index within the vector. + * + * Does not allow duplicates, will throw an error if there is already an entry + * for this index. + * + * @param {Number} insertIdx - The index at which the element should be inserted. + * @param {Number} val - The value to be inserted into the vector. + */ +lunr.Vector.prototype.insert = function (insertIdx, val) { + this.upsert(insertIdx, val, function () { + throw "duplicate index" + }) +} + +/** + * Inserts or updates an existing index within the vector. + * + * @param {Number} insertIdx - The index at which the element should be inserted. + * @param {Number} val - The value to be inserted into the vector. + * @param {function} fn - A function that is called for updates, the existing value and the + * requested value are passed as arguments + */ +lunr.Vector.prototype.upsert = function (insertIdx, val, fn) { + this._magnitude = 0 + var position = this.positionForIndex(insertIdx) + + if (this.elements[position] == insertIdx) { + this.elements[position + 1] = fn(this.elements[position + 1], val) + } else { + this.elements.splice(position, 0, insertIdx, val) + } +} + +/** + * Calculates the magnitude of this vector. + * + * @returns {Number} + */ +lunr.Vector.prototype.magnitude = function () { + if (this._magnitude) return this._magnitude + + var sumOfSquares = 0, + elementsLength = this.elements.length + + for (var i = 1; i < elementsLength; i += 2) { + var val = this.elements[i] + sumOfSquares += val * val + } + + return this._magnitude = Math.sqrt(sumOfSquares) +} + +/** + * Calculates the dot product of this vector and another vector. + * + * @param {lunr.Vector} otherVector - The vector to compute the dot product with. + * @returns {Number} + */ +lunr.Vector.prototype.dot = function (otherVector) { + var dotProduct = 0, + a = this.elements, b = otherVector.elements, + aLen = a.length, bLen = b.length, + aVal = 0, bVal = 0, + i = 0, j = 0 + + while (i < aLen && j < bLen) { + aVal = a[i], bVal = b[j] + if (aVal < bVal) { + i += 2 + } else if (aVal > bVal) { + j += 2 + } else if (aVal == bVal) { + dotProduct += a[i + 1] * b[j + 1] + i += 2 + j += 2 + } + } + + return dotProduct +} + +/** + * Calculates the similarity between this vector and another vector. + * + * @param {lunr.Vector} otherVector - The other vector to calculate the + * similarity with. + * @returns {Number} + */ +lunr.Vector.prototype.similarity = function (otherVector) { + return this.dot(otherVector) / this.magnitude() || 0 +} + +/** + * Converts the vector to an array of the elements within the vector. + * + * @returns {Number[]} + */ +lunr.Vector.prototype.toArray = function () { + var output = new Array (this.elements.length / 2) + + for (var i = 1, j = 0; i < this.elements.length; i += 2, j++) { + output[j] = this.elements[i] + } + + return output +} + +/** + * A JSON serializable representation of the vector. + * + * @returns {Number[]} + */ +lunr.Vector.prototype.toJSON = function () { + return this.elements +} +/* eslint-disable */ +/*! + * lunr.stemmer + * Copyright (C) 2020 Oliver Nightingale + * Includes code from - http://tartarus.org/~martin/PorterStemmer/js.txt + */ + +/** + * lunr.stemmer is an english language stemmer, this is a JavaScript + * implementation of the PorterStemmer taken from http://tartarus.org/~martin + * + * @static + * @implements {lunr.PipelineFunction} + * @param {lunr.Token} token - The string to stem + * @returns {lunr.Token} + * @see {@link lunr.Pipeline} + * @function + */ +lunr.stemmer = (function(){ + var step2list = { + "ational" : "ate", + "tional" : "tion", + "enci" : "ence", + "anci" : "ance", + "izer" : "ize", + "bli" : "ble", + "alli" : "al", + "entli" : "ent", + "eli" : "e", + "ousli" : "ous", + "ization" : "ize", + "ation" : "ate", + "ator" : "ate", + "alism" : "al", + "iveness" : "ive", + "fulness" : "ful", + "ousness" : "ous", + "aliti" : "al", + "iviti" : "ive", + "biliti" : "ble", + "logi" : "log" + }, + + step3list = { + "icate" : "ic", + "ative" : "", + "alize" : "al", + "iciti" : "ic", + "ical" : "ic", + "ful" : "", + "ness" : "" + }, + + c = "[^aeiou]", // consonant + v = "[aeiouy]", // vowel + C = c + "[^aeiouy]*", // consonant sequence + V = v + "[aeiou]*", // vowel sequence + + mgr0 = "^(" + C + ")?" + V + C, // [C]VC... is m>0 + meq1 = "^(" + C + ")?" + V + C + "(" + V + ")?$", // [C]VC[V] is m=1 + mgr1 = "^(" + C + ")?" + V + C + V + C, // [C]VCVC... is m>1 + s_v = "^(" + C + ")?" + v; // vowel in stem + + var re_mgr0 = new RegExp(mgr0); + var re_mgr1 = new RegExp(mgr1); + var re_meq1 = new RegExp(meq1); + var re_s_v = new RegExp(s_v); + + var re_1a = /^(.+?)(ss|i)es$/; + var re2_1a = /^(.+?)([^s])s$/; + var re_1b = /^(.+?)eed$/; + var re2_1b = /^(.+?)(ed|ing)$/; + var re_1b_2 = /.$/; + var re2_1b_2 = /(at|bl|iz)$/; + var re3_1b_2 = new RegExp("([^aeiouylsz])\\1$"); + var re4_1b_2 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + + var re_1c = /^(.+?[^aeiou])y$/; + var re_2 = /^(.+?)(ational|tional|enci|anci|izer|bli|alli|entli|eli|ousli|ization|ation|ator|alism|iveness|fulness|ousness|aliti|iviti|biliti|logi)$/; + + var re_3 = /^(.+?)(icate|ative|alize|iciti|ical|ful|ness)$/; + + var re_4 = /^(.+?)(al|ance|ence|er|ic|able|ible|ant|ement|ment|ent|ou|ism|ate|iti|ous|ive|ize)$/; + var re2_4 = /^(.+?)(s|t)(ion)$/; + + var re_5 = /^(.+?)e$/; + var re_5_1 = /ll$/; + var re3_5 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + + var porterStemmer = function porterStemmer(w) { + var stem, + suffix, + firstch, + re, + re2, + re3, + re4; + + if (w.length < 3) { return w; } + + firstch = w.substr(0,1); + if (firstch == "y") { + w = firstch.toUpperCase() + w.substr(1); + } + + // Step 1a + re = re_1a + re2 = re2_1a; + + if (re.test(w)) { w = w.replace(re,"$1$2"); } + else if (re2.test(w)) { w = w.replace(re2,"$1$2"); } + + // Step 1b + re = re_1b; + re2 = re2_1b; + if (re.test(w)) { + var fp = re.exec(w); + re = re_mgr0; + if (re.test(fp[1])) { + re = re_1b_2; + w = w.replace(re,""); + } + } else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1]; + re2 = re_s_v; + if (re2.test(stem)) { + w = stem; + re2 = re2_1b_2; + re3 = re3_1b_2; + re4 = re4_1b_2; + if (re2.test(w)) { w = w + "e"; } + else if (re3.test(w)) { re = re_1b_2; w = w.replace(re,""); } + else if (re4.test(w)) { w = w + "e"; } + } + } + + // Step 1c - replace suffix y or Y by i if preceded by a non-vowel which is not the first letter of the word (so cry -> cri, by -> by, say -> say) + re = re_1c; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + w = stem + "i"; + } + + // Step 2 + re = re_2; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = re_mgr0; + if (re.test(stem)) { + w = stem + step2list[suffix]; + } + } + + // Step 3 + re = re_3; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = re_mgr0; + if (re.test(stem)) { + w = stem + step3list[suffix]; + } + } + + // Step 4 + re = re_4; + re2 = re2_4; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = re_mgr1; + if (re.test(stem)) { + w = stem; + } + } else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1] + fp[2]; + re2 = re_mgr1; + if (re2.test(stem)) { + w = stem; + } + } + + // Step 5 + re = re_5; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = re_mgr1; + re2 = re_meq1; + re3 = re3_5; + if (re.test(stem) || (re2.test(stem) && !(re3.test(stem)))) { + w = stem; + } + } + + re = re_5_1; + re2 = re_mgr1; + if (re.test(w) && re2.test(w)) { + re = re_1b_2; + w = w.replace(re,""); + } + + // and turn initial Y back to y + + if (firstch == "y") { + w = firstch.toLowerCase() + w.substr(1); + } + + return w; + }; + + return function (token) { + return token.update(porterStemmer); + } +})(); + +lunr.Pipeline.registerFunction(lunr.stemmer, 'stemmer') +/*! + * lunr.stopWordFilter + * Copyright (C) 2020 Oliver Nightingale + */ + +/** + * lunr.generateStopWordFilter builds a stopWordFilter function from the provided + * list of stop words. + * + * The built in lunr.stopWordFilter is built using this generator and can be used + * to generate custom stopWordFilters for applications or non English languages. + * + * @function + * @param {Array} token The token to pass through the filter + * @returns {lunr.PipelineFunction} + * @see lunr.Pipeline + * @see lunr.stopWordFilter + */ +lunr.generateStopWordFilter = function (stopWords) { + var words = stopWords.reduce(function (memo, stopWord) { + memo[stopWord] = stopWord + return memo + }, {}) + + return function (token) { + if (token && words[token.toString()] !== token.toString()) return token + } +} + +/** + * lunr.stopWordFilter is an English language stop word list filter, any words + * contained in the list will not be passed through the filter. + * + * This is intended to be used in the Pipeline. If the token does not pass the + * filter then undefined will be returned. + * + * @function + * @implements {lunr.PipelineFunction} + * @params {lunr.Token} token - A token to check for being a stop word. + * @returns {lunr.Token} + * @see {@link lunr.Pipeline} + */ +lunr.stopWordFilter = lunr.generateStopWordFilter([ + 'a', + 'able', + 'about', + 'across', + 'after', + 'all', + 'almost', + 'also', + 'am', + 'among', + 'an', + 'and', + 'any', + 'are', + 'as', + 'at', + 'be', + 'because', + 'been', + 'but', + 'by', + 'can', + 'cannot', + 'could', + 'dear', + 'did', + 'do', + 'does', + 'either', + 'else', + 'ever', + 'every', + 'for', + 'from', + 'get', + 'got', + 'had', + 'has', + 'have', + 'he', + 'her', + 'hers', + 'him', + 'his', + 'how', + 'however', + 'i', + 'if', + 'in', + 'into', + 'is', + 'it', + 'its', + 'just', + 'least', + 'let', + 'like', + 'likely', + 'may', + 'me', + 'might', + 'most', + 'must', + 'my', + 'neither', + 'no', + 'nor', + 'not', + 'of', + 'off', + 'often', + 'on', + 'only', + 'or', + 'other', + 'our', + 'own', + 'rather', + 'said', + 'say', + 'says', + 'she', + 'should', + 'since', + 'so', + 'some', + 'than', + 'that', + 'the', + 'their', + 'them', + 'then', + 'there', + 'these', + 'they', + 'this', + 'tis', + 'to', + 'too', + 'twas', + 'us', + 'wants', + 'was', + 'we', + 'were', + 'what', + 'when', + 'where', + 'which', + 'while', + 'who', + 'whom', + 'why', + 'will', + 'with', + 'would', + 'yet', + 'you', + 'your' +]) + +lunr.Pipeline.registerFunction(lunr.stopWordFilter, 'stopWordFilter') +/*! + * lunr.trimmer + * Copyright (C) 2020 Oliver Nightingale + */ + +/** + * lunr.trimmer is a pipeline function for trimming non word + * characters from the beginning and end of tokens before they + * enter the index. + * + * This implementation may not work correctly for non latin + * characters and should either be removed or adapted for use + * with languages with non-latin characters. + * + * @static + * @implements {lunr.PipelineFunction} + * @param {lunr.Token} token The token to pass through the filter + * @returns {lunr.Token} + * @see lunr.Pipeline + */ +lunr.trimmer = function (token) { + return token.update(function (s) { + return s.replace(/^\W+/, '').replace(/\W+$/, '') + }) +} + +lunr.Pipeline.registerFunction(lunr.trimmer, 'trimmer') +/*! + * lunr.TokenSet + * Copyright (C) 2020 Oliver Nightingale + */ + +/** + * A token set is used to store the unique list of all tokens + * within an index. Token sets are also used to represent an + * incoming query to the index, this query token set and index + * token set are then intersected to find which tokens to look + * up in the inverted index. + * + * A token set can hold multiple tokens, as in the case of the + * index token set, or it can hold a single token as in the + * case of a simple query token set. + * + * Additionally token sets are used to perform wildcard matching. + * Leading, contained and trailing wildcards are supported, and + * from this edit distance matching can also be provided. + * + * Token sets are implemented as a minimal finite state automata, + * where both common prefixes and suffixes are shared between tokens. + * This helps to reduce the space used for storing the token set. + * + * @constructor + */ +lunr.TokenSet = function () { + this.final = false + this.edges = {} + this.id = lunr.TokenSet._nextId + lunr.TokenSet._nextId += 1 +} + +/** + * Keeps track of the next, auto increment, identifier to assign + * to a new tokenSet. + * + * TokenSets require a unique identifier to be correctly minimised. + * + * @private + */ +lunr.TokenSet._nextId = 1 + +/** + * Creates a TokenSet instance from the given sorted array of words. + * + * @param {String[]} arr - A sorted array of strings to create the set from. + * @returns {lunr.TokenSet} + * @throws Will throw an error if the input array is not sorted. + */ +lunr.TokenSet.fromArray = function (arr) { + var builder = new lunr.TokenSet.Builder + + for (var i = 0, len = arr.length; i < len; i++) { + builder.insert(arr[i]) + } + + builder.finish() + return builder.root +} + +/** + * Creates a token set from a query clause. + * + * @private + * @param {Object} clause - A single clause from lunr.Query. + * @param {string} clause.term - The query clause term. + * @param {number} [clause.editDistance] - The optional edit distance for the term. + * @returns {lunr.TokenSet} + */ +lunr.TokenSet.fromClause = function (clause) { + if ('editDistance' in clause) { + return lunr.TokenSet.fromFuzzyString(clause.term, clause.editDistance) + } else { + return lunr.TokenSet.fromString(clause.term) + } +} + +/** + * Creates a token set representing a single string with a specified + * edit distance. + * + * Insertions, deletions, substitutions and transpositions are each + * treated as an edit distance of 1. + * + * Increasing the allowed edit distance will have a dramatic impact + * on the performance of both creating and intersecting these TokenSets. + * It is advised to keep the edit distance less than 3. + * + * @param {string} str - The string to create the token set from. + * @param {number} editDistance - The allowed edit distance to match. + * @returns {lunr.Vector} + */ +lunr.TokenSet.fromFuzzyString = function (str, editDistance) { + var root = new lunr.TokenSet + + var stack = [{ + node: root, + editsRemaining: editDistance, + str: str + }] + + while (stack.length) { + var frame = stack.pop() + + // no edit + if (frame.str.length > 0) { + var char = frame.str.charAt(0), + noEditNode + + if (char in frame.node.edges) { + noEditNode = frame.node.edges[char] + } else { + noEditNode = new lunr.TokenSet + frame.node.edges[char] = noEditNode + } + + if (frame.str.length == 1) { + noEditNode.final = true + } + + stack.push({ + node: noEditNode, + editsRemaining: frame.editsRemaining, + str: frame.str.slice(1) + }) + } + + if (frame.editsRemaining == 0) { + continue + } + + // insertion + if ("*" in frame.node.edges) { + var insertionNode = frame.node.edges["*"] + } else { + var insertionNode = new lunr.TokenSet + frame.node.edges["*"] = insertionNode + } + + if (frame.str.length == 0) { + insertionNode.final = true + } + + stack.push({ + node: insertionNode, + editsRemaining: frame.editsRemaining - 1, + str: frame.str + }) + + // deletion + // can only do a deletion if we have enough edits remaining + // and if there are characters left to delete in the string + if (frame.str.length > 1) { + stack.push({ + node: frame.node, + editsRemaining: frame.editsRemaining - 1, + str: frame.str.slice(1) + }) + } + + // deletion + // just removing the last character from the str + if (frame.str.length == 1) { + frame.node.final = true + } + + // substitution + // can only do a substitution if we have enough edits remaining + // and if there are characters left to substitute + if (frame.str.length >= 1) { + if ("*" in frame.node.edges) { + var substitutionNode = frame.node.edges["*"] + } else { + var substitutionNode = new lunr.TokenSet + frame.node.edges["*"] = substitutionNode + } + + if (frame.str.length == 1) { + substitutionNode.final = true + } + + stack.push({ + node: substitutionNode, + editsRemaining: frame.editsRemaining - 1, + str: frame.str.slice(1) + }) + } + + // transposition + // can only do a transposition if there are edits remaining + // and there are enough characters to transpose + if (frame.str.length > 1) { + var charA = frame.str.charAt(0), + charB = frame.str.charAt(1), + transposeNode + + if (charB in frame.node.edges) { + transposeNode = frame.node.edges[charB] + } else { + transposeNode = new lunr.TokenSet + frame.node.edges[charB] = transposeNode + } + + if (frame.str.length == 1) { + transposeNode.final = true + } + + stack.push({ + node: transposeNode, + editsRemaining: frame.editsRemaining - 1, + str: charA + frame.str.slice(2) + }) + } + } + + return root +} + +/** + * Creates a TokenSet from a string. + * + * The string may contain one or more wildcard characters (*) + * that will allow wildcard matching when intersecting with + * another TokenSet. + * + * @param {string} str - The string to create a TokenSet from. + * @returns {lunr.TokenSet} + */ +lunr.TokenSet.fromString = function (str) { + var node = new lunr.TokenSet, + root = node + + /* + * Iterates through all characters within the passed string + * appending a node for each character. + * + * When a wildcard character is found then a self + * referencing edge is introduced to continually match + * any number of any characters. + */ + for (var i = 0, len = str.length; i < len; i++) { + var char = str[i], + final = (i == len - 1) + + if (char == "*") { + node.edges[char] = node + node.final = final + + } else { + var next = new lunr.TokenSet + next.final = final + + node.edges[char] = next + node = next + } + } + + return root +} + +/** + * Converts this TokenSet into an array of strings + * contained within the TokenSet. + * + * This is not intended to be used on a TokenSet that + * contains wildcards, in these cases the results are + * undefined and are likely to cause an infinite loop. + * + * @returns {string[]} + */ +lunr.TokenSet.prototype.toArray = function () { + var words = [] + + var stack = [{ + prefix: "", + node: this + }] + + while (stack.length) { + var frame = stack.pop(), + edges = Object.keys(frame.node.edges), + len = edges.length + + if (frame.node.final) { + /* In Safari, at this point the prefix is sometimes corrupted, see: + * https://github.com/olivernn/lunr.js/issues/279 Calling any + * String.prototype method forces Safari to "cast" this string to what + * it's supposed to be, fixing the bug. */ + frame.prefix.charAt(0) + words.push(frame.prefix) + } + + for (var i = 0; i < len; i++) { + var edge = edges[i] + + stack.push({ + prefix: frame.prefix.concat(edge), + node: frame.node.edges[edge] + }) + } + } + + return words +} + +/** + * Generates a string representation of a TokenSet. + * + * This is intended to allow TokenSets to be used as keys + * in objects, largely to aid the construction and minimisation + * of a TokenSet. As such it is not designed to be a human + * friendly representation of the TokenSet. + * + * @returns {string} + */ +lunr.TokenSet.prototype.toString = function () { + // NOTE: Using Object.keys here as this.edges is very likely + // to enter 'hash-mode' with many keys being added + // + // avoiding a for-in loop here as it leads to the function + // being de-optimised (at least in V8). From some simple + // benchmarks the performance is comparable, but allowing + // V8 to optimize may mean easy performance wins in the future. + + if (this._str) { + return this._str + } + + var str = this.final ? '1' : '0', + labels = Object.keys(this.edges).sort(), + len = labels.length + + for (var i = 0; i < len; i++) { + var label = labels[i], + node = this.edges[label] + + str = str + label + node.id + } + + return str +} + +/** + * Returns a new TokenSet that is the intersection of + * this TokenSet and the passed TokenSet. + * + * This intersection will take into account any wildcards + * contained within the TokenSet. + * + * @param {lunr.TokenSet} b - An other TokenSet to intersect with. + * @returns {lunr.TokenSet} + */ +lunr.TokenSet.prototype.intersect = function (b) { + var output = new lunr.TokenSet, + frame = undefined + + var stack = [{ + qNode: b, + output: output, + node: this + }] + + while (stack.length) { + frame = stack.pop() + + // NOTE: As with the #toString method, we are using + // Object.keys and a for loop instead of a for-in loop + // as both of these objects enter 'hash' mode, causing + // the function to be de-optimised in V8 + var qEdges = Object.keys(frame.qNode.edges), + qLen = qEdges.length, + nEdges = Object.keys(frame.node.edges), + nLen = nEdges.length + + for (var q = 0; q < qLen; q++) { + var qEdge = qEdges[q] + + for (var n = 0; n < nLen; n++) { + var nEdge = nEdges[n] + + if (nEdge == qEdge || qEdge == '*') { + var node = frame.node.edges[nEdge], + qNode = frame.qNode.edges[qEdge], + final = node.final && qNode.final, + next = undefined + + if (nEdge in frame.output.edges) { + // an edge already exists for this character + // no need to create a new node, just set the finality + // bit unless this node is already final + next = frame.output.edges[nEdge] + next.final = next.final || final + + } else { + // no edge exists yet, must create one + // set the finality bit and insert it + // into the output + next = new lunr.TokenSet + next.final = final + frame.output.edges[nEdge] = next + } + + stack.push({ + qNode: qNode, + output: next, + node: node + }) + } + } + } + } + + return output +} +lunr.TokenSet.Builder = function () { + this.previousWord = "" + this.root = new lunr.TokenSet + this.uncheckedNodes = [] + this.minimizedNodes = {} +} + +lunr.TokenSet.Builder.prototype.insert = function (word) { + var node, + commonPrefix = 0 + + if (word < this.previousWord) { + throw new Error ("Out of order word insertion") + } + + for (var i = 0; i < word.length && i < this.previousWord.length; i++) { + if (word[i] != this.previousWord[i]) break + commonPrefix++ + } + + this.minimize(commonPrefix) + + if (this.uncheckedNodes.length == 0) { + node = this.root + } else { + node = this.uncheckedNodes[this.uncheckedNodes.length - 1].child + } + + for (var i = commonPrefix; i < word.length; i++) { + var nextNode = new lunr.TokenSet, + char = word[i] + + node.edges[char] = nextNode + + this.uncheckedNodes.push({ + parent: node, + char: char, + child: nextNode + }) + + node = nextNode + } + + node.final = true + this.previousWord = word +} + +lunr.TokenSet.Builder.prototype.finish = function () { + this.minimize(0) +} + +lunr.TokenSet.Builder.prototype.minimize = function (downTo) { + for (var i = this.uncheckedNodes.length - 1; i >= downTo; i--) { + var node = this.uncheckedNodes[i], + childKey = node.child.toString() + + if (childKey in this.minimizedNodes) { + node.parent.edges[node.char] = this.minimizedNodes[childKey] + } else { + // Cache the key for this node since + // we know it can't change anymore + node.child._str = childKey + + this.minimizedNodes[childKey] = node.child + } + + this.uncheckedNodes.pop() + } +} +/*! + * lunr.Index + * Copyright (C) 2020 Oliver Nightingale + */ + +/** + * An index contains the built index of all documents and provides a query interface + * to the index. + * + * Usually instances of lunr.Index will not be created using this constructor, instead + * lunr.Builder should be used to construct new indexes, or lunr.Index.load should be + * used to load previously built and serialized indexes. + * + * @constructor + * @param {Object} attrs - The attributes of the built search index. + * @param {Object} attrs.invertedIndex - An index of term/field to document reference. + * @param {Object} attrs.fieldVectors - Field vectors + * @param {lunr.TokenSet} attrs.tokenSet - An set of all corpus tokens. + * @param {string[]} attrs.fields - The names of indexed document fields. + * @param {lunr.Pipeline} attrs.pipeline - The pipeline to use for search terms. + */ +lunr.Index = function (attrs) { + this.invertedIndex = attrs.invertedIndex + this.fieldVectors = attrs.fieldVectors + this.tokenSet = attrs.tokenSet + this.fields = attrs.fields + this.pipeline = attrs.pipeline +} + +/** + * A result contains details of a document matching a search query. + * @typedef {Object} lunr.Index~Result + * @property {string} ref - The reference of the document this result represents. + * @property {number} score - A number between 0 and 1 representing how similar this document is to the query. + * @property {lunr.MatchData} matchData - Contains metadata about this match including which term(s) caused the match. + */ + +/** + * Although lunr provides the ability to create queries using lunr.Query, it also provides a simple + * query language which itself is parsed into an instance of lunr.Query. + * + * For programmatically building queries it is advised to directly use lunr.Query, the query language + * is best used for human entered text rather than program generated text. + * + * At its simplest queries can just be a single term, e.g. `hello`, multiple terms are also supported + * and will be combined with OR, e.g `hello world` will match documents that contain either 'hello' + * or 'world', though those that contain both will rank higher in the results. + * + * Wildcards can be included in terms to match one or more unspecified characters, these wildcards can + * be inserted anywhere within the term, and more than one wildcard can exist in a single term. Adding + * wildcards will increase the number of documents that will be found but can also have a negative + * impact on query performance, especially with wildcards at the beginning of a term. + * + * Terms can be restricted to specific fields, e.g. `title:hello`, only documents with the term + * hello in the title field will match this query. Using a field not present in the index will lead + * to an error being thrown. + * + * Modifiers can also be added to terms, lunr supports edit distance and boost modifiers on terms. A term + * boost will make documents matching that term score higher, e.g. `foo^5`. Edit distance is also supported + * to provide fuzzy matching, e.g. 'hello~2' will match documents with hello with an edit distance of 2. + * Avoid large values for edit distance to improve query performance. + * + * Each term also supports a presence modifier. By default a term's presence in document is optional, however + * this can be changed to either required or prohibited. For a term's presence to be required in a document the + * term should be prefixed with a '+', e.g. `+foo bar` is a search for documents that must contain 'foo' and + * optionally contain 'bar'. Conversely a leading '-' sets the terms presence to prohibited, i.e. it must not + * appear in a document, e.g. `-foo bar` is a search for documents that do not contain 'foo' but may contain 'bar'. + * + * To escape special characters the backslash character '\' can be used, this allows searches to include + * characters that would normally be considered modifiers, e.g. `foo\~2` will search for a term "foo~2" instead + * of attempting to apply a boost of 2 to the search term "foo". + * + * @typedef {string} lunr.Index~QueryString + * @example Simple single term query + * hello + * @example Multiple term query + * hello world + * @example term scoped to a field + * title:hello + * @example term with a boost of 10 + * hello^10 + * @example term with an edit distance of 2 + * hello~2 + * @example terms with presence modifiers + * -foo +bar baz + */ + +/** + * Performs a search against the index using lunr query syntax. + * + * Results will be returned sorted by their score, the most relevant results + * will be returned first. For details on how the score is calculated, please see + * the {@link https://lunrjs.com/guides/searching.html#scoring|guide}. + * + * For more programmatic querying use lunr.Index#query. + * + * @param {lunr.Index~QueryString} queryString - A string containing a lunr query. + * @throws {lunr.QueryParseError} If the passed query string cannot be parsed. + * @returns {lunr.Index~Result[]} + */ +lunr.Index.prototype.search = function (queryString) { + return this.query(function (query) { + var parser = new lunr.QueryParser(queryString, query) + parser.parse() + }) +} + +/** + * A query builder callback provides a query object to be used to express + * the query to perform on the index. + * + * @callback lunr.Index~queryBuilder + * @param {lunr.Query} query - The query object to build up. + * @this lunr.Query + */ + +/** + * Performs a query against the index using the yielded lunr.Query object. + * + * If performing programmatic queries against the index, this method is preferred + * over lunr.Index#search so as to avoid the additional query parsing overhead. + * + * A query object is yielded to the supplied function which should be used to + * express the query to be run against the index. + * + * Note that although this function takes a callback parameter it is _not_ an + * asynchronous operation, the callback is just yielded a query object to be + * customized. + * + * @param {lunr.Index~queryBuilder} fn - A function that is used to build the query. + * @returns {lunr.Index~Result[]} + */ +lunr.Index.prototype.query = function (fn) { + // for each query clause + // * process terms + // * expand terms from token set + // * find matching documents and metadata + // * get document vectors + // * score documents + + var query = new lunr.Query(this.fields), + matchingFields = Object.create(null), + queryVectors = Object.create(null), + termFieldCache = Object.create(null), + requiredMatches = Object.create(null), + prohibitedMatches = Object.create(null) + + /* + * To support field level boosts a query vector is created per + * field. An empty vector is eagerly created to support negated + * queries. + */ + for (var i = 0; i < this.fields.length; i++) { + queryVectors[this.fields[i]] = new lunr.Vector + } + + fn.call(query, query) + + for (var i = 0; i < query.clauses.length; i++) { + /* + * Unless the pipeline has been disabled for this term, which is + * the case for terms with wildcards, we need to pass the clause + * term through the search pipeline. A pipeline returns an array + * of processed terms. Pipeline functions may expand the passed + * term, which means we may end up performing multiple index lookups + * for a single query term. + */ + var clause = query.clauses[i], + terms = null, + clauseMatches = lunr.Set.empty + + if (clause.usePipeline) { + terms = this.pipeline.runString(clause.term, { + fields: clause.fields + }) + } else { + terms = [clause.term] + } + + for (var m = 0; m < terms.length; m++) { + var term = terms[m] + + /* + * Each term returned from the pipeline needs to use the same query + * clause object, e.g. the same boost and or edit distance. The + * simplest way to do this is to re-use the clause object but mutate + * its term property. + */ + clause.term = term + + /* + * From the term in the clause we create a token set which will then + * be used to intersect the indexes token set to get a list of terms + * to lookup in the inverted index + */ + var termTokenSet = lunr.TokenSet.fromClause(clause), + expandedTerms = this.tokenSet.intersect(termTokenSet).toArray() + + /* + * If a term marked as required does not exist in the tokenSet it is + * impossible for the search to return any matches. We set all the field + * scoped required matches set to empty and stop examining any further + * clauses. + */ + if (expandedTerms.length === 0 && clause.presence === lunr.Query.presence.REQUIRED) { + for (var k = 0; k < clause.fields.length; k++) { + var field = clause.fields[k] + requiredMatches[field] = lunr.Set.empty + } + + break + } + + for (var j = 0; j < expandedTerms.length; j++) { + /* + * For each term get the posting and termIndex, this is required for + * building the query vector. + */ + var expandedTerm = expandedTerms[j], + posting = this.invertedIndex[expandedTerm], + termIndex = posting._index + + for (var k = 0; k < clause.fields.length; k++) { + /* + * For each field that this query term is scoped by (by default + * all fields are in scope) we need to get all the document refs + * that have this term in that field. + * + * The posting is the entry in the invertedIndex for the matching + * term from above. + */ + var field = clause.fields[k], + fieldPosting = posting[field], + matchingDocumentRefs = Object.keys(fieldPosting), + termField = expandedTerm + "/" + field, + matchingDocumentsSet = new lunr.Set(matchingDocumentRefs) + + /* + * if the presence of this term is required ensure that the matching + * documents are added to the set of required matches for this clause. + * + */ + if (clause.presence == lunr.Query.presence.REQUIRED) { + clauseMatches = clauseMatches.union(matchingDocumentsSet) + + if (requiredMatches[field] === undefined) { + requiredMatches[field] = lunr.Set.complete + } + } + + /* + * if the presence of this term is prohibited ensure that the matching + * documents are added to the set of prohibited matches for this field, + * creating that set if it does not yet exist. + */ + if (clause.presence == lunr.Query.presence.PROHIBITED) { + if (prohibitedMatches[field] === undefined) { + prohibitedMatches[field] = lunr.Set.empty + } + + prohibitedMatches[field] = prohibitedMatches[field].union(matchingDocumentsSet) + + /* + * Prohibited matches should not be part of the query vector used for + * similarity scoring and no metadata should be extracted so we continue + * to the next field + */ + continue + } + + /* + * The query field vector is populated using the termIndex found for + * the term and a unit value with the appropriate boost applied. + * Using upsert because there could already be an entry in the vector + * for the term we are working with. In that case we just add the scores + * together. + */ + queryVectors[field].upsert(termIndex, clause.boost, function (a, b) { return a + b }) + + /** + * If we've already seen this term, field combo then we've already collected + * the matching documents and metadata, no need to go through all that again + */ + if (termFieldCache[termField]) { + continue + } + + for (var l = 0; l < matchingDocumentRefs.length; l++) { + /* + * All metadata for this term/field/document triple + * are then extracted and collected into an instance + * of lunr.MatchData ready to be returned in the query + * results + */ + var matchingDocumentRef = matchingDocumentRefs[l], + matchingFieldRef = new lunr.FieldRef (matchingDocumentRef, field), + metadata = fieldPosting[matchingDocumentRef], + fieldMatch + + if ((fieldMatch = matchingFields[matchingFieldRef]) === undefined) { + matchingFields[matchingFieldRef] = new lunr.MatchData (expandedTerm, field, metadata) + } else { + fieldMatch.add(expandedTerm, field, metadata) + } + + } + + termFieldCache[termField] = true + } + } + } + + /** + * If the presence was required we need to update the requiredMatches field sets. + * We do this after all fields for the term have collected their matches because + * the clause terms presence is required in _any_ of the fields not _all_ of the + * fields. + */ + if (clause.presence === lunr.Query.presence.REQUIRED) { + for (var k = 0; k < clause.fields.length; k++) { + var field = clause.fields[k] + requiredMatches[field] = requiredMatches[field].intersect(clauseMatches) + } + } + } + + /** + * Need to combine the field scoped required and prohibited + * matching documents into a global set of required and prohibited + * matches + */ + var allRequiredMatches = lunr.Set.complete, + allProhibitedMatches = lunr.Set.empty + + for (var i = 0; i < this.fields.length; i++) { + var field = this.fields[i] + + if (requiredMatches[field]) { + allRequiredMatches = allRequiredMatches.intersect(requiredMatches[field]) + } + + if (prohibitedMatches[field]) { + allProhibitedMatches = allProhibitedMatches.union(prohibitedMatches[field]) + } + } + + var matchingFieldRefs = Object.keys(matchingFields), + results = [], + matches = Object.create(null) + + /* + * If the query is negated (contains only prohibited terms) + * we need to get _all_ fieldRefs currently existing in the + * index. This is only done when we know that the query is + * entirely prohibited terms to avoid any cost of getting all + * fieldRefs unnecessarily. + * + * Additionally, blank MatchData must be created to correctly + * populate the results. + */ + if (query.isNegated()) { + matchingFieldRefs = Object.keys(this.fieldVectors) + + for (var i = 0; i < matchingFieldRefs.length; i++) { + var matchingFieldRef = matchingFieldRefs[i] + var fieldRef = lunr.FieldRef.fromString(matchingFieldRef) + matchingFields[matchingFieldRef] = new lunr.MatchData + } + } + + for (var i = 0; i < matchingFieldRefs.length; i++) { + /* + * Currently we have document fields that match the query, but we + * need to return documents. The matchData and scores are combined + * from multiple fields belonging to the same document. + * + * Scores are calculated by field, using the query vectors created + * above, and combined into a final document score using addition. + */ + var fieldRef = lunr.FieldRef.fromString(matchingFieldRefs[i]), + docRef = fieldRef.docRef + + if (!allRequiredMatches.contains(docRef)) { + continue + } + + if (allProhibitedMatches.contains(docRef)) { + continue + } + + var fieldVector = this.fieldVectors[fieldRef], + score = queryVectors[fieldRef.fieldName].similarity(fieldVector), + docMatch + + if ((docMatch = matches[docRef]) !== undefined) { + docMatch.score += score + docMatch.matchData.combine(matchingFields[fieldRef]) + } else { + var match = { + ref: docRef, + score: score, + matchData: matchingFields[fieldRef] + } + matches[docRef] = match + results.push(match) + } + } + + /* + * Sort the results objects by score, highest first. + */ + return results.sort(function (a, b) { + return b.score - a.score + }) +} + +/** + * Prepares the index for JSON serialization. + * + * The schema for this JSON blob will be described in a + * separate JSON schema file. + * + * @returns {Object} + */ +lunr.Index.prototype.toJSON = function () { + var invertedIndex = Object.keys(this.invertedIndex) + .sort() + .map(function (term) { + return [term, this.invertedIndex[term]] + }, this) + + var fieldVectors = Object.keys(this.fieldVectors) + .map(function (ref) { + return [ref, this.fieldVectors[ref].toJSON()] + }, this) + + return { + version: lunr.version, + fields: this.fields, + fieldVectors: fieldVectors, + invertedIndex: invertedIndex, + pipeline: this.pipeline.toJSON() + } +} + +/** + * Loads a previously serialized lunr.Index + * + * @param {Object} serializedIndex - A previously serialized lunr.Index + * @returns {lunr.Index} + */ +lunr.Index.load = function (serializedIndex) { + var attrs = {}, + fieldVectors = {}, + serializedVectors = serializedIndex.fieldVectors, + invertedIndex = Object.create(null), + serializedInvertedIndex = serializedIndex.invertedIndex, + tokenSetBuilder = new lunr.TokenSet.Builder, + pipeline = lunr.Pipeline.load(serializedIndex.pipeline) + + if (serializedIndex.version != lunr.version) { + lunr.utils.warn("Version mismatch when loading serialised index. Current version of lunr '" + lunr.version + "' does not match serialized index '" + serializedIndex.version + "'") + } + + for (var i = 0; i < serializedVectors.length; i++) { + var tuple = serializedVectors[i], + ref = tuple[0], + elements = tuple[1] + + fieldVectors[ref] = new lunr.Vector(elements) + } + + for (var i = 0; i < serializedInvertedIndex.length; i++) { + var tuple = serializedInvertedIndex[i], + term = tuple[0], + posting = tuple[1] + + tokenSetBuilder.insert(term) + invertedIndex[term] = posting + } + + tokenSetBuilder.finish() + + attrs.fields = serializedIndex.fields + + attrs.fieldVectors = fieldVectors + attrs.invertedIndex = invertedIndex + attrs.tokenSet = tokenSetBuilder.root + attrs.pipeline = pipeline + + return new lunr.Index(attrs) +} +/*! + * lunr.Builder + * Copyright (C) 2020 Oliver Nightingale + */ + +/** + * lunr.Builder performs indexing on a set of documents and + * returns instances of lunr.Index ready for querying. + * + * All configuration of the index is done via the builder, the + * fields to index, the document reference, the text processing + * pipeline and document scoring parameters are all set on the + * builder before indexing. + * + * @constructor + * @property {string} _ref - Internal reference to the document reference field. + * @property {string[]} _fields - Internal reference to the document fields to index. + * @property {object} invertedIndex - The inverted index maps terms to document fields. + * @property {object} documentTermFrequencies - Keeps track of document term frequencies. + * @property {object} documentLengths - Keeps track of the length of documents added to the index. + * @property {lunr.tokenizer} tokenizer - Function for splitting strings into tokens for indexing. + * @property {lunr.Pipeline} pipeline - The pipeline performs text processing on tokens before indexing. + * @property {lunr.Pipeline} searchPipeline - A pipeline for processing search terms before querying the index. + * @property {number} documentCount - Keeps track of the total number of documents indexed. + * @property {number} _b - A parameter to control field length normalization, setting this to 0 disabled normalization, 1 fully normalizes field lengths, the default value is 0.75. + * @property {number} _k1 - A parameter to control how quickly an increase in term frequency results in term frequency saturation, the default value is 1.2. + * @property {number} termIndex - A counter incremented for each unique term, used to identify a terms position in the vector space. + * @property {array} metadataWhitelist - A list of metadata keys that have been whitelisted for entry in the index. + */ +lunr.Builder = function () { + this._ref = "id" + this._fields = Object.create(null) + this._documents = Object.create(null) + this.invertedIndex = Object.create(null) + this.fieldTermFrequencies = {} + this.fieldLengths = {} + this.tokenizer = lunr.tokenizer + this.pipeline = new lunr.Pipeline + this.searchPipeline = new lunr.Pipeline + this.documentCount = 0 + this._b = 0.75 + this._k1 = 1.2 + this.termIndex = 0 + this.metadataWhitelist = [] +} + +/** + * Sets the document field used as the document reference. Every document must have this field. + * The type of this field in the document should be a string, if it is not a string it will be + * coerced into a string by calling toString. + * + * The default ref is 'id'. + * + * The ref should _not_ be changed during indexing, it should be set before any documents are + * added to the index. Changing it during indexing can lead to inconsistent results. + * + * @param {string} ref - The name of the reference field in the document. + */ +lunr.Builder.prototype.ref = function (ref) { + this._ref = ref +} + +/** + * A function that is used to extract a field from a document. + * + * Lunr expects a field to be at the top level of a document, if however the field + * is deeply nested within a document an extractor function can be used to extract + * the right field for indexing. + * + * @callback fieldExtractor + * @param {object} doc - The document being added to the index. + * @returns {?(string|object|object[])} obj - The object that will be indexed for this field. + * @example Extracting a nested field + * function (doc) { return doc.nested.field } + */ + +/** + * Adds a field to the list of document fields that will be indexed. Every document being + * indexed should have this field. Null values for this field in indexed documents will + * not cause errors but will limit the chance of that document being retrieved by searches. + * + * All fields should be added before adding documents to the index. Adding fields after + * a document has been indexed will have no effect on already indexed documents. + * + * Fields can be boosted at build time. This allows terms within that field to have more + * importance when ranking search results. Use a field boost to specify that matches within + * one field are more important than other fields. + * + * @param {string} fieldName - The name of a field to index in all documents. + * @param {object} attributes - Optional attributes associated with this field. + * @param {number} [attributes.boost=1] - Boost applied to all terms within this field. + * @param {fieldExtractor} [attributes.extractor] - Function to extract a field from a document. + * @throws {RangeError} fieldName cannot contain unsupported characters '/' + */ +lunr.Builder.prototype.field = function (fieldName, attributes) { + if (/\//.test(fieldName)) { + throw new RangeError ("Field '" + fieldName + "' contains illegal character '/'") + } + + this._fields[fieldName] = attributes || {} +} + +/** + * A parameter to tune the amount of field length normalisation that is applied when + * calculating relevance scores. A value of 0 will completely disable any normalisation + * and a value of 1 will fully normalise field lengths. The default is 0.75. Values of b + * will be clamped to the range 0 - 1. + * + * @param {number} number - The value to set for this tuning parameter. + */ +lunr.Builder.prototype.b = function (number) { + if (number < 0) { + this._b = 0 + } else if (number > 1) { + this._b = 1 + } else { + this._b = number + } +} + +/** + * A parameter that controls the speed at which a rise in term frequency results in term + * frequency saturation. The default value is 1.2. Setting this to a higher value will give + * slower saturation levels, a lower value will result in quicker saturation. + * + * @param {number} number - The value to set for this tuning parameter. + */ +lunr.Builder.prototype.k1 = function (number) { + this._k1 = number +} + +/** + * Adds a document to the index. + * + * Before adding fields to the index the index should have been fully setup, with the document + * ref and all fields to index already having been specified. + * + * The document must have a field name as specified by the ref (by default this is 'id') and + * it should have all fields defined for indexing, though null or undefined values will not + * cause errors. + * + * Entire documents can be boosted at build time. Applying a boost to a document indicates that + * this document should rank higher in search results than other documents. + * + * @param {object} doc - The document to add to the index. + * @param {object} attributes - Optional attributes associated with this document. + * @param {number} [attributes.boost=1] - Boost applied to all terms within this document. + */ +lunr.Builder.prototype.add = function (doc, attributes) { + var docRef = doc[this._ref], + fields = Object.keys(this._fields) + + this._documents[docRef] = attributes || {} + this.documentCount += 1 + + for (var i = 0; i < fields.length; i++) { + var fieldName = fields[i], + extractor = this._fields[fieldName].extractor, + field = extractor ? extractor(doc) : doc[fieldName], + tokens = this.tokenizer(field, { + fields: [fieldName] + }), + terms = this.pipeline.run(tokens), + fieldRef = new lunr.FieldRef (docRef, fieldName), + fieldTerms = Object.create(null) + + this.fieldTermFrequencies[fieldRef] = fieldTerms + this.fieldLengths[fieldRef] = 0 + + // store the length of this field for this document + this.fieldLengths[fieldRef] += terms.length + + // calculate term frequencies for this field + for (var j = 0; j < terms.length; j++) { + var term = terms[j] + + if (fieldTerms[term] == undefined) { + fieldTerms[term] = 0 + } + + fieldTerms[term] += 1 + + // add to inverted index + // create an initial posting if one doesn't exist + if (this.invertedIndex[term] == undefined) { + var posting = Object.create(null) + posting["_index"] = this.termIndex + this.termIndex += 1 + + for (var k = 0; k < fields.length; k++) { + posting[fields[k]] = Object.create(null) + } + + this.invertedIndex[term] = posting + } + + // add an entry for this term/fieldName/docRef to the invertedIndex + if (this.invertedIndex[term][fieldName][docRef] == undefined) { + this.invertedIndex[term][fieldName][docRef] = Object.create(null) + } + + // store all whitelisted metadata about this token in the + // inverted index + for (var l = 0; l < this.metadataWhitelist.length; l++) { + var metadataKey = this.metadataWhitelist[l], + metadata = term.metadata[metadataKey] + + if (this.invertedIndex[term][fieldName][docRef][metadataKey] == undefined) { + this.invertedIndex[term][fieldName][docRef][metadataKey] = [] + } + + this.invertedIndex[term][fieldName][docRef][metadataKey].push(metadata) + } + } + + } +} + +/** + * Calculates the average document length for this index + * + * @private + */ +lunr.Builder.prototype.calculateAverageFieldLengths = function () { + + var fieldRefs = Object.keys(this.fieldLengths), + numberOfFields = fieldRefs.length, + accumulator = {}, + documentsWithField = {} + + for (var i = 0; i < numberOfFields; i++) { + var fieldRef = lunr.FieldRef.fromString(fieldRefs[i]), + field = fieldRef.fieldName + + documentsWithField[field] || (documentsWithField[field] = 0) + documentsWithField[field] += 1 + + accumulator[field] || (accumulator[field] = 0) + accumulator[field] += this.fieldLengths[fieldRef] + } + + var fields = Object.keys(this._fields) + + for (var i = 0; i < fields.length; i++) { + var fieldName = fields[i] + accumulator[fieldName] = accumulator[fieldName] / documentsWithField[fieldName] + } + + this.averageFieldLength = accumulator +} + +/** + * Builds a vector space model of every document using lunr.Vector + * + * @private + */ +lunr.Builder.prototype.createFieldVectors = function () { + var fieldVectors = {}, + fieldRefs = Object.keys(this.fieldTermFrequencies), + fieldRefsLength = fieldRefs.length, + termIdfCache = Object.create(null) + + for (var i = 0; i < fieldRefsLength; i++) { + var fieldRef = lunr.FieldRef.fromString(fieldRefs[i]), + fieldName = fieldRef.fieldName, + fieldLength = this.fieldLengths[fieldRef], + fieldVector = new lunr.Vector, + termFrequencies = this.fieldTermFrequencies[fieldRef], + terms = Object.keys(termFrequencies), + termsLength = terms.length + + + var fieldBoost = this._fields[fieldName].boost || 1, + docBoost = this._documents[fieldRef.docRef].boost || 1 + + for (var j = 0; j < termsLength; j++) { + var term = terms[j], + tf = termFrequencies[term], + termIndex = this.invertedIndex[term]._index, + idf, score, scoreWithPrecision + + if (termIdfCache[term] === undefined) { + idf = lunr.idf(this.invertedIndex[term], this.documentCount) + termIdfCache[term] = idf + } else { + idf = termIdfCache[term] + } + + score = idf * ((this._k1 + 1) * tf) / (this._k1 * (1 - this._b + this._b * (fieldLength / this.averageFieldLength[fieldName])) + tf) + score *= fieldBoost + score *= docBoost + scoreWithPrecision = Math.round(score * 1000) / 1000 + // Converts 1.23456789 to 1.234. + // Reducing the precision so that the vectors take up less + // space when serialised. Doing it now so that they behave + // the same before and after serialisation. Also, this is + // the fastest approach to reducing a number's precision in + // JavaScript. + + fieldVector.insert(termIndex, scoreWithPrecision) + } + + fieldVectors[fieldRef] = fieldVector + } + + this.fieldVectors = fieldVectors +} + +/** + * Creates a token set of all tokens in the index using lunr.TokenSet + * + * @private + */ +lunr.Builder.prototype.createTokenSet = function () { + this.tokenSet = lunr.TokenSet.fromArray( + Object.keys(this.invertedIndex).sort() + ) +} + +/** + * Builds the index, creating an instance of lunr.Index. + * + * This completes the indexing process and should only be called + * once all documents have been added to the index. + * + * @returns {lunr.Index} + */ +lunr.Builder.prototype.build = function () { + this.calculateAverageFieldLengths() + this.createFieldVectors() + this.createTokenSet() + + return new lunr.Index({ + invertedIndex: this.invertedIndex, + fieldVectors: this.fieldVectors, + tokenSet: this.tokenSet, + fields: Object.keys(this._fields), + pipeline: this.searchPipeline + }) +} + +/** + * Applies a plugin to the index builder. + * + * A plugin is a function that is called with the index builder as its context. + * Plugins can be used to customise or extend the behaviour of the index + * in some way. A plugin is just a function, that encapsulated the custom + * behaviour that should be applied when building the index. + * + * The plugin function will be called with the index builder as its argument, additional + * arguments can also be passed when calling use. The function will be called + * with the index builder as its context. + * + * @param {Function} plugin The plugin to apply. + */ +lunr.Builder.prototype.use = function (fn) { + var args = Array.prototype.slice.call(arguments, 1) + args.unshift(this) + fn.apply(this, args) +} +/** + * Contains and collects metadata about a matching document. + * A single instance of lunr.MatchData is returned as part of every + * lunr.Index~Result. + * + * @constructor + * @param {string} term - The term this match data is associated with + * @param {string} field - The field in which the term was found + * @param {object} metadata - The metadata recorded about this term in this field + * @property {object} metadata - A cloned collection of metadata associated with this document. + * @see {@link lunr.Index~Result} + */ +lunr.MatchData = function (term, field, metadata) { + var clonedMetadata = Object.create(null), + metadataKeys = Object.keys(metadata || {}) + + // Cloning the metadata to prevent the original + // being mutated during match data combination. + // Metadata is kept in an array within the inverted + // index so cloning the data can be done with + // Array#slice + for (var i = 0; i < metadataKeys.length; i++) { + var key = metadataKeys[i] + clonedMetadata[key] = metadata[key].slice() + } + + this.metadata = Object.create(null) + + if (term !== undefined) { + this.metadata[term] = Object.create(null) + this.metadata[term][field] = clonedMetadata + } +} + +/** + * An instance of lunr.MatchData will be created for every term that matches a + * document. However only one instance is required in a lunr.Index~Result. This + * method combines metadata from another instance of lunr.MatchData with this + * objects metadata. + * + * @param {lunr.MatchData} otherMatchData - Another instance of match data to merge with this one. + * @see {@link lunr.Index~Result} + */ +lunr.MatchData.prototype.combine = function (otherMatchData) { + var terms = Object.keys(otherMatchData.metadata) + + for (var i = 0; i < terms.length; i++) { + var term = terms[i], + fields = Object.keys(otherMatchData.metadata[term]) + + if (this.metadata[term] == undefined) { + this.metadata[term] = Object.create(null) + } + + for (var j = 0; j < fields.length; j++) { + var field = fields[j], + keys = Object.keys(otherMatchData.metadata[term][field]) + + if (this.metadata[term][field] == undefined) { + this.metadata[term][field] = Object.create(null) + } + + for (var k = 0; k < keys.length; k++) { + var key = keys[k] + + if (this.metadata[term][field][key] == undefined) { + this.metadata[term][field][key] = otherMatchData.metadata[term][field][key] + } else { + this.metadata[term][field][key] = this.metadata[term][field][key].concat(otherMatchData.metadata[term][field][key]) + } + + } + } + } +} + +/** + * Add metadata for a term/field pair to this instance of match data. + * + * @param {string} term - The term this match data is associated with + * @param {string} field - The field in which the term was found + * @param {object} metadata - The metadata recorded about this term in this field + */ +lunr.MatchData.prototype.add = function (term, field, metadata) { + if (!(term in this.metadata)) { + this.metadata[term] = Object.create(null) + this.metadata[term][field] = metadata + return + } + + if (!(field in this.metadata[term])) { + this.metadata[term][field] = metadata + return + } + + var metadataKeys = Object.keys(metadata) + + for (var i = 0; i < metadataKeys.length; i++) { + var key = metadataKeys[i] + + if (key in this.metadata[term][field]) { + this.metadata[term][field][key] = this.metadata[term][field][key].concat(metadata[key]) + } else { + this.metadata[term][field][key] = metadata[key] + } + } +} +/** + * A lunr.Query provides a programmatic way of defining queries to be performed + * against a {@link lunr.Index}. + * + * Prefer constructing a lunr.Query using the {@link lunr.Index#query} method + * so the query object is pre-initialized with the right index fields. + * + * @constructor + * @property {lunr.Query~Clause[]} clauses - An array of query clauses. + * @property {string[]} allFields - An array of all available fields in a lunr.Index. + */ +lunr.Query = function (allFields) { + this.clauses = [] + this.allFields = allFields +} + +/** + * Constants for indicating what kind of automatic wildcard insertion will be used when constructing a query clause. + * + * This allows wildcards to be added to the beginning and end of a term without having to manually do any string + * concatenation. + * + * The wildcard constants can be bitwise combined to select both leading and trailing wildcards. + * + * @constant + * @default + * @property {number} wildcard.NONE - The term will have no wildcards inserted, this is the default behaviour + * @property {number} wildcard.LEADING - Prepend the term with a wildcard, unless a leading wildcard already exists + * @property {number} wildcard.TRAILING - Append a wildcard to the term, unless a trailing wildcard already exists + * @see lunr.Query~Clause + * @see lunr.Query#clause + * @see lunr.Query#term + * @example query term with trailing wildcard + * query.term('foo', { wildcard: lunr.Query.wildcard.TRAILING }) + * @example query term with leading and trailing wildcard + * query.term('foo', { + * wildcard: lunr.Query.wildcard.LEADING | lunr.Query.wildcard.TRAILING + * }) + */ + +lunr.Query.wildcard = new String ("*") +lunr.Query.wildcard.NONE = 0 +lunr.Query.wildcard.LEADING = 1 +lunr.Query.wildcard.TRAILING = 2 + +/** + * Constants for indicating what kind of presence a term must have in matching documents. + * + * @constant + * @enum {number} + * @see lunr.Query~Clause + * @see lunr.Query#clause + * @see lunr.Query#term + * @example query term with required presence + * query.term('foo', { presence: lunr.Query.presence.REQUIRED }) + */ +lunr.Query.presence = { + /** + * Term's presence in a document is optional, this is the default value. + */ + OPTIONAL: 1, + + /** + * Term's presence in a document is required, documents that do not contain + * this term will not be returned. + */ + REQUIRED: 2, + + /** + * Term's presence in a document is prohibited, documents that do contain + * this term will not be returned. + */ + PROHIBITED: 3 +} + +/** + * A single clause in a {@link lunr.Query} contains a term and details on how to + * match that term against a {@link lunr.Index}. + * + * @typedef {Object} lunr.Query~Clause + * @property {string[]} fields - The fields in an index this clause should be matched against. + * @property {number} [boost=1] - Any boost that should be applied when matching this clause. + * @property {number} [editDistance] - Whether the term should have fuzzy matching applied, and how fuzzy the match should be. + * @property {boolean} [usePipeline] - Whether the term should be passed through the search pipeline. + * @property {number} [wildcard=lunr.Query.wildcard.NONE] - Whether the term should have wildcards appended or prepended. + * @property {number} [presence=lunr.Query.presence.OPTIONAL] - The terms presence in any matching documents. + */ + +/** + * Adds a {@link lunr.Query~Clause} to this query. + * + * Unless the clause contains the fields to be matched all fields will be matched. In addition + * a default boost of 1 is applied to the clause. + * + * @param {lunr.Query~Clause} clause - The clause to add to this query. + * @see lunr.Query~Clause + * @returns {lunr.Query} + */ +lunr.Query.prototype.clause = function (clause) { + if (!('fields' in clause)) { + clause.fields = this.allFields + } + + if (!('boost' in clause)) { + clause.boost = 1 + } + + if (!('usePipeline' in clause)) { + clause.usePipeline = true + } + + if (!('wildcard' in clause)) { + clause.wildcard = lunr.Query.wildcard.NONE + } + + if ((clause.wildcard & lunr.Query.wildcard.LEADING) && (clause.term.charAt(0) != lunr.Query.wildcard)) { + clause.term = "*" + clause.term + } + + if ((clause.wildcard & lunr.Query.wildcard.TRAILING) && (clause.term.slice(-1) != lunr.Query.wildcard)) { + clause.term = "" + clause.term + "*" + } + + if (!('presence' in clause)) { + clause.presence = lunr.Query.presence.OPTIONAL + } + + this.clauses.push(clause) + + return this +} + +/** + * A negated query is one in which every clause has a presence of + * prohibited. These queries require some special processing to return + * the expected results. + * + * @returns boolean + */ +lunr.Query.prototype.isNegated = function () { + for (var i = 0; i < this.clauses.length; i++) { + if (this.clauses[i].presence != lunr.Query.presence.PROHIBITED) { + return false + } + } + + return true +} + +/** + * Adds a term to the current query, under the covers this will create a {@link lunr.Query~Clause} + * to the list of clauses that make up this query. + * + * The term is used as is, i.e. no tokenization will be performed by this method. Instead conversion + * to a token or token-like string should be done before calling this method. + * + * The term will be converted to a string by calling `toString`. Multiple terms can be passed as an + * array, each term in the array will share the same options. + * + * @param {object|object[]} term - The term(s) to add to the query. + * @param {object} [options] - Any additional properties to add to the query clause. + * @returns {lunr.Query} + * @see lunr.Query#clause + * @see lunr.Query~Clause + * @example adding a single term to a query + * query.term("foo") + * @example adding a single term to a query and specifying search fields, term boost and automatic trailing wildcard + * query.term("foo", { + * fields: ["title"], + * boost: 10, + * wildcard: lunr.Query.wildcard.TRAILING + * }) + * @example using lunr.tokenizer to convert a string to tokens before using them as terms + * query.term(lunr.tokenizer("foo bar")) + */ +lunr.Query.prototype.term = function (term, options) { + if (Array.isArray(term)) { + term.forEach(function (t) { this.term(t, lunr.utils.clone(options)) }, this) + return this + } + + var clause = options || {} + clause.term = term.toString() + + this.clause(clause) + + return this +} +lunr.QueryParseError = function (message, start, end) { + this.name = "QueryParseError" + this.message = message + this.start = start + this.end = end +} + +lunr.QueryParseError.prototype = new Error +lunr.QueryLexer = function (str) { + this.lexemes = [] + this.str = str + this.length = str.length + this.pos = 0 + this.start = 0 + this.escapeCharPositions = [] +} + +lunr.QueryLexer.prototype.run = function () { + var state = lunr.QueryLexer.lexText + + while (state) { + state = state(this) + } +} + +lunr.QueryLexer.prototype.sliceString = function () { + var subSlices = [], + sliceStart = this.start, + sliceEnd = this.pos + + for (var i = 0; i < this.escapeCharPositions.length; i++) { + sliceEnd = this.escapeCharPositions[i] + subSlices.push(this.str.slice(sliceStart, sliceEnd)) + sliceStart = sliceEnd + 1 + } + + subSlices.push(this.str.slice(sliceStart, this.pos)) + this.escapeCharPositions.length = 0 + + return subSlices.join('') +} + +lunr.QueryLexer.prototype.emit = function (type) { + this.lexemes.push({ + type: type, + str: this.sliceString(), + start: this.start, + end: this.pos + }) + + this.start = this.pos +} + +lunr.QueryLexer.prototype.escapeCharacter = function () { + this.escapeCharPositions.push(this.pos - 1) + this.pos += 1 +} + +lunr.QueryLexer.prototype.next = function () { + if (this.pos >= this.length) { + return lunr.QueryLexer.EOS + } + + var char = this.str.charAt(this.pos) + this.pos += 1 + return char +} + +lunr.QueryLexer.prototype.width = function () { + return this.pos - this.start +} + +lunr.QueryLexer.prototype.ignore = function () { + if (this.start == this.pos) { + this.pos += 1 + } + + this.start = this.pos +} + +lunr.QueryLexer.prototype.backup = function () { + this.pos -= 1 +} + +lunr.QueryLexer.prototype.acceptDigitRun = function () { + var char, charCode + + do { + char = this.next() + charCode = char.charCodeAt(0) + } while (charCode > 47 && charCode < 58) + + if (char != lunr.QueryLexer.EOS) { + this.backup() + } +} + +lunr.QueryLexer.prototype.more = function () { + return this.pos < this.length +} + +lunr.QueryLexer.EOS = 'EOS' +lunr.QueryLexer.FIELD = 'FIELD' +lunr.QueryLexer.TERM = 'TERM' +lunr.QueryLexer.EDIT_DISTANCE = 'EDIT_DISTANCE' +lunr.QueryLexer.BOOST = 'BOOST' +lunr.QueryLexer.PRESENCE = 'PRESENCE' + +lunr.QueryLexer.lexField = function (lexer) { + lexer.backup() + lexer.emit(lunr.QueryLexer.FIELD) + lexer.ignore() + return lunr.QueryLexer.lexText +} + +lunr.QueryLexer.lexTerm = function (lexer) { + if (lexer.width() > 1) { + lexer.backup() + lexer.emit(lunr.QueryLexer.TERM) + } + + lexer.ignore() + + if (lexer.more()) { + return lunr.QueryLexer.lexText + } +} + +lunr.QueryLexer.lexEditDistance = function (lexer) { + lexer.ignore() + lexer.acceptDigitRun() + lexer.emit(lunr.QueryLexer.EDIT_DISTANCE) + return lunr.QueryLexer.lexText +} + +lunr.QueryLexer.lexBoost = function (lexer) { + lexer.ignore() + lexer.acceptDigitRun() + lexer.emit(lunr.QueryLexer.BOOST) + return lunr.QueryLexer.lexText +} + +lunr.QueryLexer.lexEOS = function (lexer) { + if (lexer.width() > 0) { + lexer.emit(lunr.QueryLexer.TERM) + } +} + +// This matches the separator used when tokenising fields +// within a document. These should match otherwise it is +// not possible to search for some tokens within a document. +// +// It is possible for the user to change the separator on the +// tokenizer so it _might_ clash with any other of the special +// characters already used within the search string, e.g. :. +// +// This means that it is possible to change the separator in +// such a way that makes some words unsearchable using a search +// string. +lunr.QueryLexer.termSeparator = lunr.tokenizer.separator + +lunr.QueryLexer.lexText = function (lexer) { + while (true) { + var char = lexer.next() + + if (char == lunr.QueryLexer.EOS) { + return lunr.QueryLexer.lexEOS + } + + // Escape character is '\' + if (char.charCodeAt(0) == 92) { + lexer.escapeCharacter() + continue + } + + if (char == ":") { + return lunr.QueryLexer.lexField + } + + if (char == "~") { + lexer.backup() + if (lexer.width() > 0) { + lexer.emit(lunr.QueryLexer.TERM) + } + return lunr.QueryLexer.lexEditDistance + } + + if (char == "^") { + lexer.backup() + if (lexer.width() > 0) { + lexer.emit(lunr.QueryLexer.TERM) + } + return lunr.QueryLexer.lexBoost + } + + // "+" indicates term presence is required + // checking for length to ensure that only + // leading "+" are considered + if (char == "+" && lexer.width() === 1) { + lexer.emit(lunr.QueryLexer.PRESENCE) + return lunr.QueryLexer.lexText + } + + // "-" indicates term presence is prohibited + // checking for length to ensure that only + // leading "-" are considered + if (char == "-" && lexer.width() === 1) { + lexer.emit(lunr.QueryLexer.PRESENCE) + return lunr.QueryLexer.lexText + } + + if (char.match(lunr.QueryLexer.termSeparator)) { + return lunr.QueryLexer.lexTerm + } + } +} + +lunr.QueryParser = function (str, query) { + this.lexer = new lunr.QueryLexer (str) + this.query = query + this.currentClause = {} + this.lexemeIdx = 0 +} + +lunr.QueryParser.prototype.parse = function () { + this.lexer.run() + this.lexemes = this.lexer.lexemes + + var state = lunr.QueryParser.parseClause + + while (state) { + state = state(this) + } + + return this.query +} + +lunr.QueryParser.prototype.peekLexeme = function () { + return this.lexemes[this.lexemeIdx] +} + +lunr.QueryParser.prototype.consumeLexeme = function () { + var lexeme = this.peekLexeme() + this.lexemeIdx += 1 + return lexeme +} + +lunr.QueryParser.prototype.nextClause = function () { + var completedClause = this.currentClause + this.query.clause(completedClause) + this.currentClause = {} +} + +lunr.QueryParser.parseClause = function (parser) { + var lexeme = parser.peekLexeme() + + if (lexeme == undefined) { + return + } + + switch (lexeme.type) { + case lunr.QueryLexer.PRESENCE: + return lunr.QueryParser.parsePresence + case lunr.QueryLexer.FIELD: + return lunr.QueryParser.parseField + case lunr.QueryLexer.TERM: + return lunr.QueryParser.parseTerm + default: + var errorMessage = "expected either a field or a term, found " + lexeme.type + + if (lexeme.str.length >= 1) { + errorMessage += " with value '" + lexeme.str + "'" + } + + throw new lunr.QueryParseError (errorMessage, lexeme.start, lexeme.end) + } +} + +lunr.QueryParser.parsePresence = function (parser) { + var lexeme = parser.consumeLexeme() + + if (lexeme == undefined) { + return + } + + switch (lexeme.str) { + case "-": + parser.currentClause.presence = lunr.Query.presence.PROHIBITED + break + case "+": + parser.currentClause.presence = lunr.Query.presence.REQUIRED + break + default: + var errorMessage = "unrecognised presence operator'" + lexeme.str + "'" + throw new lunr.QueryParseError (errorMessage, lexeme.start, lexeme.end) + } + + var nextLexeme = parser.peekLexeme() + + if (nextLexeme == undefined) { + var errorMessage = "expecting term or field, found nothing" + throw new lunr.QueryParseError (errorMessage, lexeme.start, lexeme.end) + } + + switch (nextLexeme.type) { + case lunr.QueryLexer.FIELD: + return lunr.QueryParser.parseField + case lunr.QueryLexer.TERM: + return lunr.QueryParser.parseTerm + default: + var errorMessage = "expecting term or field, found '" + nextLexeme.type + "'" + throw new lunr.QueryParseError (errorMessage, nextLexeme.start, nextLexeme.end) + } +} + +lunr.QueryParser.parseField = function (parser) { + var lexeme = parser.consumeLexeme() + + if (lexeme == undefined) { + return + } + + if (parser.query.allFields.indexOf(lexeme.str) == -1) { + var possibleFields = parser.query.allFields.map(function (f) { return "'" + f + "'" }).join(', '), + errorMessage = "unrecognised field '" + lexeme.str + "', possible fields: " + possibleFields + + throw new lunr.QueryParseError (errorMessage, lexeme.start, lexeme.end) + } + + parser.currentClause.fields = [lexeme.str] + + var nextLexeme = parser.peekLexeme() + + if (nextLexeme == undefined) { + var errorMessage = "expecting term, found nothing" + throw new lunr.QueryParseError (errorMessage, lexeme.start, lexeme.end) + } + + switch (nextLexeme.type) { + case lunr.QueryLexer.TERM: + return lunr.QueryParser.parseTerm + default: + var errorMessage = "expecting term, found '" + nextLexeme.type + "'" + throw new lunr.QueryParseError (errorMessage, nextLexeme.start, nextLexeme.end) + } +} + +lunr.QueryParser.parseTerm = function (parser) { + var lexeme = parser.consumeLexeme() + + if (lexeme == undefined) { + return + } + + parser.currentClause.term = lexeme.str.toLowerCase() + + if (lexeme.str.indexOf("*") != -1) { + parser.currentClause.usePipeline = false + } + + var nextLexeme = parser.peekLexeme() + + if (nextLexeme == undefined) { + parser.nextClause() + return + } + + switch (nextLexeme.type) { + case lunr.QueryLexer.TERM: + parser.nextClause() + return lunr.QueryParser.parseTerm + case lunr.QueryLexer.FIELD: + parser.nextClause() + return lunr.QueryParser.parseField + case lunr.QueryLexer.EDIT_DISTANCE: + return lunr.QueryParser.parseEditDistance + case lunr.QueryLexer.BOOST: + return lunr.QueryParser.parseBoost + case lunr.QueryLexer.PRESENCE: + parser.nextClause() + return lunr.QueryParser.parsePresence + default: + var errorMessage = "Unexpected lexeme type '" + nextLexeme.type + "'" + throw new lunr.QueryParseError (errorMessage, nextLexeme.start, nextLexeme.end) + } +} + +lunr.QueryParser.parseEditDistance = function (parser) { + var lexeme = parser.consumeLexeme() + + if (lexeme == undefined) { + return + } + + var editDistance = parseInt(lexeme.str, 10) + + if (isNaN(editDistance)) { + var errorMessage = "edit distance must be numeric" + throw new lunr.QueryParseError (errorMessage, lexeme.start, lexeme.end) + } + + parser.currentClause.editDistance = editDistance + + var nextLexeme = parser.peekLexeme() + + if (nextLexeme == undefined) { + parser.nextClause() + return + } + + switch (nextLexeme.type) { + case lunr.QueryLexer.TERM: + parser.nextClause() + return lunr.QueryParser.parseTerm + case lunr.QueryLexer.FIELD: + parser.nextClause() + return lunr.QueryParser.parseField + case lunr.QueryLexer.EDIT_DISTANCE: + return lunr.QueryParser.parseEditDistance + case lunr.QueryLexer.BOOST: + return lunr.QueryParser.parseBoost + case lunr.QueryLexer.PRESENCE: + parser.nextClause() + return lunr.QueryParser.parsePresence + default: + var errorMessage = "Unexpected lexeme type '" + nextLexeme.type + "'" + throw new lunr.QueryParseError (errorMessage, nextLexeme.start, nextLexeme.end) + } +} + +lunr.QueryParser.parseBoost = function (parser) { + var lexeme = parser.consumeLexeme() + + if (lexeme == undefined) { + return + } + + var boost = parseInt(lexeme.str, 10) + + if (isNaN(boost)) { + var errorMessage = "boost must be numeric" + throw new lunr.QueryParseError (errorMessage, lexeme.start, lexeme.end) + } + + parser.currentClause.boost = boost + + var nextLexeme = parser.peekLexeme() + + if (nextLexeme == undefined) { + parser.nextClause() + return + } + + switch (nextLexeme.type) { + case lunr.QueryLexer.TERM: + parser.nextClause() + return lunr.QueryParser.parseTerm + case lunr.QueryLexer.FIELD: + parser.nextClause() + return lunr.QueryParser.parseField + case lunr.QueryLexer.EDIT_DISTANCE: + return lunr.QueryParser.parseEditDistance + case lunr.QueryLexer.BOOST: + return lunr.QueryParser.parseBoost + case lunr.QueryLexer.PRESENCE: + parser.nextClause() + return lunr.QueryParser.parsePresence + default: + var errorMessage = "Unexpected lexeme type '" + nextLexeme.type + "'" + throw new lunr.QueryParseError (errorMessage, nextLexeme.start, nextLexeme.end) + } +} + + /** + * export the module via AMD, CommonJS or as a browser global + * Export code from https://github.com/umdjs/umd/blob/master/returnExports.js + */ + ;(function (root, factory) { + if (typeof define === 'function' && define.amd) { + // AMD. Register as an anonymous module. + define(factory) + } else if (typeof exports === 'object') { + /** + * Node. Does not work with strict CommonJS, but + * only CommonJS-like enviroments that support module.exports, + * like Node. + */ + module.exports = factory() + } else { + // Browser globals (root is window) + root.lunr = factory() + } + }(this, function () { + /** + * Just return a value to define the module export. + * This example returns an object, but the module + * can return a function as the exported value. + */ + return lunr + })) +})(); diff --git a/assets/js/lunr/lunr.min.js b/assets/js/lunr/lunr.min.js new file mode 100644 index 00000000..cdc94cd3 --- /dev/null +++ b/assets/js/lunr/lunr.min.js @@ -0,0 +1,6 @@ +/** + * lunr - http://lunrjs.com - A bit like Solr, but much smaller and not as bright - 2.3.9 + * Copyright (C) 2020 Oliver Nightingale + * @license MIT + */ +!function(){var e=function(t){var r=new e.Builder;return r.pipeline.add(e.trimmer,e.stopWordFilter,e.stemmer),r.searchPipeline.add(e.stemmer),t.call(r,r),r.build()};e.version="2.3.9",e.utils={},e.utils.warn=function(e){return function(t){e.console&&console.warn&&console.warn(t)}}(this),e.utils.asString=function(e){return void 0===e||null===e?"":e.toString()},e.utils.clone=function(e){if(null===e||void 0===e)return e;for(var t=Object.create(null),r=Object.keys(e),i=0;i0){var c=e.utils.clone(r)||{};c.position=[a,l],c.index=s.length,s.push(new e.Token(i.slice(a,o),c))}a=o+1}}return s},e.tokenizer.separator=/[\s\-]+/,e.Pipeline=function(){this._stack=[]},e.Pipeline.registeredFunctions=Object.create(null),e.Pipeline.registerFunction=function(t,r){r in this.registeredFunctions&&e.utils.warn("Overwriting existing registered function: "+r),t.label=r,e.Pipeline.registeredFunctions[t.label]=t},e.Pipeline.warnIfFunctionNotRegistered=function(t){var r=t.label&&t.label in this.registeredFunctions;r||e.utils.warn("Function is not registered with pipeline. This may cause problems when serialising the index.\n",t)},e.Pipeline.load=function(t){var r=new e.Pipeline;return t.forEach(function(t){var i=e.Pipeline.registeredFunctions[t];if(!i)throw new Error("Cannot load unregistered function: "+t);r.add(i)}),r},e.Pipeline.prototype.add=function(){var t=Array.prototype.slice.call(arguments);t.forEach(function(t){e.Pipeline.warnIfFunctionNotRegistered(t),this._stack.push(t)},this)},e.Pipeline.prototype.after=function(t,r){e.Pipeline.warnIfFunctionNotRegistered(r);var i=this._stack.indexOf(t);if(i==-1)throw new Error("Cannot find existingFn");i+=1,this._stack.splice(i,0,r)},e.Pipeline.prototype.before=function(t,r){e.Pipeline.warnIfFunctionNotRegistered(r);var i=this._stack.indexOf(t);if(i==-1)throw new Error("Cannot find existingFn");this._stack.splice(i,0,r)},e.Pipeline.prototype.remove=function(e){var t=this._stack.indexOf(e);t!=-1&&this._stack.splice(t,1)},e.Pipeline.prototype.run=function(e){for(var t=this._stack.length,r=0;r1&&(se&&(r=n),s!=e);)i=r-t,n=t+Math.floor(i/2),s=this.elements[2*n];return s==e?2*n:s>e?2*n:sa?l+=2:o==a&&(t+=r[u+1]*i[l+1],u+=2,l+=2);return t},e.Vector.prototype.similarity=function(e){return this.dot(e)/this.magnitude()||0},e.Vector.prototype.toArray=function(){for(var e=new Array(this.elements.length/2),t=1,r=0;t0){var o,a=s.str.charAt(0);a in s.node.edges?o=s.node.edges[a]:(o=new e.TokenSet,s.node.edges[a]=o),1==s.str.length&&(o["final"]=!0),n.push({node:o,editsRemaining:s.editsRemaining,str:s.str.slice(1)})}if(0!=s.editsRemaining){if("*"in s.node.edges)var u=s.node.edges["*"];else{var u=new e.TokenSet;s.node.edges["*"]=u}if(0==s.str.length&&(u["final"]=!0),n.push({node:u,editsRemaining:s.editsRemaining-1,str:s.str}),s.str.length>1&&n.push({node:s.node,editsRemaining:s.editsRemaining-1,str:s.str.slice(1)}),1==s.str.length&&(s.node["final"]=!0),s.str.length>=1){if("*"in s.node.edges)var l=s.node.edges["*"];else{var l=new e.TokenSet;s.node.edges["*"]=l}1==s.str.length&&(l["final"]=!0),n.push({node:l,editsRemaining:s.editsRemaining-1,str:s.str.slice(1)})}if(s.str.length>1){var c,h=s.str.charAt(0),d=s.str.charAt(1);d in s.node.edges?c=s.node.edges[d]:(c=new e.TokenSet,s.node.edges[d]=c),1==s.str.length&&(c["final"]=!0),n.push({node:c,editsRemaining:s.editsRemaining-1,str:h+s.str.slice(2)})}}}return i},e.TokenSet.fromString=function(t){for(var r=new e.TokenSet,i=r,n=0,s=t.length;n=e;t--){var r=this.uncheckedNodes[t],i=r.child.toString();i in this.minimizedNodes?r.parent.edges[r["char"]]=this.minimizedNodes[i]:(r.child._str=i,this.minimizedNodes[i]=r.child),this.uncheckedNodes.pop()}},e.Index=function(e){this.invertedIndex=e.invertedIndex,this.fieldVectors=e.fieldVectors,this.tokenSet=e.tokenSet,this.fields=e.fields,this.pipeline=e.pipeline},e.Index.prototype.search=function(t){return this.query(function(r){var i=new e.QueryParser(t,r);i.parse()})},e.Index.prototype.query=function(t){for(var r=new e.Query(this.fields),i=Object.create(null),n=Object.create(null),s=Object.create(null),o=Object.create(null),a=Object.create(null),u=0;u1?this._b=1:this._b=e},e.Builder.prototype.k1=function(e){this._k1=e},e.Builder.prototype.add=function(t,r){var i=t[this._ref],n=Object.keys(this._fields);this._documents[i]=r||{},this.documentCount+=1;for(var s=0;s=this.length)return e.QueryLexer.EOS;var t=this.str.charAt(this.pos);return this.pos+=1,t},e.QueryLexer.prototype.width=function(){return this.pos-this.start},e.QueryLexer.prototype.ignore=function(){this.start==this.pos&&(this.pos+=1),this.start=this.pos},e.QueryLexer.prototype.backup=function(){this.pos-=1},e.QueryLexer.prototype.acceptDigitRun=function(){var t,r;do t=this.next(),r=t.charCodeAt(0);while(r>47&&r<58);t!=e.QueryLexer.EOS&&this.backup()},e.QueryLexer.prototype.more=function(){return this.pos1&&(t.backup(),t.emit(e.QueryLexer.TERM)),t.ignore(),t.more())return e.QueryLexer.lexText},e.QueryLexer.lexEditDistance=function(t){return t.ignore(),t.acceptDigitRun(),t.emit(e.QueryLexer.EDIT_DISTANCE),e.QueryLexer.lexText},e.QueryLexer.lexBoost=function(t){return t.ignore(),t.acceptDigitRun(),t.emit(e.QueryLexer.BOOST),e.QueryLexer.lexText},e.QueryLexer.lexEOS=function(t){t.width()>0&&t.emit(e.QueryLexer.TERM)},e.QueryLexer.termSeparator=e.tokenizer.separator,e.QueryLexer.lexText=function(t){for(;;){var r=t.next();if(r==e.QueryLexer.EOS)return e.QueryLexer.lexEOS;if(92!=r.charCodeAt(0)){if(":"==r)return e.QueryLexer.lexField;if("~"==r)return t.backup(),t.width()>0&&t.emit(e.QueryLexer.TERM),e.QueryLexer.lexEditDistance;if("^"==r)return t.backup(),t.width()>0&&t.emit(e.QueryLexer.TERM),e.QueryLexer.lexBoost;if("+"==r&&1===t.width())return t.emit(e.QueryLexer.PRESENCE),e.QueryLexer.lexText;if("-"==r&&1===t.width())return t.emit(e.QueryLexer.PRESENCE),e.QueryLexer.lexText;if(r.match(e.QueryLexer.termSeparator))return e.QueryLexer.lexTerm}else t.escapeCharacter()}},e.QueryParser=function(t,r){this.lexer=new e.QueryLexer(t),this.query=r,this.currentClause={},this.lexemeIdx=0},e.QueryParser.prototype.parse=function(){this.lexer.run(),this.lexemes=this.lexer.lexemes;for(var t=e.QueryParser.parseClause;t;)t=t(this);return this.query},e.QueryParser.prototype.peekLexeme=function(){return this.lexemes[this.lexemeIdx]},e.QueryParser.prototype.consumeLexeme=function(){var e=this.peekLexeme();return this.lexemeIdx+=1,e},e.QueryParser.prototype.nextClause=function(){var e=this.currentClause;this.query.clause(e),this.currentClause={}},e.QueryParser.parseClause=function(t){var r=t.peekLexeme();if(void 0!=r)switch(r.type){case e.QueryLexer.PRESENCE:return e.QueryParser.parsePresence;case e.QueryLexer.FIELD:return e.QueryParser.parseField;case e.QueryLexer.TERM:return e.QueryParser.parseTerm;default:var i="expected either a field or a term, found "+r.type;throw r.str.length>=1&&(i+=" with value '"+r.str+"'"),new e.QueryParseError(i,r.start,r.end)}},e.QueryParser.parsePresence=function(t){var r=t.consumeLexeme();if(void 0!=r){switch(r.str){case"-":t.currentClause.presence=e.Query.presence.PROHIBITED;break;case"+":t.currentClause.presence=e.Query.presence.REQUIRED;break;default:var i="unrecognised presence operator'"+r.str+"'";throw new e.QueryParseError(i,r.start,r.end)}var n=t.peekLexeme();if(void 0==n){var i="expecting term or field, found nothing";throw new e.QueryParseError(i,r.start,r.end)}switch(n.type){case e.QueryLexer.FIELD:return e.QueryParser.parseField;case e.QueryLexer.TERM:return e.QueryParser.parseTerm;default:var i="expecting term or field, found '"+n.type+"'";throw new e.QueryParseError(i,n.start,n.end)}}},e.QueryParser.parseField=function(t){var r=t.consumeLexeme();if(void 0!=r){if(t.query.allFields.indexOf(r.str)==-1){var i=t.query.allFields.map(function(e){return"'"+e+"'"}).join(", "),n="unrecognised field '"+r.str+"', possible fields: "+i;throw new e.QueryParseError(n,r.start,r.end)}t.currentClause.fields=[r.str];var s=t.peekLexeme();if(void 0==s){var n="expecting term, found nothing";throw new e.QueryParseError(n,r.start,r.end)}switch(s.type){case e.QueryLexer.TERM:return e.QueryParser.parseTerm;default:var n="expecting term, found '"+s.type+"'";throw new e.QueryParseError(n,s.start,s.end)}}},e.QueryParser.parseTerm=function(t){var r=t.consumeLexeme();if(void 0!=r){t.currentClause.term=r.str.toLowerCase(),r.str.indexOf("*")!=-1&&(t.currentClause.usePipeline=!1);var i=t.peekLexeme();if(void 0==i)return void t.nextClause();switch(i.type){case e.QueryLexer.TERM:return t.nextClause(),e.QueryParser.parseTerm;case e.QueryLexer.FIELD:return t.nextClause(),e.QueryParser.parseField;case e.QueryLexer.EDIT_DISTANCE:return e.QueryParser.parseEditDistance;case e.QueryLexer.BOOST:return e.QueryParser.parseBoost;case e.QueryLexer.PRESENCE:return t.nextClause(),e.QueryParser.parsePresence;default:var n="Unexpected lexeme type '"+i.type+"'";throw new e.QueryParseError(n,i.start,i.end)}}},e.QueryParser.parseEditDistance=function(t){var r=t.consumeLexeme();if(void 0!=r){var i=parseInt(r.str,10);if(isNaN(i)){var n="edit distance must be numeric";throw new e.QueryParseError(n,r.start,r.end)}t.currentClause.editDistance=i;var s=t.peekLexeme();if(void 0==s)return void t.nextClause();switch(s.type){case e.QueryLexer.TERM:return t.nextClause(),e.QueryParser.parseTerm;case e.QueryLexer.FIELD:return t.nextClause(),e.QueryParser.parseField;case e.QueryLexer.EDIT_DISTANCE:return e.QueryParser.parseEditDistance;case e.QueryLexer.BOOST:return e.QueryParser.parseBoost;case e.QueryLexer.PRESENCE:return t.nextClause(),e.QueryParser.parsePresence;default:var n="Unexpected lexeme type '"+s.type+"'";throw new e.QueryParseError(n,s.start,s.end)}}},e.QueryParser.parseBoost=function(t){var r=t.consumeLexeme();if(void 0!=r){var i=parseInt(r.str,10);if(isNaN(i)){var n="boost must be numeric";throw new e.QueryParseError(n,r.start,r.end)}t.currentClause.boost=i;var s=t.peekLexeme();if(void 0==s)return void t.nextClause();switch(s.type){case e.QueryLexer.TERM:return t.nextClause(),e.QueryParser.parseTerm;case e.QueryLexer.FIELD:return t.nextClause(),e.QueryParser.parseField;case e.QueryLexer.EDIT_DISTANCE:return e.QueryParser.parseEditDistance;case e.QueryLexer.BOOST:return e.QueryParser.parseBoost;case e.QueryLexer.PRESENCE:return t.nextClause(),e.QueryParser.parsePresence;default:var n="Unexpected lexeme type '"+s.type+"'";throw new e.QueryParseError(n,s.start,s.end)}}},function(e,t){"function"==typeof define&&define.amd?define(t):"object"==typeof exports?module.exports=t():e.lunr=t()}(this,function(){return e})}(); diff --git a/assets/js/main.min.js b/assets/js/main.min.js new file mode 100644 index 00000000..6b5bb602 --- /dev/null +++ b/assets/js/main.min.js @@ -0,0 +1,6 @@ +/*! + * Minimal Mistakes Jekyll Theme 4.24.0 by Michael Rose + * Copyright 2013-2021 Michael Rose - mademistakes.com | @mmistakes + * Licensed under MIT + */ +!function(e,t){"use strict";"object"==typeof module&&"object"==typeof module.exports?module.exports=e.document?t(e,!0):function(e){if(!e.document)throw new Error("jQuery requires a window with a document");return t(e)}:t(e)}("undefined"!=typeof window?window:this,function(C,e){"use strict";function m(e){return null!=e&&e===e.window}var t=[],n=Object.getPrototypeOf,s=t.slice,g=t.flat?function(e){return t.flat.call(e)}:function(e){return t.concat.apply([],e)},l=t.push,o=t.indexOf,r={},i=r.toString,v=r.hasOwnProperty,a=v.toString,u=a.call(Object),y={},b=function(e){return"function"==typeof e&&"number"!=typeof e.nodeType&&"function"!=typeof e.item},T=C.document,c={type:!0,src:!0,nonce:!0,noModule:!0};function x(e,t,n){var r,o,i=(n=n||T).createElement("script");if(i.text=e,t)for(r in c)(o=t[r]||t.getAttribute&&t.getAttribute(r))&&i.setAttribute(r,o);n.head.appendChild(i).parentNode.removeChild(i)}function h(e){return null==e?e+"":"object"==typeof e||"function"==typeof e?r[i.call(e)]||"object":typeof e}var f="3.6.0",E=function(e,t){return new E.fn.init(e,t)};function d(e){var t=!!e&&"length"in e&&e.length,n=h(e);return!b(e)&&!m(e)&&("array"===n||0===t||"number"==typeof t&&0>10|55296,1023&e|56320))}function r(){C()}var e,d,x,i,o,p,h,m,w,l,u,C,T,a,E,g,s,c,v,S="sizzle"+ +new Date,y=n.document,k=0,b=0,A=le(),N=le(),j=le(),I=le(),L=function(e,t){return e===t&&(u=!0),0},D={}.hasOwnProperty,t=[],O=t.pop,H=t.push,P=t.push,q=t.slice,M=function(e,t){for(var n=0,r=e.length;n+~]|"+$+")"+$+"*"),Q=new RegExp($+"|>"),Y=new RegExp(F),V=new RegExp("^"+R+"$"),G={ID:new RegExp("^#("+R+")"),CLASS:new RegExp("^\\.("+R+")"),TAG:new RegExp("^("+R+"|[*])"),ATTR:new RegExp("^"+B),PSEUDO:new RegExp("^"+F),CHILD:new RegExp("^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\("+$+"*(even|odd|(([+-]|)(\\d*)n|)"+$+"*(?:([+-]|)"+$+"*(\\d+)|))"+$+"*\\)|)","i"),bool:new RegExp("^(?:"+_+")$","i"),needsContext:new RegExp("^"+$+"*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\("+$+"*((?:-\\d)?\\d*)"+$+"*\\)|)(?=[^-]|$)","i")},K=/HTML$/i,Z=/^(?:input|select|textarea|button)$/i,J=/^h\d$/i,ee=/^[^{]+\{\s*\[native \w/,te=/^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/,ne=/[+~]/,re=new RegExp("\\\\[\\da-fA-F]{1,6}"+$+"?|\\\\([^\\r\\n\\f])","g"),oe=/([\0-\x1f\x7f]|^-?\d)|^-$|[^\0-\x1f\x7f-\uFFFF\w-]/g,ie=function(e,t){return t?"\0"===e?"�":e.slice(0,-1)+"\\"+e.charCodeAt(e.length-1).toString(16)+" ":"\\"+e},ae=ye(function(e){return!0===e.disabled&&"fieldset"===e.nodeName.toLowerCase()},{dir:"parentNode",next:"legend"});try{P.apply(t=q.call(y.childNodes),y.childNodes),t[y.childNodes.length].nodeType}catch(e){P={apply:t.length?function(e,t){H.apply(e,q.call(t))}:function(e,t){for(var n=e.length,r=0;e[n++]=t[r++];);e.length=n-1}}}function se(t,e,n,r){var o,i,a,s,l,u,c=e&&e.ownerDocument,f=e?e.nodeType:9;if(n=n||[],"string"!=typeof t||!t||1!==f&&9!==f&&11!==f)return n;if(!r&&(C(e),e=e||T,E)){if(11!==f&&(s=te.exec(t)))if(u=s[1]){if(9===f){if(!(i=e.getElementById(u)))return n;if(i.id===u)return n.push(i),n}else if(c&&(i=c.getElementById(u))&&v(e,i)&&i.id===u)return n.push(i),n}else{if(s[2])return P.apply(n,e.getElementsByTagName(t)),n;if((u=s[3])&&d.getElementsByClassName&&e.getElementsByClassName)return P.apply(n,e.getElementsByClassName(u)),n}if(d.qsa&&!I[t+" "]&&(!g||!g.test(t))&&(1!==f||"object"!==e.nodeName.toLowerCase())){if(u=t,c=e,1===f&&(Q.test(t)||X.test(t))){for((c=ne.test(t)&&me(e.parentNode)||e)===e&&d.scope||((a=e.getAttribute("id"))?a=a.replace(oe,ie):e.setAttribute("id",a=S)),o=(l=p(t)).length;o--;)l[o]=(a?"#"+a:":scope")+" "+ve(l[o]);u=l.join(",")}try{return P.apply(n,c.querySelectorAll(u)),n}catch(e){I(t,!0)}finally{a===S&&e.removeAttribute("id")}}}return m(t.replace(z,"$1"),e,n,r)}function le(){var n=[];function r(e,t){return n.push(e+" ")>x.cacheLength&&delete r[n.shift()],r[e+" "]=t}return r}function ue(e){return e[S]=!0,e}function ce(e){var t=T.createElement("fieldset");try{return!!e(t)}catch(e){return!1}finally{t.parentNode&&t.parentNode.removeChild(t),t=null}}function fe(e,t){for(var n=e.split("|"),r=n.length;r--;)x.attrHandle[n[r]]=t}function de(e,t){var n=t&&e,r=n&&1===e.nodeType&&1===t.nodeType&&e.sourceIndex-t.sourceIndex;if(r)return r;if(n)for(;n=n.nextSibling;)if(n===t)return-1;return e?1:-1}function pe(t){return function(e){return"form"in e?e.parentNode&&!1===e.disabled?"label"in e?"label"in e.parentNode?e.parentNode.disabled===t:e.disabled===t:e.isDisabled===t||e.isDisabled!==!t&&ae(e)===t:e.disabled===t:"label"in e&&e.disabled===t}}function he(a){return ue(function(i){return i=+i,ue(function(e,t){for(var n,r=a([],e.length,i),o=r.length;o--;)e[n=r[o]]&&(e[n]=!(t[n]=e[n]))})})}function me(e){return e&&void 0!==e.getElementsByTagName&&e}for(e in d=se.support={},o=se.isXML=function(e){var t=e&&e.namespaceURI,e=e&&(e.ownerDocument||e).documentElement;return!K.test(t||e&&e.nodeName||"HTML")},C=se.setDocument=function(e){var t,e=e?e.ownerDocument||e:y;return e!=T&&9===e.nodeType&&e.documentElement&&(a=(T=e).documentElement,E=!o(T),y!=T&&(t=T.defaultView)&&t.top!==t&&(t.addEventListener?t.addEventListener("unload",r,!1):t.attachEvent&&t.attachEvent("onunload",r)),d.scope=ce(function(e){return a.appendChild(e).appendChild(T.createElement("div")),void 0!==e.querySelectorAll&&!e.querySelectorAll(":scope fieldset div").length}),d.attributes=ce(function(e){return e.className="i",!e.getAttribute("className")}),d.getElementsByTagName=ce(function(e){return e.appendChild(T.createComment("")),!e.getElementsByTagName("*").length}),d.getElementsByClassName=ee.test(T.getElementsByClassName),d.getById=ce(function(e){return a.appendChild(e).id=S,!T.getElementsByName||!T.getElementsByName(S).length}),d.getById?(x.filter.ID=function(e){var t=e.replace(re,f);return function(e){return e.getAttribute("id")===t}},x.find.ID=function(e,t){if(void 0!==t.getElementById&&E){e=t.getElementById(e);return e?[e]:[]}}):(x.filter.ID=function(e){var t=e.replace(re,f);return function(e){e=void 0!==e.getAttributeNode&&e.getAttributeNode("id");return e&&e.value===t}},x.find.ID=function(e,t){if(void 0!==t.getElementById&&E){var n,r,o,i=t.getElementById(e);if(i){if((n=i.getAttributeNode("id"))&&n.value===e)return[i];for(o=t.getElementsByName(e),r=0;i=o[r++];)if((n=i.getAttributeNode("id"))&&n.value===e)return[i]}return[]}}),x.find.TAG=d.getElementsByTagName?function(e,t){return void 0!==t.getElementsByTagName?t.getElementsByTagName(e):d.qsa?t.querySelectorAll(e):void 0}:function(e,t){var n,r=[],o=0,i=t.getElementsByTagName(e);if("*"!==e)return i;for(;n=i[o++];)1===n.nodeType&&r.push(n);return r},x.find.CLASS=d.getElementsByClassName&&function(e,t){if(void 0!==t.getElementsByClassName&&E)return t.getElementsByClassName(e)},s=[],g=[],(d.qsa=ee.test(T.querySelectorAll))&&(ce(function(e){var t;a.appendChild(e).innerHTML="",e.querySelectorAll("[msallowcapture^='']").length&&g.push("[*^$]="+$+"*(?:''|\"\")"),e.querySelectorAll("[selected]").length||g.push("\\["+$+"*(?:value|"+_+")"),e.querySelectorAll("[id~="+S+"-]").length||g.push("~="),(t=T.createElement("input")).setAttribute("name",""),e.appendChild(t),e.querySelectorAll("[name='']").length||g.push("\\["+$+"*name"+$+"*="+$+"*(?:''|\"\")"),e.querySelectorAll(":checked").length||g.push(":checked"),e.querySelectorAll("a#"+S+"+*").length||g.push(".#.+[+~]"),e.querySelectorAll("\\\f"),g.push("[\\r\\n\\f]")}),ce(function(e){e.innerHTML="";var t=T.createElement("input");t.setAttribute("type","hidden"),e.appendChild(t).setAttribute("name","D"),e.querySelectorAll("[name=d]").length&&g.push("name"+$+"*[*^$|!~]?="),2!==e.querySelectorAll(":enabled").length&&g.push(":enabled",":disabled"),a.appendChild(e).disabled=!0,2!==e.querySelectorAll(":disabled").length&&g.push(":enabled",":disabled"),e.querySelectorAll("*,:x"),g.push(",.*:")})),(d.matchesSelector=ee.test(c=a.matches||a.webkitMatchesSelector||a.mozMatchesSelector||a.oMatchesSelector||a.msMatchesSelector))&&ce(function(e){d.disconnectedMatch=c.call(e,"*"),c.call(e,"[s!='']:x"),s.push("!=",F)}),g=g.length&&new RegExp(g.join("|")),s=s.length&&new RegExp(s.join("|")),t=ee.test(a.compareDocumentPosition),v=t||ee.test(a.contains)?function(e,t){var n=9===e.nodeType?e.documentElement:e,t=t&&t.parentNode;return e===t||!(!t||1!==t.nodeType||!(n.contains?n.contains(t):e.compareDocumentPosition&&16&e.compareDocumentPosition(t)))}:function(e,t){if(t)for(;t=t.parentNode;)if(t===e)return!0;return!1},L=t?function(e,t){if(e===t)return u=!0,0;var n=!e.compareDocumentPosition-!t.compareDocumentPosition;return n||(1&(n=(e.ownerDocument||e)==(t.ownerDocument||t)?e.compareDocumentPosition(t):1)||!d.sortDetached&&t.compareDocumentPosition(e)===n?e==T||e.ownerDocument==y&&v(y,e)?-1:t==T||t.ownerDocument==y&&v(y,t)?1:l?M(l,e)-M(l,t):0:4&n?-1:1)}:function(e,t){if(e===t)return u=!0,0;var n,r=0,o=e.parentNode,i=t.parentNode,a=[e],s=[t];if(!o||!i)return e==T?-1:t==T?1:o?-1:i?1:l?M(l,e)-M(l,t):0;if(o===i)return de(e,t);for(n=e;n=n.parentNode;)a.unshift(n);for(n=t;n=n.parentNode;)s.unshift(n);for(;a[r]===s[r];)r++;return r?de(a[r],s[r]):a[r]==y?-1:s[r]==y?1:0}),T},se.matches=function(e,t){return se(e,null,null,t)},se.matchesSelector=function(e,t){if(C(e),d.matchesSelector&&E&&!I[t+" "]&&(!s||!s.test(t))&&(!g||!g.test(t)))try{var n=c.call(e,t);if(n||d.disconnectedMatch||e.document&&11!==e.document.nodeType)return n}catch(e){I(t,!0)}return 0":{dir:"parentNode",first:!0}," ":{dir:"parentNode"},"+":{dir:"previousSibling",first:!0},"~":{dir:"previousSibling"}},preFilter:{ATTR:function(e){return e[1]=e[1].replace(re,f),e[3]=(e[3]||e[4]||e[5]||"").replace(re,f),"~="===e[2]&&(e[3]=" "+e[3]+" "),e.slice(0,4)},CHILD:function(e){return e[1]=e[1].toLowerCase(),"nth"===e[1].slice(0,3)?(e[3]||se.error(e[0]),e[4]=+(e[4]?e[5]+(e[6]||1):2*("even"===e[3]||"odd"===e[3])),e[5]=+(e[7]+e[8]||"odd"===e[3])):e[3]&&se.error(e[0]),e},PSEUDO:function(e){var t,n=!e[6]&&e[2];return G.CHILD.test(e[0])?null:(e[3]?e[2]=e[4]||e[5]||"":n&&Y.test(n)&&(t=p(n,!0))&&(t=n.indexOf(")",n.length-t)-n.length)&&(e[0]=e[0].slice(0,t),e[2]=n.slice(0,t)),e.slice(0,3))}},filter:{TAG:function(e){var t=e.replace(re,f).toLowerCase();return"*"===e?function(){return!0}:function(e){return e.nodeName&&e.nodeName.toLowerCase()===t}},CLASS:function(e){var t=A[e+" "];return t||(t=new RegExp("(^|"+$+")"+e+"("+$+"|$)"))&&A(e,function(e){return t.test("string"==typeof e.className&&e.className||void 0!==e.getAttribute&&e.getAttribute("class")||"")})},ATTR:function(t,n,r){return function(e){e=se.attr(e,t);return null==e?"!="===n:!n||(e+="","="===n?e===r:"!="===n?e!==r:"^="===n?r&&0===e.indexOf(r):"*="===n?r&&-1:\x20\t\r\n\f]*)[\x20\t\r\n\f]*\/?>(?:<\/\1>|)$/i;function j(e,n,r){return b(n)?E.grep(e,function(e,t){return!!n.call(e,t,e)!==r}):n.nodeType?E.grep(e,function(e){return e===n!==r}):"string"!=typeof n?E.grep(e,function(e){return-1)[^>]*|#([\w-]+))$/;(E.fn.init=function(e,t,n){if(!e)return this;if(n=n||L,"string"!=typeof e)return e.nodeType?(this[0]=e,this.length=1,this):b(e)?void 0!==n.ready?n.ready(e):e(E):E.makeArray(e,this);if(!(r="<"===e[0]&&">"===e[e.length-1]&&3<=e.length?[null,e,null]:I.exec(e))||!r[1]&&t)return(!t||t.jquery?t||n:this.constructor(t)).find(e);if(r[1]){if(t=t instanceof E?t[0]:t,E.merge(this,E.parseHTML(r[1],t&&t.nodeType?t.ownerDocument||t:T,!0)),N.test(r[1])&&E.isPlainObject(t))for(var r in t)b(this[r])?this[r](t[r]):this.attr(r,t[r]);return this}return(e=T.getElementById(r[2]))&&(this[0]=e,this.length=1),this}).prototype=E.fn;var L=E(T),D=/^(?:parents|prev(?:Until|All))/,O={children:!0,contents:!0,next:!0,prev:!0};function H(e,t){for(;(e=e[t])&&1!==e.nodeType;);return e}E.fn.extend({has:function(e){var t=E(e,this),n=t.length;return this.filter(function(){for(var e=0;e\x20\t\r\n\f]*)/i,de=/^$|^module$|\/(?:java|ecma)script/i;f=T.createDocumentFragment().appendChild(T.createElement("div")),(p=T.createElement("input")).setAttribute("type","radio"),p.setAttribute("checked","checked"),p.setAttribute("name","t"),f.appendChild(p),y.checkClone=f.cloneNode(!0).cloneNode(!0).lastChild.checked,f.innerHTML="",y.noCloneChecked=!!f.cloneNode(!0).lastChild.defaultValue,f.innerHTML="",y.option=!!f.lastChild;var pe={thead:[1,"","
"],col:[2,"","
"],tr:[2,"","
"],td:[3,"","
"],_default:[0,"",""]};function he(e,t){var n=void 0!==e.getElementsByTagName?e.getElementsByTagName(t||"*"):void 0!==e.querySelectorAll?e.querySelectorAll(t||"*"):[];return void 0===t||t&&A(e,t)?E.merge([e],n):n}function me(e,t){for(var n=0,r=e.length;n",""]);var ge=/<|&#?\w+;/;function ve(e,t,n,r,o){for(var i,a,s,l,u,c=t.createDocumentFragment(),f=[],d=0,p=e.length;d\s*$/g;function Ae(e,t){return A(e,"table")&&A(11!==t.nodeType?t:t.firstChild,"tr")&&E(e).children("tbody")[0]||e}function Ne(e){return e.type=(null!==e.getAttribute("type"))+"/"+e.type,e}function je(e){return"true/"===(e.type||"").slice(0,5)?e.type=e.type.slice(5):e.removeAttribute("type"),e}function Ie(e,t){var n,r,o,i;if(1===t.nodeType){if(V.hasData(e)&&(i=V.get(e).events))for(o in V.remove(t,"handle events"),i)for(n=0,r=i[o].length;n").attr(n.scriptAttrs||{}).prop({charset:n.scriptCharset,src:n.url}).on("load error",o=function(e){r.remove(),o=null,e&&t("error"===e.type?404:200,e.type)}),T.head.appendChild(r[0])},abort:function(){o&&o()}}});var Yt=[],Vt=/(=)\?(?=&|$)|\?\?/;E.ajaxSetup({jsonp:"callback",jsonpCallback:function(){var e=Yt.pop()||E.expando+"_"+At.guid++;return this[e]=!0,e}}),E.ajaxPrefilter("json jsonp",function(e,t,n){var r,o,i,a=!1!==e.jsonp&&(Vt.test(e.url)?"url":"string"==typeof e.data&&0===(e.contentType||"").indexOf("application/x-www-form-urlencoded")&&Vt.test(e.data)&&"data");if(a||"jsonp"===e.dataTypes[0])return r=e.jsonpCallback=b(e.jsonpCallback)?e.jsonpCallback():e.jsonpCallback,a?e[a]=e[a].replace(Vt,"$1"+r):!1!==e.jsonp&&(e.url+=(Nt.test(e.url)?"&":"?")+e.jsonp+"="+r),e.converters["script json"]=function(){return i||E.error(r+" was not called"),i[0]},e.dataTypes[0]="json",o=C[r],C[r]=function(){i=arguments},n.always(function(){void 0===o?E(C).removeProp(r):C[r]=o,e[r]&&(e.jsonpCallback=t.jsonpCallback,Yt.push(r)),i&&b(o)&&o(i[0]),i=o=void 0}),"script"}),y.createHTMLDocument=((f=T.implementation.createHTMLDocument("").body).innerHTML="
",2===f.childNodes.length),E.parseHTML=function(e,t,n){return"string"!=typeof e?[]:("boolean"==typeof t&&(n=t,t=!1),t||(y.createHTMLDocument?((r=(t=T.implementation.createHTMLDocument("")).createElement("base")).href=T.location.href,t.head.appendChild(r)):t=T),r=!n&&[],(n=N.exec(e))?[t.createElement(n[1])]:(n=ve([e],t,r),r&&r.length&&E(r).remove(),E.merge([],n.childNodes)));var r},E.fn.load=function(e,t,n){var r,o,i,a=this,s=e.indexOf(" ");return-1").append(E.parseHTML(e)).find(r):e)}).always(n&&function(e,t){a.each(function(){n.apply(this,i||[e.responseText,t,e])})}),this},E.expr.pseudos.animated=function(t){return E.grep(E.timers,function(e){return t===e.elem}).length},E.offset={setOffset:function(e,t,n){var r,o,i,a,s=E.css(e,"position"),l=E(e),u={};"static"===s&&(e.style.position="relative"),i=l.offset(),r=E.css(e,"top"),a=E.css(e,"left"),a=("absolute"===s||"fixed"===s)&&-1<(r+a).indexOf("auto")?(o=(s=l.position()).top,s.left):(o=parseFloat(r)||0,parseFloat(a)||0),null!=(t=b(t)?t.call(e,n,E.extend({},i)):t).top&&(u.top=t.top-i.top+o),null!=t.left&&(u.left=t.left-i.left+a),"using"in t?t.using.call(e,u):l.css(u)}},E.fn.extend({offset:function(t){if(arguments.length)return void 0===t?this:this.each(function(e){E.offset.setOffset(this,t,e)});var e,n=this[0];return n?n.getClientRects().length?(e=n.getBoundingClientRect(),n=n.ownerDocument.defaultView,{top:e.top+n.pageYOffset,left:e.left+n.pageXOffset}):{top:0,left:0}:void 0},position:function(){if(this[0]){var e,t,n,r=this[0],o={top:0,left:0};if("fixed"===E.css(r,"position"))t=r.getBoundingClientRect();else{for(t=this.offset(),n=r.ownerDocument,e=r.offsetParent||n.documentElement;e&&(e===n.body||e===n.documentElement)&&"static"===E.css(e,"position");)e=e.parentNode;e&&e!==r&&1===e.nodeType&&((o=E(e).offset()).top+=E.css(e,"borderTopWidth",!0),o.left+=E.css(e,"borderLeftWidth",!0))}return{top:t.top-o.top-E.css(r,"marginTop",!0),left:t.left-o.left-E.css(r,"marginLeft",!0)}}},offsetParent:function(){return this.map(function(){for(var e=this.offsetParent;e&&"static"===E.css(e,"position");)e=e.offsetParent;return e||re})}}),E.each({scrollLeft:"pageXOffset",scrollTop:"pageYOffset"},function(t,o){var i="pageYOffset"===o;E.fn[t]=function(e){return F(this,function(e,t,n){var r;return m(e)?r=e:9===e.nodeType&&(r=e.defaultView),void 0===n?r?r[o]:e[t]:void(r?r.scrollTo(i?r.pageXOffset:n,i?n:r.pageYOffset):e[t]=n)},t,e,arguments.length)}}),E.each(["top","left"],function(e,n){E.cssHooks[n]=Ye(y.pixelPosition,function(e,t){if(t)return t=Qe(e,n),Fe.test(t)?E(e).position()[n]+"px":t})}),E.each({Height:"height",Width:"width"},function(a,s){E.each({padding:"inner"+a,content:s,"":"outer"+a},function(r,i){E.fn[i]=function(e,t){var n=arguments.length&&(r||"boolean"!=typeof e),o=r||(!0===e||!0===t?"margin":"border");return F(this,function(e,t,n){var r;return m(e)?0===i.indexOf("outer")?e["inner"+a]:e.document.documentElement["client"+a]:9===e.nodeType?(r=e.documentElement,Math.max(e.body["scroll"+a],r["scroll"+a],e.body["offset"+a],r["offset"+a],r["client"+a])):void 0===n?E.css(e,t,o):E.style(e,t,n,o)},s,n?e:void 0,n)}})}),E.each(["ajaxStart","ajaxStop","ajaxComplete","ajaxError","ajaxSuccess","ajaxSend"],function(e,t){E.fn[t]=function(e){return this.on(t,e)}}),E.fn.extend({bind:function(e,t,n){return this.on(e,null,t,n)},unbind:function(e,t){return this.off(e,null,t)},delegate:function(e,t,n,r){return this.on(t,e,n,r)},undelegate:function(e,t,n){return 1===arguments.length?this.off(e,"**"):this.off(t,e||"**",n)},hover:function(e,t){return this.mouseenter(e).mouseleave(t||e)}}),E.each("blur focus focusin focusout resize scroll click dblclick mousedown mouseup mousemove mouseover mouseout mouseenter mouseleave change select submit keydown keypress keyup contextmenu".split(" "),function(e,n){E.fn[n]=function(e,t){return 0x

',t.appendChild(n.childNodes[1])),e&&i.extend(o,e),this.each(function(){var e=['iframe[src*="player.vimeo.com"]','iframe[src*="youtube.com"]','iframe[src*="youtube-nocookie.com"]','iframe[src*="kickstarter.com"][src*="video.html"]',"object","embed"];o.customSelector&&e.push(o.customSelector);var r=".fitvidsignore";o.ignore&&(r=r+", "+o.ignore);e=i(this).find(e.join(","));(e=(e=e.not("object object")).not(r)).each(function(e){var t,n=i(this);0').parent(".fluid-width-video-wrapper").css("padding-top",100*t+"%"),n.removeAttr("height").removeAttr("width"))})})}}(window.jQuery||window.Zepto),$(function(){var n,r,e,o,t=$("nav.greedy-nav .greedy-nav__toggle"),i=$("nav.greedy-nav .visible-links"),a=$("nav.greedy-nav .hidden-links"),s=$("nav.greedy-nav"),l=$("nav.greedy-nav .site-logo"),u=$("nav.greedy-nav .site-logo img"),c=$("nav.greedy-nav .site-title"),f=$("nav.greedy-nav button.search__toggle");function d(){function t(e,t){r+=t,n+=1,o.push(r)}r=n=0,e=1e3,o=[],i.children().outerWidth(t),a.children().each(function(){var e;(e=(e=$(this)).clone()).css("visibility","hidden"),i.append(e),t(0,e.outerWidth()),e.remove()})}d();var p,h,m,g,v=$(window).width(),y=v<768?0:v<1024?1:v<1280?2:3;function b(){var e=(v=$(window).width())<768?0:v<1024?1:v<1280?2:3;e!==y&&d(),y=e,h=i.children().length,p=s.innerWidth()-(0!==l.length?l.outerWidth(!0):0)-c.outerWidth(!0)-(0!==f.length?f.outerWidth(!0):0)-(h!==o.length?t.outerWidth(!0):0),m=o[h-1],po[h]&&(a.children().first().appendTo(i),h+=1,b()),t.attr("count",n-h),h===n?t.addClass("hidden"):t.removeClass("hidden")}$(window).resize(function(){b()}),t.on("click",function(){a.toggleClass("hidden"),$(this).toggleClass("close"),clearTimeout(g)}),a.on("mouseleave",function(){g=setTimeout(function(){a.addClass("hidden")},e)}).on("mouseenter",function(){clearTimeout(g)}),0===u.length||u[0].complete||0!==u[0].naturalWidth?b():u.one("load error",b)}),function(e){"function"==typeof define&&define.amd?define(["jquery"],e):"object"==typeof exports?e(require("jquery")):e(window.jQuery||window.Zepto)}(function(u){function e(){}function c(e,t){h.ev.on("mfp"+e+x,t)}function f(e,t,n,r){var o=document.createElement("div");return o.className="mfp-"+e,n&&(o.innerHTML=n),r?t&&t.appendChild(o):(o=u(o),t&&o.appendTo(t)),o}function d(e,t){h.ev.triggerHandler("mfp"+e,t),h.st.callbacks&&(e=e.charAt(0).toLowerCase()+e.slice(1),h.st.callbacks[e]&&h.st.callbacks[e].apply(h,u.isArray(t)?t:[t]))}function p(e){return e===t&&h.currTemplate.closeBtn||(h.currTemplate.closeBtn=u(h.st.closeMarkup.replace("%title%",h.st.tClose)),t=e),h.currTemplate.closeBtn}function i(){u.magnificPopup.instance||((h=new e).init(),u.magnificPopup.instance=h)}var h,r,m,o,g,t,l="Close",v="BeforeClose",y="MarkupParse",b="Open",x=".mfp",w="mfp-ready",n="mfp-removing",a="mfp-prevent-close",s=!!window.jQuery,C=u(window);e.prototype={constructor:e,init:function(){var e=navigator.appVersion;h.isLowIE=h.isIE8=document.all&&!document.addEventListener,h.isAndroid=/android/gi.test(e),h.isIOS=/iphone|ipad|ipod/gi.test(e),h.supportsTransition=function(){var e=document.createElement("p").style,t=["ms","O","Moz","Webkit"];if(void 0!==e.transition)return!0;for(;t.length;)if(t.pop()+"Transition"in e)return!0;return!1}(),h.probablyMobile=h.isAndroid||h.isIOS||/(Opera Mini)|Kindle|webOS|BlackBerry|(Opera Mobi)|(Windows Phone)|IEMobile/i.test(navigator.userAgent),m=u(document),h.popupsCache={}},open:function(e){if(!1===e.isObj){h.items=e.items.toArray(),h.index=0;for(var t,n=e.items,r=0;r(e||C.height())},_setFocus:function(){(h.st.focus?h.content.find(h.st.focus).eq(0):h.wrap).focus()},_onFocusIn:function(e){if(e.target!==h.wrap[0]&&!u.contains(h.wrap[0],e.target))return h._setFocus(),!1},_parseMarkup:function(o,e,t){var i;t.data&&(e=u.extend(t.data,e)),d(y,[o,e,t]),u.each(e,function(e,t){return void 0===t||!1===t||void(1<(i=e.split("_")).length?0<(n=o.find(x+"-"+i[0])).length&&("replaceWith"===(r=i[1])?n[0]!==t[0]&&n.replaceWith(t):"img"===r?n.is("img")?n.attr("src",t):n.replaceWith(u("").attr("src",t).attr("class",n.attr("class"))):n.attr(i[1],t)):o.find(x+"-"+e).html(t));var n,r})},_getScrollbarSize:function(){var e;return void 0===h.scrollbarSize&&((e=document.createElement("div")).style.cssText="width: 99px; height: 99px; overflow: scroll; position: absolute; top: -9999px;",document.body.appendChild(e),h.scrollbarSize=e.offsetWidth-e.clientWidth,document.body.removeChild(e)),h.scrollbarSize}},u.magnificPopup={instance:null,proto:e.prototype,modules:[],open:function(e,t){return i(),(e=e?u.extend(!0,{},e):{}).isObj=!0,e.index=t||0,this.instance.open(e)},close:function(){return u.magnificPopup.instance&&u.magnificPopup.instance.close()},registerModule:function(e,t){t.options&&(u.magnificPopup.defaults[e]=t.options),u.extend(this.proto,t.proto),this.modules.push(e)},defaults:{disableOn:0,key:null,midClick:!1,mainClass:"",preloader:!0,focus:"",closeOnContentClick:!1,closeOnBgClick:!0,closeBtnInside:!0,showCloseBtn:!0,enableEscapeKey:!0,modal:!1,alignTop:!1,removalDelay:0,prependTo:null,fixedContentPos:"auto",fixedBgPos:"auto",overflowY:"auto",closeMarkup:'',tClose:"Close (Esc)",tLoading:"Loading...",autoFocusLast:!0}},u.fn.magnificPopup=function(e){i();var t,n,r,o=u(this);return"string"==typeof e?"open"===e?(t=s?o.data("magnificPopup"):o[0].magnificPopup,n=parseInt(arguments[1],10)||0,r=t.items?t.items[n]:(r=o,(r=t.delegate?o.find(t.delegate):r).eq(n)),h._openClick({mfpEl:r},o,t)):h.isOpen&&h[e].apply(h,Array.prototype.slice.call(arguments,1)):(e=u.extend(!0,{},e),s?o.data("magnificPopup",e):o[0].magnificPopup=e,h.addGroup(o,e)),o};function T(){k&&(S.after(k.addClass(E)).detach(),k=null)}var E,S,k,A="inline";u.magnificPopup.registerModule(A,{options:{hiddenClass:"hide",markup:"",tNotFound:"Content not found"},proto:{initInline:function(){h.types.push(A),c(l+"."+A,function(){T()})},getInline:function(e,t){if(T(),e.src){var n,r=h.st.inline,o=u(e.src);return o.length?((n=o[0].parentNode)&&n.tagName&&(S||(E=r.hiddenClass,S=f(E),E="mfp-"+E),k=o.after(S).detach().removeClass(E)),h.updateStatus("ready")):(h.updateStatus("error",r.tNotFound),o=u("
")),e.inlineElement=o}return h.updateStatus("ready"),h._parseMarkup(t,{},e),t}}});function N(){I&&u(document.body).removeClass(I)}function j(){N(),h.req&&h.req.abort()}var I,L="ajax";u.magnificPopup.registerModule(L,{options:{settings:null,cursor:"mfp-ajax-cur",tError:'The content could not be loaded.'},proto:{initAjax:function(){h.types.push(L),I=h.st.ajax.cursor,c(l+"."+L,j),c("BeforeChange."+L,j)},getAjax:function(r){I&&u(document.body).addClass(I),h.updateStatus("loading");var e=u.extend({url:r.src,success:function(e,t,n){n={data:e,xhr:n};d("ParseAjax",n),h.appendContent(u(n.data),L),r.finished=!0,N(),h._setFocus(),setTimeout(function(){h.wrap.addClass(w)},16),h.updateStatus("ready"),d("AjaxContentAdded")},error:function(){N(),r.finished=r.loadError=!0,h.updateStatus("error",h.st.ajax.tError.replace("%url%",r.src))}},h.st.ajax.settings);return h.req=u.ajax(e),""}}});var D;u.magnificPopup.registerModule("image",{options:{markup:'
',cursor:"mfp-zoom-out-cur",titleSrc:"title",verticalFit:!0,tError:'The image could not be loaded.'},proto:{initImage:function(){var e=h.st.image,t=".image";h.types.push("image"),c(b+t,function(){"image"===h.currItem.type&&e.cursor&&u(document.body).addClass(e.cursor)}),c(l+t,function(){e.cursor&&u(document.body).removeClass(e.cursor),C.off("resize"+x)}),c("Resize"+t,h.resizeImage),h.isLowIE&&c("AfterChange",h.resizeImage)},resizeImage:function(){var e,t=h.currItem;t&&t.img&&h.st.image.verticalFit&&(e=0,h.isLowIE&&(e=parseInt(t.img.css("padding-top"),10)+parseInt(t.img.css("padding-bottom"),10)),t.img.css("max-height",h.wH-e))},_onImageHasSize:function(e){e.img&&(e.hasSize=!0,D&&clearInterval(D),e.isCheckingImgSize=!1,d("ImageHasSize",e),e.imgHidden&&(h.content&&h.content.removeClass("mfp-loading"),e.imgHidden=!1))},findImageSize:function(t){var n=0,r=t.img[0],o=function(e){D&&clearInterval(D),D=setInterval(function(){0
',srcAction:"iframe_src",patterns:{youtube:{index:"youtube.com",id:"v=",src:"//www.youtube.com/embed/%id%?autoplay=1"},vimeo:{index:"vimeo.com/",id:"/",src:"//player.vimeo.com/video/%id%?autoplay=1"},gmaps:{index:"//maps.google.",src:"%id%&output=embed"}}},proto:{initIframe:function(){h.types.push(P),c("BeforeChange",function(e,t,n){t!==n&&(t===P?H():n===P&&H(!0))}),c(l+"."+P,function(){H()})},getIframe:function(e,t){var n=e.src,r=h.st.iframe;u.each(r.patterns,function(){if(-1',preload:[0,2],navigateByImgClick:!0,arrows:!0,tPrev:"Previous (Left arrow key)",tNext:"Next (Right arrow key)",tCounter:"%curr% of %total%"},proto:{initGallery:function(){var i=h.st.gallery,e=".mfp-gallery";if(h.direction=!0,!i||!i.enabled)return!1;g+=" mfp-gallery",c(b+e,function(){i.navigateByImgClick&&h.wrap.on("click"+e,".mfp-img",function(){if(1=h.index,h.index=e,h.updateItemHTML()},preloadNearbyImages:function(){for(var e=h.st.gallery.preload,t=Math.min(e[0],h.items.length),n=Math.min(e[1],h.items.length),r=1;r<=(h.direction?n:t);r++)h._preloadItem(h.index+r);for(r=1;r<=(h.direction?t:n);r++)h._preloadItem(h.index-r)},_preloadItem:function(e){var t;e=q(e),h.items[e].preloaded||((t=h.items[e]).parsed||(t=h.parseEl(e)),d("LazyLoad",t),"image"===t.type&&(t.img=u('').on("load.mfploader",function(){t.hasSize=!0}).on("error.mfploader",function(){t.hasSize=!0,t.loadError=!0,d("LazyLoadError",t)}).attr("src",t.src)),t.preloaded=!0)}}});var _="retina";u.magnificPopup.registerModule(_,{options:{replaceSrc:function(e){return e.src.replace(/\.\w+$/,function(e){return"@2x"+e})},ratio:1},proto:{initRetina:function(){var n,r;1t.durationMax?t.durationMax:t.durationMin&&e=l)return b.cancelScroll(!0),e=t,n=g,0===(t=r)&&document.body.focus(),n||(t.focus(),document.activeElement!==t&&(t.setAttribute("tabindex","-1"),t.focus(),t.style.outline="none"),x.scrollTo(0,e)),E("scrollStop",m,r,o),!(y=f=null)},h=function(e){var t,n,r;u+=e-(f=f||e),d=i+s*(n=d=1<(d=0===c?0:u/c)?1:d,"easeInQuad"===(t=m).easing&&(r=n*n),"easeOutQuad"===t.easing&&(r=n*(2-n)),"easeInOutQuad"===t.easing&&(r=n<.5?2*n*n:(4-2*n)*n-1),"easeInCubic"===t.easing&&(r=n*n*n),"easeOutCubic"===t.easing&&(r=--n*n*n+1),"easeInOutCubic"===t.easing&&(r=n<.5?4*n*n*n:(n-1)*(2*n-2)*(2*n-2)+1),"easeInQuart"===t.easing&&(r=n*n*n*n),"easeOutQuart"===t.easing&&(r=1- --n*n*n*n),"easeInOutQuart"===t.easing&&(r=n<.5?8*n*n*n*n:1-8*--n*n*n*n),"easeInQuint"===t.easing&&(r=n*n*n*n*n),"easeOutQuint"===t.easing&&(r=1+--n*n*n*n*n),"easeInOutQuint"===t.easing&&(r=n<.5?16*n*n*n*n*n:1+16*--n*n*n*n*n),(r=t.customEasing?t.customEasing(n):r)||n),x.scrollTo(0,Math.floor(d)),p(d,a)||(y=x.requestAnimationFrame(h),f=e)},0===x.pageYOffset&&x.scrollTo(0,0),t=r,e=m,g||history.pushState&&e.updateURL&&history.pushState({smoothScroll:JSON.stringify(e),anchor:t.id},document.title,t===document.documentElement?"#top":"#"+t.id),"matchMedia"in x&&x.matchMedia("(prefers-reduced-motion)").matches?x.scrollTo(0,Math.floor(a)):(E("scrollStart",m,r,o),b.cancelScroll(!0),x.requestAnimationFrame(h)))};function t(e){if(!e.defaultPrevented&&!(0!==e.button||e.metaKey||e.ctrlKey||e.shiftKey)&&"closest"in e.target&&(o=e.target.closest(r))&&"a"===o.tagName.toLowerCase()&&!e.target.closest(v.ignore)&&o.hostname===x.location.hostname&&o.pathname===x.location.pathname&&/#/.test(o.href)){var t,n;try{n=a(decodeURIComponent(o.hash))}catch(e){n=a(o.hash)}if("#"===n){if(!v.topOnEmptyHash)return;t=document.documentElement}else t=document.querySelector(n);(t=t||"#top"!==n?t:document.documentElement)&&(e.preventDefault(),n=v,history.replaceState&&n.updateURL&&!history.state&&(e=(e=x.location.hash)||"",history.replaceState({smoothScroll:JSON.stringify(n),anchor:e||x.pageYOffset},document.title,e||x.location.href)),b.animateScroll(t,o))}}function i(e){var t;null!==history.state&&history.state.smoothScroll&&history.state.smoothScroll===JSON.stringify(v)&&("string"==typeof(t=history.state.anchor)&&t&&!(t=document.querySelector(a(history.state.anchor)))||b.animateScroll(t,null,{updateURL:!1}))}b.destroy=function(){v&&(document.removeEventListener("click",t,!1),x.removeEventListener("popstate",i,!1),b.cancelScroll(),y=n=o=v=null)};return function(){if(!("querySelector"in document&&"addEventListener"in x&&"requestAnimationFrame"in x&&"closest"in x.Element.prototype))throw"Smooth Scroll: This browser does not support the required JavaScript methods and browser APIs.";b.destroy(),v=w(S,e||{}),n=v.header?document.querySelector(v.header):null,document.addEventListener("click",t,!1),v.updateURL&&v.popstate&&x.addEventListener("popstate",i,!1)}(),b}}),function(e,t){"function"==typeof define&&define.amd?define([],function(){return t(e)}):"object"==typeof exports?module.exports=t(e):e.Gumshoe=t(e)}("undefined"!=typeof global?global:"undefined"!=typeof window?window:this,function(c){"use strict";function f(e,t,n){n.settings.events&&(n=new CustomEvent(e,{bubbles:!0,cancelable:!0,detail:n}),t.dispatchEvent(n))}function n(e){var t=0;if(e.offsetParent)for(;e;)t+=e.offsetTop,e=e.offsetParent;return 0<=t?t:0}function d(e){e&&e.sort(function(e,t){return n(e.content)=Math.max(document.body.scrollHeight,document.documentElement.scrollHeight,document.body.offsetHeight,document.documentElement.offsetHeight,document.body.clientHeight,document.documentElement.clientHeight)}function p(e,t){var n,r,o=e[e.length-1];if(n=o,r=t,!(!s()||!a(n.content,r,!0)))return o;for(var i=e.length-1;0<=i;i--)if(a(e[i].content,t))return e[i]}function h(e,t){var n;!e||(n=e.nav.closest("li"))&&(n.classList.remove(t.navClass),e.content.classList.remove(t.contentClass),r(n,t),f("gumshoeDeactivate",n,{link:e.nav,content:e.content,settings:t}))}var m={navClass:"active",contentClass:"active",nested:!1,nestedClass:"active",offset:0,reflow:!1,events:!0},r=function(e,t){!t.nested||(e=e.parentNode.closest("li"))&&(e.classList.remove(t.nestedClass),r(e,t))},g=function(e,t){!t.nested||(e=e.parentNode.closest("li"))&&(e.classList.add(t.nestedClass),g(e,t))};return function(e,t){var n,o,i,r,a,s={setup:function(){n=document.querySelectorAll(e),o=[],Array.prototype.forEach.call(n,function(e){var t=document.getElementById(decodeURIComponent(e.hash.substr(1)));t&&o.push({nav:e,content:t})}),d(o)}};s.detect=function(){var e,t,n,r=p(o,a);r?i&&r.content===i.content||(h(i,a),t=a,!(e=r)||(n=e.nav.closest("li"))&&(n.classList.add(t.navClass),e.content.classList.add(t.contentClass),g(n,t),f("gumshoeActivate",n,{link:e.nav,content:e.content,settings:t})),i=r):i&&(h(i,a),i=null)};function l(e){r&&c.cancelAnimationFrame(r),r=c.requestAnimationFrame(s.detect)}function u(e){r&&c.cancelAnimationFrame(r),r=c.requestAnimationFrame(function(){d(o),s.detect()})}s.destroy=function(){i&&h(i,a),c.removeEventListener("scroll",l,!1),a.reflow&&c.removeEventListener("resize",u,!1),a=r=i=n=o=null};return a=function(){var n={};return Array.prototype.forEach.call(arguments,function(e){for(var t in e){if(!e.hasOwnProperty(t))return;n[t]=e[t]}}),n}(m,t||{}),s.setup(),s.detect(),c.addEventListener("scroll",l,!1),a.reflow&&c.addEventListener("resize",u,!1),s}}),$(function(){$("#main").fitVids();function e(){(0===$(".author__urls-wrapper").find("button").length?1024<$(window).width():!$(".author__urls-wrapper").find("button").is(":visible"))?$(".sidebar").addClass("sticky"):$(".sidebar").removeClass("sticky")}e(),$(window).resize(function(){e()}),$(".author__urls-wrapper").find("button").on("click",function(){$(".author__urls").toggleClass("is--visible"),$(".author__urls-wrapper").find("button").toggleClass("open")}),$(document).keyup(function(e){27===e.keyCode&&$(".initial-content").hasClass("is--hidden")&&($(".search-content").toggleClass("is--visible"),$(".initial-content").toggleClass("is--hidden"))}),$(".search__toggle").on("click",function(){$(".search-content").toggleClass("is--visible"),$(".initial-content").toggleClass("is--hidden"),setTimeout(function(){$(".search-content").find("input").focus()},400)});new SmoothScroll('a[href*="#"]',{offset:20,speed:400,speedAsDuration:!0,durationMax:500});0<$("nav.toc").length&&new Gumshoe("nav.toc a",{navClass:"active",contentClass:"active",nested:!1,nestedClass:"active",offset:20,reflow:!0,events:!0}),$("a[href$='.jpg'],a[href$='.jpeg'],a[href$='.JPG'],a[href$='.png'],a[href$='.gif'],a[href$='.webp']").has("> img").addClass("image-popup"),$(".image-popup").magnificPopup({type:"image",tLoading:"Loading image #%curr%...",gallery:{enabled:!0,navigateByImgClick:!0,preload:[0,1]},image:{tError:'Image #%curr% could not be loaded.'},removalDelay:500,mainClass:"mfp-zoom-in",callbacks:{beforeOpen:function(){this.st.image.markup=this.st.image.markup.replace("mfp-figure","mfp-figure mfp-with-anim")}},closeOnContentClick:!0,midClick:!0}),$(".page__content").find("h1, h2, h3, h4, h5, h6").each(function(){var e,t=$(this).attr("id");t&&((e=document.createElement("a")).className="header-link",e.href="#"+t,e.innerHTML='Permalink',e.title="Permalink",$(this).append(e))})}); \ No newline at end of file diff --git a/assets/js/qrcode/qrcode.js b/assets/js/qrcode/qrcode.js new file mode 100644 index 00000000..5507c154 --- /dev/null +++ b/assets/js/qrcode/qrcode.js @@ -0,0 +1,614 @@ +/** + * @fileoverview + * - Using the 'QRCode for Javascript library' + * - Fixed dataset of 'QRCode for Javascript library' for support full-spec. + * - this library has no dependencies. + * + * @author davidshimjs + * @see http://www.d-project.com/ + * @see http://jeromeetienne.github.com/jquery-qrcode/ + */ +var QRCode; + +(function () { + //--------------------------------------------------------------------- + // QRCode for JavaScript + // + // Copyright (c) 2009 Kazuhiko Arase + // + // URL: http://www.d-project.com/ + // + // Licensed under the MIT license: + // http://www.opensource.org/licenses/mit-license.php + // + // The word "QR Code" is registered trademark of + // DENSO WAVE INCORPORATED + // http://www.denso-wave.com/qrcode/faqpatent-e.html + // + //--------------------------------------------------------------------- + function QR8bitByte(data) { + this.mode = QRMode.MODE_8BIT_BYTE; + this.data = data; + this.parsedData = []; + + // Added to support UTF-8 Characters + for (var i = 0, l = this.data.length; i < l; i++) { + var byteArray = []; + var code = this.data.charCodeAt(i); + + if (code > 0x10000) { + byteArray[0] = 0xF0 | ((code & 0x1C0000) >>> 18); + byteArray[1] = 0x80 | ((code & 0x3F000) >>> 12); + byteArray[2] = 0x80 | ((code & 0xFC0) >>> 6); + byteArray[3] = 0x80 | (code & 0x3F); + } else if (code > 0x800) { + byteArray[0] = 0xE0 | ((code & 0xF000) >>> 12); + byteArray[1] = 0x80 | ((code & 0xFC0) >>> 6); + byteArray[2] = 0x80 | (code & 0x3F); + } else if (code > 0x80) { + byteArray[0] = 0xC0 | ((code & 0x7C0) >>> 6); + byteArray[1] = 0x80 | (code & 0x3F); + } else { + byteArray[0] = code; + } + + this.parsedData.push(byteArray); + } + + this.parsedData = Array.prototype.concat.apply([], this.parsedData); + + if (this.parsedData.length != this.data.length) { + this.parsedData.unshift(191); + this.parsedData.unshift(187); + this.parsedData.unshift(239); + } + } + + QR8bitByte.prototype = { + getLength: function (buffer) { + return this.parsedData.length; + }, + write: function (buffer) { + for (var i = 0, l = this.parsedData.length; i < l; i++) { + buffer.put(this.parsedData[i], 8); + } + } + }; + + function QRCodeModel(typeNumber, errorCorrectLevel) { + this.typeNumber = typeNumber; + this.errorCorrectLevel = errorCorrectLevel; + this.modules = null; + this.moduleCount = 0; + this.dataCache = null; + this.dataList = []; + } + + QRCodeModel.prototype={addData:function(data){var newData=new QR8bitByte(data);this.dataList.push(newData);this.dataCache=null;},isDark:function(row,col){if(row<0||this.moduleCount<=row||col<0||this.moduleCount<=col){throw new Error(row+","+col);} + return this.modules[row][col];},getModuleCount:function(){return this.moduleCount;},make:function(){this.makeImpl(false,this.getBestMaskPattern());},makeImpl:function(test,maskPattern){this.moduleCount=this.typeNumber*4+17;this.modules=new Array(this.moduleCount);for(var row=0;row=7){this.setupTypeNumber(test);} + if(this.dataCache==null){this.dataCache=QRCodeModel.createData(this.typeNumber,this.errorCorrectLevel,this.dataList);} + this.mapData(this.dataCache,maskPattern);},setupPositionProbePattern:function(row,col){for(var r=-1;r<=7;r++){if(row+r<=-1||this.moduleCount<=row+r)continue;for(var c=-1;c<=7;c++){if(col+c<=-1||this.moduleCount<=col+c)continue;if((0<=r&&r<=6&&(c==0||c==6))||(0<=c&&c<=6&&(r==0||r==6))||(2<=r&&r<=4&&2<=c&&c<=4)){this.modules[row+r][col+c]=true;}else{this.modules[row+r][col+c]=false;}}}},getBestMaskPattern:function(){var minLostPoint=0;var pattern=0;for(var i=0;i<8;i++){this.makeImpl(true,i);var lostPoint=QRUtil.getLostPoint(this);if(i==0||minLostPoint>lostPoint){minLostPoint=lostPoint;pattern=i;}} + return pattern;},createMovieClip:function(target_mc,instance_name,depth){var qr_mc=target_mc.createEmptyMovieClip(instance_name,depth);var cs=1;this.make();for(var row=0;row>i)&1)==1);this.modules[Math.floor(i/3)][i%3+this.moduleCount-8-3]=mod;} + for(var i=0;i<18;i++){var mod=(!test&&((bits>>i)&1)==1);this.modules[i%3+this.moduleCount-8-3][Math.floor(i/3)]=mod;}},setupTypeInfo:function(test,maskPattern){var data=(this.errorCorrectLevel<<3)|maskPattern;var bits=QRUtil.getBCHTypeInfo(data);for(var i=0;i<15;i++){var mod=(!test&&((bits>>i)&1)==1);if(i<6){this.modules[i][8]=mod;}else if(i<8){this.modules[i+1][8]=mod;}else{this.modules[this.moduleCount-15+i][8]=mod;}} + for(var i=0;i<15;i++){var mod=(!test&&((bits>>i)&1)==1);if(i<8){this.modules[8][this.moduleCount-i-1]=mod;}else if(i<9){this.modules[8][15-i-1+1]=mod;}else{this.modules[8][15-i-1]=mod;}} + this.modules[this.moduleCount-8][8]=(!test);},mapData:function(data,maskPattern){var inc=-1;var row=this.moduleCount-1;var bitIndex=7;var byteIndex=0;for(var col=this.moduleCount-1;col>0;col-=2){if(col==6)col--;while(true){for(var c=0;c<2;c++){if(this.modules[row][col-c]==null){var dark=false;if(byteIndex>>bitIndex)&1)==1);} + var mask=QRUtil.getMask(maskPattern,row,col-c);if(mask){dark=!dark;} + this.modules[row][col-c]=dark;bitIndex--;if(bitIndex==-1){byteIndex++;bitIndex=7;}}} + row+=inc;if(row<0||this.moduleCount<=row){row-=inc;inc=-inc;break;}}}}};QRCodeModel.PAD0=0xEC;QRCodeModel.PAD1=0x11;QRCodeModel.createData=function(typeNumber,errorCorrectLevel,dataList){var rsBlocks=QRRSBlock.getRSBlocks(typeNumber,errorCorrectLevel);var buffer=new QRBitBuffer();for(var i=0;itotalDataCount*8){throw new Error("code length overflow. (" + +buffer.getLengthInBits() + +">" + +totalDataCount*8 + +")");} + if(buffer.getLengthInBits()+4<=totalDataCount*8){buffer.put(0,4);} + while(buffer.getLengthInBits()%8!=0){buffer.putBit(false);} + while(true){if(buffer.getLengthInBits()>=totalDataCount*8){break;} + buffer.put(QRCodeModel.PAD0,8);if(buffer.getLengthInBits()>=totalDataCount*8){break;} + buffer.put(QRCodeModel.PAD1,8);} + return QRCodeModel.createBytes(buffer,rsBlocks);};QRCodeModel.createBytes=function(buffer,rsBlocks){var offset=0;var maxDcCount=0;var maxEcCount=0;var dcdata=new Array(rsBlocks.length);var ecdata=new Array(rsBlocks.length);for(var r=0;r=0)?modPoly.get(modIndex):0;}} + var totalCodeCount=0;for(var i=0;i=0){d^=(QRUtil.G15<<(QRUtil.getBCHDigit(d)-QRUtil.getBCHDigit(QRUtil.G15)));} + return((data<<10)|d)^QRUtil.G15_MASK;},getBCHTypeNumber:function(data){var d=data<<12;while(QRUtil.getBCHDigit(d)-QRUtil.getBCHDigit(QRUtil.G18)>=0){d^=(QRUtil.G18<<(QRUtil.getBCHDigit(d)-QRUtil.getBCHDigit(QRUtil.G18)));} + return(data<<12)|d;},getBCHDigit:function(data){var digit=0;while(data!=0){digit++;data>>>=1;} + return digit;},getPatternPosition:function(typeNumber){return QRUtil.PATTERN_POSITION_TABLE[typeNumber-1];},getMask:function(maskPattern,i,j){switch(maskPattern){case QRMaskPattern.PATTERN000:return(i+j)%2==0;case QRMaskPattern.PATTERN001:return i%2==0;case QRMaskPattern.PATTERN010:return j%3==0;case QRMaskPattern.PATTERN011:return(i+j)%3==0;case QRMaskPattern.PATTERN100:return(Math.floor(i/2)+Math.floor(j/3))%2==0;case QRMaskPattern.PATTERN101:return(i*j)%2+(i*j)%3==0;case QRMaskPattern.PATTERN110:return((i*j)%2+(i*j)%3)%2==0;case QRMaskPattern.PATTERN111:return((i*j)%3+(i+j)%2)%2==0;default:throw new Error("bad maskPattern:"+maskPattern);}},getErrorCorrectPolynomial:function(errorCorrectLength){var a=new QRPolynomial([1],0);for(var i=0;i5){lostPoint+=(3+sameCount-5);}}} + for(var row=0;row=256){n-=255;} + return QRMath.EXP_TABLE[n];},EXP_TABLE:new Array(256),LOG_TABLE:new Array(256)};for(var i=0;i<8;i++){QRMath.EXP_TABLE[i]=1<>>(7-index%8))&1)==1;},put:function(num,length){for(var i=0;i>>(length-i-1))&1)==1);}},getLengthInBits:function(){return this.length;},putBit:function(bit){var bufIndex=Math.floor(this.length/8);if(this.buffer.length<=bufIndex){this.buffer.push(0);} + if(bit){this.buffer[bufIndex]|=(0x80>>>(this.length%8));} + this.length++;}};var QRCodeLimitLength=[[17,14,11,7],[32,26,20,14],[53,42,32,24],[78,62,46,34],[106,84,60,44],[134,106,74,58],[154,122,86,64],[192,152,108,84],[230,180,130,98],[271,213,151,119],[321,251,177,137],[367,287,203,155],[425,331,241,177],[458,362,258,194],[520,412,292,220],[586,450,322,250],[644,504,364,280],[718,560,394,310],[792,624,442,338],[858,666,482,382],[929,711,509,403],[1003,779,565,439],[1091,857,611,461],[1171,911,661,511],[1273,997,715,535],[1367,1059,751,593],[1465,1125,805,625],[1528,1190,868,658],[1628,1264,908,698],[1732,1370,982,742],[1840,1452,1030,790],[1952,1538,1112,842],[2068,1628,1168,898],[2188,1722,1228,958],[2303,1809,1283,983],[2431,1911,1351,1051],[2563,1989,1423,1093],[2699,2099,1499,1139],[2809,2213,1579,1219],[2953,2331,1663,1273]]; + + function _isSupportCanvas() { + return typeof CanvasRenderingContext2D != "undefined"; + } + + // android 2.x doesn't support Data-URI spec + function _getAndroid() { + var android = false; + var sAgent = navigator.userAgent; + + if (/android/i.test(sAgent)) { // android + android = true; + var aMat = sAgent.toString().match(/android ([0-9]\.[0-9])/i); + + if (aMat && aMat[1]) { + android = parseFloat(aMat[1]); + } + } + + return android; + } + + var svgDrawer = (function() { + + var Drawing = function (el, htOption) { + this._el = el; + this._htOption = htOption; + }; + + Drawing.prototype.draw = function (oQRCode) { + var _htOption = this._htOption; + var _el = this._el; + var nCount = oQRCode.getModuleCount(); + var nWidth = Math.floor(_htOption.width / nCount); + var nHeight = Math.floor(_htOption.height / nCount); + + this.clear(); + + function makeSVG(tag, attrs) { + var el = document.createElementNS('http://www.w3.org/2000/svg', tag); + for (var k in attrs) + if (attrs.hasOwnProperty(k)) el.setAttribute(k, attrs[k]); + return el; + } + + var svg = makeSVG("svg" , {'viewBox': '0 0 ' + String(nCount) + " " + String(nCount), 'width': '100%', 'height': '100%', 'fill': _htOption.colorLight}); + svg.setAttributeNS("http://www.w3.org/2000/xmlns/", "xmlns:xlink", "http://www.w3.org/1999/xlink"); + _el.appendChild(svg); + + svg.appendChild(makeSVG("rect", {"fill": _htOption.colorLight, "width": "100%", "height": "100%"})); + svg.appendChild(makeSVG("rect", {"fill": _htOption.colorDark, "width": "1", "height": "1", "id": "template"})); + + for (var row = 0; row < nCount; row++) { + for (var col = 0; col < nCount; col++) { + if (oQRCode.isDark(row, col)) { + var child = makeSVG("use", {"x": String(col), "y": String(row)}); + child.setAttributeNS("http://www.w3.org/1999/xlink", "href", "#template") + svg.appendChild(child); + } + } + } + }; + Drawing.prototype.clear = function () { + while (this._el.hasChildNodes()) + this._el.removeChild(this._el.lastChild); + }; + return Drawing; + })(); + + var useSVG = document.documentElement.tagName.toLowerCase() === "svg"; + + // Drawing in DOM by using Table tag + var Drawing = useSVG ? svgDrawer : !_isSupportCanvas() ? (function () { + var Drawing = function (el, htOption) { + this._el = el; + this._htOption = htOption; + }; + + /** + * Draw the QRCode + * + * @param {QRCode} oQRCode + */ + Drawing.prototype.draw = function (oQRCode) { + var _htOption = this._htOption; + var _el = this._el; + var nCount = oQRCode.getModuleCount(); + var nWidth = Math.floor(_htOption.width / nCount); + var nHeight = Math.floor(_htOption.height / nCount); + var aHTML = ['']; + + for (var row = 0; row < nCount; row++) { + aHTML.push(''); + + for (var col = 0; col < nCount; col++) { + aHTML.push(''); + } + + aHTML.push(''); + } + + aHTML.push('
'); + _el.innerHTML = aHTML.join(''); + + // Fix the margin values as real size. + var elTable = _el.childNodes[0]; + var nLeftMarginTable = (_htOption.width - elTable.offsetWidth) / 2; + var nTopMarginTable = (_htOption.height - elTable.offsetHeight) / 2; + + if (nLeftMarginTable > 0 && nTopMarginTable > 0) { + elTable.style.margin = nTopMarginTable + "px " + nLeftMarginTable + "px"; + } + }; + + /** + * Clear the QRCode + */ + Drawing.prototype.clear = function () { + this._el.innerHTML = ''; + }; + + return Drawing; + })() : (function () { // Drawing in Canvas + function _onMakeImage() { + this._elImage.src = this._elCanvas.toDataURL("image/png"); + this._elImage.style.display = "block"; + this._elCanvas.style.display = "none"; + } + + // Android 2.1 bug workaround + // http://code.google.com/p/android/issues/detail?id=5141 + if (this._android && this._android <= 2.1) { + var factor = 1 / window.devicePixelRatio; + var drawImage = CanvasRenderingContext2D.prototype.drawImage; + CanvasRenderingContext2D.prototype.drawImage = function (image, sx, sy, sw, sh, dx, dy, dw, dh) { + if (("nodeName" in image) && /img/i.test(image.nodeName)) { + for (var i = arguments.length - 1; i >= 1; i--) { + arguments[i] = arguments[i] * factor; + } + } else if (typeof dw == "undefined") { + arguments[1] *= factor; + arguments[2] *= factor; + arguments[3] *= factor; + arguments[4] *= factor; + } + + drawImage.apply(this, arguments); + }; + } + + /** + * Check whether the user's browser supports Data URI or not + * + * @private + * @param {Function} fSuccess Occurs if it supports Data URI + * @param {Function} fFail Occurs if it doesn't support Data URI + */ + function _safeSetDataURI(fSuccess, fFail) { + var self = this; + self._fFail = fFail; + self._fSuccess = fSuccess; + + // Check it just once + if (self._bSupportDataURI === null) { + var el = document.createElement("img"); + var fOnError = function() { + self._bSupportDataURI = false; + + if (self._fFail) { + self._fFail.call(self); + } + }; + var fOnSuccess = function() { + self._bSupportDataURI = true; + + if (self._fSuccess) { + self._fSuccess.call(self); + } + }; + + el.onabort = fOnError; + el.onerror = fOnError; + el.onload = fOnSuccess; + el.src = "data:image/gif;base64,iVBORw0KGgoAAAANSUhEUgAAAAUAAAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO9TXL0Y4OHwAAAABJRU5ErkJggg=="; // the Image contains 1px data. + return; + } else if (self._bSupportDataURI === true && self._fSuccess) { + self._fSuccess.call(self); + } else if (self._bSupportDataURI === false && self._fFail) { + self._fFail.call(self); + } + }; + + /** + * Drawing QRCode by using canvas + * + * @constructor + * @param {HTMLElement} el + * @param {Object} htOption QRCode Options + */ + var Drawing = function (el, htOption) { + this._bIsPainted = false; + this._android = _getAndroid(); + + this._htOption = htOption; + this._elCanvas = document.createElement("canvas"); + this._elCanvas.width = htOption.width; + this._elCanvas.height = htOption.height; + el.appendChild(this._elCanvas); + this._el = el; + this._oContext = this._elCanvas.getContext("2d"); + this._bIsPainted = false; + this._elImage = document.createElement("img"); + this._elImage.alt = "Scan me!"; + this._elImage.style.display = "none"; + this._el.appendChild(this._elImage); + this._bSupportDataURI = null; + }; + + /** + * Draw the QRCode + * + * @param {QRCode} oQRCode + */ + Drawing.prototype.draw = function (oQRCode) { + var _elImage = this._elImage; + var _oContext = this._oContext; + var _htOption = this._htOption; + + var nCount = oQRCode.getModuleCount(); + var nWidth = _htOption.width / nCount; + var nHeight = _htOption.height / nCount; + var nRoundedWidth = Math.round(nWidth); + var nRoundedHeight = Math.round(nHeight); + + _elImage.style.display = "none"; + this.clear(); + + for (var row = 0; row < nCount; row++) { + for (var col = 0; col < nCount; col++) { + var bIsDark = oQRCode.isDark(row, col); + var nLeft = col * nWidth; + var nTop = row * nHeight; + _oContext.strokeStyle = bIsDark ? _htOption.colorDark : _htOption.colorLight; + _oContext.lineWidth = 1; + _oContext.fillStyle = bIsDark ? _htOption.colorDark : _htOption.colorLight; + _oContext.fillRect(nLeft, nTop, nWidth, nHeight); + + // 안티 앨리어싱 방지 처리 + _oContext.strokeRect( + Math.floor(nLeft) + 0.5, + Math.floor(nTop) + 0.5, + nRoundedWidth, + nRoundedHeight + ); + + _oContext.strokeRect( + Math.ceil(nLeft) - 0.5, + Math.ceil(nTop) - 0.5, + nRoundedWidth, + nRoundedHeight + ); + } + } + + this._bIsPainted = true; + }; + + /** + * Make the image from Canvas if the browser supports Data URI. + */ + Drawing.prototype.makeImage = function () { + if (this._bIsPainted) { + _safeSetDataURI.call(this, _onMakeImage); + } + }; + + /** + * Return whether the QRCode is painted or not + * + * @return {Boolean} + */ + Drawing.prototype.isPainted = function () { + return this._bIsPainted; + }; + + /** + * Clear the QRCode + */ + Drawing.prototype.clear = function () { + this._oContext.clearRect(0, 0, this._elCanvas.width, this._elCanvas.height); + this._bIsPainted = false; + }; + + /** + * @private + * @param {Number} nNumber + */ + Drawing.prototype.round = function (nNumber) { + if (!nNumber) { + return nNumber; + } + + return Math.floor(nNumber * 1000) / 1000; + }; + + return Drawing; + })(); + + /** + * Get the type by string length + * + * @private + * @param {String} sText + * @param {Number} nCorrectLevel + * @return {Number} type + */ + function _getTypeNumber(sText, nCorrectLevel) { + var nType = 1; + var length = _getUTF8Length(sText); + + for (var i = 0, len = QRCodeLimitLength.length; i <= len; i++) { + var nLimit = 0; + + switch (nCorrectLevel) { + case QRErrorCorrectLevel.L : + nLimit = QRCodeLimitLength[i][0]; + break; + case QRErrorCorrectLevel.M : + nLimit = QRCodeLimitLength[i][1]; + break; + case QRErrorCorrectLevel.Q : + nLimit = QRCodeLimitLength[i][2]; + break; + case QRErrorCorrectLevel.H : + nLimit = QRCodeLimitLength[i][3]; + break; + } + + if (length <= nLimit) { + break; + } else { + nType++; + } + } + + if (nType > QRCodeLimitLength.length) { + throw new Error("Too long data"); + } + + return nType; + } + + function _getUTF8Length(sText) { + var replacedText = encodeURI(sText).toString().replace(/\%[0-9a-fA-F]{2}/g, 'a'); + return replacedText.length + (replacedText.length != sText ? 3 : 0); + } + + /** + * @class QRCode + * @constructor + * @example + * new QRCode(document.getElementById("test"), "http://jindo.dev.naver.com/collie"); + * + * @example + * var oQRCode = new QRCode("test", { + * text : "http://naver.com", + * width : 128, + * height : 128 + * }); + * + * oQRCode.clear(); // Clear the QRCode. + * oQRCode.makeCode("http://map.naver.com"); // Re-create the QRCode. + * + * @param {HTMLElement|String} el target element or 'id' attribute of element. + * @param {Object|String} vOption + * @param {String} vOption.text QRCode link data + * @param {Number} [vOption.width=256] + * @param {Number} [vOption.height=256] + * @param {String} [vOption.colorDark="#000000"] + * @param {String} [vOption.colorLight="#ffffff"] + * @param {QRCode.CorrectLevel} [vOption.correctLevel=QRCode.CorrectLevel.H] [L|M|Q|H] + */ + QRCode = function (el, vOption) { + this._htOption = { + width : 256, + height : 256, + typeNumber : 4, + colorDark : "#000000", + colorLight : "#ffffff", + correctLevel : QRErrorCorrectLevel.H + }; + + if (typeof vOption === 'string') { + vOption = { + text : vOption + }; + } + + // Overwrites options + if (vOption) { + for (var i in vOption) { + this._htOption[i] = vOption[i]; + } + } + + if (typeof el == "string") { + el = document.getElementById(el); + } + + if (this._htOption.useSVG) { + Drawing = svgDrawer; + } + + this._android = _getAndroid(); + this._el = el; + this._oQRCode = null; + this._oDrawing = new Drawing(this._el, this._htOption); + + if (this._htOption.text) { + this.makeCode(this._htOption.text); + } + }; + + /** + * Make the QRCode + * + * @param {String} sText link data + */ + QRCode.prototype.makeCode = function (sText) { + this._oQRCode = new QRCodeModel(_getTypeNumber(sText, this._htOption.correctLevel), this._htOption.correctLevel); + this._oQRCode.addData(sText); + this._oQRCode.make(); + this._el.title = sText; + this._oDrawing.draw(this._oQRCode); + this.makeImage(); + }; + + /** + * Make the Image from Canvas element + * - It occurs automatically + * - Android below 3 doesn't support Data-URI spec. + * + * @private + */ + QRCode.prototype.makeImage = function () { + if (typeof this._oDrawing.makeImage == "function" && (!this._android || this._android >= 3)) { + this._oDrawing.makeImage(); + } + }; + + /** + * Clear the QRCode + */ + QRCode.prototype.clear = function () { + this._oDrawing.clear(); + }; + + /** + * @name QRCode.CorrectLevel + */ + QRCode.CorrectLevel = QRErrorCorrectLevel; +})(); diff --git a/assets/js/qrcode/qrcode.min.js b/assets/js/qrcode/qrcode.min.js new file mode 100644 index 00000000..993e88f3 --- /dev/null +++ b/assets/js/qrcode/qrcode.min.js @@ -0,0 +1 @@ +var QRCode;!function(){function a(a){this.mode=c.MODE_8BIT_BYTE,this.data=a,this.parsedData=[];for(var b=[],d=0,e=this.data.length;e>d;d++){var f=this.data.charCodeAt(d);f>65536?(b[0]=240|(1835008&f)>>>18,b[1]=128|(258048&f)>>>12,b[2]=128|(4032&f)>>>6,b[3]=128|63&f):f>2048?(b[0]=224|(61440&f)>>>12,b[1]=128|(4032&f)>>>6,b[2]=128|63&f):f>128?(b[0]=192|(1984&f)>>>6,b[1]=128|63&f):b[0]=f,this.parsedData=this.parsedData.concat(b)}this.parsedData.length!=this.data.length&&(this.parsedData.unshift(191),this.parsedData.unshift(187),this.parsedData.unshift(239))}function b(a,b){this.typeNumber=a,this.errorCorrectLevel=b,this.modules=null,this.moduleCount=0,this.dataCache=null,this.dataList=[]}function i(a,b){if(void 0==a.length)throw new Error(a.length+"/"+b);for(var c=0;c=f;f++){var h=0;switch(b){case d.L:h=l[f][0];break;case d.M:h=l[f][1];break;case d.Q:h=l[f][2];break;case d.H:h=l[f][3]}if(h>=e)break;c++}if(c>l.length)throw new Error("Too long data");return c}function s(a){var b=encodeURI(a).toString().replace(/\%[0-9a-fA-F]{2}/g,"a");return b.length+(b.length!=a?3:0)}a.prototype={getLength:function(){return this.parsedData.length},write:function(a){for(var b=0,c=this.parsedData.length;c>b;b++)a.put(this.parsedData[b],8)}},b.prototype={addData:function(b){var c=new a(b);this.dataList.push(c),this.dataCache=null},isDark:function(a,b){if(0>a||this.moduleCount<=a||0>b||this.moduleCount<=b)throw new Error(a+","+b);return this.modules[a][b]},getModuleCount:function(){return this.moduleCount},make:function(){this.makeImpl(!1,this.getBestMaskPattern())},makeImpl:function(a,c){this.moduleCount=4*this.typeNumber+17,this.modules=new Array(this.moduleCount);for(var d=0;d=7&&this.setupTypeNumber(a),null==this.dataCache&&(this.dataCache=b.createData(this.typeNumber,this.errorCorrectLevel,this.dataList)),this.mapData(this.dataCache,c)},setupPositionProbePattern:function(a,b){for(var c=-1;7>=c;c++)if(!(-1>=a+c||this.moduleCount<=a+c))for(var d=-1;7>=d;d++)-1>=b+d||this.moduleCount<=b+d||(this.modules[a+c][b+d]=c>=0&&6>=c&&(0==d||6==d)||d>=0&&6>=d&&(0==c||6==c)||c>=2&&4>=c&&d>=2&&4>=d?!0:!1)},getBestMaskPattern:function(){for(var a=0,b=0,c=0;8>c;c++){this.makeImpl(!0,c);var d=f.getLostPoint(this);(0==c||a>d)&&(a=d,b=c)}return b},createMovieClip:function(a,b,c){var d=a.createEmptyMovieClip(b,c),e=1;this.make();for(var f=0;f=g;g++)for(var h=-2;2>=h;h++)this.modules[d+g][e+h]=-2==g||2==g||-2==h||2==h||0==g&&0==h?!0:!1}},setupTypeNumber:function(a){for(var b=f.getBCHTypeNumber(this.typeNumber),c=0;18>c;c++){var d=!a&&1==(1&b>>c);this.modules[Math.floor(c/3)][c%3+this.moduleCount-8-3]=d}for(var c=0;18>c;c++){var d=!a&&1==(1&b>>c);this.modules[c%3+this.moduleCount-8-3][Math.floor(c/3)]=d}},setupTypeInfo:function(a,b){for(var c=this.errorCorrectLevel<<3|b,d=f.getBCHTypeInfo(c),e=0;15>e;e++){var g=!a&&1==(1&d>>e);6>e?this.modules[e][8]=g:8>e?this.modules[e+1][8]=g:this.modules[this.moduleCount-15+e][8]=g}for(var e=0;15>e;e++){var g=!a&&1==(1&d>>e);8>e?this.modules[8][this.moduleCount-e-1]=g:9>e?this.modules[8][15-e-1+1]=g:this.modules[8][15-e-1]=g}this.modules[this.moduleCount-8][8]=!a},mapData:function(a,b){for(var c=-1,d=this.moduleCount-1,e=7,g=0,h=this.moduleCount-1;h>0;h-=2)for(6==h&&h--;;){for(var i=0;2>i;i++)if(null==this.modules[d][h-i]){var j=!1;g>>e));var k=f.getMask(b,d,h-i);k&&(j=!j),this.modules[d][h-i]=j,e--,-1==e&&(g++,e=7)}if(d+=c,0>d||this.moduleCount<=d){d-=c,c=-c;break}}}},b.PAD0=236,b.PAD1=17,b.createData=function(a,c,d){for(var e=j.getRSBlocks(a,c),g=new k,h=0;h8*l)throw new Error("code length overflow. ("+g.getLengthInBits()+">"+8*l+")");for(g.getLengthInBits()+4<=8*l&&g.put(0,4);0!=g.getLengthInBits()%8;)g.putBit(!1);for(;;){if(g.getLengthInBits()>=8*l)break;if(g.put(b.PAD0,8),g.getLengthInBits()>=8*l)break;g.put(b.PAD1,8)}return b.createBytes(g,e)},b.createBytes=function(a,b){for(var c=0,d=0,e=0,g=new Array(b.length),h=new Array(b.length),j=0;j=0?p.get(q):0}}for(var r=0,m=0;mm;m++)for(var j=0;jm;m++)for(var j=0;j=0;)b^=f.G15<=0;)b^=f.G18<>>=1;return b},getPatternPosition:function(a){return f.PATTERN_POSITION_TABLE[a-1]},getMask:function(a,b,c){switch(a){case e.PATTERN000:return 0==(b+c)%2;case e.PATTERN001:return 0==b%2;case e.PATTERN010:return 0==c%3;case e.PATTERN011:return 0==(b+c)%3;case e.PATTERN100:return 0==(Math.floor(b/2)+Math.floor(c/3))%2;case e.PATTERN101:return 0==b*c%2+b*c%3;case e.PATTERN110:return 0==(b*c%2+b*c%3)%2;case e.PATTERN111:return 0==(b*c%3+(b+c)%2)%2;default:throw new Error("bad maskPattern:"+a)}},getErrorCorrectPolynomial:function(a){for(var b=new i([1],0),c=0;a>c;c++)b=b.multiply(new i([1,g.gexp(c)],0));return b},getLengthInBits:function(a,b){if(b>=1&&10>b)switch(a){case c.MODE_NUMBER:return 10;case c.MODE_ALPHA_NUM:return 9;case c.MODE_8BIT_BYTE:return 8;case c.MODE_KANJI:return 8;default:throw new Error("mode:"+a)}else if(27>b)switch(a){case c.MODE_NUMBER:return 12;case c.MODE_ALPHA_NUM:return 11;case c.MODE_8BIT_BYTE:return 16;case c.MODE_KANJI:return 10;default:throw new Error("mode:"+a)}else{if(!(41>b))throw new Error("type:"+b);switch(a){case c.MODE_NUMBER:return 14;case c.MODE_ALPHA_NUM:return 13;case c.MODE_8BIT_BYTE:return 16;case c.MODE_KANJI:return 12;default:throw new Error("mode:"+a)}}},getLostPoint:function(a){for(var b=a.getModuleCount(),c=0,d=0;b>d;d++)for(var e=0;b>e;e++){for(var f=0,g=a.isDark(d,e),h=-1;1>=h;h++)if(!(0>d+h||d+h>=b))for(var i=-1;1>=i;i++)0>e+i||e+i>=b||(0!=h||0!=i)&&g==a.isDark(d+h,e+i)&&f++;f>5&&(c+=3+f-5)}for(var d=0;b-1>d;d++)for(var e=0;b-1>e;e++){var j=0;a.isDark(d,e)&&j++,a.isDark(d+1,e)&&j++,a.isDark(d,e+1)&&j++,a.isDark(d+1,e+1)&&j++,(0==j||4==j)&&(c+=3)}for(var d=0;b>d;d++)for(var e=0;b-6>e;e++)a.isDark(d,e)&&!a.isDark(d,e+1)&&a.isDark(d,e+2)&&a.isDark(d,e+3)&&a.isDark(d,e+4)&&!a.isDark(d,e+5)&&a.isDark(d,e+6)&&(c+=40);for(var e=0;b>e;e++)for(var d=0;b-6>d;d++)a.isDark(d,e)&&!a.isDark(d+1,e)&&a.isDark(d+2,e)&&a.isDark(d+3,e)&&a.isDark(d+4,e)&&!a.isDark(d+5,e)&&a.isDark(d+6,e)&&(c+=40);for(var k=0,e=0;b>e;e++)for(var d=0;b>d;d++)a.isDark(d,e)&&k++;var l=Math.abs(100*k/b/b-50)/5;return c+=10*l}},g={glog:function(a){if(1>a)throw new Error("glog("+a+")");return g.LOG_TABLE[a]},gexp:function(a){for(;0>a;)a+=255;for(;a>=256;)a-=255;return g.EXP_TABLE[a]},EXP_TABLE:new Array(256),LOG_TABLE:new Array(256)},h=0;8>h;h++)g.EXP_TABLE[h]=1<h;h++)g.EXP_TABLE[h]=g.EXP_TABLE[h-4]^g.EXP_TABLE[h-5]^g.EXP_TABLE[h-6]^g.EXP_TABLE[h-8];for(var h=0;255>h;h++)g.LOG_TABLE[g.EXP_TABLE[h]]=h;i.prototype={get:function(a){return this.num[a]},getLength:function(){return this.num.length},multiply:function(a){for(var b=new Array(this.getLength()+a.getLength()-1),c=0;cf;f++)for(var g=c[3*f+0],h=c[3*f+1],i=c[3*f+2],k=0;g>k;k++)e.push(new j(h,i));return e},j.getRsBlockTable=function(a,b){switch(b){case d.L:return j.RS_BLOCK_TABLE[4*(a-1)+0];case d.M:return j.RS_BLOCK_TABLE[4*(a-1)+1];case d.Q:return j.RS_BLOCK_TABLE[4*(a-1)+2];case d.H:return j.RS_BLOCK_TABLE[4*(a-1)+3];default:return void 0}},k.prototype={get:function(a){var b=Math.floor(a/8);return 1==(1&this.buffer[b]>>>7-a%8)},put:function(a,b){for(var c=0;b>c;c++)this.putBit(1==(1&a>>>b-c-1))},getLengthInBits:function(){return this.length},putBit:function(a){var b=Math.floor(this.length/8);this.buffer.length<=b&&this.buffer.push(0),a&&(this.buffer[b]|=128>>>this.length%8),this.length++}};var l=[[17,14,11,7],[32,26,20,14],[53,42,32,24],[78,62,46,34],[106,84,60,44],[134,106,74,58],[154,122,86,64],[192,152,108,84],[230,180,130,98],[271,213,151,119],[321,251,177,137],[367,287,203,155],[425,331,241,177],[458,362,258,194],[520,412,292,220],[586,450,322,250],[644,504,364,280],[718,560,394,310],[792,624,442,338],[858,666,482,382],[929,711,509,403],[1003,779,565,439],[1091,857,611,461],[1171,911,661,511],[1273,997,715,535],[1367,1059,751,593],[1465,1125,805,625],[1528,1190,868,658],[1628,1264,908,698],[1732,1370,982,742],[1840,1452,1030,790],[1952,1538,1112,842],[2068,1628,1168,898],[2188,1722,1228,958],[2303,1809,1283,983],[2431,1911,1351,1051],[2563,1989,1423,1093],[2699,2099,1499,1139],[2809,2213,1579,1219],[2953,2331,1663,1273]],o=function(){var a=function(a,b){this._el=a,this._htOption=b};return a.prototype.draw=function(a){function g(a,b){var c=document.createElementNS("http://www.w3.org/2000/svg",a);for(var d in b)b.hasOwnProperty(d)&&c.setAttribute(d,b[d]);return c}var b=this._htOption,c=this._el,d=a.getModuleCount();Math.floor(b.width/d),Math.floor(b.height/d),this.clear();var h=g("svg",{viewBox:"0 0 "+String(d)+" "+String(d),width:"100%",height:"100%",fill:b.colorLight});h.setAttributeNS("http://www.w3.org/2000/xmlns/","xmlns:xlink","http://www.w3.org/1999/xlink"),c.appendChild(h),h.appendChild(g("rect",{fill:b.colorDark,width:"1",height:"1",id:"template"}));for(var i=0;d>i;i++)for(var j=0;d>j;j++)if(a.isDark(i,j)){var k=g("use",{x:String(i),y:String(j)});k.setAttributeNS("http://www.w3.org/1999/xlink","href","#template"),h.appendChild(k)}},a.prototype.clear=function(){for(;this._el.hasChildNodes();)this._el.removeChild(this._el.lastChild)},a}(),p="svg"===document.documentElement.tagName.toLowerCase(),q=p?o:m()?function(){function a(){this._elImage.src=this._elCanvas.toDataURL("image/png"),this._elImage.style.display="block",this._elCanvas.style.display="none"}function d(a,b){var c=this;if(c._fFail=b,c._fSuccess=a,null===c._bSupportDataURI){var d=document.createElement("img"),e=function(){c._bSupportDataURI=!1,c._fFail&&_fFail.call(c)},f=function(){c._bSupportDataURI=!0,c._fSuccess&&c._fSuccess.call(c)};return d.onabort=e,d.onerror=e,d.onload=f,d.src="data:image/gif;base64,iVBORw0KGgoAAAANSUhEUgAAAAUAAAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO9TXL0Y4OHwAAAABJRU5ErkJggg==",void 0}c._bSupportDataURI===!0&&c._fSuccess?c._fSuccess.call(c):c._bSupportDataURI===!1&&c._fFail&&c._fFail.call(c)}if(this._android&&this._android<=2.1){var b=1/window.devicePixelRatio,c=CanvasRenderingContext2D.prototype.drawImage;CanvasRenderingContext2D.prototype.drawImage=function(a,d,e,f,g,h,i,j){if("nodeName"in a&&/img/i.test(a.nodeName))for(var l=arguments.length-1;l>=1;l--)arguments[l]=arguments[l]*b;else"undefined"==typeof j&&(arguments[1]*=b,arguments[2]*=b,arguments[3]*=b,arguments[4]*=b);c.apply(this,arguments)}}var e=function(a,b){this._bIsPainted=!1,this._android=n(),this._htOption=b,this._elCanvas=document.createElement("canvas"),this._elCanvas.width=b.width,this._elCanvas.height=b.height,a.appendChild(this._elCanvas),this._el=a,this._oContext=this._elCanvas.getContext("2d"),this._bIsPainted=!1,this._elImage=document.createElement("img"),this._elImage.style.display="none",this._el.appendChild(this._elImage),this._bSupportDataURI=null};return e.prototype.draw=function(a){var b=this._elImage,c=this._oContext,d=this._htOption,e=a.getModuleCount(),f=d.width/e,g=d.height/e,h=Math.round(f),i=Math.round(g);b.style.display="none",this.clear();for(var j=0;e>j;j++)for(var k=0;e>k;k++){var l=a.isDark(j,k),m=k*f,n=j*g;c.strokeStyle=l?d.colorDark:d.colorLight,c.lineWidth=1,c.fillStyle=l?d.colorDark:d.colorLight,c.fillRect(m,n,f,g),c.strokeRect(Math.floor(m)+.5,Math.floor(n)+.5,h,i),c.strokeRect(Math.ceil(m)-.5,Math.ceil(n)-.5,h,i)}this._bIsPainted=!0},e.prototype.makeImage=function(){this._bIsPainted&&d.call(this,a)},e.prototype.isPainted=function(){return this._bIsPainted},e.prototype.clear=function(){this._oContext.clearRect(0,0,this._elCanvas.width,this._elCanvas.height),this._bIsPainted=!1},e.prototype.round=function(a){return a?Math.floor(1e3*a)/1e3:a},e}():function(){var a=function(a,b){this._el=a,this._htOption=b};return a.prototype.draw=function(a){for(var b=this._htOption,c=this._el,d=a.getModuleCount(),e=Math.floor(b.width/d),f=Math.floor(b.height/d),g=[''],h=0;d>h;h++){g.push("");for(var i=0;d>i;i++)g.push('');g.push("")}g.push("
"),c.innerHTML=g.join("");var j=c.childNodes[0],k=(b.width-j.offsetWidth)/2,l=(b.height-j.offsetHeight)/2;k>0&&l>0&&(j.style.margin=l+"px "+k+"px")},a.prototype.clear=function(){this._el.innerHTML=""},a}();QRCode=function(a,b){if(this._htOption={width:256,height:256,typeNumber:4,colorDark:"#000000",colorLight:"#ffffff",correctLevel:d.H},"string"==typeof b&&(b={text:b}),b)for(var c in b)this._htOption[c]=b[c];"string"==typeof a&&(a=document.getElementById(a)),this._android=n(),this._el=a,this._oQRCode=null,this._oDrawing=new q(this._el,this._htOption),this._htOption.text&&this.makeCode(this._htOption.text)},QRCode.prototype.makeCode=function(a){this._oQRCode=new b(r(a,this._htOption.correctLevel),this._htOption.correctLevel),this._oQRCode.addData(a),this._oQRCode.make(),this._el.title=a,this._oDrawing.draw(this._oQRCode),this.makeImage()},QRCode.prototype.makeImage=function(){"function"==typeof this._oDrawing.makeImage&&(!this._android||this._android>=3)&&this._oDrawing.makeImage()},QRCode.prototype.clear=function(){this._oDrawing.clear()},QRCode.CorrectLevel=d}(); \ No newline at end of file diff --git a/banner.js b/banner.js new file mode 100644 index 00000000..db3974c0 --- /dev/null +++ b/banner.js @@ -0,0 +1,19 @@ +const fs = require("fs"); +const pkg = require("./package.json"); +const filename = "assets/js/main.min.js"; +const script = fs.readFileSync(filename); +const padStart = str => ("0" + str).slice(-2); +const dateObj = new Date(); +const date = `${dateObj.getFullYear()}-${padStart( + dateObj.getMonth() + 1 +)}-${padStart(dateObj.getDate())}`; +const banner = `/*! + * Minimal Mistakes Jekyll Theme ${pkg.version} by ${pkg.author} + * Copyright 2013-${dateObj.getFullYear()} Michael Rose - mademistakes.com | @mmistakes + * Licensed under ${pkg.license} + */ +`; + +if (script.slice(0, 3) != "/**") { + fs.writeFileSync(filename, banner + script); +} diff --git a/favicon.ico b/favicon.ico new file mode 100644 index 00000000..7b01fe10 Binary files /dev/null and b/favicon.ico differ diff --git a/feed.xml b/feed.xml new file mode 100644 index 00000000..b224a18c --- /dev/null +++ b/feed.xml @@ -0,0 +1,851 @@ +Jekyll2023-10-21T11:20:52+00:00https://www.copdips.com/feed.xmlA code to rememberPython developerXiang ZHUxiang.zhu@outlook.comHashing files2023-10-21T00:00:00+00:002023-10-21T00:00:00+00:00https://www.copdips.com/2023/10/hashing-filesDuring CI/CD processes, and particularly during CI, we frequently hash dependency files to create cache keys (referred to as key input in Github Action actions/cache and key parameter in Azure pipelines Cache@2 task). However, the default hash functions come with certain limitations like this comment. To address this, we can use the following pure Bash shell command to manually generate the hash value.

+ +

For Github Actions, we can use following snippet:

+ +
# github actions example
+inputs:
+  req-files:
+    description: >
+      requirements files separated by comma or space, glob pattern is allowed.
+      e.g. "requirements/*.txt, requirements.txt"
+    required: true
+runs:
+  using: "composite"
+  steps:
+    - name: Compute hash key
+      shell: bash
+      env:
+        REQ_FILES: ${{ inputs.req-files }}
+      run: |
+        files=$(echo "$REQ_FILES" | tr "," " ")
+        files_sep_by_space=""
+        for file in $files; do
+            files_sep_by_space="$files_sep_by_space $(ls $file | tr '\n' ' ')"
+        done
+        files_sep_by_space=$(echo $files_sep_by_space | tr ' ' '\n' | sort | uniq | tr '\n' ' ')
+        files_hash=$(cat $files_sep_by_space | md5sum | awk '{print $1}')
+        echo "files_hash: $files_hash"
+
+ +

For Azure pipelines, the process is nearly identical to the above Github Action example. The only difference is that we first need to convert the reqFiles parameter from an object to a string. But if you set the parameter type to string (as in the Github Action), the process becomes identical.

+ +
# azure pipelines example
+parameters:
+  - name: reqFiles
+    displayName: >
+      requirements files, glob pattern is allowed.
+      e.g.:
+      - requirements/*.txt
+      - requirements.txt
+    type: object
+  steps:
+    - script: |
+        files=$(echo "$REQ_FILES_JSON" | jq  '. | join(" ")' -r)
+        files_sep_by_space=""
+        for file in $files; do
+            files_sep_by_space="$files_sep_by_space $(ls $file | tr '\n' ' ')"
+        done
+        files_sep_by_space=$(echo $files_sep_by_space | tr ' ' '\n' | sort | uniq | tr '\n' ' ')
+        files_hash=$(cat $files_sep_by_space | md5sum | awk '{print $1}')
+        echo "files_hash: $files_hash"
+      displayName: Compute hash key
+      env:
+        REQ_FILES_JSON: "${{ convertToJson(parameters.reqFiles) }}"
+
+ +

When creating the cache key, we also need to include os version, the one provided by Github action and Azure pipelines environment vars are not precise enough, they do not give patch version number. We can generate the full os version by the following command cat /etc/os-release | grep -i "version=" | cut -c9- | tr -d '"' | tr ' ' '_'

]]>
Xiang ZHUxiang.zhu@outlook.com
Github Actions - copdips/get-azure-keyvault-secrets-action2023-10-16T00:00:00+00:002023-10-16T00:00:00+00:00https://www.copdips.com/2023/10/github-actions-get-azure-keyvault-secrets-actionRecently, I began a new project that requires migrating some process from Azure Pipelines to Github Actions. One of the tasks involves retrieving secrets from Azure Key Vault.

+ +

In Azure Pipelines, we have an official task called AzureKeyVault@2 designed for this purpose. However, its official counterpart in Github Actions, Azure/get-keyvault-secrets@v1, has been deprecated. The recommended alternative is Azure CLI. While Azure CLI is a suitable option, it operates in a bash shell without multithreading. If numerous secrets need to be fetched, this can be time-consuming.

+ +

Over the past weekend, I decided to write my own action using Python, leveraging asyncio. I avoided any additional third party Python modules like requests, aiohttp, or httpx, so no pip install needed.

+ +

As anticipated, the pure Python solution is notably faster than using the Azure CLI, and even surpasses the speed of the Azure Pipelines task AzureKeyVault@2. In my tests, it was able to retrieve the all the secrets from an Azure Key Vault within seconds.

+ +

The source code is at: copdips/get-azure-keyvault-secrets-action

+ +

And hereunder is the usage:

+ +
# in the calling workflow, user should first login to Azure
+- uses: Azure/login@v1
+  with:
+    # creds: ${{secrets.AZURE_CREDENTIALS}} is not recommended due to json secrets security concerns.
+    creds: '{"clientId":"${{ secrets.CLIENT_ID }}","clientSecret":"${{ secrets.CLIENT_SECRET }}","subscriptionId":"${{ secrets.SUBSCRIPTION_ID }}","tenantId":"${{ secrets.TENANT_ID }}"}'
+
+- name: Get Azure KeyVault secrets
+  id: get-azure-keyvault-secrets
+  uses: copdips/get-azure-keyvault-secrets-action@v1
+  with:
+    keyvault: {your_azure_keyvault_name}
+
+# Suppose there's a secret named client-secret in the Azure Key Vault,
+# so an env var named CLIENT_SECRET should be created by the action.
+# You won't see the secret value in the workflow log as it's masked by Github automatically.
+- name: Use secrets from env var
+  run: |
+    echo $CLIENT_SECRET
+    echo ${{ env.CLIENT_SECRET }}
+
+- name: Use secrets from output
+  run: |
+    echo $JSON_SECRETS | jq .CLIENT_SECRET -r
+  env:
+    JSON_SECRETS: ${{ steps.get-azure-keyvault-secrets.outputs.json }}
+
]]>
Xiang ZHUxiang.zhu@outlook.com
Databricks Python pip authentication2023-09-22T00:00:00+00:002023-09-22T00:00:00+00:00https://www.copdips.com/2023/09/databricks-python-pip-authenticationBefore the Databricks Unit Catalog’s release, we used init scripts to generate the pip.conf file during cluster startup, allowing each cluster its unique auth token. But with init scripts no longer available in the Unit Catalog’s shared mode, an alternative approach is required.

+ +

A workaround involves placing a prepared pip.conf in the Databricks workspace and setting the PIP_CONFIG_FILE environment variable to point to this file. This method, however, presents security concerns: the pip.conf file, containing the auth token, becomes accessible to the entire workspace, potentially exposing it to all users and clusters. See here to check this workaround.

+ +

In contrast, the Unit Catalog’s single mode retains init script availability. Here, the pip auth token is stored securely in a vault and accessed via the Databricks secret scope. Upon cluster startup, the init script fetches the token from the vault, generating the pip.conf file. This approach is considerably more secure than the shared mode alternative.

]]>
Xiang ZHUxiang.zhu@outlook.com
Github Actions - Python2023-09-22T00:00:00+00:002023-09-22T00:00:00+00:00https://www.copdips.com/2023/09/github-actions-pythonSetting up pip authentication + +

PIP_INDEX_URL vs PIP_EXTRA_INDEX_URL

+ +

In most cases, when setting up private Python package artifacts (like Azure DevOps Artifacts, JFrog Artifactory, etc.) are configured to mirror the public PyPi. In such scenarios, we only need to use PIP_INDEX_URL to point to these private artifacts.

+ +

However, some people might use PIP_INDEX_URL point to the public PyPi, and PIP_EXTRA_INDEX_URL to point to the private artifacts. This approach is not recommended, as it results in the public PyPi searched first, followed by the private artifacts. This poses a security risk where a malicious actor can publish a package with the same name as your private one on the public PyPi.

+ +

Auth for Azure DevOps Artifacts

+ +

Auth by Azure SPN crendentials

+ +

In March 2023, there was a great news that Azure Service Principal was been introduced in Azure DevOps, eliminating the use of service account.

+ +
    +
  1. Create a service principal in Azure Active Directory.
  2. +
  3. Add the service principal to the Azure DevOps Artifacts feed with Contributor role. Package publishing (twine upload) needs Contributor role, but package installation (pip install) only needs Reader role.
  4. +
  5. +

    Add SPN credentials to Github Secrets with name AZURE_CREDENTIALS, and value in JSON format:

    + +
     {
    +   "clientId": "xxxxx",
    +   "clientSecret": "xxxxx",
    +   "subscriptionId": "xxxxx",
    +   "tenantId": "xxxxx"
    + }
    +
    +
  6. +
  7. +

    Create env var PIP_INDEX_URL in the workflow, and set it to the Azure DevOps Artifacts feed URL.

    + +
     - uses: actions/checkout@v4
    +
    +
    + - name: Setup Python
    +   uses: actions/setup-python@v4
    +   with:
    +     python-version: ${{ matrix.python-version }}
    +     # see below post of a faster Python cache:
    +     # https://copdips.com/2023/09/github-actions-cache.html#pip-cache-dir-vs-pip-install-dir
    +     cache: pip
    +     cache-dependency-path: requirements/*.txt
    +
    + - name: Azure Login
    +   uses: azure/login@v1
    +   with:
    +     creds: ${{ secrets.AZURE_CREDENTIALS }}
    +
    + - name: Setup Python package feed
    +   run: |
    +     access_token=$(az account get-access-token | jq .accessToken -r)
    +
    +     # setup pip auth
    +     echo "PIP_INDEX_URL=https://:$access_token@pkgs.dev.azure.com/{azdo_org_name}/_packaging/{azdo_artifacts_feed_name}/pypi/simple/" >> $GITHUB_ENV
    +
    +     # setup twine auth
    +     cat > ~/.pypirc <<EOF
    +     [distutils]
    +     index-servers={azdo_artifacts_feed_name}
    +     [{azdo_artifacts_feed_name}]
    +     repository=https://pkgs.dev.azure.com/{azdo_org_name}/_packaging/{azdo_artifacts_feed_name}/pypi/upload
    +     username=build
    +     password=$access_token
    +     EOF
    +
    +     # setup access token for action pypa/gh-action-pypi-publish
    +     echo "ACCESS_TOKEN=$access_token" >> $GITHUB_ENV
    +
    + - name: Install dependencies
    +   run: |
    +     pip install -U pip
    +     pip install -r requirements/requirements.txt
    +
    + - name: Build Python package
    +   run: |
    +     # need to install wheel in advance
    +     python setup.py sdist bdist_wheel
    +     # modern Python uses `python -m build` instead
    +
    + # alternative Python package build and check
    + - name: Build and Check Package
    +   uses: hynek/build-and-inspect-python-package@v1.5
    +
    + - name: Publish Python package by twine
    +   run: |
    +     # need to install twine in advance
    +     twine upload -r {azdo_artifacts_feed_name} dist/*.whl
    +
    + # alternative Python package publish
    + - name: Publish Python package by action
    +   # does not need to install twine in advance
    +   uses: pypa/gh-action-pypi-publish@release/v1
    +   with:
    +     repository-url: "https://pkgs.dev.azure.com/{azdo_org_name}/_packaging/{azdo_artifacts_feed_name}/pypi/upload"
    +     password: ${{ env.ACCESS_TOKEN }}
    +
    + - name: Cleanup secret envs
    +   run: |
    +     echo "PIP_INDEX_URL=" >> $GITHUB_ENV
    +     echo "ACCESS_TOKEN=" >> $GITHUB_ENV
    +
    +
  8. +
+ +

Auth by Azure OpenID Connect (OIDC)

+ +

We can also setup OpenID Connect (OIDC) between Github Action and Azure. It’s practical because we do not need to worry about Azure SPN secret rotation. However, a drawback is that when setting up OIDC, we must add a filter (subject field in the credential.json). This could be a branch name, tag name, pull request, or environment name, we can not use wildcards in the filter, so we have to set up OIDC for each branch, tag, pull request or environment as needed. This is not very practical. For AWS, there’s no such limitation.

+ +

To use Azure OIDC with Github Action, we need to add the following to the workflow:

+ +
...
+permissions:
+  id-token: write
+  contents: read
+
+jobs:
+  a_job:
+    ...
+    steps:
+      - name: Azure login by OIDC
+        uses: azure/login@v1
+        with:
+          # Official doc puts these 3 fields in secrets, but it's not necessary,
+          # as `subject` field in the credential.json prevents other repos from
+          # using the same credential. And these are not sensitive info neither.
+          tenant-id: ${{ vars.AZURE_TENANT_ID }}
+          subscription-id: ${{ vars.AZURE_SUBSCRIPTION_ID }}
+          client-id: ${{ vars.AZURE_CLIENT_ID }}
+
]]>
Xiang ZHUxiang.zhu@outlook.com
Github Actions - Workflows2023-09-21T00:00:00+00:002023-09-21T00:00:00+00:00https://www.copdips.com/2023/09/github-actions-workflowsReusable workflows + +

Re-run a reusable workflow

+ +

If reusable workflow is not referenced by SHA, for example a branch name, when re-run a workflow, it will not use the latest version of the workflow in that branch, but the same commit SHA of the first attempt. Which means, if you use the git amend push to overwrite the old commit history, the workflow re-run will fail as it cannot find the specific SHA version of the workflow.

+ +

In contrary, if an action is referenced by branch name, it will always use the latest version of the action in that branch upon re-run.

+ +

Cancelling a workflow

+ +

To cancel the current workflow run inside the run itself:

+ +
- name: cancelling
+  uses: andymckay/cancel-action@0.3
+
+ +

We can use if: cancelled() or if: always() to bypass the workflow cancel signal.

]]>
Xiang ZHUxiang.zhu@outlook.com
Github Actions - Error handling2023-09-20T00:00:00+00:002023-09-20T00:00:00+00:00https://www.copdips.com/2023/09/github-actions-error-handlingcontinue-on-error vs fail-fast + +

The doc explains that continue-on-error applies to a single job or single step which defines whether a job or step can continue on its error, while fail-fast applies to the entire matrix which means if the failure of a job in the matrix can stop other running jobs in the matrix. For example:

+ +
    +
  • if fail-fast is set to true, the entire matrix will stop running when one job fails. But if the failed job has continue-on-error set to true, the matrix will continue running, as the failed job is not considered as a failure.
  • +
  • if fail-fast is set to false, all the jobs triggered by the matrix are considered independent, so the failed job will not affect other jobs.
  • +
+ +

When setting continue-on-error at job level only, and no set at step level, if one of the steps fails, the remaining steps wont be executed, the job will get a red failure badge in the Github Actions UI, but the job status will be considered as success.

+ +

Status check functions

+ +

We can also use status check functions if ${{ success() }}, if: ${{ always() }}, if: ${{ cancelled() }}, if: ${{ failure() }} to check the previous step (or job) status.

+ +

In if expression, we can skip the double curly brackets ${{}}, for example: if: success() instead of if: ${{ success() }}

]]>
Xiang ZHUxiang.zhu@outlook.com
Github Actions - Cache2023-09-19T00:00:00+00:002023-09-19T00:00:00+00:00https://www.copdips.com/2023/09/github-actions-cacheLife span + +

Github Actions cache has a life span of 7 days, and the total size of all caches in a repository is limited to 10 GB.

+ +

Standard Cache

+ +

Cache key should be as specific as possible, so that the post cache restore installation can be reduced or skipped.

+ +

For Python pip install, we could use the following cache key:

+ +
- name: Get pip cache dir
+  run: |
+    os_version=$(cat /etc/os-release | grep -i "version=" | cut -c9- | tr -d '"' | tr ' ' '_')
+    github_workflow_full_path="${GITHUB_WORKFLOW_REF%@*}"
+    python_full_version=$(python -c 'import platform; print(platform.python_version())')
+    node_major_version=$(node --version | cut -d'.' -f1 | tr -d 'v')
+    echo "os_version=$os_version" >> $GITHUB_ENV
+    echo "github_workflow_full_path=$github_workflow_full_path" >> $GITHUB_ENV
+    echo "python_full_version=$python_full_version" >> $GITHUB_ENV
+    echo "PIP_CACHE_DIR=$(pip cache dir)" >> $GITHUB_ENV
+
+- name: cache pip
+  uses: actions/cache@v3
+  with:
+    # path: ${{ env.PIP_CACHE_DIR }}
+    path: ${{ env.pythonLocation }}
+    key: ${{ env.github_workflow_full_path}}-${{ env.os_version }}-${{ env.python_full_version }}-${{ env.node_major_version}}-${{ hashFiles('requirements/*.txt') }}
+
+ +

The cache action repository provides also some Python caching examples.

+ +

pip cache dir vs pip install dir

+ +

The path parameter in actions/cache@v3 could be:

+ +
    +
  • ${{ env.PIP_CACHE_DIR }} if you only want to cache the pip cache dir, so you can skip the Python package download step, but you still need to install the packages.
  • +
  • ${{ env.pythonLocation }} if you want to cache the whole python installation dir, this is useful when you want to cache the site-packages dir, so that the pip install step can be reduced or skipped, this is also why we must use the ${{ env.os_version }}, ${{ env.python_full_version }} in the cache key. In most of cases, this is the best choice.
  • +
+ +

hashFiles

+ +

In Azure Pipelines, there’s similar thing as hashFiles() function, it should be in the form of glob pattern, like requirements/*.txt, but without double quotes, otherwise treated as a static string.

+ +
# Azure Pipelines
+- task: Cache@2
+  inputs:
+    key: 'python | "$(pythonFullVersion)" | "$(osVersion)" | "$(System.TeamProject)" | "$(Build.DefinitionName)" | "$(Agent.JobName)" | requirements/*.txt'
+    path: ...
+  displayName: ...
+
+ +

Otherwise, we can also achieve the same result by some pure bash commands:

+ +
# suppose parameters.requirementsFilePathList is a list of file paths
+- script: |
+    echo REQUIREMENTS_FILE_PATH_LIST_STRING: $REQUIREMENTS_FILE_PATH_LIST_STRING
+    all_files_in_one_line=$(echo $REQUIREMENTS_FILE_PATH_LIST_STRING | jq  '. | join(" ")' -r)
+    echo all_files_in_one_line: $all_files_in_one_line
+    all_files_md5sum=$(cat $all_files_in_one_line | md5sum | awk '{print $1}')
+    echo all_files_md5sum: $all_files_md5sum
+    echo "##vso[task.setvariable variable=pythonRequirementsFilesHash;]$all_files_md5sum"
+  displayName: Set pythonRequirementsFilesHash
+  env:
+    REQUIREMENTS_FILE_PATH_LIST_STRING: "${{ convertToJson(parameters.requirementsFilePathList) }}"
+
+ +

Cache with actions/setup-python

+ +

The action actions/setup-python has built-in functionality for caching and restoring dependencies with cache key. This cache method can only cache the pip cache dir to reduce the Python packages download time like path: ${{ env.PIP_CACHE_DIR }} in above example, but still need to install the packages, which is much slower than caching the package installation location. As the time of writing, the cache source dir (which is the pip cache dir) is generated by the action itself, and cannot be customized.

+ +

The cache key is something like: setup-python-Linux-22.04-Ubuntu-python-3.10.13-pip-308f89683977de8773e433ddf87c874b6bd931347b779ef0ab18f37ecc4fa914 (copied from workflow run log), which is generated as per this answer.

+ +
steps:
+- uses: actions/checkout@v4
+- uses: actions/setup-python@v4
+  with:
+    python-version: '3.10'
+    cache: 'pip' # caching pip dependencies, could be pip, pipenv, or poetry
+    cache-dependency-path: requirements/*.txt
+- run: pip install -r requirements.txt
+
+ +

If cache-dependency-path is not specified, and if the cache type is pip, it will try to find all the requirements.txt files in the repo and hash them to generate the cache key. For cache type with pipenv or poetry, I didn’t test them.

]]>
Xiang ZHUxiang.zhu@outlook.com
Github Actions - Custom Actions2023-09-19T00:00:00+00:002023-09-19T00:00:00+00:00https://www.copdips.com/2023/09/github-actions-custom-actionsActions checkout location in workflow + +

Actions are automatically checked out by Github Action from the beginning of a workflow run, the checkout path could be found by: env var $GITHUB_ACTION_PATH, github context ${{ github.action_path }}. This is very useful when you need to reference some files or scripts saved in the same repository as the actions.

+ +

+
+```bash
+
+Actions in workflow:
+
+```yaml
+- name: Check out repository code
+  uses: actions/checkout@v4
+
+- name: Use action in the version of the main branch
+  uses:{org_name}/{repo_name}/actions/{action_path}@main
+
+- name: Use action in the version of v1
+  uses:{org_name}/{repo_name}/actions/{action_path}@v1
+
+ +

Actions checkout location:

+ +
../../_actions/actions/checkout
+├── v4
+│   ├── CHANGELOG.md
+│   ├── CODEOWNERS
+│   ├── ...
+
+../../_actions/{org_name}/{repo_name}
+├── main
+│   ├── README.md
+│   └── actions
+│   └── ...
+├── main.completed
+├── v1
+│   ├── README.md
+│   └── actions
+│   └── ...
+└── v1.completed
+
+ +

Multiple actions in single repository

+ +

You can save multiple actions inside a single repository, and use them in the form of uses: org/repo/folder_path@git_ref in a workflow.

+ +

azure/CLI

+ +

Benefits of using azure/CLI over run task:

+ +
    +
  1. azure/CLI runs az commands in an isolated docker container.
  2. +
  3. azure/CLI can choose the CLI version.
  4. +
  5. For some self-hosted runner, may not have “az cli” pre-installed, the Azure/CLI action eliminates the need for complex installation steps.
  6. +
+ +

Can also set shared variables inside a job to be used outside the azure/CLI step, even it’s run inside a docker container.

+ +

Drawbacks:

+ +
    +
  1. slowness: azure/CLI is much slower (around 20s to bootstrap on a ubuntu-latest-4core runner) than standard run step, because it needs to pull the docker image and run the container.
  2. +
]]>
Xiang ZHUxiang.zhu@outlook.com
Github Actions - Environment2023-09-19T00:00:00+00:002023-09-19T00:00:00+00:00https://www.copdips.com/2023/09/github-actions-environmentDynamic environment + +

environment is set at job level (not at step level), so we should use the $GITHUB_OUTPUT context to set the environment name dynamically, see here to learn how to pass data between jobs.

+ +

Standard usage for static value is like this:

+ +
jobs:
+  deployment:
+    runs-on: ubuntu-latest
+    environment: production
+    steps:
+      - name: deploy
+        # ...deployment-specific steps
+
+ +

For advanced usage with dynamic value should be like this:

+ +
# call reusable workflow set_target_env.yml to set the target_env
+jobs:
+  set_target_env:
+    uses: ./.github/workflows/set_target_env.yml
+  deployment:
+    runs-on: ubuntu-latest
+    needs: [set_target_env]
+    environment:
+      name: ${{ needs.set_target_env.outputs.workflow_output_target_env }}
+    env:
+      TARGET_ENV: ${{ needs.set_target_env.outputs.workflow_output_target_env }}
+    steps:
+      - run: |
+          echo "TARGET_ENV: $TARGET_ENV"
+      # ...other deployment-specific steps based on $TARGET_ENV
+
]]>
Xiang ZHUxiang.zhu@outlook.com
Github Actions - Variables2023-09-19T00:00:00+00:002023-09-19T00:00:00+00:00https://www.copdips.com/2023/09/github-actions-variablesVariables upon Git events + +

Suppose we create a new branch named new_branch, and create a pull request (with id 123) from the new branch new_branch to the main branch. +During the pipeline, we can see following predefined variables in different GIT events.

+ +

Check here for variables upon git events in Azure Pipelines.

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
variable name \ git actionon pushon pull requeston merge (after merge, a push event will be triggered)on manual trigger
$GITHUB_REFrefs/heads/new_branchrefs/pull/123/mergerefs/heads/mainrefs/heads/new_branch
$GITHUB_REF_NAMEnew_branch132/mergemainnew_branch
$GITHUB_EVENT_NAMEpushpull_requestpull_request_targetworkflow_dispatch
$GITHUB_REF_TYPEbranchbranchbranchbranch
$GITHUB_SHAlast commit in branchworkflow commit (not merge commit)merge commitlast commit in branch
${{ github.event.head_commit.message }}last commit messageVAR_NOT_EXISTSVAR_NOT_EXISTSVAR_NOT_EXISTS
${{ github.event.pull_request.merge_commit_sha }}VAR_NOT_EXISTSmerge commitmerge commitVAR_NOT_EXISTS
${{ github.event.pull_request.head.sha }}VAR_NOT_EXISTSlast commit in PR (not merge commit)last commit in PR (not merge commit)VAR_NOT_EXISTS
${{ github.event.pull_request.number }}VAR_NOT_EXISTS123123VAR_NOT_EXISTS
${{ github.event.number }}VAR_NOT_EXISTS123123VAR_NOT_EXISTS
${{ github.event.pull_request.merged }}VAR_NOT_EXISTSfalsetrueVAR_NOT_EXISTS
${{ github.event.pull_request.merged_by.login }}VAR_NOT_EXISTSnulluser loginVAR_NOT_EXISTS
${{ github.event.pull_request.merged_by.type }}VAR_NOT_EXISTSnullUser, etcVAR_NOT_EXISTS
${{ github.event.pull_request.title }}VAR_NOT_EXISTSnull or pr titlenull or pr titleVAR_NOT_EXISTS
${{ github.event.pull_request.body}}VAR_NOT_EXISTSnull or pr bodynull or pr bodVAR_NOT_EXISTS
${{ github.event.after }}last SHA in commitlast commit in PR (not merge commit)VAR_NOT_EXISTSVAR_NOT_EXISTS
${{ github.event.action}}VAR_NOT_EXISTSopened, synchronize, edited, reopned, [etc](https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#pull_request)..closedVAR_NOT_EXISTS
${{ github.head_ref }}VAR_NOT_EXISTSnew_branchnew_branchVAR_NOT_EXISTS
${{ github.base_ref }}nullmainmainVAR_NOT_EXISTS
+ +

Setting environment variables by Python

+ +

Same approach applies to other languages:

+ +
- name: Create new env vars by Python
+  shell: python
+  run: |
+    import os
+    with open(os.environ["GITHUB_ENV"], "a") as f:
+      f.write("ENV_VAR_1=value_1\nENV_VAR_2=value_2\n")
+
+ +

JSON Variables

+ +

JSON variables with GITHUB_OUTPUT

+ +

When setting a JSON variable in string as $GITHUB_OUTPUT, and using it in a subsequent step, we should use the Github actions expressions syntax. However, the method of using this syntax can vary based on its context. Consider the following example on a Github Ubuntu runner with a bash shell:

+ +
- name: Write json outputs
+  id: write-json-outputs
+  run: |
+    json_raw='{"name":"foo"}'
+    json_quotes_escaped="{\"name\":\"foo\"}"
+    json_quotes_backslash_escaped="{\\\"name\\\":\\\"foo\\\"}"
+    json_ascii="{\x22name\x22: \x22foo\x22}"
+
+    echo "json_raw=$json_raw" >> $GITHUB_OUTPUT
+    echo "json_quotes_escaped=$json_quotes_escaped" >> $GITHUB_OUTPUT
+    echo "json_quotes_backslash_escaped=$json_quotes_backslash_escaped" >> $GITHUB_OUTPUT
+    echo -e "json_ascii=$json_ascii" >> $GITHUB_OUTPUT
+
+    echo "GITHUB_OUTPUT content:"
+    cat $GITHUB_OUTPUT
+
+- name: Show json outputs
+  run: |
+    json_raw_wo_quotes=${{ steps.write-json-outputs.outputs.json_raw }}
+    json_raw="${{ steps.write-json-outputs.outputs.json_raw }}"
+    json_quotes_escaped="${{ steps.write-json-outputs.outputs.json_quotes_escaped }}"
+    json_quotes_backslash_escaped="${{ steps.write-json-outputs.outputs.json_quotes_backslash_escaped }}"
+    json_ascii="${{ steps.write-json-outputs.outputs.json_ascii }}"
+
+    # echo vars from templating inside bash
+    echo "json_raw_wo_quotes: $json_raw_wo_quotes"
+    echo "json_raw: $json_raw"
+    echo "json_quotes_escaped: $json_quotes_escaped"
+    echo "json_quotes_backslash_escaped: $json_quotes_backslash_escaped"
+    echo "json_ascii: $json_ascii"
+
+    # echo vars from env variables
+    echo "JSON_RAW: $JSON_RAW"
+    echo "JSON_QUOTES_ESCAPED: $JSON_QUOTES_ESCAPED"
+    echo "JSON_QUOTES_BACKSLASH_ESCAPED: $JSON_QUOTES_BACKSLASH_ESCAPED"
+    echo "JSON_QUOTES_BACKSLASH_ESCAPED_TO_JSON: $JSON_QUOTES_BACKSLASH_ESCAPED_TO_JSON"
+    echo "JSON_QUOTES_BACKSLASH_ESCAPED_WITH_QUOTES: $JSON_QUOTES_BACKSLASH_ESCAPED_WITH_QUOTES"
+    echo "JSON_ASCII: $JSON_ASCII"
+  env:
+    JSON_RAW: ${{ steps.write-json-outputs.outputs.json_raw }}
+    JSON_QUOTES_ESCAPED: ${{ steps.write-json-outputs.outputs.json_quotes_escaped }}
+    JSON_QUOTES_BACKSLASH_ESCAPED: ${{ steps.write-json-outputs.outputs.json_quotes_backslash_escaped }}
+    JSON_QUOTES_BACKSLASH_ESCAPED_TO_JSON: ${{ toJson(steps.write-json-outputs.outputs.json_quotes_backslash_escaped) }}
+    JSON_QUOTES_BACKSLASH_ESCAPED_WITH_QUOTES: "${{ steps.write-json-outputs.outputs.json_quotes_backslash_escaped }}"
+    JSON_ASCII: ${{ steps.write-json-outputs.outputs.json_ascii }}
+
+ +

When creating the json string, it would be better to not use blank spaces between keys and values, json_raw='{"name":"foo"}' instead of json_raw='{"name": "foo"}, in order to prevent from bash variable mangling issue.

+ +

We have the following output:

+ +
Write json outputs
+  GITHUB_OUTPUT content:
+  json_raw={"name":"foo"}
+  json_quotes_escaped={"name":"foo"}
+  json_quotes_backslash_escaped={\"name\":\"foo\"}
+  json_ascii={"name":"foo"}
+
+Show json outputs
+  json_raw_wo_quotes: {name:foo}
+  json_raw: {name:foo}
+  json_quotes_escaped: {name:foo}
+  json_quotes_backslash_escaped: {"name":"foo"}
+  json_ascii: {name:foo}
+  JSON_RAW: {"name":"foo"}
+  JSON_QUOTES_ESCAPED: {"name":"foo"}
+  JSON_QUOTES_BACKSLASH_ESCAPED: {\"name\":\"foo\"}
+  JSON_QUOTES_BACKSLASH_ESCAPED_TO_JSON: "{\\\"name\\\":\\\"foo\\\"}"
+  JSON_QUOTES_BACKSLASH_ESCAPED_WITH_QUOTES: {\"name\":\"foo\"}
+  JSON_ASCII: {"name":"foo"}
+
+ +

From the output we can see that there’re two ways to have a valid json string in the show step:

+ +
- name: Show json outputs
+  run: |
+    json_quotes_backslash_escaped="${{ steps.write-json-outputs.outputs.json_quotes_backslash_escaped }}"
+    echo "json_quotes_backslash_escaped: $json_quotes_backslash_escaped"
+
+    # echo vars from env
+    echo "JSON_RAW: $JSON_RAW"
+    echo "JSON_QUOTES_ESCAPED: $JSON_QUOTES_ESCAPED"
+    echo "JSON_ASCII: $JSON_ASCII"
+  env:
+    JSON_RAW: ${{ steps.write-json-outputs.outputs.json_raw }}
+    JSON_QUOTES_ESCAPED: ${{ steps.write-json-outputs.outputs.json_quotes_escaped }}
+    JSON_ASCII: ${{ steps.write-json-outputs.outputs.json_ascii }}
+
+ +

Creating a JSON string in GITHUB_OUTPUT without escaping backslashes, like json_quotes_escaped="{\"name\":\"foo\"}", is more concise than "{\\\"name\\\":\\\"foo\\\"}". However, when using ${{ <expression> }} in a bash shell within GitHub Actions, it’s not a valid JSON string. This is because the expressions are processed before the bash shell runs the script, replacing the expression with its value and discarding double quotes. This results in an output like json_raw: {name:foo}. To address this, the toJson function can be used to convert the string into valid JSON.

+ +
- name: Show json outputs
+  run: |
+    # use toJson() to parse the string to a valid json string
+    json_raw="${{ toJson(steps.write-json-outputs.outputs.json_raw) }}"
+    json_quotes_escaped="${{ toJson(steps.write-json-outputs.outputs.json_quotes_escaped) }}"
+    json_ascii="${{ toJson(steps.write-json-outputs.outputs.json_ascii) }}"
+
+    # echo vars from templating inside bash
+    echo "json_raw: $json_raw"
+    echo "json_quotes_escaped: $json_quotes_escaped"
+    echo "json_ascii: $json_ascii"
+
+    # echo vars from env variables
+    echo "JSON_RAW: $JSON_RAW"
+    echo "JSON_QUOTES_ESCAPED: $JSON_QUOTES_ESCAPED"
+    echo "JSON_ASCII: $JSON_ASCII"
+  env:
+    JSON_RAW: ${{ steps.write-json-outputs.outputs.json_raw }}
+    JSON_QUOTES_ESCAPED: ${{ steps.write-json-outputs.outputs.json_quotes_escaped }}
+    JSON_ASCII: ${{ steps.write-json-outputs.outputs.json_ascii }}
+
+ +

Check also the fromJson function to see how to parse json string to object.

+ +

Do not create JSON secrets

+ +

When creating a secret, we should not create a JSON secret. For e.g. the Github action Azure/Login provides an example how to pass creds inputs with a JSON secret:

+ +
- uses: azure/login@v1
+  with:
+    creds: ${{ secrets.AZURE_CREDENTIALS }}
+
+ +

This works but the drawback is that as the curly brackets are stored in the JSON secret, so whenever we want to show { or } in the Github action logs, they will be replaced by ***, as Github actions considers the curly brackets are secret chars. This doesn’t block the successful run of github workflows, but it’s not convenient for debugging.

+ +

A better usage of Azure/Login is also provided in its documentation here:

+ +
- uses: Azure/login@v1
+    with:
+      creds: '{"clientId":"${{ secrets.CLIENT_ID }}","clientSecret":"${{ secrets.CLIENT_SECRET }}","subscriptionId":"${{ secrets.SUBSCRIPTION_ID }}","tenantId":"${{ secrets.TENANT_ID }}"}'
+
+ +

Parsing variables

+ +

Parsing variables with object type

+ +
- run: |
+    echo "github.event: ${{ github.event }}"
+    echo "github.event toJson: $GITHUB_EVENT"
+  env:
+    GITHUB_EVENT: ${{ toJson(github.event) }}
+
+# output:
+github.event: Object
+github.event toJson: {
+  after: 9da8166fcc52c437871a2e903b3e200a35c09a1e,
+  base_ref: null,
+  before: 1448cfbf10fc149b7d200d0a0e15493f41cc8896,
+  ...
+}
+
+ +

echo "github.event toJson: ${{ toJSON(github.event) }}" will raise error, must parse the variable to environment variable $GITHUB_EVENT at first. So when using toJson method to parse object type variable, it is recommended to send the value to an environment variable first.

+ +

Parsing variables with boolean type

+ +

Check with if:

+ +
on:
+  workflow_dispatch:
+    inputs:
+      print_tags:
+        description: 'True to print to STDOUT'
+        required: true
+        type: boolean
+
+jobs:
+  print-tag:
+    runs-on: ubuntu-latest
+    # all the 4 syntaxes below are valid
+    if: inputs.print_tags
+    if: ${{ inputs.print_tags }}
+    if: inputs.print_tags == true
+    if: ${{ inputs.print_tags == true}}
+    steps:
+      - name: Print the input tag to STDOUT
+        run: echo The tags are ${{ inputs.tags }}
+      - name: Print the input tag to STDOUT
+        # in bash, compare boolean with string value
+        run: |
+          if [[ "${{ inputs.print_tags }}" == "true" ]]; then
+            echo The tags are ${{ inputs.tags }}
+          else
+            echo "print_tags is false"
+          fi
+          if [[ "$PRINT_TAGS" == "true" ]]; then
+            echo The tags are ${{ inputs.tags }}
+          else
+            echo "print_tags is false"
+          fi
+        env:
+          PRINT_TAGS: ${{ inputs.print_tags }}
+
+ +

Never use if: ${{ inputs.print_tags }} == false with == outside of {{}}, it will always be true.

+ +

Passing variables

+ +

Passing data between steps inside a job

+ +

Passing by $GITHUB_ENV between steps

+ +

You can make an environment variable available to any subsequent steps in a workflow job by defining or updating the environment variable and writing this to the GITHUB_ENV environment file.

+ +
- run: echo "var_1=value1" >> $GITHUB_ENV
+- run: echo "var_1: $var1"
+
+ +

Passing by $GITHUB_OUTPUT between steps

+ +

Sets a step’s output parameter. Note that the step will need an id to be defined to later retrieve the output value

+ +

Passing data between jobs inside a workflow

+ +

Passing by artifacts between jobs

+ +

You can use the upload-artifact and download-artifact actions to share data (in the forms of a file) between jobs in a workflow.

+ +

To share variables, you can save the variables in a file with format:

+ +
VAR_1=value1
+VAR_2=value2
+
+ +

Then download the file from another job and source it to load the variables:

+ +
- run: |
+    sed "" {downloaded_file_path} >> $GITHUB_ENV
+  shell: bash
+
+ +

Passing by $GITHUB_OUTPUT between jobs

+ +

https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idoutputs

+ +

Passing data between caller workflow and called (reusable) workflow

+ +

Use on.workflow_call.outputs, called workflow outputs are available to all downstream jobs in the caller workflow.

+ +

Passing data between irrelevant workflows

+ +]]>
Xiang ZHUxiang.zhu@outlook.com
\ No newline at end of file diff --git a/image/avatar/zhuxiang-smile.jpg b/image/avatar/zhuxiang-smile.jpg new file mode 100644 index 00000000..cd41dd0a Binary files /dev/null and b/image/avatar/zhuxiang-smile.jpg differ diff --git a/index.html b/index.html new file mode 100644 index 00000000..184496eb --- /dev/null +++ b/index.html @@ -0,0 +1,935 @@ + + + + + + +A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + +
+ + + + + +
+ +

+ + + +

Recent posts

+ + + + +
+ + + + + +
+
+ +

+ + Hashing files + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Github Actions - Python + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Github Actions - Workflows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Github Actions - Variables + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +

Variables upon Git events +

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Github Actions - Custom Actions + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Actions checkout location in workflow +

+
+
+ + + + + + +
+
+ +

+ + Github Actions - Cache + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

+

+
+
+ + +
+ + + + + + +
+
+
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/minimal-mistakes-jekyll.gemspec b/minimal-mistakes-jekyll.gemspec new file mode 100644 index 00000000..e985e498 --- /dev/null +++ b/minimal-mistakes-jekyll.gemspec @@ -0,0 +1,27 @@ +# coding: utf-8 + +Gem::Specification.new do |spec| + spec.name = "minimal-mistakes-jekyll" + spec.version = "4.24.0" + spec.authors = ["Michael Rose"] + + spec.summary = %q{A flexible two-column Jekyll theme.} + spec.homepage = "https://github.com/mmistakes/minimal-mistakes" + spec.license = "MIT" + + spec.metadata["plugin_type"] = "theme" + + spec.files = `git ls-files -z`.split("\x0").select do |f| + f.match(%r{^(assets|_(data|includes|layouts|sass)/|(LICENSE|README|CHANGELOG)((\.(txt|md|markdown)|$)))}i) + end + + spec.add_runtime_dependency "jekyll", ">= 3.7", "< 5.0" + spec.add_runtime_dependency "jekyll-paginate", "~> 1.1" + spec.add_runtime_dependency "jekyll-sitemap", "~> 1.3" + spec.add_runtime_dependency "jekyll-gist", "~> 1.5" + spec.add_runtime_dependency "jekyll-feed", "~> 0.1" + spec.add_runtime_dependency "jekyll-include-cache", "~> 0.1" + + spec.add_development_dependency "bundler" + spec.add_development_dependency "rake", ">= 12.3.3" +end diff --git a/page2/index.html b/page2/index.html new file mode 100644 index 00000000..a618ef57 --- /dev/null +++ b/page2/index.html @@ -0,0 +1,943 @@ + + + + + + +A code to remember - Page 2 + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + +
+ + + + + +
+ +

+ + + +

Recent posts

+ + + + +
+ + + + + +
+
+ +

+ + Python Asyncio + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Python Asyncio Unittest + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Unittest based on Pytest framework not on embedded unittest. +

+
+
+ + + + + + +
+
+ +

+ + Sonarcloud Github Action + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + Calling Azure REST API + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + +
+ + + + + + +
+
+
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/page3/index.html b/page3/index.html new file mode 100644 index 00000000..94a6a4f2 --- /dev/null +++ b/page3/index.html @@ -0,0 +1,949 @@ + + + + + + +A code to remember - Page 3 + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + +
+ + + + + +
+ +

+ + + +

Recent posts

+ + + + +
+ + + + + +
+
+ +

+ + Azure pipeline delete blobs from blob storage + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

The example given by this post is for Azure Pipeline with the latest Ubuntu agent, for AzCli from local machine, removing the --auth-mode login part should w...

+
+
+ + + + + + +
+
+ +

+ + Databricks cluster access mode + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

What is cluster access mode +

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Azure pipeline jobs + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + Databricks job/task context + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Giving an example of Databricks job/task json context values +

+
+
+ + + + + + +
+
+ +

+ + Azure pipeline conditions + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Azure pipeline has two kinds of conditions: +

+
+
+ + + + + + +
+
+ +

+ + Using Databricks Connect inside a container + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +

Using Databricks Connect inside a container with VSCode remote containers with spark, jre, python, databricks-connect pre-installed. +

+
+
+ + + + + + +
+
+ +

+ + Azure Pipeline Checkout Multiple Repositories + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 13 minute read + + + +

+ + +

This post will talk about some Azure pipeline predefined variables’ values in a multiple repositories checkout situation. The official doc is here. +

+
+
+ + + + + + +
+ +
+ + +
+ + + + + + +
+
+
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/page4/index.html b/page4/index.html new file mode 100644 index 00000000..43591975 --- /dev/null +++ b/page4/index.html @@ -0,0 +1,956 @@ + + + + + + +A code to remember - Page 4 + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + +
+ + + + + +
+ +

+ + + +

Recent posts

+ + + + +
+ + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Azure pipeline predefined variables + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

The official doc gives an explanation of all the predefined variables, but it lacks of some concret examples. Hereunder some examples for my preferred variab...

+
+
+ + + + + + +
+
+ +

+ + Python Asyncio Study notes + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

concurrent.futures +

+
+
+ + + + + + +
+
+ +

+ + Python datetime utcnow + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Trying Python pipreqs and pip-tools + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Some simple demos to test pipreqs and pip-tools +

+
+
+ + + + + + +
+
+ +

+ + Python Requests With Retry + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Make python requests retry easily to use +

+
+
+ + + + + + +
+
+ +

+ + Python Lint And Format + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 11 minute read + + + +

+ + +

Azure SDK Python Guidelines +

+
+
+ + +
+ + + + + + +
+
+
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/page5/index.html b/page5/index.html new file mode 100644 index 00000000..72b2449c --- /dev/null +++ b/page5/index.html @@ -0,0 +1,968 @@ + + + + + + +A code to remember - Page 5 + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + +
+ + + + + +
+ +

+ + + +

Recent posts

+ + + + +
+ + + + + +
+
+ +

+ + My Powerline setup and configuration + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Just my way to setup and configure powerline in WSL +

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Flattening nested dict in Python + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Flattening a nested dict/json with list as some keys’ value. +

+
+
+ + + + + + +
+
+ +

+ + Setting up WSL + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

Setting up WSL (Windows Subsystem for Linux) +

+
+
+ + + + + + +
+
+ +

+ + Using Scoop On Windows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Some tips to use Scoop. +

+
+
+ + + + + + +
+ +
+ + +
+ + + + + + +
+
+
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/page6/index.html b/page6/index.html new file mode 100644 index 00000000..cc239ea4 --- /dev/null +++ b/page6/index.html @@ -0,0 +1,961 @@ + + + + + + +A code to remember - Page 6 + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + +
+ + + + + +
+ +

+ + + +

Recent posts

+ + + + +
+ + + + + +
+
+ +

+ + Elastic Painless Scripted Field On Null/Missing Value + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

How to use painless scripted field to working on objects which might be null or missing in some documents. +

+
+
+ + + + + + +
+
+ +

+ + Install Python3 on Ubuntu + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Install Python3 on Ubuntu by using official source. +

+
+
+ + + + + + +
+
+ +

+ + SQLAlchemy mixin in method + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

Share common methods across SQLAlchemy db model classes by using mixin. +

+
+
+ + + + + + +
+
+ +

+ + A fast way to check TCP port in Powershell + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Test-NetConnection is too slow if the remote port is not opened due to its timeout setting. Use System.Net.Sockets.TcpClient instead. +

+
+
+ + + + + + +
+
+ +

+ + Troubleshooting Python Twine Cannot Upload Package On Windows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Python has several tools to upload packages to PyPi or some private Artifactory locations. The mostly used one should be twine. Although twine is not a Pytho...

+
+
+ + + + + + +
+
+ +

+ + Filtering In Pandas Dataframe + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +

Filtering a pandas dataframe with series, query, or numpy methods. +

+
+
+ + + + + + +
+
+ +

+ + Git Cheat Sheet + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +

+ This is not a complete Git cheat sheet for everyone, this is just a personal cheat sheet for some often forgotten git commands. + + +

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Creating Custom Python Request Auth Class + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Creating custom python request auth class with requests.auth.AuthBase. +

+
+
+ + + + + + +
+ +
+ + +
+ + + + + + +
+
+
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/page7/index.html b/page7/index.html new file mode 100644 index 00000000..59665a3e --- /dev/null +++ b/page7/index.html @@ -0,0 +1,952 @@ + + + + + + +A code to remember - Page 7 + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + +
+ + + + + +
+ +

+ + + +

Recent posts

+ + + + +
+ + + + + +
+
+ +

+ + Setting Pwsh Invoke-WebRequest Proxy + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

One-line command to set Powershell Core web cmdlets proxy. +

+
+
+ + + + + + +
+
+ +

+ + Using Gitlab integrated CICD for Python project on Windows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

Gitlab ships with its own free CICD which works pretty well. This post will give a .gitlab-ci.yml demo for a Python project running on Gitlab Windows runner. +

+
+
+ + + + + + +
+
+ +

+ + Migrate Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

+ This post will walk you through the steps to migrate Gitlab from one docker container to another. The steps need you to know how to install a new Gitlab c...

+
+
+ + + + + + +
+
+ +

+ + Update Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Step by step procedure to update Gitlab in docker. +

+
+
+ + + + + + +
+
+ +

+ + Terminate Powershell script or session + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

+ I always asked myself how to terminate a Powershell script or session, each time I needed to do some tests by myself and also searched on Google. But I co...

+
+
+ + + + + + +
+
+ +

+ + Backup and restore Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 8 minute read + + + +

+ + +

Step by step procedure to backup and restore Gitlab in docker. +

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Setup HTTPS for Gitlab + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +

Setup a SAN SSL certificate to use the HTTPS on Gitlab-CE in docker on Ubuntu server. +

+
+
+ + + + + + +
+
+ +

+ + Install Gitlab-CE in Docker on Ubuntu + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Step by step installation of Gitlab-CE in docker on Ubuntu server. +

+
+
+ + + + + + +
+ +
+ + +
+ + + + + + +
+
+
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/page8/index.html b/page8/index.html new file mode 100644 index 00000000..82ec8576 --- /dev/null +++ b/page8/index.html @@ -0,0 +1,945 @@ + + + + + + +A code to remember - Page 8 + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + +
+ + + + + +
+ +

+ + + +

Recent posts

+ + + + +
+ + + + + +
+ +
+ + + + + + +
+
+ +

+ + Use python tabulate module to create tables + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+ If you want to create some tables from a python list, you can use the tabulate module, it can generate the table easily in text mode and in many formats, ...

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Git untrack submodule from git status + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

submodule folders cannot be added into .gitignore file to untrack them from git status, we will use ignore=dirty to ignore it +

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + +
+ + + + + + +
+
+
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/page9/index.html b/page9/index.html new file mode 100644 index 00000000..96ea6876 --- /dev/null +++ b/page9/index.html @@ -0,0 +1,610 @@ + + + + + + +A code to remember - Page 9 + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + +
+ + + + + +
+ +

+ + + +

Recent posts

+ + + + +
+ + + + + +
+
+ +

+ + Powershell stop-parsing (--%) + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Use Powershell stop-parsing (--%) to treat the remaining characters in the line as a literal. +

+
+
+ + + + + + +
+
+ +

+ + Setting Up Powershell gallery And Nuget gallery + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

As like pypi for Python, npm for Node.js, we also have Powershell Gallery and Nuget Gallery for Powershell. +

+
+
+ + + + + + +
+
+ +

+ + Setting up Github Pages With custom domain over HTTPS + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

With Github pages, we can create our blogs in our own domain over HTTPS completely free. Of course you should pay for your domain name at the Registrar. +

+
+
+ + +
+ + + + + + +
+
+
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/robots.txt b/robots.txt new file mode 100644 index 00000000..d33eb70e --- /dev/null +++ b/robots.txt @@ -0,0 +1 @@ +Sitemap: https://www.copdips.com/sitemap.xml diff --git a/screenshot-layouts.png b/screenshot-layouts.png new file mode 100644 index 00000000..88ef60fd Binary files /dev/null and b/screenshot-layouts.png differ diff --git a/screenshot.png b/screenshot.png new file mode 100644 index 00000000..56a0a708 Binary files /dev/null and b/screenshot.png differ diff --git a/sitemap.xml b/sitemap.xml new file mode 100644 index 00000000..751c33e2 --- /dev/null +++ b/sitemap.xml @@ -0,0 +1,368 @@ + + + +https://www.copdips.com/2018/05/setting-up-github-pages-with-custom-domain-over-https.html +2018-05-03T00:00:00+00:00 + + +https://www.copdips.com/2018/05/setting-up-powershell-gallery-and-nuget-gallery-for-powershell.html +2018-05-07T00:00:00+00:00 + + +https://www.copdips.com/2018/05/powershell-stop-parsing.html +2018-05-16T00:00:00+00:00 + + +https://www.copdips.com/2018/05/setting-up-jekyll-with-minimal-mistakes-theme-on-windows.html +2022-08-14T18:04:18+00:00 + + +https://www.copdips.com/2018/05/using-readline-in-python-repl-on-windows.html +2018-05-22T00:00:00+00:00 + + +https://www.copdips.com/2018/05/grep-like-powershell-colorful-select-string.html +2019-12-31T22:24:25+00:00 + + +https://www.copdips.com/2018/06/converting-python-json-list-to-csv-in-2-lines-of-code-by-pandas.html +2018-06-03T00:00:00+00:00 + + +https://www.copdips.com/2018/06/import-python-module-with-sys-path-when-without-init-file.html +2018-06-21T00:00:00+00:00 + + +https://www.copdips.com/2018/06/git-untrack-submodule-from-git-status.html +2018-06-22T00:00:00+00:00 + + +https://www.copdips.com/2018/06/install-python-on-windows-with-powershell-without-administrator-privileges.html +2019-12-30T17:31:08+00:00 + + +https://www.copdips.com/2018/07/use-pyvmomi-EventHistoryCollector-to-get-all-the-vcenter-events.html +2018-07-25T00:00:00+00:00 + + +https://www.copdips.com/2018/07/use-python-tabulate-module-to-create-tables.html +2018-07-28T00:00:00+00:00 + + +https://www.copdips.com/2018/07/convert-markdown-or-rst-to-atlassian-confluance-documentation-format.html +2018-07-29T00:00:00+00:00 + + +https://www.copdips.com/2018/09/windows-scheduled-task-by-powershell.html +2018-09-05T00:00:00+00:00 + + +https://www.copdips.com/2018/09/install-gitlab-ce-in-docker-on-ubuntu.html +2018-09-06T00:00:00+00:00 + + +https://www.copdips.com/2018/09/setup-https-for-gitlab.html +2018-09-16T00:00:00+00:00 + + +https://www.copdips.com/2018/09/install-gitlab-runner-on-windows-by-powershell-psremoting.html +2018-09-20T00:00:00+00:00 + + +https://www.copdips.com/2018/09/backup-and-restore-gitlab-in-docker.html +2018-09-24T00:00:00+00:00 + + +https://www.copdips.com/2018/09/terminate-powershell-script-or-session.html +2018-09-28T00:00:00+00:00 + + +https://www.copdips.com/2018/10/update-gitlab-in-docker.html +2018-10-03T00:00:00+00:00 + + +https://www.copdips.com/2018/10/migrate-gitlab-in-docker.html +2018-10-10T00:00:00+00:00 + + +https://www.copdips.com/2018/10/using-gitlab-integrated-cicd-for-python-project-on-windows.html +2018-10-18T00:00:00+00:00 + + +https://www.copdips.com/2018/11/setting-pwsh-invoke-webrequest-proxy.html +2018-11-01T00:00:00+00:00 + + +https://www.copdips.com/2018/11/creating-multiple-redis-instance-services-on-windows.html +2018-11-05T00:00:00+00:00 + + +https://www.copdips.com/2019/04/creating-custom-python-request-auth-class.html +2019-04-01T00:00:00+00:00 + + +https://www.copdips.com/2019/05/using-python-sqlalchemy-session-in-multithreading.html +2021-03-21T22:28:44+00:00 + + +https://www.copdips.com/2019/06/git-cheat-sheet.html +2023-06-02T23:00:24+00:00 + + +https://www.copdips.com/2019/07/filtering-pandas-dataframe.html +2019-07-13T00:00:00+00:00 + + +https://www.copdips.com/2019/07/troubleshooting-python-twine-cannot-upload-package-on-windows.html +2019-07-30T00:00:00+00:00 + + +https://www.copdips.com/2019/09/fast-tcp-port-check-in-powershell.html +2019-12-31T22:49:18+00:00 + + +https://www.copdips.com/2019/09/sqlalchemy-mixin-in-method.html +2020-07-26T13:47:04+00:00 + + +https://www.copdips.com/2019/10/installing-python3-on-ubuntu.html +2021-03-17T00:12:13+00:00 + + +https://www.copdips.com/2019/12/elastic-painless-scripted-field-on-null-or-mssing-value.html +2019-12-21T00:00:00+00:00 + + +https://www.copdips.com/2019/12/Using-Powershell-to-retrieve-latest-package-url-from-github-releases.html +2019-12-29T00:00:00+00:00 + + +https://www.copdips.com/2019/12/Using-Scoop-On-Windows.html +2020-01-11T21:45:02+00:00 + + +https://www.copdips.com/2020/02/setting-up-wsl.html +2020-02-01T00:00:00+00:00 + + +https://www.copdips.com/2020/03/flattening-nested-dict-in-python.html +2020-03-09T00:00:00+00:00 + + +https://www.copdips.com/2020/04/fixing-ipython-on-Windows10-ConEmu-mouse-event-bug.html +2020-04-13T00:00:00+00:00 + + +https://www.copdips.com/2020/04/making-isort-compatible-with-black.html +2021-03-28T11:58:35+00:00 + + +https://www.copdips.com/2020/05/using-python-contextmanager-to-create-a-timer-decorator.html +2020-05-05T00:00:00+00:00 + + +https://www.copdips.com/2020/06/compiling-sqlalchemy-query-to-nearly-real-raw-sql-query.html +2020-06-08T12:35:21+00:00 + + +https://www.copdips.com/2020/07/rolling-back-from-flask-restplus-reqparse-to-native-flask-request-to-parse-inputs.html +2020-07-16T00:00:00+00:00 + + +https://www.copdips.com/2020/11/my-powerline.html +2020-11-24T00:00:00+00:00 + + +https://www.copdips.com/2021/01/python-lint-and-format.html +2023-06-10T00:55:20+00:00 + + +https://www.copdips.com/2021/01/python-requests-with-retry.html +2021-03-20T08:31:41+00:00 + + +https://www.copdips.com/2021/03/trying-python-pipreqs-and-pip-tools.html +2021-03-06T00:00:00+00:00 + + +https://www.copdips.com/2021/06/python-unittest-cheet-sheet.html +2021-06-12T00:00:00+00:00 + + +https://www.copdips.com/2021/06/python-datetime-utc-now.html +2022-09-05T10:39:43+00:00 + + +https://www.copdips.com/2021/09/python-asyncio.html +2021-09-04T00:00:00+00:00 + + +https://www.copdips.com/2022/01/azure-pipeline-predefined-variables.html +2022-01-24T10:17:34+00:00 + + +https://www.copdips.com/2022/02/azure-pipeline-reuse-variables-in-template-from-another-repository.html +2022-02-09T00:00:00+00:00 + + +https://www.copdips.com/2022/02/azure-pipeline-checkout-repository-from-another-project.html +2022-09-16T21:34:39+00:00 + + +https://www.copdips.com/2022/03/azure-pipeline-variables-and-parameters.html +2022-06-16T21:57:13+00:00 + + +https://www.copdips.com/2022/03/manage-azure-databricks-service-principal.html +2022-03-27T00:00:00+00:00 + + +https://www.copdips.com/2022/04/azure-pipeline-checkout-multiple-repositories.html +2022-04-03T00:00:00+00:00 + + +https://www.copdips.com/2022/06/using-databricks-connect-inside-a-container.html +2022-10-15T22:38:07+00:00 + + +https://www.copdips.com/2022/07/azure-pipeline-conditions.html +2022-07-03T00:00:00+00:00 + + +https://www.copdips.com/2022/07/databricks-job-context.html +2022-07-28T00:00:00+00:00 + + +https://www.copdips.com/2022/08/azure-pipeline-jobs.html +2022-08-14T00:00:00+00:00 + + +https://www.copdips.com/2022/09/azure-pipeline-system-access-token-in-shared-pipeline.html +2022-09-12T00:00:00+00:00 + + +https://www.copdips.com/2022/09/adding-data-files-to-python-package-with-setup-py.html +2022-09-15T00:00:00+00:00 + + +https://www.copdips.com/2022/09/databricks-cluster-access-mode.html +2022-09-20T00:00:00+00:00 + + +https://www.copdips.com/2022/11/azure-pipeline-delete-blobs-from-blob-storage.html +2022-11-09T00:00:00+00:00 + + +https://www.copdips.com/2022/11/azure-pipeline-windows-agent-UnicodeEncodeError.html +2022-11-13T00:00:00+00:00 + + +https://www.copdips.com/2022/11/using-ast-and-cst-to-change-python-code.html +2022-11-15T00:00:00+00:00 + + +https://www.copdips.com/2022/12/python-difference-on-subprocess-run-call-check-call-check-output.html +2022-12-02T21:32:49+00:00 + + +https://www.copdips.com/2022/12/syncing-repository-from-github-to-gitee.html +2022-12-03T00:00:00+00:00 + + +https://www.copdips.com/2023/01/python-aiohttp-rate-limit.html +2023-01-06T21:26:17+00:00 + + +https://www.copdips.com/2023/01/calling-azure-rest-api.html +2023-05-23T15:58:39+00:00 + + +https://www.copdips.com/2023/01/sonarcloud-github-action.html +2023-01-28T00:00:00+00:00 + + +https://www.copdips.com/2023/07/python-asyncio-unittest.html +2023-07-04T00:00:00+00:00 + + +https://www.copdips.com/2023/09/different-ssh-keys-for-different-github.com-accounts.html +2023-09-04T00:00:00+00:00 + + +https://www.copdips.com/2023/09/python-asyncio.html +2023-09-14T00:00:00+00:00 + + +https://www.copdips.com/2023/09/github-actions-cache.html +2023-09-19T00:00:00+00:00 + + +https://www.copdips.com/2023/09/github-actions-custom-actions.html +2023-09-19T00:00:00+00:00 + + +https://www.copdips.com/2023/09/github-actions-environment.html +2023-09-19T00:00:00+00:00 + + +https://www.copdips.com/2023/09/github-actions-variables.html +2023-09-19T00:00:00+00:00 + + +https://www.copdips.com/2023/09/github-actions-error-handling.html +2023-09-20T00:00:00+00:00 + + +https://www.copdips.com/2023/09/github-actions-workflows.html +2023-09-21T00:00:00+00:00 + + +https://www.copdips.com/2023/09/databricks-python-pip-authentication.html +2023-09-22T00:00:00+00:00 + + +https://www.copdips.com/2023/09/github-actions-python.html +2023-09-22T00:00:00+00:00 + + +https://www.copdips.com/2023/10/github-actions-get-azure-keyvault-secrets-action.html +2023-10-16T00:00:00+00:00 + + +https://www.copdips.com/2023/10/hashing-files.html +2023-10-21T00:00:00+00:00 + + +https://www.copdips.com/ + + +https://www.copdips.com/tags/ + + +https://www.copdips.com/year-archive/ + + +https://www.copdips.com/page2/ + + +https://www.copdips.com/page3/ + + +https://www.copdips.com/page4/ + + +https://www.copdips.com/page5/ + + +https://www.copdips.com/page6/ + + +https://www.copdips.com/page7/ + + +https://www.copdips.com/page8/ + + +https://www.copdips.com/page9/ + + diff --git a/staticman.yml b/staticman.yml new file mode 100644 index 00000000..61b95925 --- /dev/null +++ b/staticman.yml @@ -0,0 +1,104 @@ +# Name of the property. You can have multiple properties with completely +# different config blocks for different sections of your site. +# For example, you can have one property to handle comment submission and +# another one to handle posts. +# To encrypt strings use the following endpoint: +# https://{your Staticman API URL}/v[2|3]/encrypt/{TEXT TO BE ENCRYPTED} + +comments: + # (*) REQUIRED + # + # Names of the fields the form is allowed to submit. If a field that is + # not here is part of the request, an error will be thrown. + allowedFields: ["name", "email", "url", "message"] + + # (*) REQUIRED WHEN USING NOTIFICATIONS + # + # When allowedOrigins is defined, only requests sent from one of the domains + # listed will be accepted. The origin is sent as part as the `options` object + # (e.g. + + + + +Posts by Tag - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + +
+ + + + + +
+ +

Posts by Tag

+ + + + + + + +
    + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
  • + + python 34 + +
  • + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
  • + + cicd 27 + +
  • + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
  • + + azure 19 + +
  • + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
  • + + powershell 13 + +
  • + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
  • + + githubaction 10 + +
  • + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
  • + + gitlab 7 + +
  • + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
  • + + ubuntu 6 + +
  • + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
  • + + docker 5 + +
  • + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
  • + + async 5 + +
  • + + + + + +
  • + + databricks 5 + +
  • + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
  • + + git 4 + +
  • + + + + + + + + + + + + + +
  • + + format 4 + +
  • + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
  • + + packaging 3 + +
  • + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
  • + + sqlalchemy 3 + +
  • + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
  • + + pip 3 + +
  • + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
  • + + web 2 + +
  • + + + +
  • + + nuget 2 + +
  • + + + + + + + + + +
  • + + proxy 2 + +
  • + + + + + +
  • + + ssh 2 + +
  • + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
  • + + markdown 2 + +
  • + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
  • + + requests 2 + +
  • + + + + + + + + + + + + + + + + + +
  • + + wsl 2 + +
  • + + + +
  • + + linux 2 + +
  • + + + + + + + +
  • + + vscode 2 + +
  • + + + + + + + +
  • + + shell 2 + +
  • + + + + + +
  • + + unittest 2 + +
  • + + + +
  • + + pytest 2 + +
  • + + + + + + + + + + + + + +
  • + + spark 2 + +
  • + + + + + + + + + + + + + + + + + +
  • + + cache 2 + +
  • + + + +
  • + + auth 2 + +
  • + + + + + + + +
  • + + github 1 + +
  • + + + +
  • + + github-pages 1 + +
  • + + + + + + + + + +
  • + + powershell-gallery 1 + +
  • + + + + + + + +
  • + + parsing 1 + +
  • + + + + + +
  • + + jekyll 1 + +
  • + + + +
  • + + windows 1 + +
  • + + + +
  • + + ruby 1 + +
  • + + + + + +
  • + + repl 1 + +
  • + + + +
  • + + readline 1 + +
  • + + + +
  • + + string 1 + +
  • + + + +
  • + + regex 1 + +
  • + + + +
  • + + json 1 + +
  • + + + +
  • + + csv 1 + +
  • + + + +
  • + + module 1 + +
  • + + + + + +
  • + + submodule 1 + +
  • + + + +
  • + + package 1 + +
  • + + + +
  • + + pyvmomi 1 + +
  • + + + +
  • + + vmware 1 + +
  • + + + + + + + +
  • + + scheduled-task 1 + +
  • + + + + + + + + + + + +
  • + + certificate 1 + +
  • + + + +
  • + + openssl 1 + +
  • + + + +
  • + + backup 1 + +
  • + + + +
  • + + update 1 + +
  • + + + +
  • + + migration 1 + +
  • + + + +
  • + + service 1 + +
  • + + + +
  • + + redis 1 + +
  • + + + + + + + +
  • + + multithreading 1 + +
  • + + + +
  • + + pandas 1 + +
  • + + + +
  • + + filtering 1 + +
  • + + + +
  • + + network 1 + +
  • + + + +
  • + + elastic 1 + +
  • + + + +
  • + + scoop 1 + +
  • + + + + + + + +
  • + + itertools 1 + +
  • + + + +
  • + + ipython 1 + +
  • + + + + + +
  • + + contextlib 1 + +
  • + + + +
  • + + flask 1 + +
  • + + + + + + + + + + + +
  • + + datetime 1 + +
  • + + + + + + + + + +
  • + + container 1 + +
  • + + + + + +
  • + + storage 1 + +
  • + + + +
  • + + codec 1 + +
  • + + + +
  • + + ast 1 + +
  • + + + +
  • + + api 1 + +
  • + + + +
  • + + rest 1 + +
  • + + + + + +
  • + + sonar 1 + +
  • + + + + + + + +
  • + + vault 1 + +
  • + + + +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+

python

+
+ + + + + +
+ +
+ + + + + + +
+
+ +

+ + Github Actions - Python + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Python Asyncio + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + Python Asyncio Unittest + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Unittest based on Pytest framework not on embedded unittest. +

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Python Asyncio Study notes + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

concurrent.futures +

+
+
+ + + + + + +
+
+ +

+ + Python datetime utcnow + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Trying Python pipreqs and pip-tools + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Some simple demos to test pipreqs and pip-tools +

+
+
+ + + + + + +
+
+ +

+ + Python Requests With Retry + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Make python requests retry easily to use +

+
+
+ + + + + + +
+
+ +

+ + Python Lint And Format + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 11 minute read + + + +

+ + +

Azure SDK Python Guidelines +

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Flattening nested dict in Python + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Flattening a nested dict/json with list as some keys’ value. +

+
+
+ + + + + + +
+
+ +

+ + Install Python3 on Ubuntu + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Install Python3 on Ubuntu by using official source. +

+
+
+ + + + + + +
+
+ +

+ + SQLAlchemy mixin in method + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

Share common methods across SQLAlchemy db model classes by using mixin. +

+
+
+ + + + + + +
+
+ +

+ + Troubleshooting Python Twine Cannot Upload Package On Windows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Python has several tools to upload packages to PyPi or some private Artifactory locations. The mostly used one should be twine. Although twine is not a Pytho...

+
+
+ + + + + + +
+
+ +

+ + Filtering In Pandas Dataframe + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +

Filtering a pandas dataframe with series, query, or numpy methods. +

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Creating Custom Python Request Auth Class + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Creating custom python request auth class with requests.auth.AuthBase. +

+
+
+ + + + + + +
+
+ +

+ + Using Gitlab integrated CICD for Python project on Windows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

Gitlab ships with its own free CICD which works pretty well. This post will give a .gitlab-ci.yml demo for a Python project running on Gitlab Windows runner. +

+
+
+ + + + + + +
+
+ +

+ + Use python tabulate module to create tables + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+ If you want to create some tables from a python list, you can use the tabulate module, it can generate the table easily in text mode and in many formats, ...

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+

cicd

+
+ + + + + +
+
+ +

+ + Hashing files + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Github Actions - Python + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + Github Actions - Workflows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Github Actions - Variables + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +

Variables upon Git events +

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Github Actions - Custom Actions + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Actions checkout location in workflow +

+
+
+ + + + + + +
+
+ +

+ + Github Actions - Cache + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + Sonarcloud Github Action + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Azure pipeline delete blobs from blob storage + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

The example given by this post is for Azure Pipeline with the latest Ubuntu agent, for AzCli from local machine, removing the --auth-mode login part should w...

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Azure pipeline jobs + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + Azure pipeline conditions + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Azure pipeline has two kinds of conditions: +

+
+
+ + + + + + +
+
+ +

+ + Azure Pipeline Checkout Multiple Repositories + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 13 minute read + + + +

+ + +

This post will talk about some Azure pipeline predefined variables’ values in a multiple repositories checkout situation. The official doc is here. +

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Azure pipeline predefined variables + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

The official doc gives an explanation of all the predefined variables, but it lacks of some concret examples. Hereunder some examples for my preferred variab...

+
+
+ + + + + + +
+
+ +

+ + Using Gitlab integrated CICD for Python project on Windows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

Gitlab ships with its own free CICD which works pretty well. This post will give a .gitlab-ci.yml demo for a Python project running on Gitlab Windows runner. +

+
+
+ + + + + + +
+
+ +

+ + Migrate Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

+ This post will walk you through the steps to migrate Gitlab from one docker container to another. The steps need you to know how to install a new Gitlab c...

+
+
+ + + + + + +
+
+ +

+ + Update Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Step by step procedure to update Gitlab in docker. +

+
+
+ + + + + + +
+
+ +

+ + Backup and restore Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 8 minute read + + + +

+ + +

Step by step procedure to backup and restore Gitlab in docker. +

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Setup HTTPS for Gitlab + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +

Setup a SAN SSL certificate to use the HTTPS on Gitlab-CE in docker on Ubuntu server. +

+
+
+ + + + + + +
+
+ +

+ + Install Gitlab-CE in Docker on Ubuntu + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Step by step installation of Gitlab-CE in docker on Ubuntu server. +

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+

azure

+
+ + + + + +
+
+ +

+ + Hashing files + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Github Actions - Python + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + Github Actions - Custom Actions + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Actions checkout location in workflow +

+
+
+ + + + + + +
+
+ +

+ + Github Actions - Cache + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + Calling Azure REST API + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Azure pipeline delete blobs from blob storage + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

The example given by this post is for Azure Pipeline with the latest Ubuntu agent, for AzCli from local machine, removing the --auth-mode login part should w...

+
+
+ + + + + + +
+
+ +

+ + Databricks cluster access mode + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

What is cluster access mode +

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Azure pipeline jobs + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + Databricks job/task context + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Giving an example of Databricks job/task json context values +

+
+
+ + + + + + +
+
+ +

+ + Azure pipeline conditions + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Azure pipeline has two kinds of conditions: +

+
+
+ + + + + + +
+
+ +

+ + Azure Pipeline Checkout Multiple Repositories + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 13 minute read + + + +

+ + +

This post will talk about some Azure pipeline predefined variables’ values in a multiple repositories checkout situation. The official doc is here. +

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Azure pipeline predefined variables + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

The official doc gives an explanation of all the predefined variables, but it lacks of some concret examples. Hereunder some examples for my preferred variab...

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+

powershell

+
+ + + + + +
+
+ +

+ + Using Scoop On Windows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Some tips to use Scoop. +

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + A fast way to check TCP port in Powershell + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Test-NetConnection is too slow if the remote port is not opened due to its timeout setting. Use System.Net.Sockets.TcpClient instead. +

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Setting Pwsh Invoke-WebRequest Proxy + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

One-line command to set Powershell Core web cmdlets proxy. +

+
+
+ + + + + + +
+
+ +

+ + Using Gitlab integrated CICD for Python project on Windows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

Gitlab ships with its own free CICD which works pretty well. This post will give a .gitlab-ci.yml demo for a Python project running on Gitlab Windows runner. +

+
+
+ + + + + + +
+
+ +

+ + Terminate Powershell script or session + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

+ I always asked myself how to terminate a Powershell script or session, each time I needed to do some tests by myself and also searched on Google. But I co...

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Powershell stop-parsing (--%) + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Use Powershell stop-parsing (--%) to treat the remaining characters in the line as a literal. +

+
+
+ + + + + + +
+
+ +

+ + Setting Up Powershell gallery And Nuget gallery + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

As like pypi for Python, npm for Node.js, we also have Powershell Gallery and Nuget Gallery for Powershell. +

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+

githubaction

+
+ + + + + +
+
+ +

+ + Hashing files + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Github Actions - Python + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + Github Actions - Workflows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Github Actions - Variables + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +

Variables upon Git events +

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Github Actions - Custom Actions + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Actions checkout location in workflow +

+
+
+ + + + + + +
+
+ +

+ + Github Actions - Cache + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + Sonarcloud Github Action + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+

gitlab

+
+ + + + + +
+
+ +

+ + Using Gitlab integrated CICD for Python project on Windows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

Gitlab ships with its own free CICD which works pretty well. This post will give a .gitlab-ci.yml demo for a Python project running on Gitlab Windows runner. +

+
+
+ + + + + + +
+
+ +

+ + Migrate Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

+ This post will walk you through the steps to migrate Gitlab from one docker container to another. The steps need you to know how to install a new Gitlab c...

+
+
+ + + + + + +
+
+ +

+ + Update Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Step by step procedure to update Gitlab in docker. +

+
+
+ + + + + + +
+
+ +

+ + Backup and restore Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 8 minute read + + + +

+ + +

Step by step procedure to backup and restore Gitlab in docker. +

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Setup HTTPS for Gitlab + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +

Setup a SAN SSL certificate to use the HTTPS on Gitlab-CE in docker on Ubuntu server. +

+
+
+ + + + + + +
+
+ +

+ + Install Gitlab-CE in Docker on Ubuntu + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Step by step installation of Gitlab-CE in docker on Ubuntu server. +

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+

ubuntu

+
+ + + + + +
+
+ +

+ + Install Python3 on Ubuntu + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Install Python3 on Ubuntu by using official source. +

+
+
+ + + + + + +
+
+ +

+ + Migrate Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

+ This post will walk you through the steps to migrate Gitlab from one docker container to another. The steps need you to know how to install a new Gitlab c...

+
+
+ + + + + + +
+
+ +

+ + Update Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Step by step procedure to update Gitlab in docker. +

+
+
+ + + + + + +
+
+ +

+ + Backup and restore Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 8 minute read + + + +

+ + +

Step by step procedure to backup and restore Gitlab in docker. +

+
+
+ + + + + + +
+
+ +

+ + Setup HTTPS for Gitlab + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +

Setup a SAN SSL certificate to use the HTTPS on Gitlab-CE in docker on Ubuntu server. +

+
+
+ + + + + + +
+
+ +

+ + Install Gitlab-CE in Docker on Ubuntu + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Step by step installation of Gitlab-CE in docker on Ubuntu server. +

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+

docker

+
+ + + + + +
+
+ +

+ + Using Databricks Connect inside a container + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +

Using Databricks Connect inside a container with VSCode remote containers with spark, jre, python, databricks-connect pre-installed. +

+
+
+ + + + + + +
+
+ +

+ + Migrate Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

+ This post will walk you through the steps to migrate Gitlab from one docker container to another. The steps need you to know how to install a new Gitlab c...

+
+
+ + + + + + +
+
+ +

+ + Update Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Step by step procedure to update Gitlab in docker. +

+
+
+ + + + + + +
+
+ +

+ + Backup and restore Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 8 minute read + + + +

+ + +

Step by step procedure to backup and restore Gitlab in docker. +

+
+
+ + + + + + +
+
+ +

+ + Install Gitlab-CE in Docker on Ubuntu + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Step by step installation of Gitlab-CE in docker on Ubuntu server. +

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+

async

+
+ + + + + +
+ +
+ + + + + + +
+
+ +

+ + Python Asyncio + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + Python Asyncio Unittest + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Unittest based on Pytest framework not on embedded unittest. +

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Python Asyncio Study notes + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

concurrent.futures +

+
+
+ + +
+ Back to top ↑ +
+ + + + + +
+

databricks

+
+ + + + + +
+ +
+ + + + + + +
+
+ +

+ + Databricks cluster access mode + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

What is cluster access mode +

+
+
+ + + + + + +
+
+ +

+ + Databricks job/task context + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Giving an example of Databricks job/task json context values +

+
+
+ + + + + + +
+
+ +

+ + Using Databricks Connect inside a container + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +

Using Databricks Connect inside a container with VSCode remote containers with spark, jre, python, databricks-connect pre-installed. +

+
+
+ + + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+

git

+
+ + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Git Cheat Sheet + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +

+ This is not a complete Git cheat sheet for everyone, this is just a personal cheat sheet for some often forgotten git commands. + + +

+
+
+ + + + + + +
+
+ +

+ + Git untrack submodule from git status + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

submodule folders cannot be added into .gitignore file to untrack them from git status, we will use ignore=dirty to ignore it +

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + +
+

format

+
+ + + + + +
+
+ +

+ + Python Lint And Format + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 11 minute read + + + +

+ + +

Azure SDK Python Guidelines +

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Use python tabulate module to create tables + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+ If you want to create some tables from a python list, you can use the tabulate module, it can generate the table easily in text mode and in many formats, ...

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+

packaging

+
+ + + + + +
+ +
+ + + + + + +
+
+ +

+ + Troubleshooting Python Twine Cannot Upload Package On Windows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Python has several tools to upload packages to PyPi or some private Artifactory locations. The mostly used one should be twine. Although twine is not a Pytho...

+
+
+ + + + + + +
+
+ +

+ + Setting Up Powershell gallery And Nuget gallery + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

As like pypi for Python, npm for Node.js, we also have Powershell Gallery and Nuget Gallery for Powershell. +

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+

sqlalchemy

+
+ + + + + +
+ +
+ + + + + + +
+
+ +

+ + SQLAlchemy mixin in method + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

Share common methods across SQLAlchemy db model classes by using mixin. +

+
+
+ + + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+

pip

+
+ + + + + +
+
+ +

+ + Github Actions - Python + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Trying Python pipreqs and pip-tools + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Some simple demos to test pipreqs and pip-tools +

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+

web

+
+ + + + + +
+ +
+ + + + + + +
+
+ +

+ + Setting up Github Pages With custom domain over HTTPS + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

With Github pages, we can create our blogs in our own domain over HTTPS completely free. Of course you should pay for your domain name at the Registrar. +

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

nuget

+
+ + + + + +
+ +
+ + + + + + +
+
+ +

+ + Setting Up Powershell gallery And Nuget gallery + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

As like pypi for Python, npm for Node.js, we also have Powershell Gallery and Nuget Gallery for Powershell. +

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + +
+

proxy

+
+ + + + + +
+
+ +

+ + Setting Pwsh Invoke-WebRequest Proxy + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

One-line command to set Powershell Core web cmdlets proxy. +

+
+
+ + + + + + +
+
+ +

+ + Setting Up Powershell gallery And Nuget gallery + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

As like pypi for Python, npm for Node.js, we also have Powershell Gallery and Nuget Gallery for Powershell. +

+
+
+ + +
+ Back to top ↑ +
+ + + + + +
+

ssh

+
+ + + + + +
+ +
+ + + + + + +
+
+ +

+ + Powershell stop-parsing (--%) + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Use Powershell stop-parsing (--%) to treat the remaining characters in the line as a literal. +

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+

markdown

+
+ + + + + +
+ +
+ + + + + + +
+
+ +

+ + Use python tabulate module to create tables + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+ If you want to create some tables from a python list, you can use the tabulate module, it can generate the table easily in text mode and in many formats, ...

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+

requests

+
+ + + + + +
+
+ +

+ + Python Requests With Retry + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Make python requests retry easily to use +

+
+
+ + + + + + +
+
+ +

+ + Creating Custom Python Request Auth Class + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Creating custom python request auth class with requests.auth.AuthBase. +

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + +
+

wsl

+
+ + + + + +
+
+ +

+ + My Powerline setup and configuration + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Just my way to setup and configure powerline in WSL +

+
+
+ + + + + + +
+
+ +

+ + Setting up WSL + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

Setting up WSL (Windows Subsystem for Linux) +

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

linux

+
+ + + + + +
+
+ +

+ + My Powerline setup and configuration + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Just my way to setup and configure powerline in WSL +

+
+
+ + + + + + +
+
+ +

+ + Setting up WSL + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

Setting up WSL (Windows Subsystem for Linux) +

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + +
+

vscode

+
+ + + + + +
+
+ +

+ + Using Databricks Connect inside a container + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +

Using Databricks Connect inside a container with VSCode remote containers with spark, jre, python, databricks-connect pre-installed. +

+
+
+ + + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + + + + + +
+

shell

+
+ + + + + +
+
+ +

+ + Hashing files + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + My Powerline setup and configuration + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Just my way to setup and configure powerline in WSL +

+
+
+ + +
+ Back to top ↑ +
+ + + + + +
+

unittest

+
+ + + + + +
+
+ +

+ + Python Asyncio Unittest + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Unittest based on Pytest framework not on embedded unittest. +

+
+
+ + + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + +
+

pytest

+
+ + + + + +
+
+ +

+ + Python Asyncio Unittest + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Unittest based on Pytest framework not on embedded unittest. +

+
+
+ + + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + +
+

spark

+
+ + + + + +
+
+ +

+ + Databricks cluster access mode + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

What is cluster access mode +

+
+
+ + + + + + +
+
+ +

+ + Using Databricks Connect inside a container + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +

Using Databricks Connect inside a container with VSCode remote containers with spark, jre, python, databricks-connect pre-installed. +

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + +
+

cache

+
+ + + + + +
+
+ +

+ + Hashing files + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + Github Actions - Cache + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

+

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

auth

+
+ + + + + +
+
+ +

+ + Github Actions - Python + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + + + + + +
+

github

+
+ + + + + +
+
+ +

+ + Setting up Github Pages With custom domain over HTTPS + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

With Github pages, we can create our blogs in our own domain over HTTPS completely free. Of course you should pay for your domain name at the Registrar. +

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

github-pages

+
+ + + + + +
+
+ +

+ + Setting up Github Pages With custom domain over HTTPS + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

With Github pages, we can create our blogs in our own domain over HTTPS completely free. Of course you should pay for your domain name at the Registrar. +

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + + + + + + + +
+

parsing

+
+ + + + + +
+
+ +

+ + Powershell stop-parsing (--%) + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Use Powershell stop-parsing (--%) to treat the remaining characters in the line as a literal. +

+
+
+ + +
+ Back to top ↑ +
+ + + + + +
+

jekyll

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + +
+

windows

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + +
+

ruby

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + + + +
+

repl

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + +
+

readline

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + +
+

string

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + +
+

regex

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + +
+

json

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + +
+

csv

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + +
+

module

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + + + +
+

submodule

+
+ + + + + +
+
+ +

+ + Git untrack submodule from git status + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

submodule folders cannot be added into .gitignore file to untrack them from git status, we will use ignore=dirty to ignore it +

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

package

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + +
+

pyvmomi

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + +
+

vmware

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + + + + + +
+

scheduled-task

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + +
+

certificate

+
+ + + + + +
+
+ +

+ + Setup HTTPS for Gitlab + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +

Setup a SAN SSL certificate to use the HTTPS on Gitlab-CE in docker on Ubuntu server. +

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

openssl

+
+ + + + + +
+
+ +

+ + Setup HTTPS for Gitlab + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +

Setup a SAN SSL certificate to use the HTTPS on Gitlab-CE in docker on Ubuntu server. +

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

backup

+
+ + + + + +
+
+ +

+ + Backup and restore Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 8 minute read + + + +

+ + +

Step by step procedure to backup and restore Gitlab in docker. +

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

update

+
+ + + + + +
+
+ +

+ + Update Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Step by step procedure to update Gitlab in docker. +

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

migration

+
+ + + + + +
+
+ +

+ + Migrate Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

+ This post will walk you through the steps to migrate Gitlab from one docker container to another. The steps need you to know how to install a new Gitlab c...

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

service

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + +
+

redis

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + + + + + +
+

multithreading

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + +
+

pandas

+
+ + + + + +
+
+ +

+ + Filtering In Pandas Dataframe + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +

Filtering a pandas dataframe with series, query, or numpy methods. +

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

filtering

+
+ + + + + +
+
+ +

+ + Filtering In Pandas Dataframe + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +

Filtering a pandas dataframe with series, query, or numpy methods. +

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

network

+
+ + + + + +
+
+ +

+ + A fast way to check TCP port in Powershell + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Test-NetConnection is too slow if the remote port is not opened due to its timeout setting. Use System.Net.Sockets.TcpClient instead. +

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

elastic

+
+ + + + + +
+
+ +

+ + Elastic Painless Scripted Field On Null/Missing Value + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

How to use painless scripted field to working on objects which might be null or missing in some documents. +

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

scoop

+
+ + + + + +
+
+ +

+ + Using Scoop On Windows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Some tips to use Scoop. +

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + +
+

itertools

+
+ + + + + +
+
+ +

+ + Flattening nested dict in Python + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Flattening a nested dict/json with list as some keys’ value. +

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

ipython

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + + + +
+

contextlib

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + +
+

flask

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + + + + + + + + + +
+

datetime

+
+ + + + + +
+
+ +

+ + Python datetime utcnow + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

+

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + + + +
+

container

+
+ + + + + +
+
+ +

+ + Using Databricks Connect inside a container + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +

Using Databricks Connect inside a container with VSCode remote containers with spark, jre, python, databricks-connect pre-installed. +

+
+
+ + +
+ Back to top ↑ +
+ + + + + +
+

storage

+
+ + + + + +
+
+ +

+ + Azure pipeline delete blobs from blob storage + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

The example given by this post is for Azure Pipeline with the latest Ubuntu agent, for AzCli from local machine, removing the --auth-mode login part should w...

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

codec

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + +
+

ast

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + +
+

api

+
+ + + + + +
+
+ +

+ + Calling Azure REST API + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

+

+
+
+ + +
+ Back to top ↑ +
+ + + +
+

rest

+
+ + + + + +
+
+ +

+ + Calling Azure REST API + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

+

+
+
+ + +
+ Back to top ↑ +
+ + + + + +
+

sonar

+
+ + + + + +
+
+ +

+ + Sonarcloud Github Action + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+

+
+
+ + +
+ Back to top ↑ +
+ + + + + + + +
+

vault

+
+ + + + + +
+ +
+ + +
+ Back to top ↑ +
+ + + + +
+
+
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/year-archive/index.html b/year-archive/index.html new file mode 100644 index 00000000..4c6c26ad --- /dev/null +++ b/year-archive/index.html @@ -0,0 +1,4394 @@ + + + + + + +Posts by Year - A code to remember + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+
+ +
+
+
+ + +
+ + + + +
+ + + + + +
+ +

Posts by Year

+ + + + + + + + + +
+

2023

+
+ + + + + +
+
+ +

+ + Hashing files + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Github Actions - Python + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Github Actions - Workflows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Github Actions - Variables + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +

Variables upon Git events +

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Github Actions - Custom Actions + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Actions checkout location in workflow +

+
+
+ + + + + + +
+
+ +

+ + Github Actions - Cache + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + Python Asyncio + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Python Asyncio Unittest + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Unittest based on Pytest framework not on embedded unittest. +

+
+
+ + + + + + +
+
+ +

+ + Sonarcloud Github Action + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + Calling Azure REST API + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + +
+ Back to top ↑ +
+ +
+

2022

+
+ + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Azure pipeline delete blobs from blob storage + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

The example given by this post is for Azure Pipeline with the latest Ubuntu agent, for AzCli from local machine, removing the --auth-mode login part should w...

+
+
+ + + + + + +
+
+ +

+ + Databricks cluster access mode + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

What is cluster access mode +

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Azure pipeline jobs + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+
+ +

+ + Databricks job/task context + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Giving an example of Databricks job/task json context values +

+
+
+ + + + + + +
+
+ +

+ + Azure pipeline conditions + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Azure pipeline has two kinds of conditions: +

+
+
+ + + + + + +
+
+ +

+ + Using Databricks Connect inside a container + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +

Using Databricks Connect inside a container with VSCode remote containers with spark, jre, python, databricks-connect pre-installed. +

+
+
+ + + + + + +
+
+ +

+ + Azure Pipeline Checkout Multiple Repositories + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 13 minute read + + + +

+ + +

This post will talk about some Azure pipeline predefined variables’ values in a multiple repositories checkout situation. The official doc is here. +

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Azure pipeline predefined variables + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

The official doc gives an explanation of all the predefined variables, but it lacks of some concret examples. Hereunder some examples for my preferred variab...

+
+
+ + +
+ Back to top ↑ +
+ +
+

2021

+
+ + + + + +
+
+ +

+ + Python Asyncio Study notes + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

concurrent.futures +

+
+
+ + + + + + +
+
+ +

+ + Python datetime utcnow + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

+

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Trying Python pipreqs and pip-tools + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Some simple demos to test pipreqs and pip-tools +

+
+
+ + + + + + +
+
+ +

+ + Python Requests With Retry + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Make python requests retry easily to use +

+
+
+ + + + + + +
+
+ +

+ + Python Lint And Format + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 11 minute read + + + +

+ + +

Azure SDK Python Guidelines +

+
+
+ + +
+ Back to top ↑ +
+ +
+

2020

+
+ + + + + +
+
+ +

+ + My Powerline setup and configuration + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Just my way to setup and configure powerline in WSL +

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Flattening nested dict in Python + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Flattening a nested dict/json with list as some keys’ value. +

+
+
+ + + + + + +
+
+ +

+ + Setting up WSL + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

Setting up WSL (Windows Subsystem for Linux) +

+
+
+ + +
+ Back to top ↑ +
+ +
+

2019

+
+ + + + + +
+
+ +

+ + Using Scoop On Windows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Some tips to use Scoop. +

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Elastic Painless Scripted Field On Null/Missing Value + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

How to use painless scripted field to working on objects which might be null or missing in some documents. +

+
+
+ + + + + + +
+
+ +

+ + Install Python3 on Ubuntu + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Install Python3 on Ubuntu by using official source. +

+
+
+ + + + + + +
+
+ +

+ + SQLAlchemy mixin in method + + +

+ + +

+ + + + + + + + + + + + + + + + + + + less than 1 minute read + + + +

+ + +

Share common methods across SQLAlchemy db model classes by using mixin. +

+
+
+ + + + + + +
+
+ +

+ + A fast way to check TCP port in Powershell + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Test-NetConnection is too slow if the remote port is not opened due to its timeout setting. Use System.Net.Sockets.TcpClient instead. +

+
+
+ + + + + + +
+
+ +

+ + Troubleshooting Python Twine Cannot Upload Package On Windows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Python has several tools to upload packages to PyPi or some private Artifactory locations. The mostly used one should be twine. Although twine is not a Pytho...

+
+
+ + + + + + +
+
+ +

+ + Filtering In Pandas Dataframe + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +

Filtering a pandas dataframe with series, query, or numpy methods. +

+
+
+ + + + + + +
+
+ +

+ + Git Cheat Sheet + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 5 minute read + + + +

+ + +

+ This is not a complete Git cheat sheet for everyone, this is just a personal cheat sheet for some often forgotten git commands. + + +

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Creating Custom Python Request Auth Class + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

Creating custom python request auth class with requests.auth.AuthBase. +

+
+
+ + +
+ Back to top ↑ +
+ +
+

2018

+
+ + + + + +
+ +
+ + + + + + +
+
+ +

+ + Setting Pwsh Invoke-WebRequest Proxy + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

One-line command to set Powershell Core web cmdlets proxy. +

+
+
+ + + + + + +
+
+ +

+ + Using Gitlab integrated CICD for Python project on Windows + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

Gitlab ships with its own free CICD which works pretty well. This post will give a .gitlab-ci.yml demo for a Python project running on Gitlab Windows runner. +

+
+
+ + + + + + +
+
+ +

+ + Migrate Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

+ This post will walk you through the steps to migrate Gitlab from one docker container to another. The steps need you to know how to install a new Gitlab c...

+
+
+ + + + + + +
+
+ +

+ + Update Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 2 minute read + + + +

+ + +

Step by step procedure to update Gitlab in docker. +

+
+
+ + + + + + +
+
+ +

+ + Terminate Powershell script or session + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 4 minute read + + + +

+ + +

+ I always asked myself how to terminate a Powershell script or session, each time I needed to do some tests by myself and also searched on Google. But I co...

+
+
+ + + + + + +
+
+ +

+ + Backup and restore Gitlab in docker + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 8 minute read + + + +

+ + +

Step by step procedure to backup and restore Gitlab in docker. +

+
+
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Setup HTTPS for Gitlab + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 6 minute read + + + +

+ + +

Setup a SAN SSL certificate to use the HTTPS on Gitlab-CE in docker on Ubuntu server. +

+
+
+ + + + + + +
+
+ +

+ + Install Gitlab-CE in Docker on Ubuntu + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Step by step installation of Gitlab-CE in docker on Ubuntu server. +

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Use python tabulate module to create tables + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

+ If you want to create some tables from a python list, you can use the tabulate module, it can generate the table easily in text mode and in many formats, ...

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Git untrack submodule from git status + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 1 minute read + + + +

+ + +

submodule folders cannot be added into .gitignore file to untrack them from git status, we will use ignore=dirty to ignore it +

+
+
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+ +
+ + + + + + +
+
+ +

+ + Powershell stop-parsing (--%) + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

Use Powershell stop-parsing (--%) to treat the remaining characters in the line as a literal. +

+
+
+ + + + + + +
+
+ +

+ + Setting Up Powershell gallery And Nuget gallery + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

As like pypi for Python, npm for Node.js, we also have Powershell Gallery and Nuget Gallery for Powershell. +

+
+
+ + + + + + +
+
+ +

+ + Setting up Github Pages With custom domain over HTTPS + + +

+ + +

+ + + + + + + + + + + + + + + + + + + 3 minute read + + + +

+ + +

With Github pages, we can create our blogs in our own domain over HTTPS completely free. Of course you should pay for your domain name at the Registrar. +

+
+
+ + +
+ Back to top ↑ +
+ + +
+
+
+ + +
+
+ +
+
+ +
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + +