カテゴリー: Desktop Development

  • chocolatey v1.3.1 installed. Version 2.2.2 is available

    PS C:\Users\user> choco upgrade chocolatey
    Chocolatey v1.3.1
    2 validations performed. 1 success(es), 1 warning(s), and 0 error(s).

    Validation Warnings:

    • A pending system reboot request has been detected, however, this is
      being ignored due to the current Chocolatey configuration. If you
      want to halt when this occurs, then either set the global feature
      using:
      choco feature enable -name=exitOnRebootDetected
      or pass the option –exit-when-reboot-detected.

    Upgrading the following packages:
    chocolatey
    By upgrading, you accept licenses for the packages.

    You have chocolatey v1.3.1 installed. Version 2.2.2 is available based on your source(s).
    Progress: Downloading chocolatey 2.2.2… 100%

    chocolatey v2.2.2
    chocolatey package files upgrade completed. Performing other installation steps.
    The package chocolatey wants to run ‘chocolateyInstall.ps1’.
    Note: If you don’t run this script, the installation will fail.
    Note: To confirm automatically next time, use ‘-y’ or consider:
    choco feature enable -n allowGlobalConfirmation
    Do you want to run the script?([Y]es/[A]ll – yes to all/[N]o/[P]rint): y

    Creating ChocolateyInstall as an environment variable (targeting ‘Machine’)
    Setting ChocolateyInstall to ‘C:\ProgramData\chocolatey’
    WARNING: It’s very likely you will need to close and reopen your shell
    before you can use choco.
    Restricting write permissions to Administrators
    We are setting up the Chocolatey package repository.
    The packages themselves go to ‘C:\ProgramData\chocolatey\lib’
    (i.e. C:\ProgramData\chocolatey\lib\yourPackageName).
    A shim file for the command line goes to ‘C:\ProgramData\chocolatey\bin’
    and points to an executable in ‘C:\ProgramData\chocolatey\lib\yourPackageName’.

    Creating Chocolatey folders if they do not already exist.

    Removing shim C:\ProgramData\chocolatey\redirects\chocolatey.exe
    Removing shim C:\ProgramData\chocolatey\redirects\cinst.exe
    Removing shim C:\ProgramData\chocolatey\redirects\clist.exe
    Removing shim C:\ProgramData\chocolatey\redirects\cpush.exe
    Removing shim C:\ProgramData\chocolatey\redirects\cuninst.exe
    Removing shim C:\ProgramData\chocolatey\redirects\cup.exe
    Adding Chocolatey to the profile. This will provide tab completion, refreshenv, etc.
    WARNING: Chocolatey profile installed. Reload your profile – type . $profile
    Chocolatey (choco.exe) is now ready.
    You can call choco from anywhere, command line or powershell by typing choco.
    Run choco /? for a list of functions.
    You may need to shut down and restart powershell and/or consoles
    first prior to using choco.
    Removing shim C:\ProgramData\chocolatey\bin\chocolatey.exe
    Removing shim C:\ProgramData\chocolatey\bin\cinst.exe
    Removing shim C:\ProgramData\chocolatey\bin\clist.exe
    Removing shim C:\ProgramData\chocolatey\bin\cpush.exe
    Removing shim C:\ProgramData\chocolatey\bin\cuninst.exe
    Removing shim C:\ProgramData\chocolatey\bin\cup.exe
    Environment Vars (like PATH) have changed. Close/reopen your shell to
    see the changes (or in powershell/cmd.exe just type refreshenv).
    The upgrade of chocolatey was successful.
    Software install location not explicitly set, it could be in package or
    default install location of installer.

    Chocolatey upgraded 1/1 packages.
    See the log for details (C:\ProgramData\chocolatey\logs\chocolatey.log).
    PS C:\Users\user>

    PS C:\Users\user> choco
    Chocolatey v2.2.2
    Please run ‘choco -?’ or ‘choco -?’ for help menu.

  • CUDA Toolkit

    C:\Users\user>nvidia-smi
    Sat May 13 14:05:06 2023
    +—————————————————————————————+
    | NVIDIA-SMI 531.79 Driver Version: 531.79 CUDA Version: 12.1 |
    |—————————————–+———————-+———————-+
    | GPU Name TCC/WDDM | Bus-Id Disp.A | Volatile Uncorr. ECC |
    | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
    | | | MIG M. |
    |=========================================+======================+======================|
    | 0 NVIDIA GeForce RTX 4090 WDDM | 00000000:01:00.0 On | Off |
    | 30% 42C P8 23W / 450W| 5658MiB / 24564MiB | 0% Default |
    | | | N/A |
    +—————————————–+———————-+———————-+

    +—————————————————————————————+
    | Processes: |
    | GPU GI CI PID Type Process name GPU Memory |
    | ID ID Usage |
    |=======================================================================================|
    | 0 N/A N/A 6712 C+G C:\Windows\explorer.exe N/A |
    | 0 N/A N/A 7084 C+G ….0_x64__8wekyb3d8bbwe\HxOutlook.exe N/A |
    | 0 N/A N/A 7664 C+G …rPicker\PowerToys.ColorPickerUI.exe N/A |
    | 0 N/A N/A 8736 C+G …nt.CBS_cw5n1h2txyewy\SearchHost.exe N/A |
    | 0 N/A N/A 8816 C+G …2txyewy\StartMenuExperienceHost.exe N/A |
    | 0 N/A N/A 10300 C+G …CBS_cw5n1h2txyewy\TextInputHost.exe N/A |
    | 0 N/A N/A 10596 C+G …t.LockApp_cw5n1h2txyewy\LockApp.exe N/A |
    | 0 N/A N/A 11724 C+G …oft\Edge Dev\Application\msedge.exe N/A |
    | 0 N/A N/A 12216 C+G …ekyb3d8bbwe\PhoneExperienceHost.exe N/A |
    | 0 N/A N/A 12280 C+G …FancyZones\PowerToys.FancyZones.exe N/A |
    | 0 N/A N/A 12364 C+G …auncher\PowerToys.PowerLauncher.exe N/A |
    | 0 N/A N/A 13556 C+G …GeForce Experience\NVIDIA Share.exe N/A |
    | 0 N/A N/A 14748 C+G …crosoft\Edge\Application\msedge.exe N/A |
    | 0 N/A N/A 14844 C+G …64__8wekyb3d8bbwe\CalculatorApp.exe N/A |
    | 0 N/A N/A 15956 C+G …les\Microsoft OneDrive\OneDrive.exe N/A |
    | 0 N/A N/A 16156 C+G …les\Microsoft OneDrive\OneDrive.exe N/A |
    | 0 N/A N/A 16776 C+G …B\system_tray\lghub_system_tray.exe N/A |
    | 0 N/A N/A 16940 C+G C:\Program Files\LGHUB\lghub.exe N/A |
    | 0 N/A N/A 17992 C+G …x64__q4d96b2w5wcc2\app\Evernote.exe N/A |
    | 0 N/A N/A 18928 C+G …61.0_x64__8wekyb3d8bbwe\GameBar.exe N/A |
    | 0 N/A N/A 19088 C+G …__8wekyb3d8bbwe\WindowsTerminal.exe N/A |
    | 0 N/A N/A 20000 C+G …on\113.0.1774.35\msedgewebview2.exe N/A |
    | 0 N/A N/A 20032 C+G …5n1h2txyewy\ShellExperienceHost.exe N/A |
    | 0 N/A N/A 22452 C+G …siveControlPanel\SystemSettings.exe N/A |
    | 0 N/A N/A 23736 C …rograms\Python\Python310\python.exe N/A |
    +—————————————————————————————+

    C:\Users\user>

  • webui起動

    Creating venv in directory C:\Users\user\Downloads\stable-diffusion-webui\venv using python “C:\Users\user\AppData\Local\Programs\Python\Python310\python.exe”
    venv “C:\Users\user\Downloads\stable-diffusion-webui\venv\Scripts\Python.exe”
    Python 3.10.11 (tags/v3.10.11:7d4cc5a, Apr 5 2023, 00:38:17) [MSC v.1929 64 bit (AMD64)]
    Commit hash: 5ab7f213bec2f816f9c5644becb32eb72c8ffb89
    Installing torch and torchvision
    Looking in indexes: https://pypi.org/simple, https://download.pytorch.org/whl/cu118
    Collecting torch==2.0.0
    Downloading https://download.pytorch.org/whl/cu118/torch-2.0.0%2Bcu118-cp310-cp310-win_amd64.whl (2611.3 MB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.6/2.6 GB 2.8 MB/s eta 0:00:00
    Collecting torchvision==0.15.1
    Downloading https://download.pytorch.org/whl/cu118/torchvision-0.15.1%2Bcu118-cp310-cp310-win_amd64.whl (4.9 MB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.9/4.9 MB 105.0 MB/s eta 0:00:00
    Collecting sympy
    Downloading https://download.pytorch.org/whl/sympy-1.11.1-py3-none-any.whl (6.5 MB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.5/6.5 MB 102.6 MB/s eta 0:00:00
    Collecting jinja2
    Downloading https://download.pytorch.org/whl/Jinja2-3.1.2-py3-none-any.whl (133 kB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 133.1/133.1 kB ? eta 0:00:00
    Collecting filelock
    Downloading filelock-3.12.0-py3-none-any.whl (10 kB)
    Collecting networkx
    Downloading networkx-3.1-py3-none-any.whl (2.1 MB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.1/2.1 MB 64.4 MB/s eta 0:00:00
    Collecting typing-extensions
    Using cached typing_extensions-4.5.0-py3-none-any.whl (27 kB)
    Collecting requests
    Using cached requests-2.29.0-py3-none-any.whl (62 kB)
    Collecting numpy
    Using cached numpy-1.24.3-cp310-cp310-win_amd64.whl (14.8 MB)
    Collecting pillow!=8.3.*,>=5.3.0
    Using cached Pillow-9.5.0-cp310-cp310-win_amd64.whl (2.5 MB)
    Collecting MarkupSafe>=2.0
    Downloading https://download.pytorch.org/whl/MarkupSafe-2.1.2-cp310-cp310-win_amd64.whl (16 kB)
    Collecting certifi>=2017.4.17
    Using cached https://download.pytorch.org/whl/certifi-2022.12.7-py3-none-any.whl (155 kB)
    Collecting urllib3<1.27,>=1.21.1
    Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
    Collecting charset-normalizer<4,>=2
    Using cached charset_normalizer-3.1.0-cp310-cp310-win_amd64.whl (97 kB)
    Collecting idna<4,>=2.5
    Using cached https://download.pytorch.org/whl/idna-3.4-py3-none-any.whl (61 kB)
    Collecting mpmath>=0.19
    Downloading mpmath-1.3.0-py3-none-any.whl (536 kB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 536.2/536.2 kB ? eta 0:00:00
    Installing collected packages: mpmath, urllib3, typing-extensions, sympy, pillow, numpy, networkx, MarkupSafe, idna, filelock, charset-normalizer, certifi, requests, jinja2, torch, torchvision
    Successfully installed MarkupSafe-2.1.2 certifi-2022.12.7 charset-normalizer-3.1.0 filelock-3.12.0 idna-3.4 jinja2-3.1.2 mpmath-1.3.0 networkx-3.1 numpy-1.24.3 pillow-9.5.0 requests-2.29.0 sympy-1.11.1 torch-2.0.0+cu118 torchvision-0.15.1+cu118 typing-extensions-4.5.0 urllib3-1.26.15

    [notice] A new release of pip is available: 23.0.1 -> 23.1.2
    [notice] To update, run: C:\Users\user\Downloads\stable-diffusion-webui\venv\Scripts\python.exe -m pip install –upgrade pip
    Installing gfpgan
    Installing clip
    Installing open_clip
    Cloning Stable Diffusion into C:\Users\user\Downloads\stable-diffusion-webui\repositories\stable-diffusion-stability-ai…
    Cloning Taming Transformers into C:\Users\user\Downloads\stable-diffusion-webui\repositories\taming-transformers…
    Cloning K-diffusion into C:\Users\user\Downloads\stable-diffusion-webui\repositories\k-diffusion…
    Cloning CodeFormer into C:\Users\user\Downloads\stable-diffusion-webui\repositories\CodeFormer…
    Cloning BLIP into C:\Users\user\Downloads\stable-diffusion-webui\repositories\BLIP…
    Installing requirements for CodeFormer
    Installing requirements
    Launching Web UI with arguments:
    No module ‘xformers’. Proceeding without it.
    Calculating sha256 for C:\Users\user\Downloads\stable-diffusion-webui\models\Stable-diffusion\trinart2_step115000.ckpt: 776af18775dfccf29725a994df855e7d8f7b8ea525013e3a466f210ec15c8fd4
    Loading weights [776af18775] from C:\Users\user\Downloads\stable-diffusion-webui\models\Stable-diffusion\trinart2_step115000.ckpt
    Creating model from config: C:\Users\user\Downloads\stable-diffusion-webui\configs\v1-inference.yaml
    LatentDiffusion: Running in eps-prediction mode
    DiffusionWrapper has 859.52 M params.
    Downloading (…)olve/main/vocab.json: 100%|██████████████████████████████████████████| 961k/961k [00:00<00:00, 16.3MB/s]
    Downloading (…)olve/main/merges.txt: 100%|██████████████████████████████████████████| 525k/525k [00:00<00:00, 2.03MB/s]
    Downloading (…)cial_tokens_map.json: 100%|████████████████████████████████████████████████████| 389/389 [00:00<?, ?B/s]
    Downloading (…)okenizer_config.json: 100%|████████████████████████████████████████████████████| 905/905 [00:00<?, ?B/s]
    Downloading (…)lve/main/config.json: 100%|████████████████████████████████████████████████| 4.52k/4.52k [00:00<?, ?B/s]
    Applying cross attention optimization (Doggettx).
    Textual inversion embeddings loaded(0):
    Model loaded in 7.0s (calculate hash: 1.4s, load weights from disk: 0.6s, create model: 2.6s, apply weights to model: 0.6s, apply half(): 0.5s, move model to device: 0.4s, load textual inversion embeddings: 0.7s).
    Running on local URL: http://127.0.0.1:7860

    To create a public link, set share=True in launch().
    Startup time: 14.3s (import torch: 1.7s, import gradio: 1.5s, import ldm: 0.6s, other imports: 2.2s, setup codeformer: 0.2s, load scripts: 0.6s, load SD checkpoint: 7.1s, create ui: 0.4s).

  • Install torch Previous version 1.13.1+cu117

    PS C:\Users\user> pip install torch==1.13.1+cu117 torchvision==0.14.1+cu117 torchaudio==0.13.1 –extra-index-url https://download.pytorch.org/whl/cu117
    Looking in indexes: https://pypi.org/simple, https://download.pytorch.org/whl/cu117
    Collecting torch==1.13.1+cu117
    Downloading https://download.pytorch.org/whl/cu117/torch-1.13.1%2Bcu117-cp310-cp310-win_amd64.whl (2255.4 MB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.3/2.3 GB 3.6 MB/s eta 0:00:00
    Collecting torchvision==0.14.1+cu117
    Downloading https://download.pytorch.org/whl/cu117/torchvision-0.14.1%2Bcu117-cp310-cp310-win_amd64.whl (4.8 MB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.8/4.8 MB 155.8 MB/s eta 0:00:00
    Collecting torchaudio==0.13.1
    Downloading https://download.pytorch.org/whl/cu117/torchaudio-0.13.1%2Bcu117-cp310-cp310-win_amd64.whl (2.3 MB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.3/2.3 MB 73.3 MB/s eta 0:00:00
    Collecting typing-extensions (from torch==1.13.1+cu117)
    Downloading typing_extensions-4.5.0-py3-none-any.whl (27 kB)
    Collecting numpy (from torchvision==0.14.1+cu117)
    Downloading numpy-1.24.3-cp310-cp310-win_amd64.whl (14.8 MB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 14.8/14.8 MB 81.8 MB/s eta 0:00:00
    Collecting requests (from torchvision==0.14.1+cu117)
    Downloading requests-2.29.0-py3-none-any.whl (62 kB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.5/62.5 kB ? eta 0:00:00
    Collecting pillow!=8.3.*,>=5.3.0 (from torchvision==0.14.1+cu117)
    Downloading Pillow-9.5.0-cp310-cp310-win_amd64.whl (2.5 MB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.5/2.5 MB 166.7 MB/s eta 0:00:00
    Collecting charset-normalizer<4,>=2 (from requests->torchvision==0.14.1+cu117)
    Downloading charset_normalizer-3.1.0-cp310-cp310-win_amd64.whl (97 kB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 97.1/97.1 kB ? eta 0:00:00
    Collecting idna<4,>=2.5 (from requests->torchvision==0.14.1+cu117)
    Downloading https://download.pytorch.org/whl/idna-3.4-py3-none-any.whl (61 kB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.5/61.5 kB ? eta 0:00:00
    Collecting urllib3<1.27,>=1.21.1 (from requests->torchvision==0.14.1+cu117)
    Downloading urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 140.9/140.9 kB ? eta 0:00:00
    Collecting certifi>=2017.4.17 (from requests->torchvision==0.14.1+cu117)
    Downloading https://download.pytorch.org/whl/certifi-2022.12.7-py3-none-any.whl (155 kB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 155.3/155.3 kB ? eta 0:00:00
    Installing collected packages: urllib3, typing-extensions, pillow, numpy, idna, charset-normalizer, certifi, torch, requests, torchvision, torchaudio
    Successfully installed certifi-2022.12.7 charset-normalizer-3.1.0 idna-3.4 numpy-1.24.3 pillow-9.5.0 requests-2.29.0 torch-1.13.1+cu117 torchaudio-0.13.1+cu117 torchvision-0.14.1+cu117 typing-extensions-4.5.0 urllib3-1.26.15
    PS C:\Users\user>

    PS C:\Users\user> pip list
    Package Version


    certifi 2022.12.7
    charset-normalizer 3.1.0
    idna 3.4
    numpy 1.24.3
    Pillow 9.5.0
    pip 23.1.2
    requests 2.29.0
    setuptools 65.5.0
    torch 1.13.1+cu117
    torchaudio 0.13.1+cu117
    torchvision 0.14.1+cu117
    typing_extensions 4.5.0
    urllib3 1.26.15
    PS C:\Users\user>

    PS C:\Users\user> pip show torch
    Name: torch
    Version: 1.13.1+cu117
    Summary: Tensors and Dynamic neural networks in Python with strong GPU acceleration
    Home-page: https://pytorch.org/
    Author: PyTorch Team
    Author-email: packages@pytorch.org
    License: BSD-3
    Location: c:\users\user\appdata\local\programs\python\python310\lib\site-packages
    Requires: typing-extensions
    Required-by: torchaudio, torchvision
    PS C:\Users\user>

    PS C:\Users\user> py C:\Users\user/python_exercise/torch_gpu_info.py
    torch.version, 1.13.1+cu117
    torch.cuda.is_available(), True
    compute_89
    find gpu devices, 1
    cuda:0, NVIDIA GeForce RTX 4090
    end
    PS C:\Users\user>

    PS C:\Users\user\python_exercise> cat torch_gpu_info.py
    import torch

    print(f”torch.version, {torch.version}”)
    print(f”torch.cuda.is_available(), {torch.cuda.is_available()}”)
    print(f”compute_{”.join(map(str,(torch.cuda.get_device_capability())))}”)
    device_num:int = torch.cuda.device_count()
    print(f”find gpu devices, {device_num}”)
    for idx in range(device_num):
    print(f”cuda:{idx}, {torch.cuda.get_device_name(idx)}”)

    print(“end”)

    PS C:\Users\user\python_exercise>