Sep 12 18:55:06.940940 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 15:34:39 -00 2025 Sep 12 18:55:06.940956 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 18:55:06.940963 kernel: BIOS-provided physical RAM map: Sep 12 18:55:06.940967 kernel: BIOS-e820: [mem 0x0000000000000000-0x00000000000997ff] usable Sep 12 18:55:06.940971 kernel: BIOS-e820: [mem 0x0000000000099800-0x000000000009ffff] reserved Sep 12 18:55:06.940975 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Sep 12 18:55:06.940981 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Sep 12 18:55:06.940986 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Sep 12 18:55:06.940991 kernel: BIOS-e820: [mem 0x0000000040400000-0x000000006dfbdfff] usable Sep 12 18:55:06.940995 kernel: BIOS-e820: [mem 0x000000006dfbe000-0x000000006dfbefff] ACPI NVS Sep 12 18:55:06.941000 kernel: BIOS-e820: [mem 0x000000006dfbf000-0x000000006dfbffff] reserved Sep 12 18:55:06.941004 kernel: BIOS-e820: [mem 0x000000006dfc0000-0x0000000077fc6fff] usable Sep 12 18:55:06.941009 kernel: BIOS-e820: [mem 0x0000000077fc7000-0x00000000790a9fff] reserved Sep 12 18:55:06.941013 kernel: BIOS-e820: [mem 0x00000000790aa000-0x0000000079232fff] usable Sep 12 18:55:06.941020 kernel: BIOS-e820: [mem 0x0000000079233000-0x0000000079664fff] ACPI NVS Sep 12 18:55:06.941025 kernel: BIOS-e820: [mem 0x0000000079665000-0x000000007befefff] reserved Sep 12 18:55:06.941030 kernel: BIOS-e820: [mem 0x000000007beff000-0x000000007befffff] usable Sep 12 18:55:06.941035 kernel: BIOS-e820: [mem 0x000000007bf00000-0x000000007f7fffff] reserved Sep 12 18:55:06.941040 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 12 18:55:06.941045 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Sep 12 18:55:06.941049 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Sep 12 18:55:06.941054 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Sep 12 18:55:06.941059 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Sep 12 18:55:06.941065 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000087f7fffff] usable Sep 12 18:55:06.941070 kernel: NX (Execute Disable) protection: active Sep 12 18:55:06.941075 kernel: APIC: Static calls initialized Sep 12 18:55:06.941080 kernel: SMBIOS 3.2.1 present. Sep 12 18:55:06.941085 kernel: DMI: Supermicro PIO-519C-MR-PH004/X11SCH-F, BIOS 1.5 11/17/2020 Sep 12 18:55:06.941090 kernel: DMI: Memory slots populated: 2/4 Sep 12 18:55:06.941095 kernel: tsc: Detected 3400.000 MHz processor Sep 12 18:55:06.941100 kernel: tsc: Detected 3399.906 MHz TSC Sep 12 18:55:06.941104 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 12 18:55:06.941110 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 12 18:55:06.941115 kernel: last_pfn = 0x87f800 max_arch_pfn = 0x400000000 Sep 12 18:55:06.941122 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 23), built from 10 variable MTRRs Sep 12 18:55:06.941127 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 12 18:55:06.941132 kernel: last_pfn = 0x7bf00 max_arch_pfn = 0x400000000 Sep 12 18:55:06.941137 kernel: Using GB pages for direct mapping Sep 12 18:55:06.941142 kernel: ACPI: Early table checksum verification disabled Sep 12 18:55:06.941147 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Sep 12 18:55:06.941155 kernel: ACPI: XSDT 0x00000000795460C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Sep 12 18:55:06.941161 kernel: ACPI: FACP 0x0000000079582620 000114 (v06 01072009 AMI 00010013) Sep 12 18:55:06.941166 kernel: ACPI: DSDT 0x0000000079546268 03C3B7 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Sep 12 18:55:06.941172 kernel: ACPI: FACS 0x0000000079664F80 000040 Sep 12 18:55:06.941177 kernel: ACPI: APIC 0x0000000079582738 00012C (v04 01072009 AMI 00010013) Sep 12 18:55:06.941182 kernel: ACPI: FPDT 0x0000000079582868 000044 (v01 01072009 AMI 00010013) Sep 12 18:55:06.941188 kernel: ACPI: FIDT 0x00000000795828B0 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Sep 12 18:55:06.941193 kernel: ACPI: MCFG 0x0000000079582950 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Sep 12 18:55:06.941199 kernel: ACPI: SPMI 0x0000000079582990 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Sep 12 18:55:06.941205 kernel: ACPI: SSDT 0x00000000795829D8 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Sep 12 18:55:06.941210 kernel: ACPI: SSDT 0x00000000795844F8 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Sep 12 18:55:06.941215 kernel: ACPI: SSDT 0x00000000795876C0 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Sep 12 18:55:06.941221 kernel: ACPI: HPET 0x00000000795899F0 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 12 18:55:06.941226 kernel: ACPI: SSDT 0x0000000079589A28 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Sep 12 18:55:06.941231 kernel: ACPI: SSDT 0x000000007958A9D8 0008F7 (v02 INTEL xh_mossb 00000000 INTL 20160527) Sep 12 18:55:06.941237 kernel: ACPI: UEFI 0x000000007958B2D0 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 12 18:55:06.941243 kernel: ACPI: LPIT 0x000000007958B318 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 12 18:55:06.941248 kernel: ACPI: SSDT 0x000000007958B3B0 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Sep 12 18:55:06.941254 kernel: ACPI: SSDT 0x000000007958DB90 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Sep 12 18:55:06.941259 kernel: ACPI: DBGP 0x000000007958F078 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 12 18:55:06.941264 kernel: ACPI: DBG2 0x000000007958F0B0 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Sep 12 18:55:06.941270 kernel: ACPI: SSDT 0x000000007958F108 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Sep 12 18:55:06.941275 kernel: ACPI: DMAR 0x0000000079590C70 0000A8 (v01 INTEL EDK2 00000002 01000013) Sep 12 18:55:06.941280 kernel: ACPI: SSDT 0x0000000079590D18 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Sep 12 18:55:06.941286 kernel: ACPI: TPM2 0x0000000079590E60 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Sep 12 18:55:06.941292 kernel: ACPI: SSDT 0x0000000079590E98 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Sep 12 18:55:06.941298 kernel: ACPI: WSMT 0x0000000079591C28 000028 (v01 \xf5m 01072009 AMI 00010013) Sep 12 18:55:06.941303 kernel: ACPI: EINJ 0x0000000079591C50 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Sep 12 18:55:06.941308 kernel: ACPI: ERST 0x0000000079591D80 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Sep 12 18:55:06.941314 kernel: ACPI: BERT 0x0000000079591FB0 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Sep 12 18:55:06.941319 kernel: ACPI: HEST 0x0000000079591FE0 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Sep 12 18:55:06.941324 kernel: ACPI: SSDT 0x0000000079592260 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Sep 12 18:55:06.941330 kernel: ACPI: Reserving FACP table memory at [mem 0x79582620-0x79582733] Sep 12 18:55:06.941336 kernel: ACPI: Reserving DSDT table memory at [mem 0x79546268-0x7958261e] Sep 12 18:55:06.941342 kernel: ACPI: Reserving FACS table memory at [mem 0x79664f80-0x79664fbf] Sep 12 18:55:06.941347 kernel: ACPI: Reserving APIC table memory at [mem 0x79582738-0x79582863] Sep 12 18:55:06.941352 kernel: ACPI: Reserving FPDT table memory at [mem 0x79582868-0x795828ab] Sep 12 18:55:06.941357 kernel: ACPI: Reserving FIDT table memory at [mem 0x795828b0-0x7958294b] Sep 12 18:55:06.941363 kernel: ACPI: Reserving MCFG table memory at [mem 0x79582950-0x7958298b] Sep 12 18:55:06.941368 kernel: ACPI: Reserving SPMI table memory at [mem 0x79582990-0x795829d0] Sep 12 18:55:06.941373 kernel: ACPI: Reserving SSDT table memory at [mem 0x795829d8-0x795844f3] Sep 12 18:55:06.941379 kernel: ACPI: Reserving SSDT table memory at [mem 0x795844f8-0x795876bd] Sep 12 18:55:06.941385 kernel: ACPI: Reserving SSDT table memory at [mem 0x795876c0-0x795899ea] Sep 12 18:55:06.941390 kernel: ACPI: Reserving HPET table memory at [mem 0x795899f0-0x79589a27] Sep 12 18:55:06.941395 kernel: ACPI: Reserving SSDT table memory at [mem 0x79589a28-0x7958a9d5] Sep 12 18:55:06.941401 kernel: ACPI: Reserving SSDT table memory at [mem 0x7958a9d8-0x7958b2ce] Sep 12 18:55:06.941406 kernel: ACPI: Reserving UEFI table memory at [mem 0x7958b2d0-0x7958b311] Sep 12 18:55:06.941411 kernel: ACPI: Reserving LPIT table memory at [mem 0x7958b318-0x7958b3ab] Sep 12 18:55:06.941417 kernel: ACPI: Reserving SSDT table memory at [mem 0x7958b3b0-0x7958db8d] Sep 12 18:55:06.941422 kernel: ACPI: Reserving SSDT table memory at [mem 0x7958db90-0x7958f071] Sep 12 18:55:06.941427 kernel: ACPI: Reserving DBGP table memory at [mem 0x7958f078-0x7958f0ab] Sep 12 18:55:06.941432 kernel: ACPI: Reserving DBG2 table memory at [mem 0x7958f0b0-0x7958f103] Sep 12 18:55:06.941439 kernel: ACPI: Reserving SSDT table memory at [mem 0x7958f108-0x79590c6e] Sep 12 18:55:06.941444 kernel: ACPI: Reserving DMAR table memory at [mem 0x79590c70-0x79590d17] Sep 12 18:55:06.941449 kernel: ACPI: Reserving SSDT table memory at [mem 0x79590d18-0x79590e5b] Sep 12 18:55:06.941454 kernel: ACPI: Reserving TPM2 table memory at [mem 0x79590e60-0x79590e93] Sep 12 18:55:06.941460 kernel: ACPI: Reserving SSDT table memory at [mem 0x79590e98-0x79591c26] Sep 12 18:55:06.941465 kernel: ACPI: Reserving WSMT table memory at [mem 0x79591c28-0x79591c4f] Sep 12 18:55:06.941470 kernel: ACPI: Reserving EINJ table memory at [mem 0x79591c50-0x79591d7f] Sep 12 18:55:06.941475 kernel: ACPI: Reserving ERST table memory at [mem 0x79591d80-0x79591faf] Sep 12 18:55:06.941481 kernel: ACPI: Reserving BERT table memory at [mem 0x79591fb0-0x79591fdf] Sep 12 18:55:06.941487 kernel: ACPI: Reserving HEST table memory at [mem 0x79591fe0-0x7959225b] Sep 12 18:55:06.941492 kernel: ACPI: Reserving SSDT table memory at [mem 0x79592260-0x795923c1] Sep 12 18:55:06.941498 kernel: No NUMA configuration found Sep 12 18:55:06.941503 kernel: Faking a node at [mem 0x0000000000000000-0x000000087f7fffff] Sep 12 18:55:06.941508 kernel: NODE_DATA(0) allocated [mem 0x87f7f8dc0-0x87f7fffff] Sep 12 18:55:06.941514 kernel: Zone ranges: Sep 12 18:55:06.941519 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 12 18:55:06.941524 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Sep 12 18:55:06.941530 kernel: Normal [mem 0x0000000100000000-0x000000087f7fffff] Sep 12 18:55:06.941536 kernel: Device empty Sep 12 18:55:06.941541 kernel: Movable zone start for each node Sep 12 18:55:06.941546 kernel: Early memory node ranges Sep 12 18:55:06.941552 kernel: node 0: [mem 0x0000000000001000-0x0000000000098fff] Sep 12 18:55:06.941557 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Sep 12 18:55:06.941562 kernel: node 0: [mem 0x0000000040400000-0x000000006dfbdfff] Sep 12 18:55:06.941568 kernel: node 0: [mem 0x000000006dfc0000-0x0000000077fc6fff] Sep 12 18:55:06.941573 kernel: node 0: [mem 0x00000000790aa000-0x0000000079232fff] Sep 12 18:55:06.941583 kernel: node 0: [mem 0x000000007beff000-0x000000007befffff] Sep 12 18:55:06.941591 kernel: node 0: [mem 0x0000000100000000-0x000000087f7fffff] Sep 12 18:55:06.941597 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000087f7fffff] Sep 12 18:55:06.941602 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 12 18:55:06.941609 kernel: On node 0, zone DMA: 103 pages in unavailable ranges Sep 12 18:55:06.941630 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Sep 12 18:55:06.941635 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Sep 12 18:55:06.941641 kernel: On node 0, zone DMA32: 4323 pages in unavailable ranges Sep 12 18:55:06.941646 kernel: On node 0, zone DMA32: 11468 pages in unavailable ranges Sep 12 18:55:06.941653 kernel: On node 0, zone Normal: 16640 pages in unavailable ranges Sep 12 18:55:06.941658 kernel: On node 0, zone Normal: 2048 pages in unavailable ranges Sep 12 18:55:06.941664 kernel: ACPI: PM-Timer IO Port: 0x1808 Sep 12 18:55:06.941669 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Sep 12 18:55:06.941675 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Sep 12 18:55:06.941680 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Sep 12 18:55:06.941686 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Sep 12 18:55:06.941691 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Sep 12 18:55:06.941696 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Sep 12 18:55:06.941702 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Sep 12 18:55:06.941708 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Sep 12 18:55:06.941714 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Sep 12 18:55:06.941719 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Sep 12 18:55:06.941725 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Sep 12 18:55:06.941730 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Sep 12 18:55:06.941736 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Sep 12 18:55:06.941741 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Sep 12 18:55:06.941747 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Sep 12 18:55:06.941752 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Sep 12 18:55:06.941758 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Sep 12 18:55:06.941764 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 12 18:55:06.941769 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 12 18:55:06.941775 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 12 18:55:06.941780 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 12 18:55:06.941786 kernel: TSC deadline timer available Sep 12 18:55:06.941791 kernel: CPU topo: Max. logical packages: 1 Sep 12 18:55:06.941797 kernel: CPU topo: Max. logical dies: 1 Sep 12 18:55:06.941802 kernel: CPU topo: Max. dies per package: 1 Sep 12 18:55:06.941809 kernel: CPU topo: Max. threads per core: 2 Sep 12 18:55:06.941814 kernel: CPU topo: Num. cores per package: 8 Sep 12 18:55:06.941820 kernel: CPU topo: Num. threads per package: 16 Sep 12 18:55:06.941825 kernel: CPU topo: Allowing 16 present CPUs plus 0 hotplug CPUs Sep 12 18:55:06.941830 kernel: [mem 0x7f800000-0xdfffffff] available for PCI devices Sep 12 18:55:06.941836 kernel: Booting paravirtualized kernel on bare hardware Sep 12 18:55:06.941841 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 12 18:55:06.941847 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Sep 12 18:55:06.941853 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 12 18:55:06.941859 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 12 18:55:06.941865 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Sep 12 18:55:06.941871 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 18:55:06.941876 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 18:55:06.941882 kernel: random: crng init done Sep 12 18:55:06.941887 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Sep 12 18:55:06.941893 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Sep 12 18:55:06.941898 kernel: Fallback order for Node 0: 0 Sep 12 18:55:06.941904 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8352999 Sep 12 18:55:06.941910 kernel: Policy zone: Normal Sep 12 18:55:06.941916 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 18:55:06.941921 kernel: software IO TLB: area num 16. Sep 12 18:55:06.941926 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Sep 12 18:55:06.941932 kernel: ftrace: allocating 40125 entries in 157 pages Sep 12 18:55:06.941937 kernel: ftrace: allocated 157 pages with 5 groups Sep 12 18:55:06.941943 kernel: Dynamic Preempt: voluntary Sep 12 18:55:06.941949 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 18:55:06.941955 kernel: rcu: RCU event tracing is enabled. Sep 12 18:55:06.941961 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Sep 12 18:55:06.941966 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 18:55:06.941972 kernel: Rude variant of Tasks RCU enabled. Sep 12 18:55:06.941977 kernel: Tracing variant of Tasks RCU enabled. Sep 12 18:55:06.941983 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 18:55:06.941988 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Sep 12 18:55:06.941994 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 12 18:55:06.941999 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 12 18:55:06.942005 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 12 18:55:06.942011 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Sep 12 18:55:06.942017 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 18:55:06.942022 kernel: Console: colour VGA+ 80x25 Sep 12 18:55:06.942028 kernel: printk: legacy console [tty0] enabled Sep 12 18:55:06.942033 kernel: printk: legacy console [ttyS1] enabled Sep 12 18:55:06.942039 kernel: ACPI: Core revision 20240827 Sep 12 18:55:06.942044 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 79635855245 ns Sep 12 18:55:06.942050 kernel: APIC: Switch to symmetric I/O mode setup Sep 12 18:55:06.942055 kernel: DMAR: Host address width 39 Sep 12 18:55:06.942062 kernel: DMAR: DRHD base: 0x000000fed90000 flags: 0x0 Sep 12 18:55:06.942067 kernel: DMAR: dmar0: reg_base_addr fed90000 ver 1:0 cap 1c0000c40660462 ecap 19e2ff0505e Sep 12 18:55:06.942073 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Sep 12 18:55:06.942078 kernel: DMAR: dmar1: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Sep 12 18:55:06.942084 kernel: DMAR: RMRR base: 0x00000079f11000 end: 0x0000007a15afff Sep 12 18:55:06.942089 kernel: DMAR: RMRR base: 0x0000007d000000 end: 0x0000007f7fffff Sep 12 18:55:06.942095 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 1 Sep 12 18:55:06.942100 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Sep 12 18:55:06.942106 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Sep 12 18:55:06.942112 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Sep 12 18:55:06.942118 kernel: x2apic enabled Sep 12 18:55:06.942123 kernel: APIC: Switched APIC routing to: cluster x2apic Sep 12 18:55:06.942129 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 12 18:55:06.942134 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Sep 12 18:55:06.942140 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Sep 12 18:55:06.942146 kernel: CPU0: Thermal monitoring enabled (TM1) Sep 12 18:55:06.942151 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Sep 12 18:55:06.942157 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Sep 12 18:55:06.942163 kernel: process: using mwait in idle threads Sep 12 18:55:06.942168 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 12 18:55:06.942174 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 12 18:55:06.942180 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Sep 12 18:55:06.942185 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Sep 12 18:55:06.942190 kernel: RETBleed: Mitigation: Enhanced IBRS Sep 12 18:55:06.942196 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 12 18:55:06.942202 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 12 18:55:06.942207 kernel: TAA: Mitigation: Clear CPU buffers Sep 12 18:55:06.942214 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Sep 12 18:55:06.942219 kernel: SRBDS: Mitigation: Microcode Sep 12 18:55:06.942224 kernel: GDS: Vulnerable: No microcode Sep 12 18:55:06.942230 kernel: active return thunk: its_return_thunk Sep 12 18:55:06.942235 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 12 18:55:06.942241 kernel: VMSCAPE: Mitigation: IBPB before exit to userspace Sep 12 18:55:06.942246 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 12 18:55:06.942252 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 12 18:55:06.942257 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 12 18:55:06.942264 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Sep 12 18:55:06.942269 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Sep 12 18:55:06.942275 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 12 18:55:06.942281 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Sep 12 18:55:06.942286 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Sep 12 18:55:06.942292 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Sep 12 18:55:06.942297 kernel: Freeing SMP alternatives memory: 32K Sep 12 18:55:06.942303 kernel: pid_max: default: 32768 minimum: 301 Sep 12 18:55:06.942308 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 18:55:06.942315 kernel: landlock: Up and running. Sep 12 18:55:06.942320 kernel: SELinux: Initializing. Sep 12 18:55:06.942325 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 18:55:06.942331 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 18:55:06.942337 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Sep 12 18:55:06.942342 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Sep 12 18:55:06.942348 kernel: ... version: 4 Sep 12 18:55:06.942353 kernel: ... bit width: 48 Sep 12 18:55:06.942359 kernel: ... generic registers: 4 Sep 12 18:55:06.942365 kernel: ... value mask: 0000ffffffffffff Sep 12 18:55:06.942371 kernel: ... max period: 00007fffffffffff Sep 12 18:55:06.942376 kernel: ... fixed-purpose events: 3 Sep 12 18:55:06.942382 kernel: ... event mask: 000000070000000f Sep 12 18:55:06.942387 kernel: signal: max sigframe size: 2032 Sep 12 18:55:06.942393 kernel: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Sep 12 18:55:06.942398 kernel: rcu: Hierarchical SRCU implementation. Sep 12 18:55:06.942404 kernel: rcu: Max phase no-delay instances is 400. Sep 12 18:55:06.942409 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Sep 12 18:55:06.942416 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Sep 12 18:55:06.942421 kernel: smp: Bringing up secondary CPUs ... Sep 12 18:55:06.942427 kernel: smpboot: x86: Booting SMP configuration: Sep 12 18:55:06.942432 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 #11 #12 #13 #14 #15 Sep 12 18:55:06.942438 kernel: TAA CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/tsx_async_abort.html for more details. Sep 12 18:55:06.942444 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Sep 12 18:55:06.942450 kernel: smp: Brought up 1 node, 16 CPUs Sep 12 18:55:06.942455 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Sep 12 18:55:06.942462 kernel: Memory: 32652132K/33411996K available (14336K kernel code, 2432K rwdata, 9960K rodata, 54040K init, 2924K bss, 732516K reserved, 0K cma-reserved) Sep 12 18:55:06.942467 kernel: devtmpfs: initialized Sep 12 18:55:06.942473 kernel: x86/mm: Memory block size: 128MB Sep 12 18:55:06.942478 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x6dfbe000-0x6dfbefff] (4096 bytes) Sep 12 18:55:06.942484 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x79233000-0x79664fff] (4399104 bytes) Sep 12 18:55:06.942489 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 18:55:06.942495 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Sep 12 18:55:06.942500 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 18:55:06.942506 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 18:55:06.942512 kernel: audit: initializing netlink subsys (disabled) Sep 12 18:55:06.942518 kernel: audit: type=2000 audit(1757703298.174:1): state=initialized audit_enabled=0 res=1 Sep 12 18:55:06.942523 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 18:55:06.942529 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 12 18:55:06.942534 kernel: cpuidle: using governor menu Sep 12 18:55:06.942540 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 18:55:06.942545 kernel: dca service started, version 1.12.1 Sep 12 18:55:06.942551 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Sep 12 18:55:06.942557 kernel: PCI: Using configuration type 1 for base access Sep 12 18:55:06.942563 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 12 18:55:06.942568 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 18:55:06.942574 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 18:55:06.942579 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 18:55:06.942587 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 18:55:06.942612 kernel: ACPI: Added _OSI(Module Device) Sep 12 18:55:06.942617 kernel: ACPI: Added _OSI(Processor Device) Sep 12 18:55:06.942623 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 18:55:06.942629 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Sep 12 18:55:06.942652 kernel: ACPI: Dynamic OEM Table Load: Sep 12 18:55:06.942658 kernel: ACPI: SSDT 0xFFFF9633822CC000 000400 (v02 PmRef Cpu0Cst 00003001 INTL 20160527) Sep 12 18:55:06.942663 kernel: ACPI: Dynamic OEM Table Load: Sep 12 18:55:06.942669 kernel: ACPI: SSDT 0xFFFF9633823A6800 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Sep 12 18:55:06.942674 kernel: ACPI: Dynamic OEM Table Load: Sep 12 18:55:06.942680 kernel: ACPI: SSDT 0xFFFF963380246300 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Sep 12 18:55:06.942685 kernel: ACPI: Dynamic OEM Table Load: Sep 12 18:55:06.942690 kernel: ACPI: SSDT 0xFFFF9633823A6000 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Sep 12 18:55:06.942696 kernel: ACPI: Dynamic OEM Table Load: Sep 12 18:55:06.942702 kernel: ACPI: SSDT 0xFFFF9633801A1000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Sep 12 18:55:06.942708 kernel: ACPI: Dynamic OEM Table Load: Sep 12 18:55:06.942713 kernel: ACPI: SSDT 0xFFFF9633822CE800 00030A (v02 PmRef ApCst 00003000 INTL 20160527) Sep 12 18:55:06.942719 kernel: ACPI: Interpreter enabled Sep 12 18:55:06.942724 kernel: ACPI: PM: (supports S0 S5) Sep 12 18:55:06.942730 kernel: ACPI: Using IOAPIC for interrupt routing Sep 12 18:55:06.942735 kernel: HEST: Enabling Firmware First mode for corrected errors. Sep 12 18:55:06.942741 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Sep 12 18:55:06.942746 kernel: HEST: Table parsing has been initialized. Sep 12 18:55:06.942753 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Sep 12 18:55:06.942758 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 12 18:55:06.942764 kernel: PCI: Using E820 reservations for host bridge windows Sep 12 18:55:06.942769 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Sep 12 18:55:06.942775 kernel: ACPI: \_SB_.PCI0.XDCI.USBC: New power resource Sep 12 18:55:06.942781 kernel: ACPI: \_SB_.PCI0.SAT0.VOL0.V0PR: New power resource Sep 12 18:55:06.942786 kernel: ACPI: \_SB_.PCI0.SAT0.VOL1.V1PR: New power resource Sep 12 18:55:06.942792 kernel: ACPI: \_SB_.PCI0.SAT0.VOL2.V2PR: New power resource Sep 12 18:55:06.942797 kernel: ACPI: \_SB_.PCI0.CNVW.WRST: New power resource Sep 12 18:55:06.942804 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Sep 12 18:55:06.942809 kernel: ACPI: \_TZ_.FN00: New power resource Sep 12 18:55:06.942815 kernel: ACPI: \_TZ_.FN01: New power resource Sep 12 18:55:06.942820 kernel: ACPI: \_TZ_.FN02: New power resource Sep 12 18:55:06.942826 kernel: ACPI: \_TZ_.FN03: New power resource Sep 12 18:55:06.942831 kernel: ACPI: \_TZ_.FN04: New power resource Sep 12 18:55:06.942837 kernel: ACPI: \PIN_: New power resource Sep 12 18:55:06.942842 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Sep 12 18:55:06.942929 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 18:55:06.942991 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Sep 12 18:55:06.943048 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Sep 12 18:55:06.943056 kernel: PCI host bridge to bus 0000:00 Sep 12 18:55:06.943116 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 12 18:55:06.943168 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 12 18:55:06.943219 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 12 18:55:06.943285 kernel: pci_bus 0000:00: root bus resource [mem 0x7f800000-0xdfffffff window] Sep 12 18:55:06.943334 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Sep 12 18:55:06.943382 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Sep 12 18:55:06.943448 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 conventional PCI endpoint Sep 12 18:55:06.943512 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 PCIe Root Port Sep 12 18:55:06.943570 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 12 18:55:06.943631 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Sep 12 18:55:06.943695 kernel: pci 0000:00:01.1: [8086:1905] type 01 class 0x060400 PCIe Root Port Sep 12 18:55:06.943751 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Sep 12 18:55:06.943807 kernel: pci 0000:00:01.1: bridge window [mem 0x96100000-0x962fffff] Sep 12 18:55:06.943864 kernel: pci 0000:00:01.1: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Sep 12 18:55:06.943921 kernel: pci 0000:00:01.1: PME# supported from D0 D3hot D3cold Sep 12 18:55:06.943991 kernel: pci 0000:00:02.0: [8086:3e9a] type 00 class 0x038000 PCIe Root Complex Integrated Endpoint Sep 12 18:55:06.944050 kernel: pci 0000:00:02.0: BAR 0 [mem 0x94000000-0x94ffffff 64bit] Sep 12 18:55:06.944106 kernel: pci 0000:00:02.0: BAR 2 [mem 0x80000000-0x8fffffff 64bit pref] Sep 12 18:55:06.944162 kernel: pci 0000:00:02.0: BAR 4 [io 0x6000-0x603f] Sep 12 18:55:06.944222 kernel: pci 0000:00:08.0: [8086:1911] type 00 class 0x088000 conventional PCI endpoint Sep 12 18:55:06.944278 kernel: pci 0000:00:08.0: BAR 0 [mem 0x9651f000-0x9651ffff 64bit] Sep 12 18:55:06.944340 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 conventional PCI endpoint Sep 12 18:55:06.944397 kernel: pci 0000:00:12.0: BAR 0 [mem 0x9651e000-0x9651efff 64bit] Sep 12 18:55:06.944460 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 conventional PCI endpoint Sep 12 18:55:06.944517 kernel: pci 0000:00:14.0: BAR 0 [mem 0x96500000-0x9650ffff 64bit] Sep 12 18:55:06.944575 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Sep 12 18:55:06.944638 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 conventional PCI endpoint Sep 12 18:55:06.944694 kernel: pci 0000:00:14.2: BAR 0 [mem 0x96512000-0x96513fff 64bit] Sep 12 18:55:06.944752 kernel: pci 0000:00:14.2: BAR 2 [mem 0x9651d000-0x9651dfff 64bit] Sep 12 18:55:06.944812 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 conventional PCI endpoint Sep 12 18:55:06.944869 kernel: pci 0000:00:15.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 12 18:55:06.944928 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 conventional PCI endpoint Sep 12 18:55:06.944984 kernel: pci 0000:00:15.1: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 12 18:55:06.945047 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 conventional PCI endpoint Sep 12 18:55:06.945104 kernel: pci 0000:00:16.0: BAR 0 [mem 0x9651a000-0x9651afff 64bit] Sep 12 18:55:06.945160 kernel: pci 0000:00:16.0: PME# supported from D3hot Sep 12 18:55:06.945222 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 conventional PCI endpoint Sep 12 18:55:06.945279 kernel: pci 0000:00:16.1: BAR 0 [mem 0x96519000-0x96519fff 64bit] Sep 12 18:55:06.945334 kernel: pci 0000:00:16.1: PME# supported from D3hot Sep 12 18:55:06.945393 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 conventional PCI endpoint Sep 12 18:55:06.945450 kernel: pci 0000:00:16.4: BAR 0 [mem 0x96518000-0x96518fff 64bit] Sep 12 18:55:06.945507 kernel: pci 0000:00:16.4: PME# supported from D3hot Sep 12 18:55:06.945567 kernel: pci 0000:00:17.0: [8086:a352] type 00 class 0x010601 conventional PCI endpoint Sep 12 18:55:06.945627 kernel: pci 0000:00:17.0: BAR 0 [mem 0x96510000-0x96511fff] Sep 12 18:55:06.945683 kernel: pci 0000:00:17.0: BAR 1 [mem 0x96517000-0x965170ff] Sep 12 18:55:06.945739 kernel: pci 0000:00:17.0: BAR 2 [io 0x6090-0x6097] Sep 12 18:55:06.945797 kernel: pci 0000:00:17.0: BAR 3 [io 0x6080-0x6083] Sep 12 18:55:06.945853 kernel: pci 0000:00:17.0: BAR 4 [io 0x6060-0x607f] Sep 12 18:55:06.945909 kernel: pci 0000:00:17.0: BAR 5 [mem 0x96516000-0x965167ff] Sep 12 18:55:06.945965 kernel: pci 0000:00:17.0: PME# supported from D3hot Sep 12 18:55:06.946026 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 PCIe Root Port Sep 12 18:55:06.946083 kernel: pci 0000:00:1b.0: PCI bridge to [bus 03] Sep 12 18:55:06.946141 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Sep 12 18:55:06.946203 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 PCIe Root Port Sep 12 18:55:06.946260 kernel: pci 0000:00:1b.4: PCI bridge to [bus 04] Sep 12 18:55:06.946316 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Sep 12 18:55:06.946374 kernel: pci 0000:00:1b.4: bridge window [mem 0x96400000-0x964fffff] Sep 12 18:55:06.946430 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Sep 12 18:55:06.946490 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 PCIe Root Port Sep 12 18:55:06.946549 kernel: pci 0000:00:1b.5: PCI bridge to [bus 05] Sep 12 18:55:06.946609 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Sep 12 18:55:06.946666 kernel: pci 0000:00:1b.5: bridge window [mem 0x96300000-0x963fffff] Sep 12 18:55:06.946722 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Sep 12 18:55:06.946784 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 PCIe Root Port Sep 12 18:55:06.946840 kernel: pci 0000:00:1c.0: PCI bridge to [bus 06] Sep 12 18:55:06.946896 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Sep 12 18:55:06.946959 kernel: pci 0000:00:1c.1: [8086:a339] type 01 class 0x060400 PCIe Root Port Sep 12 18:55:06.947018 kernel: pci 0000:00:1c.1: PCI bridge to [bus 07-08] Sep 12 18:55:06.947074 kernel: pci 0000:00:1c.1: bridge window [io 0x3000-0x3fff] Sep 12 18:55:06.947130 kernel: pci 0000:00:1c.1: bridge window [mem 0x95000000-0x960fffff] Sep 12 18:55:06.947186 kernel: pci 0000:00:1c.1: PME# supported from D0 D3hot D3cold Sep 12 18:55:06.947246 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 conventional PCI endpoint Sep 12 18:55:06.947302 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 12 18:55:06.947361 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 conventional PCI endpoint Sep 12 18:55:06.947423 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 conventional PCI endpoint Sep 12 18:55:06.947482 kernel: pci 0000:00:1f.4: BAR 0 [mem 0x96514000-0x965140ff 64bit] Sep 12 18:55:06.947538 kernel: pci 0000:00:1f.4: BAR 4 [io 0xefa0-0xefbf] Sep 12 18:55:06.947601 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 conventional PCI endpoint Sep 12 18:55:06.947657 kernel: pci 0000:00:1f.5: BAR 0 [mem 0xfe010000-0xfe010fff] Sep 12 18:55:06.947713 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 12 18:55:06.947778 kernel: pci 0000:02:00.0: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Sep 12 18:55:06.947837 kernel: pci 0000:02:00.0: BAR 0 [mem 0x92000000-0x93ffffff 64bit pref] Sep 12 18:55:06.947896 kernel: pci 0000:02:00.0: ROM [mem 0x96200000-0x962fffff pref] Sep 12 18:55:06.947953 kernel: pci 0000:02:00.0: PME# supported from D3cold Sep 12 18:55:06.948011 kernel: pci 0000:02:00.0: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Sep 12 18:55:06.948067 kernel: pci 0000:02:00.0: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Sep 12 18:55:06.948133 kernel: pci 0000:02:00.1: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Sep 12 18:55:06.948193 kernel: pci 0000:02:00.1: BAR 0 [mem 0x90000000-0x91ffffff 64bit pref] Sep 12 18:55:06.948251 kernel: pci 0000:02:00.1: ROM [mem 0x96100000-0x961fffff pref] Sep 12 18:55:06.948309 kernel: pci 0000:02:00.1: PME# supported from D3cold Sep 12 18:55:06.948367 kernel: pci 0000:02:00.1: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Sep 12 18:55:06.948424 kernel: pci 0000:02:00.1: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Sep 12 18:55:06.948481 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Sep 12 18:55:06.948538 kernel: pci 0000:00:1b.0: PCI bridge to [bus 03] Sep 12 18:55:06.948605 kernel: pci 0000:04:00.0: working around ROM BAR overlap defect Sep 12 18:55:06.948665 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Sep 12 18:55:06.948722 kernel: pci 0000:04:00.0: BAR 0 [mem 0x96400000-0x9647ffff] Sep 12 18:55:06.948835 kernel: pci 0000:04:00.0: BAR 2 [io 0x5000-0x501f] Sep 12 18:55:06.948894 kernel: pci 0000:04:00.0: BAR 3 [mem 0x96480000-0x96483fff] Sep 12 18:55:06.948951 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Sep 12 18:55:06.949009 kernel: pci 0000:00:1b.4: PCI bridge to [bus 04] Sep 12 18:55:06.949073 kernel: pci 0000:05:00.0: working around ROM BAR overlap defect Sep 12 18:55:06.949134 kernel: pci 0000:05:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Sep 12 18:55:06.949256 kernel: pci 0000:05:00.0: BAR 0 [mem 0x96300000-0x9637ffff] Sep 12 18:55:06.949402 kernel: pci 0000:05:00.0: BAR 2 [io 0x4000-0x401f] Sep 12 18:55:06.949487 kernel: pci 0000:05:00.0: BAR 3 [mem 0x96380000-0x96383fff] Sep 12 18:55:06.949674 kernel: pci 0000:05:00.0: PME# supported from D0 D3hot D3cold Sep 12 18:55:06.949751 kernel: pci 0000:00:1b.5: PCI bridge to [bus 05] Sep 12 18:55:06.949831 kernel: pci 0000:00:1c.0: PCI bridge to [bus 06] Sep 12 18:55:06.949913 kernel: pci 0000:07:00.0: [1a03:1150] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Sep 12 18:55:06.949974 kernel: pci 0000:07:00.0: PCI bridge to [bus 08] Sep 12 18:55:06.950035 kernel: pci 0000:07:00.0: bridge window [io 0x3000-0x3fff] Sep 12 18:55:06.950095 kernel: pci 0000:07:00.0: bridge window [mem 0x95000000-0x960fffff] Sep 12 18:55:06.950173 kernel: pci 0000:07:00.0: enabling Extended Tags Sep 12 18:55:06.950233 kernel: pci 0000:07:00.0: supports D1 D2 Sep 12 18:55:06.950292 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 12 18:55:06.950354 kernel: pci 0000:00:1c.1: PCI bridge to [bus 07-08] Sep 12 18:55:06.950427 kernel: pci_bus 0000:08: extended config space not accessible Sep 12 18:55:06.950569 kernel: pci 0000:08:00.0: [1a03:2000] type 00 class 0x030000 conventional PCI endpoint Sep 12 18:55:06.950645 kernel: pci 0000:08:00.0: BAR 0 [mem 0x95000000-0x95ffffff] Sep 12 18:55:06.950717 kernel: pci 0000:08:00.0: BAR 1 [mem 0x96000000-0x9601ffff] Sep 12 18:55:06.950781 kernel: pci 0000:08:00.0: BAR 2 [io 0x3000-0x307f] Sep 12 18:55:06.950846 kernel: pci 0000:08:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 12 18:55:06.950908 kernel: pci 0000:08:00.0: supports D1 D2 Sep 12 18:55:06.950971 kernel: pci 0000:08:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 12 18:55:06.951031 kernel: pci 0000:07:00.0: PCI bridge to [bus 08] Sep 12 18:55:06.951039 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Sep 12 18:55:06.951047 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Sep 12 18:55:06.951053 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Sep 12 18:55:06.951059 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Sep 12 18:55:06.951065 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Sep 12 18:55:06.951071 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Sep 12 18:55:06.951077 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Sep 12 18:55:06.951083 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Sep 12 18:55:06.951089 kernel: iommu: Default domain type: Translated Sep 12 18:55:06.951095 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 12 18:55:06.951102 kernel: PCI: Using ACPI for IRQ routing Sep 12 18:55:06.951108 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 12 18:55:06.951113 kernel: e820: reserve RAM buffer [mem 0x00099800-0x0009ffff] Sep 12 18:55:06.951119 kernel: e820: reserve RAM buffer [mem 0x6dfbe000-0x6fffffff] Sep 12 18:55:06.951125 kernel: e820: reserve RAM buffer [mem 0x77fc7000-0x77ffffff] Sep 12 18:55:06.951131 kernel: e820: reserve RAM buffer [mem 0x79233000-0x7bffffff] Sep 12 18:55:06.951136 kernel: e820: reserve RAM buffer [mem 0x7bf00000-0x7bffffff] Sep 12 18:55:06.951142 kernel: e820: reserve RAM buffer [mem 0x87f800000-0x87fffffff] Sep 12 18:55:06.951204 kernel: pci 0000:08:00.0: vgaarb: setting as boot VGA device Sep 12 18:55:06.951273 kernel: pci 0000:08:00.0: vgaarb: bridge control possible Sep 12 18:55:06.951348 kernel: pci 0000:08:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 12 18:55:06.951358 kernel: vgaarb: loaded Sep 12 18:55:06.951364 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0 Sep 12 18:55:06.951370 kernel: hpet0: 8 comparators, 64-bit 24.000000 MHz counter Sep 12 18:55:06.951376 kernel: clocksource: Switched to clocksource tsc-early Sep 12 18:55:06.951384 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 18:55:06.951390 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 18:55:06.951397 kernel: pnp: PnP ACPI init Sep 12 18:55:06.951461 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Sep 12 18:55:06.951520 kernel: pnp 00:02: [dma 0 disabled] Sep 12 18:55:06.951578 kernel: pnp 00:03: [dma 0 disabled] Sep 12 18:55:06.951641 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Sep 12 18:55:06.951696 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Sep 12 18:55:06.951753 kernel: system 00:05: [mem 0xfed10000-0xfed17fff] has been reserved Sep 12 18:55:06.951826 kernel: system 00:05: [mem 0xfed18000-0xfed18fff] has been reserved Sep 12 18:55:06.951960 kernel: system 00:05: [mem 0xfed19000-0xfed19fff] has been reserved Sep 12 18:55:06.952012 kernel: system 00:05: [mem 0xe0000000-0xefffffff] has been reserved Sep 12 18:55:06.952063 kernel: system 00:05: [mem 0xfed20000-0xfed3ffff] has been reserved Sep 12 18:55:06.952114 kernel: system 00:05: [mem 0xfed90000-0xfed93fff] could not be reserved Sep 12 18:55:06.952166 kernel: system 00:05: [mem 0xfed45000-0xfed8ffff] has been reserved Sep 12 18:55:06.952218 kernel: system 00:05: [mem 0xfee00000-0xfeefffff] could not be reserved Sep 12 18:55:06.952276 kernel: system 00:06: [io 0x1800-0x18fe] could not be reserved Sep 12 18:55:06.952328 kernel: system 00:06: [mem 0xfd000000-0xfd69ffff] has been reserved Sep 12 18:55:06.952380 kernel: system 00:06: [mem 0xfd6c0000-0xfd6cffff] has been reserved Sep 12 18:55:06.952431 kernel: system 00:06: [mem 0xfd6f0000-0xfdffffff] has been reserved Sep 12 18:55:06.952482 kernel: system 00:06: [mem 0xfe000000-0xfe01ffff] could not be reserved Sep 12 18:55:06.952533 kernel: system 00:06: [mem 0xfe200000-0xfe7fffff] has been reserved Sep 12 18:55:06.952589 kernel: system 00:06: [mem 0xff000000-0xffffffff] has been reserved Sep 12 18:55:06.952679 kernel: system 00:07: [io 0x2000-0x20fe] has been reserved Sep 12 18:55:06.952688 kernel: pnp: PnP ACPI: found 9 devices Sep 12 18:55:06.952694 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 12 18:55:06.952699 kernel: NET: Registered PF_INET protocol family Sep 12 18:55:06.952705 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 18:55:06.952711 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Sep 12 18:55:06.952717 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 18:55:06.952724 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 18:55:06.952730 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Sep 12 18:55:06.952736 kernel: TCP: Hash tables configured (established 262144 bind 65536) Sep 12 18:55:06.952742 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 12 18:55:06.952747 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 12 18:55:06.952753 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 18:55:06.952759 kernel: NET: Registered PF_XDP protocol family Sep 12 18:55:06.952815 kernel: pci 0000:00:15.0: BAR 0 [mem 0x7f800000-0x7f800fff 64bit]: assigned Sep 12 18:55:06.952872 kernel: pci 0000:00:15.1: BAR 0 [mem 0x7f801000-0x7f801fff 64bit]: assigned Sep 12 18:55:06.952930 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x7f802000-0x7f802fff 64bit]: assigned Sep 12 18:55:06.952987 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 12 18:55:06.953045 kernel: pci 0000:02:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Sep 12 18:55:06.953103 kernel: pci 0000:02:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Sep 12 18:55:06.953162 kernel: pci 0000:02:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Sep 12 18:55:06.953220 kernel: pci 0000:02:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Sep 12 18:55:06.953339 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Sep 12 18:55:06.953396 kernel: pci 0000:00:01.1: bridge window [mem 0x96100000-0x962fffff] Sep 12 18:55:06.953452 kernel: pci 0000:00:01.1: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Sep 12 18:55:06.953510 kernel: pci 0000:00:1b.0: PCI bridge to [bus 03] Sep 12 18:55:06.953566 kernel: pci 0000:00:1b.4: PCI bridge to [bus 04] Sep 12 18:55:06.953651 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Sep 12 18:55:06.953736 kernel: pci 0000:00:1b.4: bridge window [mem 0x96400000-0x964fffff] Sep 12 18:55:06.953792 kernel: pci 0000:00:1b.5: PCI bridge to [bus 05] Sep 12 18:55:06.953847 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Sep 12 18:55:06.953903 kernel: pci 0000:00:1b.5: bridge window [mem 0x96300000-0x963fffff] Sep 12 18:55:06.953959 kernel: pci 0000:00:1c.0: PCI bridge to [bus 06] Sep 12 18:55:06.954017 kernel: pci 0000:07:00.0: PCI bridge to [bus 08] Sep 12 18:55:06.954074 kernel: pci 0000:07:00.0: bridge window [io 0x3000-0x3fff] Sep 12 18:55:06.954134 kernel: pci 0000:07:00.0: bridge window [mem 0x95000000-0x960fffff] Sep 12 18:55:06.954191 kernel: pci 0000:00:1c.1: PCI bridge to [bus 07-08] Sep 12 18:55:06.954246 kernel: pci 0000:00:1c.1: bridge window [io 0x3000-0x3fff] Sep 12 18:55:06.954302 kernel: pci 0000:00:1c.1: bridge window [mem 0x95000000-0x960fffff] Sep 12 18:55:06.954354 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Sep 12 18:55:06.954405 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 12 18:55:06.954455 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 12 18:55:06.954505 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 12 18:55:06.954555 kernel: pci_bus 0000:00: resource 7 [mem 0x7f800000-0xdfffffff window] Sep 12 18:55:06.954708 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Sep 12 18:55:06.954796 kernel: pci_bus 0000:02: resource 1 [mem 0x96100000-0x962fffff] Sep 12 18:55:06.954850 kernel: pci_bus 0000:02: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Sep 12 18:55:06.954907 kernel: pci_bus 0000:04: resource 0 [io 0x5000-0x5fff] Sep 12 18:55:06.954959 kernel: pci_bus 0000:04: resource 1 [mem 0x96400000-0x964fffff] Sep 12 18:55:06.955016 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Sep 12 18:55:06.955071 kernel: pci_bus 0000:05: resource 1 [mem 0x96300000-0x963fffff] Sep 12 18:55:06.955127 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Sep 12 18:55:06.955179 kernel: pci_bus 0000:07: resource 1 [mem 0x95000000-0x960fffff] Sep 12 18:55:06.955234 kernel: pci_bus 0000:08: resource 0 [io 0x3000-0x3fff] Sep 12 18:55:06.955289 kernel: pci_bus 0000:08: resource 1 [mem 0x95000000-0x960fffff] Sep 12 18:55:06.955297 kernel: PCI: CLS 64 bytes, default 64 Sep 12 18:55:06.955303 kernel: DMAR: No ATSR found Sep 12 18:55:06.955310 kernel: DMAR: No SATC found Sep 12 18:55:06.955316 kernel: DMAR: IOMMU feature fl1gp_support inconsistent Sep 12 18:55:06.955322 kernel: DMAR: IOMMU feature pgsel_inv inconsistent Sep 12 18:55:06.955328 kernel: DMAR: IOMMU feature nwfs inconsistent Sep 12 18:55:06.955333 kernel: DMAR: IOMMU feature pasid inconsistent Sep 12 18:55:06.955339 kernel: DMAR: IOMMU feature eafs inconsistent Sep 12 18:55:06.955345 kernel: DMAR: IOMMU feature prs inconsistent Sep 12 18:55:06.955351 kernel: DMAR: IOMMU feature nest inconsistent Sep 12 18:55:06.955357 kernel: DMAR: IOMMU feature mts inconsistent Sep 12 18:55:06.955363 kernel: DMAR: IOMMU feature sc_support inconsistent Sep 12 18:55:06.955369 kernel: DMAR: IOMMU feature dev_iotlb_support inconsistent Sep 12 18:55:06.955375 kernel: DMAR: dmar0: Using Queued invalidation Sep 12 18:55:06.955380 kernel: DMAR: dmar1: Using Queued invalidation Sep 12 18:55:06.955436 kernel: pci 0000:00:02.0: Adding to iommu group 0 Sep 12 18:55:06.955494 kernel: pci 0000:00:00.0: Adding to iommu group 1 Sep 12 18:55:06.955551 kernel: pci 0000:00:01.0: Adding to iommu group 2 Sep 12 18:55:06.955637 kernel: pci 0000:00:01.1: Adding to iommu group 2 Sep 12 18:55:06.955710 kernel: pci 0000:00:08.0: Adding to iommu group 3 Sep 12 18:55:06.955767 kernel: pci 0000:00:12.0: Adding to iommu group 4 Sep 12 18:55:06.955823 kernel: pci 0000:00:14.0: Adding to iommu group 5 Sep 12 18:55:06.955878 kernel: pci 0000:00:14.2: Adding to iommu group 5 Sep 12 18:55:06.955934 kernel: pci 0000:00:15.0: Adding to iommu group 6 Sep 12 18:55:06.955989 kernel: pci 0000:00:15.1: Adding to iommu group 6 Sep 12 18:55:06.956098 kernel: pci 0000:00:16.0: Adding to iommu group 7 Sep 12 18:55:06.956155 kernel: pci 0000:00:16.1: Adding to iommu group 7 Sep 12 18:55:06.956210 kernel: pci 0000:00:16.4: Adding to iommu group 7 Sep 12 18:55:06.956269 kernel: pci 0000:00:17.0: Adding to iommu group 8 Sep 12 18:55:06.956325 kernel: pci 0000:00:1b.0: Adding to iommu group 9 Sep 12 18:55:06.956381 kernel: pci 0000:00:1b.4: Adding to iommu group 10 Sep 12 18:55:06.956437 kernel: pci 0000:00:1b.5: Adding to iommu group 11 Sep 12 18:55:06.956493 kernel: pci 0000:00:1c.0: Adding to iommu group 12 Sep 12 18:55:06.956548 kernel: pci 0000:00:1c.1: Adding to iommu group 13 Sep 12 18:55:06.956629 kernel: pci 0000:00:1e.0: Adding to iommu group 14 Sep 12 18:55:06.956701 kernel: pci 0000:00:1f.0: Adding to iommu group 15 Sep 12 18:55:06.956758 kernel: pci 0000:00:1f.4: Adding to iommu group 15 Sep 12 18:55:06.956813 kernel: pci 0000:00:1f.5: Adding to iommu group 15 Sep 12 18:55:06.956880 kernel: pci 0000:02:00.0: Adding to iommu group 2 Sep 12 18:55:06.956939 kernel: pci 0000:02:00.1: Adding to iommu group 2 Sep 12 18:55:06.956997 kernel: pci 0000:04:00.0: Adding to iommu group 16 Sep 12 18:55:06.957056 kernel: pci 0000:05:00.0: Adding to iommu group 17 Sep 12 18:55:06.957114 kernel: pci 0000:07:00.0: Adding to iommu group 18 Sep 12 18:55:06.957176 kernel: pci 0000:08:00.0: Adding to iommu group 18 Sep 12 18:55:06.957184 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Sep 12 18:55:06.957190 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 12 18:55:06.957196 kernel: software IO TLB: mapped [mem 0x0000000073fc7000-0x0000000077fc7000] (64MB) Sep 12 18:55:06.957202 kernel: RAPL PMU: API unit is 2^-32 Joules, 4 fixed counters, 655360 ms ovfl timer Sep 12 18:55:06.957208 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Sep 12 18:55:06.957214 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Sep 12 18:55:06.957220 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Sep 12 18:55:06.957225 kernel: RAPL PMU: hw unit of domain pp1-gpu 2^-14 Joules Sep 12 18:55:06.957288 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Sep 12 18:55:06.957297 kernel: Initialise system trusted keyrings Sep 12 18:55:06.957303 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Sep 12 18:55:06.957309 kernel: Key type asymmetric registered Sep 12 18:55:06.957315 kernel: Asymmetric key parser 'x509' registered Sep 12 18:55:06.957320 kernel: tsc: Refined TSC clocksource calibration: 3408.021 MHz Sep 12 18:55:06.957326 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fe7d55de, max_idle_ns: 440795350771 ns Sep 12 18:55:06.957332 kernel: clocksource: Switched to clocksource tsc Sep 12 18:55:06.957339 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 12 18:55:06.957345 kernel: io scheduler mq-deadline registered Sep 12 18:55:06.957351 kernel: io scheduler kyber registered Sep 12 18:55:06.957357 kernel: io scheduler bfq registered Sep 12 18:55:06.957413 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 122 Sep 12 18:55:06.957469 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 123 Sep 12 18:55:06.957526 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 124 Sep 12 18:55:06.957584 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 125 Sep 12 18:55:06.957681 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 126 Sep 12 18:55:06.957740 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 127 Sep 12 18:55:06.957797 kernel: pcieport 0000:00:1c.1: PME: Signaling with IRQ 128 Sep 12 18:55:06.957861 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Sep 12 18:55:06.957870 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Sep 12 18:55:06.957876 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Sep 12 18:55:06.957881 kernel: pstore: Using crash dump compression: deflate Sep 12 18:55:06.957887 kernel: pstore: Registered erst as persistent store backend Sep 12 18:55:06.957893 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 12 18:55:06.957900 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 18:55:06.957906 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 12 18:55:06.957912 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Sep 12 18:55:06.957968 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Sep 12 18:55:06.957978 kernel: i8042: PNP: No PS/2 controller found. Sep 12 18:55:06.958029 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Sep 12 18:55:06.958082 kernel: rtc_cmos rtc_cmos: registered as rtc0 Sep 12 18:55:06.958134 kernel: rtc_cmos rtc_cmos: setting system clock to 2025-09-12T18:55:05 UTC (1757703305) Sep 12 18:55:06.958189 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Sep 12 18:55:06.958198 kernel: intel_pstate: Intel P-state driver initializing Sep 12 18:55:06.958204 kernel: intel_pstate: Disabling energy efficiency optimization Sep 12 18:55:06.958209 kernel: intel_pstate: HWP enabled Sep 12 18:55:06.958215 kernel: NET: Registered PF_INET6 protocol family Sep 12 18:55:06.958221 kernel: Segment Routing with IPv6 Sep 12 18:55:06.958227 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 18:55:06.958232 kernel: NET: Registered PF_PACKET protocol family Sep 12 18:55:06.958238 kernel: Key type dns_resolver registered Sep 12 18:55:06.958245 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Sep 12 18:55:06.958251 kernel: microcode: Current revision: 0x000000de Sep 12 18:55:06.958256 kernel: IPI shorthand broadcast: enabled Sep 12 18:55:06.958263 kernel: sched_clock: Marking stable (3826213024, 1502332386)->(6907042400, -1578496990) Sep 12 18:55:06.958268 kernel: registered taskstats version 1 Sep 12 18:55:06.958274 kernel: Loading compiled-in X.509 certificates Sep 12 18:55:06.958280 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: f1ae8d6e9bfae84d90f4136cf098b0465b2a5bd7' Sep 12 18:55:06.958285 kernel: Demotion targets for Node 0: null Sep 12 18:55:06.958291 kernel: Key type .fscrypt registered Sep 12 18:55:06.958298 kernel: Key type fscrypt-provisioning registered Sep 12 18:55:06.958303 kernel: ima: Allocated hash algorithm: sha1 Sep 12 18:55:06.958309 kernel: ima: No architecture policies found Sep 12 18:55:06.958315 kernel: clk: Disabling unused clocks Sep 12 18:55:06.958320 kernel: Warning: unable to open an initial console. Sep 12 18:55:06.958326 kernel: Freeing unused kernel image (initmem) memory: 54040K Sep 12 18:55:06.958332 kernel: Write protecting the kernel read-only data: 24576k Sep 12 18:55:06.958338 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Sep 12 18:55:06.958344 kernel: Run /init as init process Sep 12 18:55:06.958350 kernel: with arguments: Sep 12 18:55:06.958356 kernel: /init Sep 12 18:55:06.958362 kernel: with environment: Sep 12 18:55:06.958368 kernel: HOME=/ Sep 12 18:55:06.958373 kernel: TERM=linux Sep 12 18:55:06.958379 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 18:55:06.958385 systemd[1]: Successfully made /usr/ read-only. Sep 12 18:55:06.958393 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 18:55:06.958400 systemd[1]: Detected architecture x86-64. Sep 12 18:55:06.958406 systemd[1]: Running in initrd. Sep 12 18:55:06.958412 systemd[1]: No hostname configured, using default hostname. Sep 12 18:55:06.958418 systemd[1]: Hostname set to . Sep 12 18:55:06.958424 systemd[1]: Initializing machine ID from random generator. Sep 12 18:55:06.958430 systemd[1]: Queued start job for default target initrd.target. Sep 12 18:55:06.958436 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 18:55:06.958443 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 18:55:06.958449 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 18:55:06.958456 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 18:55:06.958462 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 18:55:06.958468 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 18:55:06.958475 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 18:55:06.958481 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 18:55:06.958488 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 18:55:06.958495 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 18:55:06.958501 systemd[1]: Reached target paths.target - Path Units. Sep 12 18:55:06.958507 systemd[1]: Reached target slices.target - Slice Units. Sep 12 18:55:06.958513 systemd[1]: Reached target swap.target - Swaps. Sep 12 18:55:06.958519 systemd[1]: Reached target timers.target - Timer Units. Sep 12 18:55:06.958525 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 18:55:06.958531 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 18:55:06.958538 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 18:55:06.958544 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 18:55:06.958550 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 18:55:06.958556 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 18:55:06.958562 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 18:55:06.958568 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 18:55:06.958574 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 18:55:06.958580 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 18:55:06.958617 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 18:55:06.958626 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 18:55:06.958632 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 18:55:06.958653 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 18:55:06.958659 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 18:55:06.958678 systemd-journald[299]: Collecting audit messages is disabled. Sep 12 18:55:06.958695 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 18:55:06.958701 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 18:55:06.958708 systemd-journald[299]: Journal started Sep 12 18:55:06.958721 systemd-journald[299]: Runtime Journal (/run/log/journal/77852ee482f346c6aa3d53195755f160) is 8M, max 639.3M, 631.3M free. Sep 12 18:55:06.949639 systemd-modules-load[301]: Inserted module 'overlay' Sep 12 18:55:06.974840 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 18:55:06.980590 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 18:55:06.981430 systemd-modules-load[301]: Inserted module 'br_netfilter' Sep 12 18:55:07.028803 kernel: Bridge firewalling registered Sep 12 18:55:07.013026 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 18:55:07.038889 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 18:55:07.054876 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 18:55:07.077928 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 18:55:07.080541 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 18:55:07.097776 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 18:55:07.116501 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 18:55:07.139355 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 18:55:07.153710 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 18:55:07.160767 systemd-tmpfiles[320]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 18:55:07.161352 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 18:55:07.162030 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 18:55:07.163109 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 18:55:07.164128 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 18:55:07.167567 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 18:55:07.169978 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 18:55:07.185439 systemd-resolved[334]: Positive Trust Anchors: Sep 12 18:55:07.185444 systemd-resolved[334]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 18:55:07.185470 systemd-resolved[334]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 18:55:07.187160 systemd-resolved[334]: Defaulting to hostname 'linux'. Sep 12 18:55:07.209950 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 18:55:07.216181 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 18:55:07.236955 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 18:55:07.372471 dracut-cmdline[342]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 18:55:07.588642 kernel: SCSI subsystem initialized Sep 12 18:55:07.601644 kernel: Loading iSCSI transport class v2.0-870. Sep 12 18:55:07.613663 kernel: iscsi: registered transport (tcp) Sep 12 18:55:07.636289 kernel: iscsi: registered transport (qla4xxx) Sep 12 18:55:07.636307 kernel: QLogic iSCSI HBA Driver Sep 12 18:55:07.647271 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 18:55:07.683572 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 18:55:07.684828 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 18:55:07.815150 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 18:55:07.819241 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 18:55:07.941636 kernel: raid6: avx2x4 gen() 19901 MB/s Sep 12 18:55:07.962633 kernel: raid6: avx2x2 gen() 41891 MB/s Sep 12 18:55:07.988710 kernel: raid6: avx2x1 gen() 46835 MB/s Sep 12 18:55:07.988726 kernel: raid6: using algorithm avx2x1 gen() 46835 MB/s Sep 12 18:55:08.016808 kernel: raid6: .... xor() 24642 MB/s, rmw enabled Sep 12 18:55:08.016828 kernel: raid6: using avx2x2 recovery algorithm Sep 12 18:55:08.037655 kernel: xor: automatically using best checksumming function avx Sep 12 18:55:08.141636 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 18:55:08.144834 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 18:55:08.154560 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 18:55:08.205130 systemd-udevd[554]: Using default interface naming scheme 'v255'. Sep 12 18:55:08.208788 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 18:55:08.215493 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 18:55:08.275791 dracut-pre-trigger[565]: rd.md=0: removing MD RAID activation Sep 12 18:55:08.293654 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 18:55:08.308419 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 18:55:08.428813 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 18:55:08.451693 kernel: cryptd: max_cpu_qlen set to 1000 Sep 12 18:55:08.430983 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 18:55:08.510700 kernel: pps_core: LinuxPPS API ver. 1 registered Sep 12 18:55:08.510724 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Sep 12 18:55:08.510742 kernel: AES CTR mode by8 optimization enabled Sep 12 18:55:08.510755 kernel: libata version 3.00 loaded. Sep 12 18:55:08.510768 kernel: PTP clock support registered Sep 12 18:55:08.510783 kernel: ACPI: bus type USB registered Sep 12 18:55:08.510797 kernel: usbcore: registered new interface driver usbfs Sep 12 18:55:08.510810 kernel: usbcore: registered new interface driver hub Sep 12 18:55:08.510822 kernel: usbcore: registered new device driver usb Sep 12 18:55:08.465513 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 18:55:08.465590 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 18:55:08.510895 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 18:55:08.511721 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 18:55:08.734764 kernel: ahci 0000:00:17.0: version 3.0 Sep 12 18:55:08.734871 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Sep 12 18:55:08.734948 kernel: ahci 0000:00:17.0: AHCI vers 0001.0301, 32 command slots, 6 Gbps, SATA mode Sep 12 18:55:08.735023 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Sep 12 18:55:08.735094 kernel: ahci 0000:00:17.0: 8/8 ports implemented (port mask 0xff) Sep 12 18:55:08.735165 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Sep 12 18:55:08.735238 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Sep 12 18:55:08.735309 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Sep 12 18:55:08.735378 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Sep 12 18:55:08.735447 kernel: scsi host0: ahci Sep 12 18:55:08.735523 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Sep 12 18:55:08.735600 kernel: scsi host1: ahci Sep 12 18:55:08.735672 kernel: hub 1-0:1.0: USB hub found Sep 12 18:55:08.735756 kernel: scsi host2: ahci Sep 12 18:55:08.735826 kernel: hub 1-0:1.0: 16 ports detected Sep 12 18:55:08.735901 kernel: scsi host3: ahci Sep 12 18:55:08.735971 kernel: hub 2-0:1.0: USB hub found Sep 12 18:55:08.736051 kernel: scsi host4: ahci Sep 12 18:55:08.736121 kernel: hub 2-0:1.0: 10 ports detected Sep 12 18:55:08.736199 kernel: scsi host5: ahci Sep 12 18:55:08.736267 kernel: scsi host6: ahci Sep 12 18:55:08.736334 kernel: scsi host7: ahci Sep 12 18:55:08.736400 kernel: ata1: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516100 irq 129 lpm-pol 0 Sep 12 18:55:08.736409 kernel: ata2: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516180 irq 129 lpm-pol 0 Sep 12 18:55:08.736416 kernel: ata3: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516200 irq 129 lpm-pol 0 Sep 12 18:55:08.736423 kernel: ata4: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516280 irq 129 lpm-pol 0 Sep 12 18:55:08.736432 kernel: ata5: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516300 irq 129 lpm-pol 0 Sep 12 18:55:08.736440 kernel: ata6: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516380 irq 129 lpm-pol 0 Sep 12 18:55:08.736447 kernel: ata7: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516400 irq 129 lpm-pol 0 Sep 12 18:55:08.736454 kernel: ata8: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516480 irq 129 lpm-pol 0 Sep 12 18:55:08.736461 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Sep 12 18:55:08.736468 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Sep 12 18:55:08.751894 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 18:55:08.793398 kernel: igb 0000:04:00.0: added PHC on eth0 Sep 12 18:55:08.793498 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Sep 12 18:55:08.793576 kernel: igb 0000:04:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:73:1c:30 Sep 12 18:55:08.793663 kernel: igb 0000:04:00.0: eth0: PBA No: 010000-000 Sep 12 18:55:08.793746 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Sep 12 18:55:08.833927 kernel: igb 0000:05:00.0: added PHC on eth1 Sep 12 18:55:08.834029 kernel: igb 0000:05:00.0: Intel(R) Gigabit Ethernet Network Connection Sep 12 18:55:08.841361 kernel: igb 0000:05:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:73:1c:31 Sep 12 18:55:08.846855 kernel: igb 0000:05:00.0: eth1: PBA No: 010000-000 Sep 12 18:55:08.846951 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Sep 12 18:55:08.846969 kernel: igb 0000:05:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Sep 12 18:55:08.878073 kernel: mlx5_core 0000:02:00.0: PTM is not supported by PCIe Sep 12 18:55:08.878186 kernel: mlx5_core 0000:02:00.0: firmware version: 14.28.2006 Sep 12 18:55:08.887167 kernel: mlx5_core 0000:02:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Sep 12 18:55:08.913069 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 18:55:08.987023 kernel: hub 1-14:1.0: USB hub found Sep 12 18:55:08.987175 kernel: hub 1-14:1.0: 4 ports detected Sep 12 18:55:09.033620 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 12 18:55:09.033638 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 12 18:55:09.039590 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Sep 12 18:55:09.045590 kernel: ata7: SATA link down (SStatus 0 SControl 300) Sep 12 18:55:09.051616 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Sep 12 18:55:09.057590 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 12 18:55:09.063589 kernel: ata8: SATA link down (SStatus 0 SControl 300) Sep 12 18:55:09.069616 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 12 18:55:09.074602 kernel: ata1.00: Model 'Micron_5300_MTFDDAK480TDT', rev ' D3MU001', applying quirks: zeroaftertrim Sep 12 18:55:09.091298 kernel: ata1.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Sep 12 18:55:09.092610 kernel: ata2.00: Model 'Micron_5300_MTFDDAK480TDT', rev ' D3MU001', applying quirks: zeroaftertrim Sep 12 18:55:09.108483 kernel: ata2.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Sep 12 18:55:09.119613 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Sep 12 18:55:09.119629 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Sep 12 18:55:09.137592 kernel: ata1.00: Features: NCQ-prio Sep 12 18:55:09.137617 kernel: ata2.00: Features: NCQ-prio Sep 12 18:55:09.152635 kernel: mlx5_core 0000:02:00.0: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Sep 12 18:55:09.157667 kernel: ata1.00: configured for UDMA/133 Sep 12 18:55:09.157697 kernel: mlx5_core 0000:02:00.0: Port module event: module 0, Cable plugged Sep 12 18:55:09.157787 kernel: ata2.00: configured for UDMA/133 Sep 12 18:55:09.157796 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Sep 12 18:55:09.178656 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Sep 12 18:55:09.194591 kernel: igb 0000:05:00.0 eno2: renamed from eth1 Sep 12 18:55:09.194690 kernel: igb 0000:04:00.0 eno1: renamed from eth0 Sep 12 18:55:09.194769 kernel: ata2.00: Enabling discard_zeroes_data Sep 12 18:55:09.204310 kernel: ata1.00: Enabling discard_zeroes_data Sep 12 18:55:09.209018 kernel: sd 1:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Sep 12 18:55:09.209122 kernel: sd 0:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Sep 12 18:55:09.223978 kernel: sd 1:0:0:0: [sdb] 4096-byte physical blocks Sep 12 18:55:09.224115 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Sep 12 18:55:09.224191 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 12 18:55:09.234428 kernel: sd 1:0:0:0: [sdb] Write Protect is off Sep 12 18:55:09.234546 kernel: sd 0:0:0:0: [sda] Mode Sense: 00 3a 00 00 Sep 12 18:55:09.239262 kernel: sd 1:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Sep 12 18:55:09.244656 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 12 18:55:09.244748 kernel: sd 1:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 12 18:55:09.244822 kernel: sd 0:0:0:0: [sda] Preferred minimum I/O size 4096 bytes Sep 12 18:55:09.253753 kernel: sd 1:0:0:0: [sdb] Preferred minimum I/O size 4096 bytes Sep 12 18:55:09.283022 kernel: ata1.00: Enabling discard_zeroes_data Sep 12 18:55:09.288216 kernel: ata2.00: Enabling discard_zeroes_data Sep 12 18:55:09.313169 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 18:55:09.313188 kernel: sd 1:0:0:0: [sdb] Attached SCSI disk Sep 12 18:55:09.313276 kernel: GPT:9289727 != 937703087 Sep 12 18:55:09.313285 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 18:55:09.313292 kernel: GPT:9289727 != 937703087 Sep 12 18:55:09.313299 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 18:55:09.313308 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 18:55:09.337259 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Sep 12 18:55:09.337290 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 12 18:55:09.384106 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Micron_5300_MTFDDAK480TDT ROOT. Sep 12 18:55:09.405786 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Micron_5300_MTFDDAK480TDT EFI-SYSTEM. Sep 12 18:55:09.466845 kernel: mlx5_core 0000:02:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Sep 12 18:55:09.466958 kernel: mlx5_core 0000:02:00.1: PTM is not supported by PCIe Sep 12 18:55:09.467042 kernel: mlx5_core 0000:02:00.1: firmware version: 14.28.2006 Sep 12 18:55:09.467124 kernel: mlx5_core 0000:02:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Sep 12 18:55:09.467204 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 12 18:55:09.467212 kernel: usbcore: registered new interface driver usbhid Sep 12 18:55:09.467220 kernel: usbhid: USB HID core driver Sep 12 18:55:09.418007 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5300_MTFDDAK480TDT OEM. Sep 12 18:55:09.486662 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Sep 12 18:55:09.509387 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Micron_5300_MTFDDAK480TDT USR-A. Sep 12 18:55:09.509493 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Micron_5300_MTFDDAK480TDT USR-A. Sep 12 18:55:09.564047 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Sep 12 18:55:09.564151 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Sep 12 18:55:09.576962 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Sep 12 18:55:09.591103 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 18:55:09.623898 disk-uuid[779]: Primary Header is updated. Sep 12 18:55:09.623898 disk-uuid[779]: Secondary Entries is updated. Sep 12 18:55:09.623898 disk-uuid[779]: Secondary Header is updated. Sep 12 18:55:09.656704 kernel: ata1.00: Enabling discard_zeroes_data Sep 12 18:55:09.656718 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 18:55:09.656726 kernel: ata1.00: Enabling discard_zeroes_data Sep 12 18:55:09.669619 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 18:55:09.730598 kernel: mlx5_core 0000:02:00.1: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Sep 12 18:55:09.743036 kernel: mlx5_core 0000:02:00.1: Port module event: module 1, Cable plugged Sep 12 18:55:09.989664 kernel: mlx5_core 0000:02:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Sep 12 18:55:10.002618 kernel: mlx5_core 0000:02:00.0 enp2s0f0np0: renamed from eth0 Sep 12 18:55:10.003240 kernel: mlx5_core 0000:02:00.1 enp2s0f1np1: renamed from eth1 Sep 12 18:55:10.016969 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 18:55:10.017468 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 18:55:10.033985 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 18:55:10.071785 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 18:55:10.072956 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 18:55:10.140629 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 18:55:10.651109 kernel: ata1.00: Enabling discard_zeroes_data Sep 12 18:55:10.666476 disk-uuid[780]: The operation has completed successfully. Sep 12 18:55:10.673715 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 18:55:10.702246 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 18:55:10.702298 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 18:55:10.733477 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 18:55:10.772157 sh[817]: Success Sep 12 18:55:10.805022 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 18:55:10.805043 kernel: device-mapper: uevent: version 1.0.3 Sep 12 18:55:10.814274 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 18:55:10.826653 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 12 18:55:10.875997 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 18:55:10.877226 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 18:55:10.912961 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 18:55:10.962699 kernel: BTRFS: device fsid 74707491-1b86-4926-8bdb-c533ce2a0c32 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (830) Sep 12 18:55:10.962717 kernel: BTRFS info (device dm-0): first mount of filesystem 74707491-1b86-4926-8bdb-c533ce2a0c32 Sep 12 18:55:10.962725 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 12 18:55:10.980825 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 12 18:55:10.980843 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 18:55:10.986941 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 18:55:10.989178 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 18:55:10.989446 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 18:55:11.022027 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 18:55:11.024343 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 18:55:11.039925 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 18:55:11.087643 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (853) Sep 12 18:55:11.104962 kernel: BTRFS info (device sda6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 18:55:11.104980 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 18:55:11.120593 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 18:55:11.120665 kernel: BTRFS info (device sda6): turning on async discard Sep 12 18:55:11.126799 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 18:55:11.140670 kernel: BTRFS info (device sda6): last unmount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 18:55:11.141070 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 18:55:11.141927 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 18:55:11.157004 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 18:55:11.184956 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 18:55:11.231772 systemd-networkd[1000]: lo: Link UP Sep 12 18:55:11.231777 systemd-networkd[1000]: lo: Gained carrier Sep 12 18:55:11.234977 systemd-networkd[1000]: Enumeration completed Sep 12 18:55:11.235054 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 18:55:11.235536 systemd-networkd[1000]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 18:55:11.266368 ignition[967]: Ignition 2.21.0 Sep 12 18:55:11.237905 systemd[1]: Reached target network.target - Network. Sep 12 18:55:11.266373 ignition[967]: Stage: fetch-offline Sep 12 18:55:11.263461 systemd-networkd[1000]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 18:55:11.266396 ignition[967]: no configs at "/usr/lib/ignition/base.d" Sep 12 18:55:11.268439 unknown[967]: fetched base config from "system" Sep 12 18:55:11.266401 ignition[967]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 12 18:55:11.268443 unknown[967]: fetched user config from "system" Sep 12 18:55:11.266453 ignition[967]: parsed url from cmdline: "" Sep 12 18:55:11.269560 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 18:55:11.266456 ignition[967]: no config URL provided Sep 12 18:55:11.275214 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 12 18:55:11.266459 ignition[967]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 18:55:11.275791 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 18:55:11.266489 ignition[967]: parsing config with SHA512: a0b0a94e8ac2acc4d85b2227df1d32f22b48e18edbf46556699defff1a5462e987ed3b82788ab4d0ce26cbd94b189600610b742716a1b1ac0dfa38be677550e9 Sep 12 18:55:11.291640 systemd-networkd[1000]: enp2s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 18:55:11.268646 ignition[967]: fetch-offline: fetch-offline passed Sep 12 18:55:11.268648 ignition[967]: POST message to Packet Timeline Sep 12 18:55:11.268651 ignition[967]: POST Status error: resource requires networking Sep 12 18:55:11.268686 ignition[967]: Ignition finished successfully Sep 12 18:55:11.456803 kernel: mlx5_core 0000:02:00.0 enp2s0f0np0: Link up Sep 12 18:55:11.330453 ignition[1016]: Ignition 2.21.0 Sep 12 18:55:11.330459 ignition[1016]: Stage: kargs Sep 12 18:55:11.459405 systemd-networkd[1000]: enp2s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 18:55:11.330548 ignition[1016]: no configs at "/usr/lib/ignition/base.d" Sep 12 18:55:11.330555 ignition[1016]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 12 18:55:11.331517 ignition[1016]: kargs: kargs passed Sep 12 18:55:11.331528 ignition[1016]: POST message to Packet Timeline Sep 12 18:55:11.331570 ignition[1016]: GET https://metadata.packet.net/metadata: attempt #1 Sep 12 18:55:11.332169 ignition[1016]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:36330->[::1]:53: read: connection refused Sep 12 18:55:11.533202 ignition[1016]: GET https://metadata.packet.net/metadata: attempt #2 Sep 12 18:55:11.534199 ignition[1016]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:47667->[::1]:53: read: connection refused Sep 12 18:55:11.678628 kernel: mlx5_core 0000:02:00.1 enp2s0f1np1: Link up Sep 12 18:55:11.682704 systemd-networkd[1000]: eno1: Link UP Sep 12 18:55:11.682873 systemd-networkd[1000]: eno2: Link UP Sep 12 18:55:11.683027 systemd-networkd[1000]: enp2s0f0np0: Link UP Sep 12 18:55:11.683206 systemd-networkd[1000]: enp2s0f0np0: Gained carrier Sep 12 18:55:11.702056 systemd-networkd[1000]: enp2s0f1np1: Link UP Sep 12 18:55:11.703207 systemd-networkd[1000]: enp2s0f1np1: Gained carrier Sep 12 18:55:11.739824 systemd-networkd[1000]: enp2s0f0np0: DHCPv4 address 139.178.94.145/31, gateway 139.178.94.144 acquired from 145.40.83.140 Sep 12 18:55:11.934673 ignition[1016]: GET https://metadata.packet.net/metadata: attempt #3 Sep 12 18:55:11.935870 ignition[1016]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:54318->[::1]:53: read: connection refused Sep 12 18:55:12.736166 ignition[1016]: GET https://metadata.packet.net/metadata: attempt #4 Sep 12 18:55:12.737286 ignition[1016]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:41494->[::1]:53: read: connection refused Sep 12 18:55:13.221114 systemd-networkd[1000]: enp2s0f1np1: Gained IPv6LL Sep 12 18:55:13.285121 systemd-networkd[1000]: enp2s0f0np0: Gained IPv6LL Sep 12 18:55:14.338939 ignition[1016]: GET https://metadata.packet.net/metadata: attempt #5 Sep 12 18:55:14.340160 ignition[1016]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:57968->[::1]:53: read: connection refused Sep 12 18:55:17.543468 ignition[1016]: GET https://metadata.packet.net/metadata: attempt #6 Sep 12 18:55:18.654526 ignition[1016]: GET result: OK Sep 12 18:55:19.148884 ignition[1016]: Ignition finished successfully Sep 12 18:55:19.154493 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 18:55:19.166575 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 18:55:19.219970 ignition[1036]: Ignition 2.21.0 Sep 12 18:55:19.219976 ignition[1036]: Stage: disks Sep 12 18:55:19.220071 ignition[1036]: no configs at "/usr/lib/ignition/base.d" Sep 12 18:55:19.220078 ignition[1036]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 12 18:55:19.220625 ignition[1036]: disks: disks passed Sep 12 18:55:19.220629 ignition[1036]: POST message to Packet Timeline Sep 12 18:55:19.220641 ignition[1036]: GET https://metadata.packet.net/metadata: attempt #1 Sep 12 18:55:20.234149 ignition[1036]: GET result: OK Sep 12 18:55:20.645275 ignition[1036]: Ignition finished successfully Sep 12 18:55:20.650411 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 18:55:20.661933 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 18:55:20.669130 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 18:55:20.686090 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 18:55:20.715046 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 18:55:20.731033 systemd[1]: Reached target basic.target - Basic System. Sep 12 18:55:20.750665 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 18:55:20.798695 systemd-fsck[1056]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 12 18:55:20.809120 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 18:55:20.809894 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 18:55:20.924492 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 18:55:20.937817 kernel: EXT4-fs (sda9): mounted filesystem 26739aba-b0be-4ce3-bfbd-ca4dbcbe2426 r/w with ordered data mode. Quota mode: none. Sep 12 18:55:20.924806 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 18:55:20.939217 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 18:55:20.957900 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 18:55:20.984261 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 12 18:55:21.033890 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sda6 (8:6) scanned by mount (1065) Sep 12 18:55:21.033904 kernel: BTRFS info (device sda6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 18:55:21.033912 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 18:55:21.033921 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 18:55:20.985181 systemd[1]: Starting flatcar-static-network.service - Flatcar Static Network Agent... Sep 12 18:55:21.046455 kernel: BTRFS info (device sda6): turning on async discard Sep 12 18:55:21.046467 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 18:55:21.046457 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 18:55:21.046477 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 18:55:21.066939 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 18:55:21.123865 coreos-metadata[1067]: Sep 12 18:55:21.119 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 12 18:55:21.092881 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 18:55:21.152680 coreos-metadata[1068]: Sep 12 18:55:21.118 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 12 18:55:21.117492 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 18:55:21.181472 initrd-setup-root[1097]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 18:55:21.189696 initrd-setup-root[1104]: cut: /sysroot/etc/group: No such file or directory Sep 12 18:55:21.198705 initrd-setup-root[1111]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 18:55:21.207710 initrd-setup-root[1118]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 18:55:21.253263 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 18:55:21.254105 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 18:55:21.271405 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 18:55:21.300318 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 18:55:21.317664 kernel: BTRFS info (device sda6): last unmount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 18:55:21.325382 ignition[1186]: INFO : Ignition 2.21.0 Sep 12 18:55:21.325382 ignition[1186]: INFO : Stage: mount Sep 12 18:55:21.331839 ignition[1186]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 18:55:21.331839 ignition[1186]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 12 18:55:21.331839 ignition[1186]: INFO : mount: mount passed Sep 12 18:55:21.331839 ignition[1186]: INFO : POST message to Packet Timeline Sep 12 18:55:21.331839 ignition[1186]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 12 18:55:21.328891 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 18:55:22.176278 coreos-metadata[1068]: Sep 12 18:55:22.176 INFO Fetch successful Sep 12 18:55:22.208322 coreos-metadata[1067]: Sep 12 18:55:22.208 INFO Fetch successful Sep 12 18:55:22.213730 systemd[1]: flatcar-static-network.service: Deactivated successfully. Sep 12 18:55:22.213803 systemd[1]: Finished flatcar-static-network.service - Flatcar Static Network Agent. Sep 12 18:55:22.239693 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 18:55:22.246957 coreos-metadata[1067]: Sep 12 18:55:22.238 INFO wrote hostname ci-4426.1.0-a-3db2d8461d to /sysroot/etc/hostname Sep 12 18:55:22.316839 ignition[1186]: INFO : GET result: OK Sep 12 18:55:22.767817 ignition[1186]: INFO : Ignition finished successfully Sep 12 18:55:22.772145 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 18:55:22.789804 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 18:55:22.826761 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 18:55:22.879554 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sda6 (8:6) scanned by mount (1213) Sep 12 18:55:22.879595 kernel: BTRFS info (device sda6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 18:55:22.887675 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 18:55:22.903832 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 18:55:22.903849 kernel: BTRFS info (device sda6): turning on async discard Sep 12 18:55:22.909944 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 18:55:22.911874 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 18:55:22.950924 ignition[1230]: INFO : Ignition 2.21.0 Sep 12 18:55:22.950924 ignition[1230]: INFO : Stage: files Sep 12 18:55:22.962841 ignition[1230]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 18:55:22.962841 ignition[1230]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 12 18:55:22.962841 ignition[1230]: DEBUG : files: compiled without relabeling support, skipping Sep 12 18:55:22.962841 ignition[1230]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 18:55:22.962841 ignition[1230]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 18:55:22.962841 ignition[1230]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 18:55:22.962841 ignition[1230]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 18:55:22.962841 ignition[1230]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 18:55:22.962841 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 12 18:55:22.962841 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Sep 12 18:55:22.954856 unknown[1230]: wrote ssh authorized keys file for user: core Sep 12 18:55:23.230817 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 18:55:23.432675 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 12 18:55:23.432675 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 18:55:23.463823 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 18:55:23.463823 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 18:55:23.463823 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 18:55:23.463823 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 18:55:23.463823 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 18:55:23.463823 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 18:55:23.463823 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 18:55:23.463823 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 18:55:23.463823 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 18:55:23.463823 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 18:55:23.463823 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 18:55:23.463823 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 18:55:23.463823 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Sep 12 18:55:24.035872 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 18:55:24.365426 ignition[1230]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 18:55:24.365426 ignition[1230]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 18:55:24.394831 ignition[1230]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 18:55:24.394831 ignition[1230]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 18:55:24.394831 ignition[1230]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 18:55:24.394831 ignition[1230]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 12 18:55:24.394831 ignition[1230]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 18:55:24.394831 ignition[1230]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 18:55:24.394831 ignition[1230]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 18:55:24.394831 ignition[1230]: INFO : files: files passed Sep 12 18:55:24.394831 ignition[1230]: INFO : POST message to Packet Timeline Sep 12 18:55:24.394831 ignition[1230]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 12 18:55:25.326439 ignition[1230]: INFO : GET result: OK Sep 12 18:55:25.758873 ignition[1230]: INFO : Ignition finished successfully Sep 12 18:55:25.762966 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 18:55:25.780464 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 18:55:25.803236 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 18:55:25.829987 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 18:55:25.830084 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 18:55:25.847248 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 18:55:25.876865 initrd-setup-root-after-ignition[1269]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 18:55:25.876865 initrd-setup-root-after-ignition[1269]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 18:55:25.859078 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 18:55:25.928783 initrd-setup-root-after-ignition[1273]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 18:55:25.888244 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 18:55:25.952982 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 18:55:25.953040 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 18:55:25.970966 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 18:55:25.996754 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 18:55:25.996973 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 18:55:25.998214 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 18:55:26.087808 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 18:55:26.092272 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 18:55:26.166526 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 18:55:26.167078 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 18:55:26.197342 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 18:55:26.214329 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 18:55:26.214821 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 18:55:26.240415 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 18:55:26.259279 systemd[1]: Stopped target basic.target - Basic System. Sep 12 18:55:26.276222 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 18:55:26.293283 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 18:55:26.313290 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 18:55:26.322621 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 18:55:26.351231 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 18:55:26.360619 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 18:55:26.387326 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 18:55:26.397651 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 18:55:26.422218 systemd[1]: Stopped target swap.target - Swaps. Sep 12 18:55:26.439127 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 18:55:26.439535 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 18:55:26.463399 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 18:55:26.481251 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 18:55:26.501163 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 18:55:26.501641 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 18:55:26.522115 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 18:55:26.522517 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 18:55:26.552306 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 18:55:26.552799 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 18:55:26.560529 systemd[1]: Stopped target paths.target - Path Units. Sep 12 18:55:26.585090 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 18:55:26.585581 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 18:55:26.604219 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 18:55:26.621285 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 18:55:26.638199 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 18:55:26.638508 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 18:55:26.656321 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 18:55:26.656661 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 18:55:26.677413 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 18:55:26.677882 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 18:55:26.790867 ignition[1294]: INFO : Ignition 2.21.0 Sep 12 18:55:26.790867 ignition[1294]: INFO : Stage: umount Sep 12 18:55:26.790867 ignition[1294]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 18:55:26.790867 ignition[1294]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 12 18:55:26.790867 ignition[1294]: INFO : umount: umount passed Sep 12 18:55:26.790867 ignition[1294]: INFO : POST message to Packet Timeline Sep 12 18:55:26.790867 ignition[1294]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 12 18:55:26.694214 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 18:55:26.694580 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 18:55:26.701445 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 12 18:55:26.701885 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 18:55:26.719025 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 18:55:26.741955 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 18:55:26.753869 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 18:55:26.753997 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 18:55:26.780008 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 18:55:26.780076 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 18:55:26.809609 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 18:55:26.810169 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 18:55:26.810223 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 18:55:26.841270 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 18:55:26.841370 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 18:55:28.083691 ignition[1294]: INFO : GET result: OK Sep 12 18:55:28.543167 ignition[1294]: INFO : Ignition finished successfully Sep 12 18:55:28.546874 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 18:55:28.547190 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 18:55:28.562874 systemd[1]: Stopped target network.target - Network. Sep 12 18:55:28.569105 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 18:55:28.569296 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 18:55:28.583186 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 18:55:28.583344 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 18:55:28.597182 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 18:55:28.597347 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 18:55:28.613383 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 18:55:28.613540 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 18:55:28.638104 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 18:55:28.638292 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 18:55:28.654430 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 18:55:28.672240 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 18:55:28.688778 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 18:55:28.689085 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 18:55:28.711835 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 18:55:28.712708 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 18:55:28.713012 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 18:55:28.719145 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 18:55:28.721519 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 18:55:28.741303 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 18:55:28.741412 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 18:55:28.752285 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 18:55:28.766888 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 18:55:28.766921 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 18:55:28.791990 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 18:55:28.792060 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 18:55:28.809198 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 18:55:28.809282 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 18:55:28.835130 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 18:55:28.835315 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 18:55:28.856364 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 18:55:28.878566 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 18:55:28.878785 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 18:55:28.880005 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 18:55:28.880428 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 18:55:28.887898 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 18:55:28.888099 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 18:55:28.904184 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 18:55:28.904296 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 18:55:28.936841 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 18:55:28.936984 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 18:55:28.962199 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 18:55:28.962378 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 18:55:28.990187 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 18:55:28.990373 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 18:55:29.286797 systemd-journald[299]: Received SIGTERM from PID 1 (systemd). Sep 12 18:55:29.018425 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 18:55:29.024862 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 18:55:29.024890 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 18:55:29.059934 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 18:55:29.059979 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 18:55:29.070055 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 12 18:55:29.070114 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 18:55:29.100155 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 18:55:29.100294 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 18:55:29.119875 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 18:55:29.120015 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 18:55:29.144232 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 12 18:55:29.144393 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 12 18:55:29.144504 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 12 18:55:29.144629 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 18:55:29.145881 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 18:55:29.146111 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 18:55:29.168344 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 18:55:29.168658 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 18:55:29.170967 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 18:55:29.196844 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 18:55:29.237479 systemd[1]: Switching root. Sep 12 18:55:29.420904 systemd-journald[299]: Journal stopped Sep 12 18:55:31.195883 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 18:55:31.195900 kernel: SELinux: policy capability open_perms=1 Sep 12 18:55:31.195908 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 18:55:31.195914 kernel: SELinux: policy capability always_check_network=0 Sep 12 18:55:31.195919 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 18:55:31.195925 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 18:55:31.195932 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 18:55:31.195939 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 18:55:31.195945 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 18:55:31.195951 kernel: audit: type=1403 audit(1757703329.546:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 18:55:31.195958 systemd[1]: Successfully loaded SELinux policy in 96.928ms. Sep 12 18:55:31.195966 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.978ms. Sep 12 18:55:31.195973 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 18:55:31.195980 systemd[1]: Detected architecture x86-64. Sep 12 18:55:31.195989 systemd[1]: Detected first boot. Sep 12 18:55:31.195995 systemd[1]: Hostname set to . Sep 12 18:55:31.196002 systemd[1]: Initializing machine ID from random generator. Sep 12 18:55:31.196009 zram_generator::config[1348]: No configuration found. Sep 12 18:55:31.196018 systemd[1]: Populated /etc with preset unit settings. Sep 12 18:55:31.196025 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 18:55:31.196032 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 18:55:31.196038 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 18:55:31.196045 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 18:55:31.196052 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 18:55:31.196058 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 18:55:31.196067 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 18:55:31.196074 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 18:55:31.196081 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 18:55:31.196088 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 18:55:31.196095 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 18:55:31.196102 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 18:55:31.196108 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 18:55:31.196115 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 18:55:31.196124 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 18:55:31.196131 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 18:55:31.196138 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 18:55:31.196145 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 18:55:31.196151 systemd[1]: Expecting device dev-ttyS1.device - /dev/ttyS1... Sep 12 18:55:31.196159 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 18:55:31.196165 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 18:55:31.196176 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 18:55:31.196183 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 18:55:31.196191 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 18:55:31.196199 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 18:55:31.196206 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 18:55:31.196213 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 18:55:31.196220 systemd[1]: Reached target slices.target - Slice Units. Sep 12 18:55:31.196227 systemd[1]: Reached target swap.target - Swaps. Sep 12 18:55:31.196234 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 18:55:31.196242 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 18:55:31.196249 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 18:55:31.196257 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 18:55:31.196264 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 18:55:31.196271 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 18:55:31.196279 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 18:55:31.196286 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 18:55:31.196294 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 18:55:31.196301 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 18:55:31.196308 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 18:55:31.196315 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 18:55:31.196322 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 18:55:31.196329 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 18:55:31.196338 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 18:55:31.196345 systemd[1]: Reached target machines.target - Containers. Sep 12 18:55:31.196352 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 18:55:31.196359 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 18:55:31.196367 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 18:55:31.196374 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 18:55:31.196381 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 18:55:31.196388 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 18:55:31.196397 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 18:55:31.196404 kernel: ACPI: bus type drm_connector registered Sep 12 18:55:31.196411 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 18:55:31.196418 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 18:55:31.196425 kernel: fuse: init (API version 7.41) Sep 12 18:55:31.196431 kernel: loop: module loaded Sep 12 18:55:31.196438 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 18:55:31.196446 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 18:55:31.196453 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 18:55:31.196461 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 18:55:31.196468 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 18:55:31.196476 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 18:55:31.196484 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 18:55:31.196491 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 18:55:31.196509 systemd-journald[1451]: Collecting audit messages is disabled. Sep 12 18:55:31.196527 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 18:55:31.196535 systemd-journald[1451]: Journal started Sep 12 18:55:31.196552 systemd-journald[1451]: Runtime Journal (/run/log/journal/f2d511b9e9b14e23839956978f4dcfcf) is 8M, max 639.3M, 631.3M free. Sep 12 18:55:30.042450 systemd[1]: Queued start job for default target multi-user.target. Sep 12 18:55:30.056686 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 12 18:55:30.057488 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 18:55:31.226639 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 18:55:31.246627 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 18:55:31.269650 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 18:55:31.289822 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 18:55:31.289847 systemd[1]: Stopped verity-setup.service. Sep 12 18:55:31.313657 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 18:55:31.321626 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 18:55:31.330103 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 18:55:31.338723 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 18:55:31.347727 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 18:55:31.355737 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 18:55:31.364889 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 18:55:31.373871 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 18:55:31.382958 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 18:55:31.393941 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 18:55:31.404950 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 18:55:31.405075 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 18:55:31.415047 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 18:55:31.415190 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 18:55:31.425042 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 18:55:31.425224 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 18:55:31.434067 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 18:55:31.434272 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 18:55:31.445328 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 18:55:31.445703 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 18:55:31.456526 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 18:55:31.457027 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 18:55:31.466613 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 18:55:31.476550 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 18:55:31.487707 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 18:55:31.498567 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 18:55:31.509548 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 18:55:31.528233 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 18:55:31.537530 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 18:55:31.552963 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 18:55:31.561777 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 18:55:31.561807 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 18:55:31.571948 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 18:55:31.583379 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 18:55:31.592855 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 18:55:31.608646 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 18:55:31.631979 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 18:55:31.642857 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 18:55:31.653907 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 18:55:31.656736 systemd-journald[1451]: Time spent on flushing to /var/log/journal/f2d511b9e9b14e23839956978f4dcfcf is 12.306ms for 1430 entries. Sep 12 18:55:31.656736 systemd-journald[1451]: System Journal (/var/log/journal/f2d511b9e9b14e23839956978f4dcfcf) is 8M, max 195.6M, 187.6M free. Sep 12 18:55:31.680299 systemd-journald[1451]: Received client request to flush runtime journal. Sep 12 18:55:31.670706 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 18:55:31.683008 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 18:55:31.692463 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 18:55:31.704259 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 18:55:31.714592 kernel: loop0: detected capacity change from 0 to 8 Sep 12 18:55:31.725642 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 18:55:31.725937 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 18:55:31.735761 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 18:55:31.746406 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 18:55:31.749080 systemd-tmpfiles[1491]: ACLs are not supported, ignoring. Sep 12 18:55:31.749090 systemd-tmpfiles[1491]: ACLs are not supported, ignoring. Sep 12 18:55:31.758663 kernel: loop1: detected capacity change from 0 to 128016 Sep 12 18:55:31.761904 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 18:55:31.771823 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 18:55:31.780862 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 18:55:31.792359 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 18:55:31.803405 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 18:55:31.818674 kernel: loop2: detected capacity change from 0 to 111000 Sep 12 18:55:31.825859 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 18:55:31.845325 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 18:55:31.845731 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 18:55:31.865213 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 18:55:31.874656 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 18:55:31.885656 kernel: loop3: detected capacity change from 0 to 224512 Sep 12 18:55:31.906887 systemd-tmpfiles[1508]: ACLs are not supported, ignoring. Sep 12 18:55:31.906898 systemd-tmpfiles[1508]: ACLs are not supported, ignoring. Sep 12 18:55:31.908580 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 18:55:31.932638 kernel: loop4: detected capacity change from 0 to 8 Sep 12 18:55:31.939591 kernel: loop5: detected capacity change from 0 to 128016 Sep 12 18:55:31.957625 kernel: loop6: detected capacity change from 0 to 111000 Sep 12 18:55:31.977641 kernel: loop7: detected capacity change from 0 to 224512 Sep 12 18:55:31.991681 (sd-merge)[1512]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-packet'. Sep 12 18:55:31.991982 (sd-merge)[1512]: Merged extensions into '/usr'. Sep 12 18:55:31.995670 systemd[1]: Reload requested from client PID 1488 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 18:55:31.995680 systemd[1]: Reloading... Sep 12 18:55:31.996415 ldconfig[1482]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 18:55:32.023599 zram_generator::config[1538]: No configuration found. Sep 12 18:55:32.152950 systemd[1]: Reloading finished in 156 ms. Sep 12 18:55:32.170503 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 18:55:32.179983 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 18:55:32.190957 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 18:55:32.219580 systemd[1]: Starting ensure-sysext.service... Sep 12 18:55:32.227788 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 18:55:32.255854 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 18:55:32.273733 systemd-tmpfiles[1597]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 18:55:32.273771 systemd-tmpfiles[1597]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 18:55:32.273999 systemd-tmpfiles[1597]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 18:55:32.274229 systemd-tmpfiles[1597]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 18:55:32.274944 systemd-tmpfiles[1597]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 18:55:32.275178 systemd-tmpfiles[1597]: ACLs are not supported, ignoring. Sep 12 18:55:32.275229 systemd-tmpfiles[1597]: ACLs are not supported, ignoring. Sep 12 18:55:32.278289 systemd-tmpfiles[1597]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 18:55:32.278294 systemd-tmpfiles[1597]: Skipping /boot Sep 12 18:55:32.282872 systemd-tmpfiles[1597]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 18:55:32.282877 systemd-tmpfiles[1597]: Skipping /boot Sep 12 18:55:32.285280 systemd[1]: Reload requested from client PID 1596 ('systemctl') (unit ensure-sysext.service)... Sep 12 18:55:32.285288 systemd[1]: Reloading... Sep 12 18:55:32.297148 systemd-udevd[1598]: Using default interface naming scheme 'v255'. Sep 12 18:55:32.317642 zram_generator::config[1625]: No configuration found. Sep 12 18:55:32.369647 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Sep 12 18:55:32.369717 kernel: ACPI: button: Sleep Button [SLPB] Sep 12 18:55:32.377705 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 12 18:55:32.384698 kernel: ACPI: video: Video Device [GFX0] (multi-head: yes rom: no post: no) Sep 12 18:55:32.384760 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 18:55:32.384788 kernel: ACPI: button: Power Button [PWRF] Sep 12 18:55:32.400683 kernel: input: Video Bus as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0A08:00/LNXVIDEO:00/input/input4 Sep 12 18:55:32.433191 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Sep 12 18:55:32.433450 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Sep 12 18:55:32.443571 kernel: IPMI message handler: version 39.2 Sep 12 18:55:32.443637 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Sep 12 18:55:32.452594 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Sep 12 18:55:32.463597 kernel: ipmi device interface Sep 12 18:55:32.463664 kernel: iTCO_vendor_support: vendor-support=0 Sep 12 18:55:32.463690 kernel: MACsec IEEE 802.1AE Sep 12 18:55:32.501777 kernel: ipmi_si: IPMI System Interface driver Sep 12 18:55:32.501831 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Sep 12 18:55:32.509381 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Sep 12 18:55:32.515745 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Sep 12 18:55:32.522081 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Sep 12 18:55:32.522599 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Sep 12 18:55:32.531597 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Sep 12 18:55:32.539104 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5300_MTFDDAK480TDT OEM. Sep 12 18:55:32.540590 kernel: ipmi_si: Adding ACPI-specified kcs state machine Sep 12 18:55:32.540617 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Sep 12 18:55:32.540633 kernel: iTCO_wdt iTCO_wdt: unable to reset NO_REBOOT flag, device disabled by hardware/BIOS Sep 12 18:55:32.573727 systemd[1]: Condition check resulted in dev-ttyS1.device - /dev/ttyS1 being skipped. Sep 12 18:55:32.574375 systemd[1]: Reloading finished in 288 ms. Sep 12 18:55:32.595851 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 18:55:32.609123 kernel: intel_rapl_common: Found RAPL domain package Sep 12 18:55:32.620802 kernel: intel_rapl_common: Found RAPL domain core Sep 12 18:55:32.620826 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Sep 12 18:55:32.620940 kernel: intel_rapl_common: Found RAPL domain uncore Sep 12 18:55:32.639309 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b11, dev_id: 0x20) Sep 12 18:55:32.644709 kernel: intel_rapl_common: Found RAPL domain dram Sep 12 18:55:32.680600 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Sep 12 18:55:32.682295 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 18:55:32.709162 systemd[1]: Finished ensure-sysext.service. Sep 12 18:55:32.714592 kernel: ipmi_ssif: IPMI SSIF Interface driver Sep 12 18:55:32.739247 systemd[1]: Reached target tpm2.target - Trusted Platform Module. Sep 12 18:55:32.747699 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 18:55:32.748375 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 18:55:32.977124 kernel: i915 0000:00:02.0: can't derive routing for PCI INT A Sep 12 18:55:32.977267 kernel: i915 0000:00:02.0: PCI INT A: not connected Sep 12 18:55:32.986907 kernel: i915 0000:00:02.0: [drm] Found COFFEELAKE (device ID 3e9a) display version 9.00 stepping N/A Sep 12 18:55:32.990132 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 18:55:33.001616 kernel: i915 0000:00:02.0: [drm] VT-d active for gfx access Sep 12 18:55:33.002028 kernel: i915 0000:00:02.0: [drm] Using Transparent Hugepages Sep 12 18:55:33.010159 augenrules[1824]: No rules Sep 12 18:55:33.021348 kernel: i915 0000:00:02.0: ROM [??? 0x00000000 flags 0x20000000]: can't assign; bogus alignment Sep 12 18:55:33.021757 kernel: i915 0000:00:02.0: [drm] Failed to find VBIOS tables (VBT) Sep 12 18:55:33.033369 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 18:55:33.033591 kernel: i915 0000:00:02.0: [drm] Finished loading DMC firmware i915/kbl_dmc_ver1_04.bin (v1.4) Sep 12 18:55:33.034101 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 18:55:33.043176 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 18:55:33.052175 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 18:55:33.063157 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 18:55:33.071715 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 18:55:33.072251 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 18:55:33.081626 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 18:55:33.082272 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 18:55:33.093648 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 18:55:33.094775 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 18:55:33.095806 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 18:55:33.120257 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 18:55:33.146711 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 18:55:33.155628 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 18:55:33.156422 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 18:55:33.156570 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 18:55:33.166990 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 18:55:33.167247 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 18:55:33.167361 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 18:55:33.167538 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 18:55:33.167721 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 18:55:33.167897 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 18:55:33.168000 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 18:55:33.168170 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 18:55:33.168275 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 18:55:33.168506 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 18:55:33.168702 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 18:55:33.173195 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 18:55:33.173276 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 18:55:33.174029 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 18:55:33.174969 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 18:55:33.174997 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 18:55:33.175226 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 18:55:33.192109 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 18:55:33.211249 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 18:55:33.264173 systemd-resolved[1837]: Positive Trust Anchors: Sep 12 18:55:33.264180 systemd-resolved[1837]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 18:55:33.264208 systemd-resolved[1837]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 18:55:33.267145 systemd-resolved[1837]: Using system hostname 'ci-4426.1.0-a-3db2d8461d'. Sep 12 18:55:33.270699 systemd-networkd[1836]: lo: Link UP Sep 12 18:55:33.270703 systemd-networkd[1836]: lo: Gained carrier Sep 12 18:55:33.273180 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 18:55:33.274103 systemd-networkd[1836]: bond0: netdev ready Sep 12 18:55:33.275261 systemd-networkd[1836]: Enumeration completed Sep 12 18:55:33.276327 systemd-networkd[1836]: enp2s0f0np0: Configuring with /etc/systemd/network/10-0c:42:a1:7e:a0:c0.network. Sep 12 18:55:33.283931 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 18:55:33.293659 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 18:55:33.302825 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 18:55:33.313875 systemd[1]: Reached target network.target - Network. Sep 12 18:55:33.321632 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 18:55:33.331639 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 18:55:33.339686 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 18:55:33.349642 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 18:55:33.359624 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 12 18:55:33.370641 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 18:55:33.381633 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 18:55:33.381654 systemd[1]: Reached target paths.target - Path Units. Sep 12 18:55:33.388631 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 18:55:33.401591 kernel: mlx5_core 0000:02:00.0 enp2s0f0np0: Link up Sep 12 18:55:33.401618 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 18:55:33.414590 kernel: bond0: (slave enp2s0f0np0): Enslaving as a backup interface with an up link Sep 12 18:55:33.415227 systemd-networkd[1836]: enp2s0f1np1: Configuring with /etc/systemd/network/10-0c:42:a1:7e:a0:c1.network. Sep 12 18:55:33.417685 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 18:55:33.427627 systemd[1]: Reached target timers.target - Timer Units. Sep 12 18:55:33.435185 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 18:55:33.444485 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 18:55:33.453854 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 18:55:33.464731 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 18:55:33.474803 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 18:55:33.486374 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 18:55:33.498219 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 18:55:33.509073 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 18:55:33.519202 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 18:55:33.527644 systemd[1]: Reached target basic.target - Basic System. Sep 12 18:55:33.534664 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 18:55:33.534683 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 18:55:33.535263 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 18:55:33.541589 kernel: mlx5_core 0000:02:00.1 enp2s0f1np1: Link up Sep 12 18:55:33.553589 kernel: bond0: (slave enp2s0f1np1): Enslaving as a backup interface with an up link Sep 12 18:55:33.553654 systemd-networkd[1836]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Sep 12 18:55:33.554608 systemd-networkd[1836]: enp2s0f0np0: Link UP Sep 12 18:55:33.554808 systemd-networkd[1836]: enp2s0f0np0: Gained carrier Sep 12 18:55:33.564592 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Sep 12 18:55:33.566408 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 12 18:55:33.580914 systemd-networkd[1836]: enp2s0f1np1: Reconfiguring with /etc/systemd/network/10-0c:42:a1:7e:a0:c0.network. Sep 12 18:55:33.581075 systemd-networkd[1836]: enp2s0f1np1: Link UP Sep 12 18:55:33.581220 systemd-networkd[1836]: enp2s0f1np1: Gained carrier Sep 12 18:55:33.588715 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 18:55:33.591694 systemd-networkd[1836]: bond0: Link UP Sep 12 18:55:33.591861 systemd-networkd[1836]: bond0: Gained carrier Sep 12 18:55:33.591971 systemd-timesyncd[1838]: Network configuration changed, trying to establish connection. Sep 12 18:55:33.592290 systemd-timesyncd[1838]: Network configuration changed, trying to establish connection. Sep 12 18:55:33.592465 systemd-timesyncd[1838]: Network configuration changed, trying to establish connection. Sep 12 18:55:33.592548 systemd-timesyncd[1838]: Network configuration changed, trying to establish connection. Sep 12 18:55:33.597457 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 18:55:33.604408 coreos-metadata[1876]: Sep 12 18:55:33.604 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 12 18:55:33.628684 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 18:55:33.647702 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 18:55:33.651269 jq[1882]: false Sep 12 18:55:33.656625 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 18:55:33.673841 kernel: bond0: (slave enp2s0f0np0): link status definitely up, 10000 Mbps full duplex Sep 12 18:55:33.673953 kernel: bond0: active interface up! Sep 12 18:55:33.675703 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 12 18:55:33.680427 extend-filesystems[1883]: Found /dev/sda6 Sep 12 18:55:33.697125 extend-filesystems[1883]: Found /dev/sda9 Sep 12 18:55:33.697125 extend-filesystems[1883]: Checking size of /dev/sda9 Sep 12 18:55:33.697125 extend-filesystems[1883]: Resized partition /dev/sda9 Sep 12 18:55:33.718846 kernel: EXT4-fs (sda9): resizing filesystem from 553472 to 116605649 blocks Sep 12 18:55:33.706666 oslogin_cache_refresh[1884]: Refreshing passwd entry cache Sep 12 18:55:33.698050 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 18:55:33.719064 extend-filesystems[1894]: resize2fs 1.47.2 (1-Jan-2025) Sep 12 18:55:33.757633 kernel: i915 0000:00:02.0: [drm] [ENCODER:98:DDI A/PHY A] failed to retrieve link info, disabling eDP Sep 12 18:55:33.757797 kernel: [drm] Initialized i915 1.6.0 for 0000:00:02.0 on minor 0 Sep 12 18:55:33.711375 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 18:55:33.757904 google_oslogin_nss_cache[1884]: oslogin_cache_refresh[1884]: Refreshing passwd entry cache Sep 12 18:55:33.750389 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 18:55:33.758488 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 18:55:33.779064 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 18:55:33.792654 kernel: bond0: (slave enp2s0f1np1): link status definitely up, 10000 Mbps full duplex Sep 12 18:55:33.796821 systemd[1]: Starting tcsd.service - TCG Core Services Daemon... Sep 12 18:55:33.803943 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 18:55:33.804274 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 18:55:33.813395 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 18:55:33.822103 update_engine[1914]: I20250912 18:55:33.822062 1914 main.cc:92] Flatcar Update Engine starting Sep 12 18:55:33.824569 systemd-logind[1909]: Watching system buttons on /dev/input/event3 (Power Button) Sep 12 18:55:33.824956 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 18:55:33.825119 systemd-logind[1909]: Watching system buttons on /dev/input/event2 (Sleep Button) Sep 12 18:55:33.825136 systemd-logind[1909]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Sep 12 18:55:33.825399 systemd-logind[1909]: New seat seat0. Sep 12 18:55:33.826371 jq[1915]: true Sep 12 18:55:33.834784 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 18:55:33.844217 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 18:55:33.853758 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 18:55:33.853867 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 18:55:33.854024 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 18:55:33.863776 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 18:55:33.873235 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 18:55:33.873453 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 18:55:33.895803 jq[1920]: true Sep 12 18:55:33.896310 (ntainerd)[1921]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 18:55:33.911009 tar[1919]: linux-amd64/LICENSE Sep 12 18:55:33.911159 tar[1919]: linux-amd64/helm Sep 12 18:55:33.913073 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Sep 12 18:55:33.913197 systemd[1]: Condition check resulted in tcsd.service - TCG Core Services Daemon being skipped. Sep 12 18:55:33.927109 dbus-daemon[1877]: [system] SELinux support is enabled Sep 12 18:55:33.927200 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 18:55:33.928998 update_engine[1914]: I20250912 18:55:33.928946 1914 update_check_scheduler.cc:74] Next update check in 9m58s Sep 12 18:55:33.938129 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 18:55:33.938148 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 18:55:33.938790 dbus-daemon[1877]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 12 18:55:33.944131 sshd_keygen[1912]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 18:55:33.947685 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 18:55:33.947699 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 18:55:33.957886 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 18:55:33.967077 systemd[1]: Started update-engine.service - Update Engine. Sep 12 18:55:33.969187 bash[1946]: Updated "/home/core/.ssh/authorized_keys" Sep 12 18:55:33.975909 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 18:55:33.987335 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 18:55:34.006264 systemd[1]: Starting sshkeys.service... Sep 12 18:55:34.012477 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 18:55:34.021230 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 18:55:34.021368 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 18:55:34.032749 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 18:55:34.044564 locksmithd[1973]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 18:55:34.056228 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 12 18:55:34.062677 containerd[1921]: time="2025-09-12T18:55:34Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 18:55:34.062977 containerd[1921]: time="2025-09-12T18:55:34.062964147Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 18:55:34.067509 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 12 18:55:34.068698 containerd[1921]: time="2025-09-12T18:55:34.068679295Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="5.069µs" Sep 12 18:55:34.068698 containerd[1921]: time="2025-09-12T18:55:34.068696777Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 18:55:34.068762 containerd[1921]: time="2025-09-12T18:55:34.068708210Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 18:55:34.068805 containerd[1921]: time="2025-09-12T18:55:34.068796125Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 18:55:34.068837 containerd[1921]: time="2025-09-12T18:55:34.068806558Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 18:55:34.068837 containerd[1921]: time="2025-09-12T18:55:34.068821218Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 18:55:34.068886 containerd[1921]: time="2025-09-12T18:55:34.068853321Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 18:55:34.068886 containerd[1921]: time="2025-09-12T18:55:34.068860889Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 18:55:34.068999 containerd[1921]: time="2025-09-12T18:55:34.068988519Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 18:55:34.068999 containerd[1921]: time="2025-09-12T18:55:34.068997123Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 18:55:34.069051 containerd[1921]: time="2025-09-12T18:55:34.069003205Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 18:55:34.069051 containerd[1921]: time="2025-09-12T18:55:34.069008161Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 18:55:34.069051 containerd[1921]: time="2025-09-12T18:55:34.069047851Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 18:55:34.069179 containerd[1921]: time="2025-09-12T18:55:34.069170117Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 18:55:34.069207 containerd[1921]: time="2025-09-12T18:55:34.069187530Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 18:55:34.069207 containerd[1921]: time="2025-09-12T18:55:34.069193653Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 18:55:34.069253 containerd[1921]: time="2025-09-12T18:55:34.069208420Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 18:55:34.069325 containerd[1921]: time="2025-09-12T18:55:34.069315710Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 18:55:34.069357 containerd[1921]: time="2025-09-12T18:55:34.069350400Z" level=info msg="metadata content store policy set" policy=shared Sep 12 18:55:34.083754 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 18:55:34.083837 containerd[1921]: time="2025-09-12T18:55:34.083817575Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 18:55:34.083867 containerd[1921]: time="2025-09-12T18:55:34.083844367Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 18:55:34.083867 containerd[1921]: time="2025-09-12T18:55:34.083853115Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 18:55:34.083922 containerd[1921]: time="2025-09-12T18:55:34.083866893Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 18:55:34.083922 containerd[1921]: time="2025-09-12T18:55:34.083874405Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 18:55:34.083922 containerd[1921]: time="2025-09-12T18:55:34.083881044Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 18:55:34.083922 containerd[1921]: time="2025-09-12T18:55:34.083889148Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 18:55:34.083922 containerd[1921]: time="2025-09-12T18:55:34.083895383Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 18:55:34.083922 containerd[1921]: time="2025-09-12T18:55:34.083901223Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 18:55:34.083922 containerd[1921]: time="2025-09-12T18:55:34.083906646Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 18:55:34.083922 containerd[1921]: time="2025-09-12T18:55:34.083914261Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 18:55:34.083922 containerd[1921]: time="2025-09-12T18:55:34.083921988Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 18:55:34.084123 containerd[1921]: time="2025-09-12T18:55:34.083978658Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 18:55:34.084123 containerd[1921]: time="2025-09-12T18:55:34.083990600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 18:55:34.084123 containerd[1921]: time="2025-09-12T18:55:34.083998987Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 18:55:34.084123 containerd[1921]: time="2025-09-12T18:55:34.084005061Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 18:55:34.084123 containerd[1921]: time="2025-09-12T18:55:34.084010643Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 18:55:34.084123 containerd[1921]: time="2025-09-12T18:55:34.084016134Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 18:55:34.084123 containerd[1921]: time="2025-09-12T18:55:34.084022150Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 18:55:34.084123 containerd[1921]: time="2025-09-12T18:55:34.084027898Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 18:55:34.084123 containerd[1921]: time="2025-09-12T18:55:34.084034043Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 18:55:34.084123 containerd[1921]: time="2025-09-12T18:55:34.084041897Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 18:55:34.084123 containerd[1921]: time="2025-09-12T18:55:34.084048042Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 18:55:34.084123 containerd[1921]: time="2025-09-12T18:55:34.084087043Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 18:55:34.084123 containerd[1921]: time="2025-09-12T18:55:34.084096123Z" level=info msg="Start snapshots syncer" Sep 12 18:55:34.084123 containerd[1921]: time="2025-09-12T18:55:34.084109031Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 18:55:34.084444 containerd[1921]: time="2025-09-12T18:55:34.084239909Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 18:55:34.084444 containerd[1921]: time="2025-09-12T18:55:34.084267814Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 18:55:34.084559 containerd[1921]: time="2025-09-12T18:55:34.084308160Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 18:55:34.084559 containerd[1921]: time="2025-09-12T18:55:34.084366436Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 18:55:34.084559 containerd[1921]: time="2025-09-12T18:55:34.084379228Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 18:55:34.084559 containerd[1921]: time="2025-09-12T18:55:34.084385526Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 18:55:34.084559 containerd[1921]: time="2025-09-12T18:55:34.084392946Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 18:55:34.084559 containerd[1921]: time="2025-09-12T18:55:34.084399936Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 18:55:34.084559 containerd[1921]: time="2025-09-12T18:55:34.084406042Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 18:55:34.084559 containerd[1921]: time="2025-09-12T18:55:34.084412099Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 18:55:34.084559 containerd[1921]: time="2025-09-12T18:55:34.084424907Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 18:55:34.084559 containerd[1921]: time="2025-09-12T18:55:34.084437385Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 18:55:34.084559 containerd[1921]: time="2025-09-12T18:55:34.084444152Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 18:55:34.084559 containerd[1921]: time="2025-09-12T18:55:34.084462820Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 18:55:34.084559 containerd[1921]: time="2025-09-12T18:55:34.084471268Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 18:55:34.084559 containerd[1921]: time="2025-09-12T18:55:34.084476488Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 18:55:34.084890 containerd[1921]: time="2025-09-12T18:55:34.084481865Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 18:55:34.084890 containerd[1921]: time="2025-09-12T18:55:34.084486458Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 18:55:34.084890 containerd[1921]: time="2025-09-12T18:55:34.084492076Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 18:55:34.084890 containerd[1921]: time="2025-09-12T18:55:34.084498096Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 18:55:34.084890 containerd[1921]: time="2025-09-12T18:55:34.084506952Z" level=info msg="runtime interface created" Sep 12 18:55:34.084890 containerd[1921]: time="2025-09-12T18:55:34.084510498Z" level=info msg="created NRI interface" Sep 12 18:55:34.084890 containerd[1921]: time="2025-09-12T18:55:34.084517390Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 18:55:34.084890 containerd[1921]: time="2025-09-12T18:55:34.084526063Z" level=info msg="Connect containerd service" Sep 12 18:55:34.084890 containerd[1921]: time="2025-09-12T18:55:34.084544764Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 18:55:34.085099 containerd[1921]: time="2025-09-12T18:55:34.084966437Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 18:55:34.095928 coreos-metadata[1986]: Sep 12 18:55:34.095 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 12 18:55:34.097922 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 18:55:34.107621 systemd[1]: Started serial-getty@ttyS1.service - Serial Getty on ttyS1. Sep 12 18:55:34.114979 tar[1919]: linux-amd64/README.md Sep 12 18:55:34.117917 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 18:55:34.142600 kernel: i915 0000:00:02.0: [drm] Cannot find any crtc or sizes Sep 12 18:55:34.149926 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 18:55:34.176522 containerd[1921]: time="2025-09-12T18:55:34.176498054Z" level=info msg="Start subscribing containerd event" Sep 12 18:55:34.176602 containerd[1921]: time="2025-09-12T18:55:34.176535530Z" level=info msg="Start recovering state" Sep 12 18:55:34.176602 containerd[1921]: time="2025-09-12T18:55:34.176560540Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 18:55:34.176657 containerd[1921]: time="2025-09-12T18:55:34.176608251Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 18:55:34.176657 containerd[1921]: time="2025-09-12T18:55:34.176611940Z" level=info msg="Start event monitor" Sep 12 18:55:34.176657 containerd[1921]: time="2025-09-12T18:55:34.176627522Z" level=info msg="Start cni network conf syncer for default" Sep 12 18:55:34.176657 containerd[1921]: time="2025-09-12T18:55:34.176639668Z" level=info msg="Start streaming server" Sep 12 18:55:34.176657 containerd[1921]: time="2025-09-12T18:55:34.176649030Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 18:55:34.176657 containerd[1921]: time="2025-09-12T18:55:34.176656096Z" level=info msg="runtime interface starting up..." Sep 12 18:55:34.176807 containerd[1921]: time="2025-09-12T18:55:34.176662797Z" level=info msg="starting plugins..." Sep 12 18:55:34.176807 containerd[1921]: time="2025-09-12T18:55:34.176673700Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 18:55:34.176804 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 18:55:34.176970 containerd[1921]: time="2025-09-12T18:55:34.176953085Z" level=info msg="containerd successfully booted in 0.114468s" Sep 12 18:55:34.343591 kernel: i915 0000:00:02.0: [drm] Cannot find any crtc or sizes Sep 12 18:55:35.049621 kernel: EXT4-fs (sda9): resized filesystem to 116605649 Sep 12 18:55:35.081517 extend-filesystems[1894]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 12 18:55:35.081517 extend-filesystems[1894]: old_desc_blocks = 1, new_desc_blocks = 56 Sep 12 18:55:35.081517 extend-filesystems[1894]: The filesystem on /dev/sda9 is now 116605649 (4k) blocks long. Sep 12 18:55:35.119672 extend-filesystems[1883]: Resized filesystem in /dev/sda9 Sep 12 18:55:35.082391 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 18:55:35.082532 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 18:55:35.364922 systemd-timesyncd[1838]: Network configuration changed, trying to establish connection. Sep 12 18:55:35.428639 systemd-networkd[1836]: bond0: Gained IPv6LL Sep 12 18:55:35.428910 systemd-timesyncd[1838]: Network configuration changed, trying to establish connection. Sep 12 18:55:35.429939 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 18:55:35.440233 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 18:55:35.450097 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 18:55:35.468959 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 18:55:35.494963 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 18:55:36.198689 kernel: mlx5_core 0000:02:00.0: lag map: port 1:1 port 2:2 Sep 12 18:55:36.198843 kernel: mlx5_core 0000:02:00.0: shared_fdb:0 mode:queue_affinity Sep 12 18:55:36.228950 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 18:55:36.239383 (kubelet)[2031]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 18:55:36.274651 kernel: mlx5_core 0000:02:00.0: lag map: port 1:2 port 2:2 Sep 12 18:55:36.284638 kernel: mlx5_core 0000:02:00.0: lag map: port 1:1 port 2:2 Sep 12 18:55:36.668027 kubelet[2031]: E0912 18:55:36.667897 2031 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 18:55:36.669026 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 18:55:36.669112 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 18:55:36.669292 systemd[1]: kubelet.service: Consumed 595ms CPU time, 269.3M memory peak. Sep 12 18:55:36.938052 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 18:55:36.947524 systemd[1]: Started sshd@0-139.178.94.145:22-139.178.89.65:49270.service - OpenSSH per-connection server daemon (139.178.89.65:49270). Sep 12 18:55:37.021424 sshd[2051]: Accepted publickey for core from 139.178.89.65 port 49270 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 18:55:37.022409 sshd-session[2051]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 18:55:37.026802 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 18:55:37.037493 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 18:55:37.053322 systemd-logind[1909]: New session 1 of user core. Sep 12 18:55:37.070438 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 18:55:37.087734 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 18:55:37.116126 (systemd)[2056]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 18:55:37.119232 systemd-logind[1909]: New session c1 of user core. Sep 12 18:55:37.248805 systemd[2056]: Queued start job for default target default.target. Sep 12 18:55:37.262280 systemd[2056]: Created slice app.slice - User Application Slice. Sep 12 18:55:37.262313 systemd[2056]: Reached target paths.target - Paths. Sep 12 18:55:37.262336 systemd[2056]: Reached target timers.target - Timers. Sep 12 18:55:37.262979 systemd[2056]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 18:55:37.268612 systemd[2056]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 18:55:37.268639 systemd[2056]: Reached target sockets.target - Sockets. Sep 12 18:55:37.268662 systemd[2056]: Reached target basic.target - Basic System. Sep 12 18:55:37.268685 systemd[2056]: Reached target default.target - Main User Target. Sep 12 18:55:37.268701 systemd[2056]: Startup finished in 142ms. Sep 12 18:55:37.268775 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 18:55:37.278688 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 18:55:37.352343 systemd[1]: Started sshd@1-139.178.94.145:22-139.178.89.65:49276.service - OpenSSH per-connection server daemon (139.178.89.65:49276). Sep 12 18:55:37.394478 sshd[2067]: Accepted publickey for core from 139.178.89.65 port 49276 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 18:55:37.395131 sshd-session[2067]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 18:55:37.397926 systemd-logind[1909]: New session 2 of user core. Sep 12 18:55:37.411184 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 18:55:37.478552 sshd[2070]: Connection closed by 139.178.89.65 port 49276 Sep 12 18:55:37.478757 sshd-session[2067]: pam_unix(sshd:session): session closed for user core Sep 12 18:55:37.508489 systemd[1]: sshd@1-139.178.94.145:22-139.178.89.65:49276.service: Deactivated successfully. Sep 12 18:55:37.509407 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 18:55:37.509994 systemd-logind[1909]: Session 2 logged out. Waiting for processes to exit. Sep 12 18:55:37.511001 systemd[1]: Started sshd@2-139.178.94.145:22-139.178.89.65:49278.service - OpenSSH per-connection server daemon (139.178.89.65:49278). Sep 12 18:55:37.521657 systemd-logind[1909]: Removed session 2. Sep 12 18:55:37.563124 sshd[2076]: Accepted publickey for core from 139.178.89.65 port 49278 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 18:55:37.564186 sshd-session[2076]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 18:55:37.568618 systemd-logind[1909]: New session 3 of user core. Sep 12 18:55:37.581126 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 18:55:37.658688 sshd[2080]: Connection closed by 139.178.89.65 port 49278 Sep 12 18:55:37.659535 sshd-session[2076]: pam_unix(sshd:session): session closed for user core Sep 12 18:55:37.667427 systemd[1]: sshd@2-139.178.94.145:22-139.178.89.65:49278.service: Deactivated successfully. Sep 12 18:55:37.671887 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 18:55:37.676472 systemd-logind[1909]: Session 3 logged out. Waiting for processes to exit. Sep 12 18:55:37.679639 systemd-logind[1909]: Removed session 3. Sep 12 18:55:37.711183 google_oslogin_nss_cache[1884]: oslogin_cache_refresh[1884]: Failure getting users, quitting Sep 12 18:55:37.711183 google_oslogin_nss_cache[1884]: oslogin_cache_refresh[1884]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 18:55:37.711121 oslogin_cache_refresh[1884]: Failure getting users, quitting Sep 12 18:55:37.712661 google_oslogin_nss_cache[1884]: oslogin_cache_refresh[1884]: Refreshing group entry cache Sep 12 18:55:37.711172 oslogin_cache_refresh[1884]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 18:55:37.711305 oslogin_cache_refresh[1884]: Refreshing group entry cache Sep 12 18:55:37.712969 google_oslogin_nss_cache[1884]: oslogin_cache_refresh[1884]: Failure getting groups, quitting Sep 12 18:55:37.713049 google_oslogin_nss_cache[1884]: oslogin_cache_refresh[1884]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 18:55:37.712969 oslogin_cache_refresh[1884]: Failure getting groups, quitting Sep 12 18:55:37.713001 oslogin_cache_refresh[1884]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 18:55:37.717180 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 12 18:55:37.717963 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 12 18:55:37.790355 coreos-metadata[1986]: Sep 12 18:55:37.790 INFO Fetch successful Sep 12 18:55:37.874493 unknown[1986]: wrote ssh authorized keys file for user: core Sep 12 18:55:37.904504 update-ssh-keys[2087]: Updated "/home/core/.ssh/authorized_keys" Sep 12 18:55:37.904943 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 12 18:55:37.918077 systemd[1]: Finished sshkeys.service. Sep 12 18:55:38.251288 coreos-metadata[1876]: Sep 12 18:55:38.251 INFO Fetch successful Sep 12 18:55:38.309693 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 12 18:55:38.319890 systemd[1]: Starting packet-phone-home.service - Report Success to Packet... Sep 12 18:55:38.765705 systemd[1]: Finished packet-phone-home.service - Report Success to Packet. Sep 12 18:55:38.778217 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 18:55:38.787405 systemd[1]: Startup finished in 4.471s (kernel) + 23.229s (initrd) + 9.336s (userspace) = 37.037s. Sep 12 18:55:38.808966 login[1995]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 12 18:55:38.812540 systemd-logind[1909]: New session 4 of user core. Sep 12 18:55:38.813600 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 18:55:38.821065 login[1994]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 12 18:55:38.823935 systemd-logind[1909]: New session 5 of user core. Sep 12 18:55:38.824865 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 18:55:40.203285 systemd-timesyncd[1838]: Network configuration changed, trying to establish connection. Sep 12 18:55:46.760771 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 18:55:46.762369 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 18:55:47.033160 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 18:55:47.035302 (kubelet)[2133]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 18:55:47.062335 kubelet[2133]: E0912 18:55:47.062250 2133 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 18:55:47.064259 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 18:55:47.064346 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 18:55:47.064522 systemd[1]: kubelet.service: Consumed 148ms CPU time, 112.9M memory peak. Sep 12 18:55:47.687182 systemd[1]: Started sshd@3-139.178.94.145:22-139.178.89.65:46104.service - OpenSSH per-connection server daemon (139.178.89.65:46104). Sep 12 18:55:47.720019 sshd[2149]: Accepted publickey for core from 139.178.89.65 port 46104 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 18:55:47.720658 sshd-session[2149]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 18:55:47.723394 systemd-logind[1909]: New session 6 of user core. Sep 12 18:55:47.735870 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 18:55:47.786941 sshd[2152]: Connection closed by 139.178.89.65 port 46104 Sep 12 18:55:47.787120 sshd-session[2149]: pam_unix(sshd:session): session closed for user core Sep 12 18:55:47.808377 systemd[1]: sshd@3-139.178.94.145:22-139.178.89.65:46104.service: Deactivated successfully. Sep 12 18:55:47.812306 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 18:55:47.814720 systemd-logind[1909]: Session 6 logged out. Waiting for processes to exit. Sep 12 18:55:47.820651 systemd[1]: Started sshd@4-139.178.94.145:22-139.178.89.65:46112.service - OpenSSH per-connection server daemon (139.178.89.65:46112). Sep 12 18:55:47.822772 systemd-logind[1909]: Removed session 6. Sep 12 18:55:47.910004 sshd[2158]: Accepted publickey for core from 139.178.89.65 port 46112 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 18:55:47.910771 sshd-session[2158]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 18:55:47.913589 systemd-logind[1909]: New session 7 of user core. Sep 12 18:55:47.924869 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 18:55:47.975294 sshd[2161]: Connection closed by 139.178.89.65 port 46112 Sep 12 18:55:47.975516 sshd-session[2158]: pam_unix(sshd:session): session closed for user core Sep 12 18:55:48.008754 systemd[1]: sshd@4-139.178.94.145:22-139.178.89.65:46112.service: Deactivated successfully. Sep 12 18:55:48.012886 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 18:55:48.015239 systemd-logind[1909]: Session 7 logged out. Waiting for processes to exit. Sep 12 18:55:48.020973 systemd[1]: Started sshd@5-139.178.94.145:22-139.178.89.65:46126.service - OpenSSH per-connection server daemon (139.178.89.65:46126). Sep 12 18:55:48.022920 systemd-logind[1909]: Removed session 7. Sep 12 18:55:48.123846 sshd[2167]: Accepted publickey for core from 139.178.89.65 port 46126 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 18:55:48.125029 sshd-session[2167]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 18:55:48.129456 systemd-logind[1909]: New session 8 of user core. Sep 12 18:55:48.137834 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 18:55:48.202159 sshd[2170]: Connection closed by 139.178.89.65 port 46126 Sep 12 18:55:48.203014 sshd-session[2167]: pam_unix(sshd:session): session closed for user core Sep 12 18:55:48.220474 systemd[1]: sshd@5-139.178.94.145:22-139.178.89.65:46126.service: Deactivated successfully. Sep 12 18:55:48.222058 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 18:55:48.223003 systemd-logind[1909]: Session 8 logged out. Waiting for processes to exit. Sep 12 18:55:48.225378 systemd[1]: Started sshd@6-139.178.94.145:22-139.178.89.65:46136.service - OpenSSH per-connection server daemon (139.178.89.65:46136). Sep 12 18:55:48.226136 systemd-logind[1909]: Removed session 8. Sep 12 18:55:48.291356 sshd[2176]: Accepted publickey for core from 139.178.89.65 port 46136 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 18:55:48.292885 sshd-session[2176]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 18:55:48.298212 systemd-logind[1909]: New session 9 of user core. Sep 12 18:55:48.316005 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 18:55:48.386195 sudo[2180]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 18:55:48.386355 sudo[2180]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 18:55:48.401031 sudo[2180]: pam_unix(sudo:session): session closed for user root Sep 12 18:55:48.401737 sshd[2179]: Connection closed by 139.178.89.65 port 46136 Sep 12 18:55:48.401941 sshd-session[2176]: pam_unix(sshd:session): session closed for user core Sep 12 18:55:48.423920 systemd[1]: sshd@6-139.178.94.145:22-139.178.89.65:46136.service: Deactivated successfully. Sep 12 18:55:48.425249 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 18:55:48.426035 systemd-logind[1909]: Session 9 logged out. Waiting for processes to exit. Sep 12 18:55:48.427894 systemd[1]: Started sshd@7-139.178.94.145:22-139.178.89.65:46142.service - OpenSSH per-connection server daemon (139.178.89.65:46142). Sep 12 18:55:48.428468 systemd-logind[1909]: Removed session 9. Sep 12 18:55:48.513207 sshd[2186]: Accepted publickey for core from 139.178.89.65 port 46142 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 18:55:48.514334 sshd-session[2186]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 18:55:48.518589 systemd-logind[1909]: New session 10 of user core. Sep 12 18:55:48.527835 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 18:55:48.590657 sudo[2191]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 18:55:48.591539 sudo[2191]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 18:55:48.606759 sudo[2191]: pam_unix(sudo:session): session closed for user root Sep 12 18:55:48.613019 sudo[2190]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 18:55:48.613414 sudo[2190]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 18:55:48.619141 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 18:55:48.663754 augenrules[2213]: No rules Sep 12 18:55:48.664119 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 18:55:48.664240 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 18:55:48.664844 sudo[2190]: pam_unix(sudo:session): session closed for user root Sep 12 18:55:48.665600 sshd[2189]: Connection closed by 139.178.89.65 port 46142 Sep 12 18:55:48.665841 sshd-session[2186]: pam_unix(sshd:session): session closed for user core Sep 12 18:55:48.685141 systemd[1]: sshd@7-139.178.94.145:22-139.178.89.65:46142.service: Deactivated successfully. Sep 12 18:55:48.686168 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 18:55:48.686779 systemd-logind[1909]: Session 10 logged out. Waiting for processes to exit. Sep 12 18:55:48.688003 systemd[1]: Started sshd@8-139.178.94.145:22-139.178.89.65:46154.service - OpenSSH per-connection server daemon (139.178.89.65:46154). Sep 12 18:55:48.688850 systemd-logind[1909]: Removed session 10. Sep 12 18:55:48.731034 sshd[2222]: Accepted publickey for core from 139.178.89.65 port 46154 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 18:55:48.731878 sshd-session[2222]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 18:55:48.735523 systemd-logind[1909]: New session 11 of user core. Sep 12 18:55:48.755921 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 18:55:48.809177 sudo[2226]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 18:55:48.809332 sudo[2226]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 18:55:49.074607 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 18:55:49.092833 (dockerd)[2251]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 18:55:49.329035 dockerd[2251]: time="2025-09-12T18:55:49.328937023Z" level=info msg="Starting up" Sep 12 18:55:49.329435 dockerd[2251]: time="2025-09-12T18:55:49.329395892Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 18:55:49.336080 dockerd[2251]: time="2025-09-12T18:55:49.336030645Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 18:55:49.374744 dockerd[2251]: time="2025-09-12T18:55:49.374713637Z" level=info msg="Loading containers: start." Sep 12 18:55:49.387594 kernel: Initializing XFRM netlink socket Sep 12 18:55:49.530934 systemd-timesyncd[1838]: Network configuration changed, trying to establish connection. Sep 12 18:55:49.552533 systemd-networkd[1836]: docker0: Link UP Sep 12 18:55:49.567492 dockerd[2251]: time="2025-09-12T18:55:49.567447098Z" level=info msg="Loading containers: done." Sep 12 18:55:49.574738 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3440546788-merged.mount: Deactivated successfully. Sep 12 18:55:49.574879 dockerd[2251]: time="2025-09-12T18:55:49.574865225Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 18:55:49.574925 dockerd[2251]: time="2025-09-12T18:55:49.574904713Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 18:55:49.574955 dockerd[2251]: time="2025-09-12T18:55:49.574946582Z" level=info msg="Initializing buildkit" Sep 12 18:55:49.587427 dockerd[2251]: time="2025-09-12T18:55:49.587385298Z" level=info msg="Completed buildkit initialization" Sep 12 18:55:49.589923 dockerd[2251]: time="2025-09-12T18:55:49.589871998Z" level=info msg="Daemon has completed initialization" Sep 12 18:55:49.589923 dockerd[2251]: time="2025-09-12T18:55:49.589911115Z" level=info msg="API listen on /run/docker.sock" Sep 12 18:55:49.589974 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 18:55:49.915861 systemd-timesyncd[1838]: Contacted time server [2604:a880:400:d0::4ed:f001]:123 (2.flatcar.pool.ntp.org). Sep 12 18:55:49.915920 systemd-timesyncd[1838]: Initial clock synchronization to Fri 2025-09-12 18:55:50.059469 UTC. Sep 12 18:55:50.537301 containerd[1921]: time="2025-09-12T18:55:50.537201884Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\"" Sep 12 18:55:51.074138 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount342599856.mount: Deactivated successfully. Sep 12 18:55:52.002944 containerd[1921]: time="2025-09-12T18:55:52.002889721Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:52.003168 containerd[1921]: time="2025-09-12T18:55:52.003085927Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.9: active requests=0, bytes read=28837916" Sep 12 18:55:52.003439 containerd[1921]: time="2025-09-12T18:55:52.003404491Z" level=info msg="ImageCreate event name:\"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:52.004891 containerd[1921]: time="2025-09-12T18:55:52.004853371Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:52.005447 containerd[1921]: time="2025-09-12T18:55:52.005407740Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.9\" with image id \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\", size \"28834515\" in 1.468153246s" Sep 12 18:55:52.005447 containerd[1921]: time="2025-09-12T18:55:52.005425820Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\" returns image reference \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\"" Sep 12 18:55:52.005786 containerd[1921]: time="2025-09-12T18:55:52.005746048Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\"" Sep 12 18:55:53.165806 containerd[1921]: time="2025-09-12T18:55:53.165751020Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:53.166031 containerd[1921]: time="2025-09-12T18:55:53.165940727Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.9: active requests=0, bytes read=24787027" Sep 12 18:55:53.166319 containerd[1921]: time="2025-09-12T18:55:53.166278767Z" level=info msg="ImageCreate event name:\"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:53.167615 containerd[1921]: time="2025-09-12T18:55:53.167570840Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:53.168165 containerd[1921]: time="2025-09-12T18:55:53.168118958Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.9\" with image id \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\", size \"26421706\" in 1.162354871s" Sep 12 18:55:53.168165 containerd[1921]: time="2025-09-12T18:55:53.168137085Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\" returns image reference \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\"" Sep 12 18:55:53.168442 containerd[1921]: time="2025-09-12T18:55:53.168400809Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\"" Sep 12 18:55:54.080439 containerd[1921]: time="2025-09-12T18:55:54.080414434Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:54.080556 containerd[1921]: time="2025-09-12T18:55:54.080542251Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.9: active requests=0, bytes read=19176289" Sep 12 18:55:54.081045 containerd[1921]: time="2025-09-12T18:55:54.080996137Z" level=info msg="ImageCreate event name:\"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:54.082388 containerd[1921]: time="2025-09-12T18:55:54.082347687Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:54.083329 containerd[1921]: time="2025-09-12T18:55:54.083309644Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.9\" with image id \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\", size \"20810986\" in 914.892366ms" Sep 12 18:55:54.083329 containerd[1921]: time="2025-09-12T18:55:54.083326750Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\" returns image reference \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\"" Sep 12 18:55:54.083559 containerd[1921]: time="2025-09-12T18:55:54.083549025Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\"" Sep 12 18:55:54.889231 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount941828812.mount: Deactivated successfully. Sep 12 18:55:55.082915 containerd[1921]: time="2025-09-12T18:55:55.082888870Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:55.083148 containerd[1921]: time="2025-09-12T18:55:55.083133706Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.9: active requests=0, bytes read=30924206" Sep 12 18:55:55.083479 containerd[1921]: time="2025-09-12T18:55:55.083438394Z" level=info msg="ImageCreate event name:\"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:55.084417 containerd[1921]: time="2025-09-12T18:55:55.084404543Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:55.084649 containerd[1921]: time="2025-09-12T18:55:55.084634565Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.9\" with image id \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\", repo tag \"registry.k8s.io/kube-proxy:v1.32.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\", size \"30923225\" in 1.001070737s" Sep 12 18:55:55.084693 containerd[1921]: time="2025-09-12T18:55:55.084650813Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\" returns image reference \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\"" Sep 12 18:55:55.084911 containerd[1921]: time="2025-09-12T18:55:55.084900103Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 18:55:55.627634 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2896899409.mount: Deactivated successfully. Sep 12 18:55:56.184559 containerd[1921]: time="2025-09-12T18:55:56.184532973Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:56.184847 containerd[1921]: time="2025-09-12T18:55:56.184757561Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 12 18:55:56.185215 containerd[1921]: time="2025-09-12T18:55:56.185200338Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:56.186965 containerd[1921]: time="2025-09-12T18:55:56.186949731Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:56.187427 containerd[1921]: time="2025-09-12T18:55:56.187410371Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.102492768s" Sep 12 18:55:56.187474 containerd[1921]: time="2025-09-12T18:55:56.187430034Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 12 18:55:56.187768 containerd[1921]: time="2025-09-12T18:55:56.187754423Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 18:55:56.646771 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount172001911.mount: Deactivated successfully. Sep 12 18:55:56.647562 containerd[1921]: time="2025-09-12T18:55:56.647519048Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 18:55:56.647760 containerd[1921]: time="2025-09-12T18:55:56.647724303Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 12 18:55:56.648112 containerd[1921]: time="2025-09-12T18:55:56.648069789Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 18:55:56.649080 containerd[1921]: time="2025-09-12T18:55:56.649040928Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 18:55:56.649476 containerd[1921]: time="2025-09-12T18:55:56.649436103Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 461.664722ms" Sep 12 18:55:56.649476 containerd[1921]: time="2025-09-12T18:55:56.649450079Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 12 18:55:56.649765 containerd[1921]: time="2025-09-12T18:55:56.649755999Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 12 18:55:57.230559 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 18:55:57.231680 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 18:55:57.236233 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount898027888.mount: Deactivated successfully. Sep 12 18:55:57.504920 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 18:55:57.507367 (kubelet)[2659]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 18:55:57.544833 kubelet[2659]: E0912 18:55:57.544807 2659 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 18:55:57.546308 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 18:55:57.546397 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 18:55:57.546624 systemd[1]: kubelet.service: Consumed 116ms CPU time, 113.3M memory peak. Sep 12 18:55:58.507069 containerd[1921]: time="2025-09-12T18:55:58.507017575Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:58.507290 containerd[1921]: time="2025-09-12T18:55:58.507251686Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682056" Sep 12 18:55:58.507621 containerd[1921]: time="2025-09-12T18:55:58.507607634Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:58.510376 containerd[1921]: time="2025-09-12T18:55:58.510332076Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:55:58.510930 containerd[1921]: time="2025-09-12T18:55:58.510884329Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 1.861094484s" Sep 12 18:55:58.510930 containerd[1921]: time="2025-09-12T18:55:58.510906520Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Sep 12 18:56:00.373670 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 18:56:00.373884 systemd[1]: kubelet.service: Consumed 116ms CPU time, 113.3M memory peak. Sep 12 18:56:00.375376 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 18:56:00.388711 systemd[1]: Reload requested from client PID 2750 ('systemctl') (unit session-11.scope)... Sep 12 18:56:00.388718 systemd[1]: Reloading... Sep 12 18:56:00.435679 zram_generator::config[2796]: No configuration found. Sep 12 18:56:00.593913 systemd[1]: Reloading finished in 204 ms. Sep 12 18:56:00.614297 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 18:56:00.614343 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 18:56:00.614493 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 18:56:00.614518 systemd[1]: kubelet.service: Consumed 50ms CPU time, 92.2M memory peak. Sep 12 18:56:00.615926 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 18:56:00.895421 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 18:56:00.897690 (kubelet)[2861]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 18:56:00.919292 kubelet[2861]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 18:56:00.919292 kubelet[2861]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 18:56:00.919292 kubelet[2861]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 18:56:00.919511 kubelet[2861]: I0912 18:56:00.919328 2861 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 18:56:01.142361 kubelet[2861]: I0912 18:56:01.142314 2861 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 18:56:01.142361 kubelet[2861]: I0912 18:56:01.142327 2861 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 18:56:01.142469 kubelet[2861]: I0912 18:56:01.142463 2861 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 18:56:01.165650 kubelet[2861]: E0912 18:56:01.165549 2861 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.94.145:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.94.145:6443: connect: connection refused" logger="UnhandledError" Sep 12 18:56:01.168900 kubelet[2861]: I0912 18:56:01.168865 2861 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 18:56:01.175292 kubelet[2861]: I0912 18:56:01.175284 2861 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 18:56:01.184416 kubelet[2861]: I0912 18:56:01.184379 2861 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 18:56:01.186339 kubelet[2861]: I0912 18:56:01.186293 2861 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 18:56:01.186452 kubelet[2861]: I0912 18:56:01.186308 2861 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426.1.0-a-3db2d8461d","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 18:56:01.186452 kubelet[2861]: I0912 18:56:01.186428 2861 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 18:56:01.186452 kubelet[2861]: I0912 18:56:01.186449 2861 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 18:56:01.186552 kubelet[2861]: I0912 18:56:01.186511 2861 state_mem.go:36] "Initialized new in-memory state store" Sep 12 18:56:01.190347 kubelet[2861]: I0912 18:56:01.190298 2861 kubelet.go:446] "Attempting to sync node with API server" Sep 12 18:56:01.190347 kubelet[2861]: I0912 18:56:01.190331 2861 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 18:56:01.190347 kubelet[2861]: I0912 18:56:01.190343 2861 kubelet.go:352] "Adding apiserver pod source" Sep 12 18:56:01.190423 kubelet[2861]: I0912 18:56:01.190350 2861 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 18:56:01.193265 kubelet[2861]: I0912 18:56:01.193210 2861 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 18:56:01.193545 kubelet[2861]: I0912 18:56:01.193502 2861 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 18:56:01.194296 kubelet[2861]: W0912 18:56:01.194239 2861 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 18:56:01.194357 kubelet[2861]: W0912 18:56:01.194294 2861 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.94.145:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.1.0-a-3db2d8461d&limit=500&resourceVersion=0": dial tcp 139.178.94.145:6443: connect: connection refused Sep 12 18:56:01.194357 kubelet[2861]: W0912 18:56:01.194294 2861 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.94.145:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.94.145:6443: connect: connection refused Sep 12 18:56:01.194388 kubelet[2861]: E0912 18:56:01.194364 2861 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.94.145:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.1.0-a-3db2d8461d&limit=500&resourceVersion=0\": dial tcp 139.178.94.145:6443: connect: connection refused" logger="UnhandledError" Sep 12 18:56:01.194388 kubelet[2861]: E0912 18:56:01.194364 2861 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.94.145:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.94.145:6443: connect: connection refused" logger="UnhandledError" Sep 12 18:56:01.195872 kubelet[2861]: I0912 18:56:01.195836 2861 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 18:56:01.195872 kubelet[2861]: I0912 18:56:01.195854 2861 server.go:1287] "Started kubelet" Sep 12 18:56:01.195958 kubelet[2861]: I0912 18:56:01.195908 2861 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 18:56:01.196957 kubelet[2861]: I0912 18:56:01.196948 2861 server.go:479] "Adding debug handlers to kubelet server" Sep 12 18:56:01.197892 kubelet[2861]: I0912 18:56:01.197884 2861 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 18:56:01.197923 kubelet[2861]: I0912 18:56:01.197898 2861 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 18:56:01.197957 kubelet[2861]: I0912 18:56:01.197937 2861 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 18:56:01.197981 kubelet[2861]: E0912 18:56:01.197953 2861 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-3db2d8461d\" not found" Sep 12 18:56:01.197981 kubelet[2861]: I0912 18:56:01.197972 2861 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 18:56:01.198019 kubelet[2861]: I0912 18:56:01.198014 2861 reconciler.go:26] "Reconciler: start to sync state" Sep 12 18:56:01.198304 kubelet[2861]: I0912 18:56:01.198293 2861 factory.go:221] Registration of the systemd container factory successfully Sep 12 18:56:01.198402 kubelet[2861]: I0912 18:56:01.198390 2861 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 18:56:01.216918 kubelet[2861]: E0912 18:56:01.216889 2861 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.94.145:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.1.0-a-3db2d8461d?timeout=10s\": dial tcp 139.178.94.145:6443: connect: connection refused" interval="200ms" Sep 12 18:56:01.216989 kubelet[2861]: I0912 18:56:01.216692 2861 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 18:56:01.217025 kubelet[2861]: W0912 18:56:01.216986 2861 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.94.145:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.94.145:6443: connect: connection refused Sep 12 18:56:01.217063 kubelet[2861]: E0912 18:56:01.217038 2861 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.94.145:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.94.145:6443: connect: connection refused" logger="UnhandledError" Sep 12 18:56:01.217063 kubelet[2861]: E0912 18:56:01.217049 2861 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 18:56:01.217137 kubelet[2861]: I0912 18:56:01.217127 2861 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 18:56:01.217225 kubelet[2861]: I0912 18:56:01.217212 2861 factory.go:221] Registration of the containerd container factory successfully Sep 12 18:56:01.220188 kubelet[2861]: E0912 18:56:01.217915 2861 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.94.145:6443/api/v1/namespaces/default/events\": dial tcp 139.178.94.145:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4426.1.0-a-3db2d8461d.18649de4f09699e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4426.1.0-a-3db2d8461d,UID:ci-4426.1.0-a-3db2d8461d,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4426.1.0-a-3db2d8461d,},FirstTimestamp:2025-09-12 18:56:01.195842025 +0000 UTC m=+0.296149566,LastTimestamp:2025-09-12 18:56:01.195842025 +0000 UTC m=+0.296149566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4426.1.0-a-3db2d8461d,}" Sep 12 18:56:01.224874 kubelet[2861]: I0912 18:56:01.224841 2861 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 18:56:01.224874 kubelet[2861]: I0912 18:56:01.224850 2861 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 18:56:01.224874 kubelet[2861]: I0912 18:56:01.224860 2861 state_mem.go:36] "Initialized new in-memory state store" Sep 12 18:56:01.225800 kubelet[2861]: I0912 18:56:01.225791 2861 policy_none.go:49] "None policy: Start" Sep 12 18:56:01.225800 kubelet[2861]: I0912 18:56:01.225801 2861 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 18:56:01.225849 kubelet[2861]: I0912 18:56:01.225808 2861 state_mem.go:35] "Initializing new in-memory state store" Sep 12 18:56:01.226480 kubelet[2861]: I0912 18:56:01.226461 2861 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 18:56:01.227092 kubelet[2861]: I0912 18:56:01.227047 2861 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 18:56:01.227092 kubelet[2861]: I0912 18:56:01.227061 2861 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 18:56:01.227092 kubelet[2861]: I0912 18:56:01.227072 2861 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 18:56:01.227092 kubelet[2861]: I0912 18:56:01.227076 2861 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 18:56:01.227171 kubelet[2861]: E0912 18:56:01.227103 2861 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 18:56:01.227310 kubelet[2861]: W0912 18:56:01.227296 2861 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.94.145:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.94.145:6443: connect: connection refused Sep 12 18:56:01.227345 kubelet[2861]: E0912 18:56:01.227316 2861 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.94.145:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.94.145:6443: connect: connection refused" logger="UnhandledError" Sep 12 18:56:01.228677 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 18:56:01.241452 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 18:56:01.243371 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 18:56:01.255372 kubelet[2861]: I0912 18:56:01.255324 2861 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 18:56:01.255466 kubelet[2861]: I0912 18:56:01.255455 2861 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 18:56:01.255509 kubelet[2861]: I0912 18:56:01.255465 2861 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 18:56:01.255604 kubelet[2861]: I0912 18:56:01.255592 2861 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 18:56:01.256040 kubelet[2861]: E0912 18:56:01.256027 2861 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 18:56:01.256079 kubelet[2861]: E0912 18:56:01.256053 2861 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4426.1.0-a-3db2d8461d\" not found" Sep 12 18:56:01.352955 systemd[1]: Created slice kubepods-burstable-pod4a18f9dd885dbd70e4a59acf6bfe5a11.slice - libcontainer container kubepods-burstable-pod4a18f9dd885dbd70e4a59acf6bfe5a11.slice. Sep 12 18:56:01.359392 kubelet[2861]: I0912 18:56:01.359316 2861 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:01.360209 kubelet[2861]: E0912 18:56:01.360143 2861 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.94.145:6443/api/v1/nodes\": dial tcp 139.178.94.145:6443: connect: connection refused" node="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:01.373725 kubelet[2861]: E0912 18:56:01.373638 2861 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.1.0-a-3db2d8461d\" not found" node="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:01.379435 systemd[1]: Created slice kubepods-burstable-pode9e73be75efcf5b88edf31c35ae94dab.slice - libcontainer container kubepods-burstable-pode9e73be75efcf5b88edf31c35ae94dab.slice. Sep 12 18:56:01.395294 kubelet[2861]: E0912 18:56:01.395176 2861 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.1.0-a-3db2d8461d\" not found" node="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:01.399128 kubelet[2861]: I0912 18:56:01.399054 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e9e73be75efcf5b88edf31c35ae94dab-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426.1.0-a-3db2d8461d\" (UID: \"e9e73be75efcf5b88edf31c35ae94dab\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:01.399329 kubelet[2861]: I0912 18:56:01.399167 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/567e42c4bb24700a6148e6b9e1aa7218-kubeconfig\") pod \"kube-scheduler-ci-4426.1.0-a-3db2d8461d\" (UID: \"567e42c4bb24700a6148e6b9e1aa7218\") " pod="kube-system/kube-scheduler-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:01.399329 kubelet[2861]: I0912 18:56:01.399251 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4a18f9dd885dbd70e4a59acf6bfe5a11-ca-certs\") pod \"kube-apiserver-ci-4426.1.0-a-3db2d8461d\" (UID: \"4a18f9dd885dbd70e4a59acf6bfe5a11\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:01.399632 kubelet[2861]: I0912 18:56:01.399326 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4a18f9dd885dbd70e4a59acf6bfe5a11-k8s-certs\") pod \"kube-apiserver-ci-4426.1.0-a-3db2d8461d\" (UID: \"4a18f9dd885dbd70e4a59acf6bfe5a11\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:01.399632 kubelet[2861]: I0912 18:56:01.399403 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4a18f9dd885dbd70e4a59acf6bfe5a11-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426.1.0-a-3db2d8461d\" (UID: \"4a18f9dd885dbd70e4a59acf6bfe5a11\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:01.399632 kubelet[2861]: I0912 18:56:01.399481 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e9e73be75efcf5b88edf31c35ae94dab-ca-certs\") pod \"kube-controller-manager-ci-4426.1.0-a-3db2d8461d\" (UID: \"e9e73be75efcf5b88edf31c35ae94dab\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:01.399632 kubelet[2861]: I0912 18:56:01.399553 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e9e73be75efcf5b88edf31c35ae94dab-k8s-certs\") pod \"kube-controller-manager-ci-4426.1.0-a-3db2d8461d\" (UID: \"e9e73be75efcf5b88edf31c35ae94dab\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:01.400028 kubelet[2861]: I0912 18:56:01.399652 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e9e73be75efcf5b88edf31c35ae94dab-kubeconfig\") pod \"kube-controller-manager-ci-4426.1.0-a-3db2d8461d\" (UID: \"e9e73be75efcf5b88edf31c35ae94dab\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:01.400028 kubelet[2861]: I0912 18:56:01.399728 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e9e73be75efcf5b88edf31c35ae94dab-flexvolume-dir\") pod \"kube-controller-manager-ci-4426.1.0-a-3db2d8461d\" (UID: \"e9e73be75efcf5b88edf31c35ae94dab\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:01.404081 systemd[1]: Created slice kubepods-burstable-pod567e42c4bb24700a6148e6b9e1aa7218.slice - libcontainer container kubepods-burstable-pod567e42c4bb24700a6148e6b9e1aa7218.slice. Sep 12 18:56:01.409099 kubelet[2861]: E0912 18:56:01.409002 2861 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.1.0-a-3db2d8461d\" not found" node="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:01.418158 kubelet[2861]: E0912 18:56:01.417983 2861 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.94.145:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.1.0-a-3db2d8461d?timeout=10s\": dial tcp 139.178.94.145:6443: connect: connection refused" interval="400ms" Sep 12 18:56:01.564807 kubelet[2861]: I0912 18:56:01.564735 2861 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:01.565619 kubelet[2861]: E0912 18:56:01.565539 2861 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.94.145:6443/api/v1/nodes\": dial tcp 139.178.94.145:6443: connect: connection refused" node="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:01.676718 containerd[1921]: time="2025-09-12T18:56:01.676463174Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426.1.0-a-3db2d8461d,Uid:4a18f9dd885dbd70e4a59acf6bfe5a11,Namespace:kube-system,Attempt:0,}" Sep 12 18:56:01.685700 containerd[1921]: time="2025-09-12T18:56:01.685678045Z" level=info msg="connecting to shim ceae479cf33779232ac6811e69d4eb4d894070cae0f4d4c0192a059115413047" address="unix:///run/containerd/s/7fee0c19469b886e303047edb7dcb7356ac0e6611c42165088eb1274ea9858a3" namespace=k8s.io protocol=ttrpc version=3 Sep 12 18:56:01.696714 containerd[1921]: time="2025-09-12T18:56:01.696690933Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426.1.0-a-3db2d8461d,Uid:e9e73be75efcf5b88edf31c35ae94dab,Namespace:kube-system,Attempt:0,}" Sep 12 18:56:01.703769 systemd[1]: Started cri-containerd-ceae479cf33779232ac6811e69d4eb4d894070cae0f4d4c0192a059115413047.scope - libcontainer container ceae479cf33779232ac6811e69d4eb4d894070cae0f4d4c0192a059115413047. Sep 12 18:56:01.704521 containerd[1921]: time="2025-09-12T18:56:01.704502547Z" level=info msg="connecting to shim 29445818f8aceccbefbaddde72e74376140db46c5e3dd6aade2b462d0f9d898e" address="unix:///run/containerd/s/51b828ade1e52102f531af4f684bcccbed0fd86259b20f0d9f82b098366540eb" namespace=k8s.io protocol=ttrpc version=3 Sep 12 18:56:01.710715 containerd[1921]: time="2025-09-12T18:56:01.710696219Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426.1.0-a-3db2d8461d,Uid:567e42c4bb24700a6148e6b9e1aa7218,Namespace:kube-system,Attempt:0,}" Sep 12 18:56:01.713271 systemd[1]: Started cri-containerd-29445818f8aceccbefbaddde72e74376140db46c5e3dd6aade2b462d0f9d898e.scope - libcontainer container 29445818f8aceccbefbaddde72e74376140db46c5e3dd6aade2b462d0f9d898e. Sep 12 18:56:01.717813 containerd[1921]: time="2025-09-12T18:56:01.717786166Z" level=info msg="connecting to shim 03f7b83eaf4997eda3b66e31139b238645af354ae1923e39e41fc28803e57fe2" address="unix:///run/containerd/s/3f3c0cf9f2e4e3d3166f02cf90e72865d9fd44d1e2a986e52eefe675f38181f2" namespace=k8s.io protocol=ttrpc version=3 Sep 12 18:56:01.726424 systemd[1]: Started cri-containerd-03f7b83eaf4997eda3b66e31139b238645af354ae1923e39e41fc28803e57fe2.scope - libcontainer container 03f7b83eaf4997eda3b66e31139b238645af354ae1923e39e41fc28803e57fe2. Sep 12 18:56:01.731944 containerd[1921]: time="2025-09-12T18:56:01.731920055Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426.1.0-a-3db2d8461d,Uid:4a18f9dd885dbd70e4a59acf6bfe5a11,Namespace:kube-system,Attempt:0,} returns sandbox id \"ceae479cf33779232ac6811e69d4eb4d894070cae0f4d4c0192a059115413047\"" Sep 12 18:56:01.733342 containerd[1921]: time="2025-09-12T18:56:01.733327880Z" level=info msg="CreateContainer within sandbox \"ceae479cf33779232ac6811e69d4eb4d894070cae0f4d4c0192a059115413047\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 18:56:01.736302 containerd[1921]: time="2025-09-12T18:56:01.736280713Z" level=info msg="Container 0508e8d8339b6f76996db3137578420a7e238172c818a4841839e3eac1315f23: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:01.738834 containerd[1921]: time="2025-09-12T18:56:01.738791278Z" level=info msg="CreateContainer within sandbox \"ceae479cf33779232ac6811e69d4eb4d894070cae0f4d4c0192a059115413047\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"0508e8d8339b6f76996db3137578420a7e238172c818a4841839e3eac1315f23\"" Sep 12 18:56:01.739078 containerd[1921]: time="2025-09-12T18:56:01.739065955Z" level=info msg="StartContainer for \"0508e8d8339b6f76996db3137578420a7e238172c818a4841839e3eac1315f23\"" Sep 12 18:56:01.739719 containerd[1921]: time="2025-09-12T18:56:01.739706917Z" level=info msg="connecting to shim 0508e8d8339b6f76996db3137578420a7e238172c818a4841839e3eac1315f23" address="unix:///run/containerd/s/7fee0c19469b886e303047edb7dcb7356ac0e6611c42165088eb1274ea9858a3" protocol=ttrpc version=3 Sep 12 18:56:01.740202 containerd[1921]: time="2025-09-12T18:56:01.740189063Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426.1.0-a-3db2d8461d,Uid:e9e73be75efcf5b88edf31c35ae94dab,Namespace:kube-system,Attempt:0,} returns sandbox id \"29445818f8aceccbefbaddde72e74376140db46c5e3dd6aade2b462d0f9d898e\"" Sep 12 18:56:01.742146 containerd[1921]: time="2025-09-12T18:56:01.742128020Z" level=info msg="CreateContainer within sandbox \"29445818f8aceccbefbaddde72e74376140db46c5e3dd6aade2b462d0f9d898e\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 18:56:01.745668 containerd[1921]: time="2025-09-12T18:56:01.745627916Z" level=info msg="Container 93c6c2ebb60a91cdd328d82a37cfb794a696b5dbe7ea39d9e8e943ac9c3a4d0c: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:01.748373 containerd[1921]: time="2025-09-12T18:56:01.748359410Z" level=info msg="CreateContainer within sandbox \"29445818f8aceccbefbaddde72e74376140db46c5e3dd6aade2b462d0f9d898e\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"93c6c2ebb60a91cdd328d82a37cfb794a696b5dbe7ea39d9e8e943ac9c3a4d0c\"" Sep 12 18:56:01.748601 containerd[1921]: time="2025-09-12T18:56:01.748580323Z" level=info msg="StartContainer for \"93c6c2ebb60a91cdd328d82a37cfb794a696b5dbe7ea39d9e8e943ac9c3a4d0c\"" Sep 12 18:56:01.749194 containerd[1921]: time="2025-09-12T18:56:01.749177947Z" level=info msg="connecting to shim 93c6c2ebb60a91cdd328d82a37cfb794a696b5dbe7ea39d9e8e943ac9c3a4d0c" address="unix:///run/containerd/s/51b828ade1e52102f531af4f684bcccbed0fd86259b20f0d9f82b098366540eb" protocol=ttrpc version=3 Sep 12 18:56:01.755791 systemd[1]: Started cri-containerd-0508e8d8339b6f76996db3137578420a7e238172c818a4841839e3eac1315f23.scope - libcontainer container 0508e8d8339b6f76996db3137578420a7e238172c818a4841839e3eac1315f23. Sep 12 18:56:01.756692 containerd[1921]: time="2025-09-12T18:56:01.756648310Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426.1.0-a-3db2d8461d,Uid:567e42c4bb24700a6148e6b9e1aa7218,Namespace:kube-system,Attempt:0,} returns sandbox id \"03f7b83eaf4997eda3b66e31139b238645af354ae1923e39e41fc28803e57fe2\"" Sep 12 18:56:01.757643 systemd[1]: Started cri-containerd-93c6c2ebb60a91cdd328d82a37cfb794a696b5dbe7ea39d9e8e943ac9c3a4d0c.scope - libcontainer container 93c6c2ebb60a91cdd328d82a37cfb794a696b5dbe7ea39d9e8e943ac9c3a4d0c. Sep 12 18:56:01.757815 containerd[1921]: time="2025-09-12T18:56:01.757801759Z" level=info msg="CreateContainer within sandbox \"03f7b83eaf4997eda3b66e31139b238645af354ae1923e39e41fc28803e57fe2\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 18:56:01.760802 containerd[1921]: time="2025-09-12T18:56:01.760785631Z" level=info msg="Container 90f27b6d4ceeffd4bb46e66c7cf2adea109c5a61aea5071015da3a0fe06811c8: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:01.763308 containerd[1921]: time="2025-09-12T18:56:01.763292478Z" level=info msg="CreateContainer within sandbox \"03f7b83eaf4997eda3b66e31139b238645af354ae1923e39e41fc28803e57fe2\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"90f27b6d4ceeffd4bb46e66c7cf2adea109c5a61aea5071015da3a0fe06811c8\"" Sep 12 18:56:01.763559 containerd[1921]: time="2025-09-12T18:56:01.763546613Z" level=info msg="StartContainer for \"90f27b6d4ceeffd4bb46e66c7cf2adea109c5a61aea5071015da3a0fe06811c8\"" Sep 12 18:56:01.764069 containerd[1921]: time="2025-09-12T18:56:01.764057135Z" level=info msg="connecting to shim 90f27b6d4ceeffd4bb46e66c7cf2adea109c5a61aea5071015da3a0fe06811c8" address="unix:///run/containerd/s/3f3c0cf9f2e4e3d3166f02cf90e72865d9fd44d1e2a986e52eefe675f38181f2" protocol=ttrpc version=3 Sep 12 18:56:01.770646 systemd[1]: Started cri-containerd-90f27b6d4ceeffd4bb46e66c7cf2adea109c5a61aea5071015da3a0fe06811c8.scope - libcontainer container 90f27b6d4ceeffd4bb46e66c7cf2adea109c5a61aea5071015da3a0fe06811c8. Sep 12 18:56:01.792290 containerd[1921]: time="2025-09-12T18:56:01.792262358Z" level=info msg="StartContainer for \"0508e8d8339b6f76996db3137578420a7e238172c818a4841839e3eac1315f23\" returns successfully" Sep 12 18:56:01.792402 containerd[1921]: time="2025-09-12T18:56:01.792299783Z" level=info msg="StartContainer for \"93c6c2ebb60a91cdd328d82a37cfb794a696b5dbe7ea39d9e8e943ac9c3a4d0c\" returns successfully" Sep 12 18:56:01.800267 containerd[1921]: time="2025-09-12T18:56:01.800242911Z" level=info msg="StartContainer for \"90f27b6d4ceeffd4bb46e66c7cf2adea109c5a61aea5071015da3a0fe06811c8\" returns successfully" Sep 12 18:56:01.967095 kubelet[2861]: I0912 18:56:01.966993 2861 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:02.230025 kubelet[2861]: E0912 18:56:02.229967 2861 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.1.0-a-3db2d8461d\" not found" node="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:02.230380 kubelet[2861]: E0912 18:56:02.230370 2861 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.1.0-a-3db2d8461d\" not found" node="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:02.230968 kubelet[2861]: E0912 18:56:02.230958 2861 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.1.0-a-3db2d8461d\" not found" node="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:02.371092 kubelet[2861]: E0912 18:56:02.371070 2861 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4426.1.0-a-3db2d8461d\" not found" node="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:02.476808 kubelet[2861]: I0912 18:56:02.476709 2861 kubelet_node_status.go:78] "Successfully registered node" node="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:02.476808 kubelet[2861]: E0912 18:56:02.476780 2861 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4426.1.0-a-3db2d8461d\": node \"ci-4426.1.0-a-3db2d8461d\" not found" Sep 12 18:56:02.510301 kubelet[2861]: E0912 18:56:02.510240 2861 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-3db2d8461d\" not found" Sep 12 18:56:02.598247 kubelet[2861]: I0912 18:56:02.598130 2861 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:02.607545 kubelet[2861]: E0912 18:56:02.607442 2861 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426.1.0-a-3db2d8461d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:02.607545 kubelet[2861]: I0912 18:56:02.607491 2861 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:02.611345 kubelet[2861]: E0912 18:56:02.611248 2861 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426.1.0-a-3db2d8461d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:02.611345 kubelet[2861]: I0912 18:56:02.611303 2861 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:02.615467 kubelet[2861]: E0912 18:56:02.615391 2861 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4426.1.0-a-3db2d8461d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:03.191741 kubelet[2861]: I0912 18:56:03.191721 2861 apiserver.go:52] "Watching apiserver" Sep 12 18:56:03.199034 kubelet[2861]: I0912 18:56:03.198991 2861 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 18:56:03.233095 kubelet[2861]: I0912 18:56:03.233007 2861 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:03.233305 kubelet[2861]: I0912 18:56:03.233198 2861 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:03.237013 kubelet[2861]: E0912 18:56:03.236923 2861 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426.1.0-a-3db2d8461d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:03.237013 kubelet[2861]: E0912 18:56:03.236928 2861 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426.1.0-a-3db2d8461d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:04.084986 kubelet[2861]: I0912 18:56:04.084895 2861 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:04.092379 kubelet[2861]: W0912 18:56:04.092302 2861 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 12 18:56:04.843793 systemd[1]: Reload requested from client PID 3176 ('systemctl') (unit session-11.scope)... Sep 12 18:56:04.843801 systemd[1]: Reloading... Sep 12 18:56:04.887655 zram_generator::config[3221]: No configuration found. Sep 12 18:56:05.054310 systemd[1]: Reloading finished in 210 ms. Sep 12 18:56:05.073029 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 18:56:05.080975 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 18:56:05.081107 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 18:56:05.081162 systemd[1]: kubelet.service: Consumed 824ms CPU time, 141.2M memory peak. Sep 12 18:56:05.082192 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 18:56:05.412740 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 18:56:05.442529 (kubelet)[3285]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 18:56:05.501012 kubelet[3285]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 18:56:05.501012 kubelet[3285]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 18:56:05.501012 kubelet[3285]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 18:56:05.501296 kubelet[3285]: I0912 18:56:05.501091 3285 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 18:56:05.507067 kubelet[3285]: I0912 18:56:05.507028 3285 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 18:56:05.507206 kubelet[3285]: I0912 18:56:05.507193 3285 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 18:56:05.507426 kubelet[3285]: I0912 18:56:05.507414 3285 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 18:56:05.508380 kubelet[3285]: I0912 18:56:05.508364 3285 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 18:56:05.510033 kubelet[3285]: I0912 18:56:05.510017 3285 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 18:56:05.512223 kubelet[3285]: I0912 18:56:05.512205 3285 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 18:56:05.522831 kubelet[3285]: I0912 18:56:05.522802 3285 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 18:56:05.523095 kubelet[3285]: I0912 18:56:05.523049 3285 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 18:56:05.523568 kubelet[3285]: I0912 18:56:05.523097 3285 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426.1.0-a-3db2d8461d","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 18:56:05.523667 kubelet[3285]: I0912 18:56:05.523576 3285 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 18:56:05.523667 kubelet[3285]: I0912 18:56:05.523592 3285 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 18:56:05.523667 kubelet[3285]: I0912 18:56:05.523636 3285 state_mem.go:36] "Initialized new in-memory state store" Sep 12 18:56:05.523836 kubelet[3285]: I0912 18:56:05.523803 3285 kubelet.go:446] "Attempting to sync node with API server" Sep 12 18:56:05.523836 kubelet[3285]: I0912 18:56:05.523821 3285 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 18:56:05.523836 kubelet[3285]: I0912 18:56:05.523839 3285 kubelet.go:352] "Adding apiserver pod source" Sep 12 18:56:05.523934 kubelet[3285]: I0912 18:56:05.523846 3285 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 18:56:05.524439 kubelet[3285]: I0912 18:56:05.524423 3285 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 18:56:05.524763 kubelet[3285]: I0912 18:56:05.524753 3285 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 18:56:05.525073 kubelet[3285]: I0912 18:56:05.525064 3285 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 18:56:05.525109 kubelet[3285]: I0912 18:56:05.525085 3285 server.go:1287] "Started kubelet" Sep 12 18:56:05.525242 kubelet[3285]: I0912 18:56:05.525219 3285 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 18:56:05.525337 kubelet[3285]: I0912 18:56:05.525268 3285 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 18:56:05.525517 kubelet[3285]: I0912 18:56:05.525507 3285 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 18:56:05.526160 kubelet[3285]: I0912 18:56:05.526152 3285 server.go:479] "Adding debug handlers to kubelet server" Sep 12 18:56:05.526276 kubelet[3285]: E0912 18:56:05.526258 3285 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 18:56:05.526665 kubelet[3285]: I0912 18:56:05.526653 3285 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 18:56:05.526713 kubelet[3285]: I0912 18:56:05.526657 3285 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 18:56:05.526764 kubelet[3285]: I0912 18:56:05.526745 3285 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 18:56:05.526804 kubelet[3285]: E0912 18:56:05.526722 3285 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-3db2d8461d\" not found" Sep 12 18:56:05.526839 kubelet[3285]: I0912 18:56:05.526810 3285 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 18:56:05.526964 kubelet[3285]: I0912 18:56:05.526942 3285 reconciler.go:26] "Reconciler: start to sync state" Sep 12 18:56:05.527116 kubelet[3285]: I0912 18:56:05.527102 3285 factory.go:221] Registration of the systemd container factory successfully Sep 12 18:56:05.528953 kubelet[3285]: I0912 18:56:05.527184 3285 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 18:56:05.532434 kubelet[3285]: I0912 18:56:05.532405 3285 factory.go:221] Registration of the containerd container factory successfully Sep 12 18:56:05.535603 kubelet[3285]: I0912 18:56:05.535575 3285 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 18:56:05.536378 kubelet[3285]: I0912 18:56:05.536361 3285 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 18:56:05.536434 kubelet[3285]: I0912 18:56:05.536381 3285 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 18:56:05.536434 kubelet[3285]: I0912 18:56:05.536397 3285 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 18:56:05.536434 kubelet[3285]: I0912 18:56:05.536403 3285 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 18:56:05.536540 kubelet[3285]: E0912 18:56:05.536433 3285 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 18:56:05.552232 kubelet[3285]: I0912 18:56:05.552210 3285 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 18:56:05.552232 kubelet[3285]: I0912 18:56:05.552226 3285 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 18:56:05.552232 kubelet[3285]: I0912 18:56:05.552239 3285 state_mem.go:36] "Initialized new in-memory state store" Sep 12 18:56:05.552387 kubelet[3285]: I0912 18:56:05.552376 3285 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 18:56:05.552426 kubelet[3285]: I0912 18:56:05.552386 3285 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 18:56:05.552426 kubelet[3285]: I0912 18:56:05.552403 3285 policy_none.go:49] "None policy: Start" Sep 12 18:56:05.552426 kubelet[3285]: I0912 18:56:05.552409 3285 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 18:56:05.552426 kubelet[3285]: I0912 18:56:05.552417 3285 state_mem.go:35] "Initializing new in-memory state store" Sep 12 18:56:05.552502 kubelet[3285]: I0912 18:56:05.552494 3285 state_mem.go:75] "Updated machine memory state" Sep 12 18:56:05.555178 kubelet[3285]: I0912 18:56:05.555136 3285 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 18:56:05.555299 kubelet[3285]: I0912 18:56:05.555250 3285 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 18:56:05.555299 kubelet[3285]: I0912 18:56:05.555261 3285 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 18:56:05.555393 kubelet[3285]: I0912 18:56:05.555383 3285 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 18:56:05.555863 kubelet[3285]: E0912 18:56:05.555849 3285 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 18:56:05.638003 kubelet[3285]: I0912 18:56:05.637874 3285 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:05.638228 kubelet[3285]: I0912 18:56:05.637889 3285 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:05.638228 kubelet[3285]: I0912 18:56:05.637871 3285 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:05.645342 kubelet[3285]: W0912 18:56:05.645279 3285 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 12 18:56:05.645583 kubelet[3285]: W0912 18:56:05.645467 3285 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 12 18:56:05.645914 kubelet[3285]: W0912 18:56:05.645875 3285 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 12 18:56:05.646053 kubelet[3285]: E0912 18:56:05.645978 3285 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4426.1.0-a-3db2d8461d\" already exists" pod="kube-system/kube-controller-manager-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:05.663115 kubelet[3285]: I0912 18:56:05.662921 3285 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:05.673320 kubelet[3285]: I0912 18:56:05.673263 3285 kubelet_node_status.go:124] "Node was previously registered" node="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:05.673538 kubelet[3285]: I0912 18:56:05.673406 3285 kubelet_node_status.go:78] "Successfully registered node" node="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:05.829352 kubelet[3285]: I0912 18:56:05.829246 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e9e73be75efcf5b88edf31c35ae94dab-k8s-certs\") pod \"kube-controller-manager-ci-4426.1.0-a-3db2d8461d\" (UID: \"e9e73be75efcf5b88edf31c35ae94dab\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:05.829352 kubelet[3285]: I0912 18:56:05.829329 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/567e42c4bb24700a6148e6b9e1aa7218-kubeconfig\") pod \"kube-scheduler-ci-4426.1.0-a-3db2d8461d\" (UID: \"567e42c4bb24700a6148e6b9e1aa7218\") " pod="kube-system/kube-scheduler-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:05.829753 kubelet[3285]: I0912 18:56:05.829389 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4a18f9dd885dbd70e4a59acf6bfe5a11-k8s-certs\") pod \"kube-apiserver-ci-4426.1.0-a-3db2d8461d\" (UID: \"4a18f9dd885dbd70e4a59acf6bfe5a11\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:05.829753 kubelet[3285]: I0912 18:56:05.829500 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e9e73be75efcf5b88edf31c35ae94dab-ca-certs\") pod \"kube-controller-manager-ci-4426.1.0-a-3db2d8461d\" (UID: \"e9e73be75efcf5b88edf31c35ae94dab\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:05.829753 kubelet[3285]: I0912 18:56:05.829637 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e9e73be75efcf5b88edf31c35ae94dab-flexvolume-dir\") pod \"kube-controller-manager-ci-4426.1.0-a-3db2d8461d\" (UID: \"e9e73be75efcf5b88edf31c35ae94dab\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:05.829753 kubelet[3285]: I0912 18:56:05.829726 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e9e73be75efcf5b88edf31c35ae94dab-kubeconfig\") pod \"kube-controller-manager-ci-4426.1.0-a-3db2d8461d\" (UID: \"e9e73be75efcf5b88edf31c35ae94dab\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:05.830140 kubelet[3285]: I0912 18:56:05.829784 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e9e73be75efcf5b88edf31c35ae94dab-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426.1.0-a-3db2d8461d\" (UID: \"e9e73be75efcf5b88edf31c35ae94dab\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:05.830140 kubelet[3285]: I0912 18:56:05.829844 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4a18f9dd885dbd70e4a59acf6bfe5a11-ca-certs\") pod \"kube-apiserver-ci-4426.1.0-a-3db2d8461d\" (UID: \"4a18f9dd885dbd70e4a59acf6bfe5a11\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:05.830140 kubelet[3285]: I0912 18:56:05.829897 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4a18f9dd885dbd70e4a59acf6bfe5a11-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426.1.0-a-3db2d8461d\" (UID: \"4a18f9dd885dbd70e4a59acf6bfe5a11\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:06.524757 kubelet[3285]: I0912 18:56:06.524710 3285 apiserver.go:52] "Watching apiserver" Sep 12 18:56:06.527181 kubelet[3285]: I0912 18:56:06.527169 3285 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 18:56:06.541353 kubelet[3285]: I0912 18:56:06.541339 3285 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:06.541433 kubelet[3285]: I0912 18:56:06.541413 3285 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:06.544168 kubelet[3285]: W0912 18:56:06.544158 3285 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 12 18:56:06.544243 kubelet[3285]: E0912 18:56:06.544200 3285 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426.1.0-a-3db2d8461d\" already exists" pod="kube-system/kube-apiserver-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:06.544446 kubelet[3285]: W0912 18:56:06.544438 3285 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 12 18:56:06.544473 kubelet[3285]: E0912 18:56:06.544457 3285 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426.1.0-a-3db2d8461d\" already exists" pod="kube-system/kube-scheduler-ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:06.558549 kubelet[3285]: I0912 18:56:06.558504 3285 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4426.1.0-a-3db2d8461d" podStartSLOduration=1.558470121 podStartE2EDuration="1.558470121s" podCreationTimestamp="2025-09-12 18:56:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 18:56:06.558464563 +0000 UTC m=+1.107850952" watchObservedRunningTime="2025-09-12 18:56:06.558470121 +0000 UTC m=+1.107856512" Sep 12 18:56:06.565781 kubelet[3285]: I0912 18:56:06.565748 3285 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4426.1.0-a-3db2d8461d" podStartSLOduration=2.565735316 podStartE2EDuration="2.565735316s" podCreationTimestamp="2025-09-12 18:56:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 18:56:06.565729234 +0000 UTC m=+1.115115627" watchObservedRunningTime="2025-09-12 18:56:06.565735316 +0000 UTC m=+1.115121702" Sep 12 18:56:06.565886 kubelet[3285]: I0912 18:56:06.565809 3285 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4426.1.0-a-3db2d8461d" podStartSLOduration=1.565804708 podStartE2EDuration="1.565804708s" podCreationTimestamp="2025-09-12 18:56:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 18:56:06.5622694 +0000 UTC m=+1.111655788" watchObservedRunningTime="2025-09-12 18:56:06.565804708 +0000 UTC m=+1.115191094" Sep 12 18:56:10.204836 kubelet[3285]: I0912 18:56:10.204700 3285 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 18:56:10.205771 containerd[1921]: time="2025-09-12T18:56:10.205388233Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 18:56:10.206552 kubelet[3285]: I0912 18:56:10.205807 3285 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 18:56:11.124344 systemd[1]: Created slice kubepods-besteffort-pod8f4387c9_b0c2_4fbf_8494_e6c139349673.slice - libcontainer container kubepods-besteffort-pod8f4387c9_b0c2_4fbf_8494_e6c139349673.slice. Sep 12 18:56:11.167012 kubelet[3285]: I0912 18:56:11.166919 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8f4387c9-b0c2-4fbf-8494-e6c139349673-xtables-lock\") pod \"kube-proxy-bkrsh\" (UID: \"8f4387c9-b0c2-4fbf-8494-e6c139349673\") " pod="kube-system/kube-proxy-bkrsh" Sep 12 18:56:11.167012 kubelet[3285]: I0912 18:56:11.166987 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l82c4\" (UniqueName: \"kubernetes.io/projected/8f4387c9-b0c2-4fbf-8494-e6c139349673-kube-api-access-l82c4\") pod \"kube-proxy-bkrsh\" (UID: \"8f4387c9-b0c2-4fbf-8494-e6c139349673\") " pod="kube-system/kube-proxy-bkrsh" Sep 12 18:56:11.167262 kubelet[3285]: I0912 18:56:11.167030 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/8f4387c9-b0c2-4fbf-8494-e6c139349673-kube-proxy\") pod \"kube-proxy-bkrsh\" (UID: \"8f4387c9-b0c2-4fbf-8494-e6c139349673\") " pod="kube-system/kube-proxy-bkrsh" Sep 12 18:56:11.167262 kubelet[3285]: I0912 18:56:11.167061 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8f4387c9-b0c2-4fbf-8494-e6c139349673-lib-modules\") pod \"kube-proxy-bkrsh\" (UID: \"8f4387c9-b0c2-4fbf-8494-e6c139349673\") " pod="kube-system/kube-proxy-bkrsh" Sep 12 18:56:11.235922 systemd[1]: Created slice kubepods-besteffort-pod5d9191cb_1eee_4efa_b899_29c47ba63e27.slice - libcontainer container kubepods-besteffort-pod5d9191cb_1eee_4efa_b899_29c47ba63e27.slice. Sep 12 18:56:11.268062 kubelet[3285]: I0912 18:56:11.267993 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/5d9191cb-1eee-4efa-b899-29c47ba63e27-var-lib-calico\") pod \"tigera-operator-755d956888-6kbfk\" (UID: \"5d9191cb-1eee-4efa-b899-29c47ba63e27\") " pod="tigera-operator/tigera-operator-755d956888-6kbfk" Sep 12 18:56:11.269261 kubelet[3285]: I0912 18:56:11.268079 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqftc\" (UniqueName: \"kubernetes.io/projected/5d9191cb-1eee-4efa-b899-29c47ba63e27-kube-api-access-qqftc\") pod \"tigera-operator-755d956888-6kbfk\" (UID: \"5d9191cb-1eee-4efa-b899-29c47ba63e27\") " pod="tigera-operator/tigera-operator-755d956888-6kbfk" Sep 12 18:56:11.445989 containerd[1921]: time="2025-09-12T18:56:11.445772826Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-bkrsh,Uid:8f4387c9-b0c2-4fbf-8494-e6c139349673,Namespace:kube-system,Attempt:0,}" Sep 12 18:56:11.541035 containerd[1921]: time="2025-09-12T18:56:11.540985559Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-6kbfk,Uid:5d9191cb-1eee-4efa-b899-29c47ba63e27,Namespace:tigera-operator,Attempt:0,}" Sep 12 18:56:11.760722 containerd[1921]: time="2025-09-12T18:56:11.760698649Z" level=info msg="connecting to shim 7fc5a3aa2b635b1f04ddc1d57e5932eb550f1740ecead0a7bf13896007214f61" address="unix:///run/containerd/s/a3589b256e517ead16cf265dbbb303616659cf159f820e2f2b7c7c6b551ad351" namespace=k8s.io protocol=ttrpc version=3 Sep 12 18:56:11.761249 containerd[1921]: time="2025-09-12T18:56:11.761232776Z" level=info msg="connecting to shim 103bd90771156adcce369946831a04728b719581889e59faf7d164bff9c7b7c4" address="unix:///run/containerd/s/9cecf34f2ccd9c31da6263adada51a20f889c4e045d8c4a57955aac650f00609" namespace=k8s.io protocol=ttrpc version=3 Sep 12 18:56:11.783767 systemd[1]: Started cri-containerd-103bd90771156adcce369946831a04728b719581889e59faf7d164bff9c7b7c4.scope - libcontainer container 103bd90771156adcce369946831a04728b719581889e59faf7d164bff9c7b7c4. Sep 12 18:56:11.784621 systemd[1]: Started cri-containerd-7fc5a3aa2b635b1f04ddc1d57e5932eb550f1740ecead0a7bf13896007214f61.scope - libcontainer container 7fc5a3aa2b635b1f04ddc1d57e5932eb550f1740ecead0a7bf13896007214f61. Sep 12 18:56:11.798598 containerd[1921]: time="2025-09-12T18:56:11.798563228Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-bkrsh,Uid:8f4387c9-b0c2-4fbf-8494-e6c139349673,Namespace:kube-system,Attempt:0,} returns sandbox id \"7fc5a3aa2b635b1f04ddc1d57e5932eb550f1740ecead0a7bf13896007214f61\"" Sep 12 18:56:11.799767 containerd[1921]: time="2025-09-12T18:56:11.799751309Z" level=info msg="CreateContainer within sandbox \"7fc5a3aa2b635b1f04ddc1d57e5932eb550f1740ecead0a7bf13896007214f61\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 18:56:11.803879 containerd[1921]: time="2025-09-12T18:56:11.803838575Z" level=info msg="Container 01dd60dc3b06ab58c6b3fedd63314ad8e89ac97b961bfab33bfb1340fcfb0b7e: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:11.807078 containerd[1921]: time="2025-09-12T18:56:11.807059999Z" level=info msg="CreateContainer within sandbox \"7fc5a3aa2b635b1f04ddc1d57e5932eb550f1740ecead0a7bf13896007214f61\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"01dd60dc3b06ab58c6b3fedd63314ad8e89ac97b961bfab33bfb1340fcfb0b7e\"" Sep 12 18:56:11.807409 containerd[1921]: time="2025-09-12T18:56:11.807396886Z" level=info msg="StartContainer for \"01dd60dc3b06ab58c6b3fedd63314ad8e89ac97b961bfab33bfb1340fcfb0b7e\"" Sep 12 18:56:11.808192 containerd[1921]: time="2025-09-12T18:56:11.808179663Z" level=info msg="connecting to shim 01dd60dc3b06ab58c6b3fedd63314ad8e89ac97b961bfab33bfb1340fcfb0b7e" address="unix:///run/containerd/s/a3589b256e517ead16cf265dbbb303616659cf159f820e2f2b7c7c6b551ad351" protocol=ttrpc version=3 Sep 12 18:56:11.813857 containerd[1921]: time="2025-09-12T18:56:11.813818660Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-6kbfk,Uid:5d9191cb-1eee-4efa-b899-29c47ba63e27,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"103bd90771156adcce369946831a04728b719581889e59faf7d164bff9c7b7c4\"" Sep 12 18:56:11.814649 containerd[1921]: time="2025-09-12T18:56:11.814638122Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 18:56:11.826900 systemd[1]: Started cri-containerd-01dd60dc3b06ab58c6b3fedd63314ad8e89ac97b961bfab33bfb1340fcfb0b7e.scope - libcontainer container 01dd60dc3b06ab58c6b3fedd63314ad8e89ac97b961bfab33bfb1340fcfb0b7e. Sep 12 18:56:11.847856 containerd[1921]: time="2025-09-12T18:56:11.847833576Z" level=info msg="StartContainer for \"01dd60dc3b06ab58c6b3fedd63314ad8e89ac97b961bfab33bfb1340fcfb0b7e\" returns successfully" Sep 12 18:56:12.573757 kubelet[3285]: I0912 18:56:12.573657 3285 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-bkrsh" podStartSLOduration=1.573647565 podStartE2EDuration="1.573647565s" podCreationTimestamp="2025-09-12 18:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 18:56:12.573594241 +0000 UTC m=+7.122980632" watchObservedRunningTime="2025-09-12 18:56:12.573647565 +0000 UTC m=+7.123033950" Sep 12 18:56:13.281405 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount812415906.mount: Deactivated successfully. Sep 12 18:56:13.531313 containerd[1921]: time="2025-09-12T18:56:13.531252305Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:13.531547 containerd[1921]: time="2025-09-12T18:56:13.531440615Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 12 18:56:13.531878 containerd[1921]: time="2025-09-12T18:56:13.531833619Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:13.532717 containerd[1921]: time="2025-09-12T18:56:13.532674468Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:13.533107 containerd[1921]: time="2025-09-12T18:56:13.533069314Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 1.718416138s" Sep 12 18:56:13.533107 containerd[1921]: time="2025-09-12T18:56:13.533083692Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 12 18:56:13.534057 containerd[1921]: time="2025-09-12T18:56:13.534014171Z" level=info msg="CreateContainer within sandbox \"103bd90771156adcce369946831a04728b719581889e59faf7d164bff9c7b7c4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 18:56:13.536592 containerd[1921]: time="2025-09-12T18:56:13.536547954Z" level=info msg="Container 649ccd2af6188be53b82e09488e2a966209f1daa048ea8bea29c306b26e9b22d: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:13.538583 containerd[1921]: time="2025-09-12T18:56:13.538570603Z" level=info msg="CreateContainer within sandbox \"103bd90771156adcce369946831a04728b719581889e59faf7d164bff9c7b7c4\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"649ccd2af6188be53b82e09488e2a966209f1daa048ea8bea29c306b26e9b22d\"" Sep 12 18:56:13.538769 containerd[1921]: time="2025-09-12T18:56:13.538755529Z" level=info msg="StartContainer for \"649ccd2af6188be53b82e09488e2a966209f1daa048ea8bea29c306b26e9b22d\"" Sep 12 18:56:13.539157 containerd[1921]: time="2025-09-12T18:56:13.539145186Z" level=info msg="connecting to shim 649ccd2af6188be53b82e09488e2a966209f1daa048ea8bea29c306b26e9b22d" address="unix:///run/containerd/s/9cecf34f2ccd9c31da6263adada51a20f889c4e045d8c4a57955aac650f00609" protocol=ttrpc version=3 Sep 12 18:56:13.557913 systemd[1]: Started cri-containerd-649ccd2af6188be53b82e09488e2a966209f1daa048ea8bea29c306b26e9b22d.scope - libcontainer container 649ccd2af6188be53b82e09488e2a966209f1daa048ea8bea29c306b26e9b22d. Sep 12 18:56:13.571090 containerd[1921]: time="2025-09-12T18:56:13.571064575Z" level=info msg="StartContainer for \"649ccd2af6188be53b82e09488e2a966209f1daa048ea8bea29c306b26e9b22d\" returns successfully" Sep 12 18:56:14.597398 kubelet[3285]: I0912 18:56:14.597282 3285 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-6kbfk" podStartSLOduration=1.878246037 podStartE2EDuration="3.597243451s" podCreationTimestamp="2025-09-12 18:56:11 +0000 UTC" firstStartedPulling="2025-09-12 18:56:11.81443659 +0000 UTC m=+6.363822983" lastFinishedPulling="2025-09-12 18:56:13.533434012 +0000 UTC m=+8.082820397" observedRunningTime="2025-09-12 18:56:14.596831033 +0000 UTC m=+9.146217532" watchObservedRunningTime="2025-09-12 18:56:14.597243451 +0000 UTC m=+9.146629895" Sep 12 18:56:17.872208 sudo[2226]: pam_unix(sudo:session): session closed for user root Sep 12 18:56:17.873191 sshd[2225]: Connection closed by 139.178.89.65 port 46154 Sep 12 18:56:17.873412 sshd-session[2222]: pam_unix(sshd:session): session closed for user core Sep 12 18:56:17.875931 systemd[1]: sshd@8-139.178.94.145:22-139.178.89.65:46154.service: Deactivated successfully. Sep 12 18:56:17.877062 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 18:56:17.877167 systemd[1]: session-11.scope: Consumed 3.192s CPU time, 228.9M memory peak. Sep 12 18:56:17.878273 systemd-logind[1909]: Session 11 logged out. Waiting for processes to exit. Sep 12 18:56:17.878954 systemd-logind[1909]: Removed session 11. Sep 12 18:56:18.749702 update_engine[1914]: I20250912 18:56:18.749665 1914 update_attempter.cc:509] Updating boot flags... Sep 12 18:56:20.239890 systemd[1]: Created slice kubepods-besteffort-poda8909f64_52ec_4093_9517_4aa32667a2a7.slice - libcontainer container kubepods-besteffort-poda8909f64_52ec_4093_9517_4aa32667a2a7.slice. Sep 12 18:56:20.327917 kubelet[3285]: I0912 18:56:20.327835 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8909f64-52ec-4093-9517-4aa32667a2a7-tigera-ca-bundle\") pod \"calico-typha-6c594cc55-pnw9h\" (UID: \"a8909f64-52ec-4093-9517-4aa32667a2a7\") " pod="calico-system/calico-typha-6c594cc55-pnw9h" Sep 12 18:56:20.328835 kubelet[3285]: I0912 18:56:20.327929 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l26v8\" (UniqueName: \"kubernetes.io/projected/a8909f64-52ec-4093-9517-4aa32667a2a7-kube-api-access-l26v8\") pod \"calico-typha-6c594cc55-pnw9h\" (UID: \"a8909f64-52ec-4093-9517-4aa32667a2a7\") " pod="calico-system/calico-typha-6c594cc55-pnw9h" Sep 12 18:56:20.328835 kubelet[3285]: I0912 18:56:20.327986 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/a8909f64-52ec-4093-9517-4aa32667a2a7-typha-certs\") pod \"calico-typha-6c594cc55-pnw9h\" (UID: \"a8909f64-52ec-4093-9517-4aa32667a2a7\") " pod="calico-system/calico-typha-6c594cc55-pnw9h" Sep 12 18:56:20.544042 containerd[1921]: time="2025-09-12T18:56:20.543970450Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6c594cc55-pnw9h,Uid:a8909f64-52ec-4093-9517-4aa32667a2a7,Namespace:calico-system,Attempt:0,}" Sep 12 18:56:20.550136 systemd[1]: Created slice kubepods-besteffort-pod9148df86_8ee7_4dad_9d2e_254fbf363a2f.slice - libcontainer container kubepods-besteffort-pod9148df86_8ee7_4dad_9d2e_254fbf363a2f.slice. Sep 12 18:56:20.557905 containerd[1921]: time="2025-09-12T18:56:20.557880386Z" level=info msg="connecting to shim d8ebb2c5869a55487d293aa43ee857f041c5ac173b48fd5136bbc74e6005fc56" address="unix:///run/containerd/s/100afbdc329f3fcf9b8db28e89504f05c5df4accbaa1a53d134a0960339c08f7" namespace=k8s.io protocol=ttrpc version=3 Sep 12 18:56:20.579920 systemd[1]: Started cri-containerd-d8ebb2c5869a55487d293aa43ee857f041c5ac173b48fd5136bbc74e6005fc56.scope - libcontainer container d8ebb2c5869a55487d293aa43ee857f041c5ac173b48fd5136bbc74e6005fc56. Sep 12 18:56:20.605100 containerd[1921]: time="2025-09-12T18:56:20.605079179Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6c594cc55-pnw9h,Uid:a8909f64-52ec-4093-9517-4aa32667a2a7,Namespace:calico-system,Attempt:0,} returns sandbox id \"d8ebb2c5869a55487d293aa43ee857f041c5ac173b48fd5136bbc74e6005fc56\"" Sep 12 18:56:20.605748 containerd[1921]: time="2025-09-12T18:56:20.605737326Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 18:56:20.630713 kubelet[3285]: I0912 18:56:20.630658 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9148df86-8ee7-4dad-9d2e-254fbf363a2f-tigera-ca-bundle\") pod \"calico-node-574mq\" (UID: \"9148df86-8ee7-4dad-9d2e-254fbf363a2f\") " pod="calico-system/calico-node-574mq" Sep 12 18:56:20.630713 kubelet[3285]: I0912 18:56:20.630694 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/9148df86-8ee7-4dad-9d2e-254fbf363a2f-cni-bin-dir\") pod \"calico-node-574mq\" (UID: \"9148df86-8ee7-4dad-9d2e-254fbf363a2f\") " pod="calico-system/calico-node-574mq" Sep 12 18:56:20.630713 kubelet[3285]: I0912 18:56:20.630714 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/9148df86-8ee7-4dad-9d2e-254fbf363a2f-cni-log-dir\") pod \"calico-node-574mq\" (UID: \"9148df86-8ee7-4dad-9d2e-254fbf363a2f\") " pod="calico-system/calico-node-574mq" Sep 12 18:56:20.630851 kubelet[3285]: I0912 18:56:20.630730 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9148df86-8ee7-4dad-9d2e-254fbf363a2f-var-lib-calico\") pod \"calico-node-574mq\" (UID: \"9148df86-8ee7-4dad-9d2e-254fbf363a2f\") " pod="calico-system/calico-node-574mq" Sep 12 18:56:20.630851 kubelet[3285]: I0912 18:56:20.630744 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/9148df86-8ee7-4dad-9d2e-254fbf363a2f-flexvol-driver-host\") pod \"calico-node-574mq\" (UID: \"9148df86-8ee7-4dad-9d2e-254fbf363a2f\") " pod="calico-system/calico-node-574mq" Sep 12 18:56:20.630851 kubelet[3285]: I0912 18:56:20.630756 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/9148df86-8ee7-4dad-9d2e-254fbf363a2f-policysync\") pod \"calico-node-574mq\" (UID: \"9148df86-8ee7-4dad-9d2e-254fbf363a2f\") " pod="calico-system/calico-node-574mq" Sep 12 18:56:20.630851 kubelet[3285]: I0912 18:56:20.630768 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/9148df86-8ee7-4dad-9d2e-254fbf363a2f-cni-net-dir\") pod \"calico-node-574mq\" (UID: \"9148df86-8ee7-4dad-9d2e-254fbf363a2f\") " pod="calico-system/calico-node-574mq" Sep 12 18:56:20.630851 kubelet[3285]: I0912 18:56:20.630798 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9148df86-8ee7-4dad-9d2e-254fbf363a2f-lib-modules\") pod \"calico-node-574mq\" (UID: \"9148df86-8ee7-4dad-9d2e-254fbf363a2f\") " pod="calico-system/calico-node-574mq" Sep 12 18:56:20.630957 kubelet[3285]: I0912 18:56:20.630825 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/9148df86-8ee7-4dad-9d2e-254fbf363a2f-node-certs\") pod \"calico-node-574mq\" (UID: \"9148df86-8ee7-4dad-9d2e-254fbf363a2f\") " pod="calico-system/calico-node-574mq" Sep 12 18:56:20.630957 kubelet[3285]: I0912 18:56:20.630837 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/9148df86-8ee7-4dad-9d2e-254fbf363a2f-var-run-calico\") pod \"calico-node-574mq\" (UID: \"9148df86-8ee7-4dad-9d2e-254fbf363a2f\") " pod="calico-system/calico-node-574mq" Sep 12 18:56:20.630957 kubelet[3285]: I0912 18:56:20.630857 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbnhq\" (UniqueName: \"kubernetes.io/projected/9148df86-8ee7-4dad-9d2e-254fbf363a2f-kube-api-access-mbnhq\") pod \"calico-node-574mq\" (UID: \"9148df86-8ee7-4dad-9d2e-254fbf363a2f\") " pod="calico-system/calico-node-574mq" Sep 12 18:56:20.630957 kubelet[3285]: I0912 18:56:20.630880 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9148df86-8ee7-4dad-9d2e-254fbf363a2f-xtables-lock\") pod \"calico-node-574mq\" (UID: \"9148df86-8ee7-4dad-9d2e-254fbf363a2f\") " pod="calico-system/calico-node-574mq" Sep 12 18:56:20.734498 kubelet[3285]: E0912 18:56:20.734435 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.734498 kubelet[3285]: W0912 18:56:20.734483 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.734873 kubelet[3285]: E0912 18:56:20.734551 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.740507 kubelet[3285]: E0912 18:56:20.740451 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.740507 kubelet[3285]: W0912 18:56:20.740499 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.740872 kubelet[3285]: E0912 18:56:20.740562 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.753815 kubelet[3285]: E0912 18:56:20.753725 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.753815 kubelet[3285]: W0912 18:56:20.753769 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.753815 kubelet[3285]: E0912 18:56:20.753814 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.793111 kubelet[3285]: E0912 18:56:20.793042 3285 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mkb45" podUID="19fb1996-8df9-4ece-9557-f77103d3c3c7" Sep 12 18:56:20.820741 kubelet[3285]: E0912 18:56:20.820650 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.820741 kubelet[3285]: W0912 18:56:20.820676 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.820741 kubelet[3285]: E0912 18:56:20.820698 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.820986 kubelet[3285]: E0912 18:56:20.820966 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.820986 kubelet[3285]: W0912 18:56:20.820982 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.821102 kubelet[3285]: E0912 18:56:20.820996 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.821282 kubelet[3285]: E0912 18:56:20.821265 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.821282 kubelet[3285]: W0912 18:56:20.821280 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.821384 kubelet[3285]: E0912 18:56:20.821295 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.821598 kubelet[3285]: E0912 18:56:20.821576 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.821669 kubelet[3285]: W0912 18:56:20.821608 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.821669 kubelet[3285]: E0912 18:56:20.821623 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.821870 kubelet[3285]: E0912 18:56:20.821855 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.821870 kubelet[3285]: W0912 18:56:20.821868 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.821973 kubelet[3285]: E0912 18:56:20.821880 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.822077 kubelet[3285]: E0912 18:56:20.822061 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.822077 kubelet[3285]: W0912 18:56:20.822072 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.822179 kubelet[3285]: E0912 18:56:20.822083 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.822273 kubelet[3285]: E0912 18:56:20.822259 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.822273 kubelet[3285]: W0912 18:56:20.822270 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.822376 kubelet[3285]: E0912 18:56:20.822280 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.822459 kubelet[3285]: E0912 18:56:20.822448 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.822459 kubelet[3285]: W0912 18:56:20.822458 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.822553 kubelet[3285]: E0912 18:56:20.822470 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.822668 kubelet[3285]: E0912 18:56:20.822655 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.822668 kubelet[3285]: W0912 18:56:20.822666 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.822768 kubelet[3285]: E0912 18:56:20.822677 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.822894 kubelet[3285]: E0912 18:56:20.822882 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.822894 kubelet[3285]: W0912 18:56:20.822893 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.823030 kubelet[3285]: E0912 18:56:20.822903 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.823130 kubelet[3285]: E0912 18:56:20.823118 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.823130 kubelet[3285]: W0912 18:56:20.823129 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.823211 kubelet[3285]: E0912 18:56:20.823139 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.823364 kubelet[3285]: E0912 18:56:20.823351 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.823364 kubelet[3285]: W0912 18:56:20.823363 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.823451 kubelet[3285]: E0912 18:56:20.823375 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.823615 kubelet[3285]: E0912 18:56:20.823601 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.823615 kubelet[3285]: W0912 18:56:20.823613 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.823692 kubelet[3285]: E0912 18:56:20.823625 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.823822 kubelet[3285]: E0912 18:56:20.823811 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.823876 kubelet[3285]: W0912 18:56:20.823822 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.823876 kubelet[3285]: E0912 18:56:20.823832 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.824068 kubelet[3285]: E0912 18:56:20.824056 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.824068 kubelet[3285]: W0912 18:56:20.824067 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.824147 kubelet[3285]: E0912 18:56:20.824077 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.824321 kubelet[3285]: E0912 18:56:20.824309 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.824321 kubelet[3285]: W0912 18:56:20.824321 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.824410 kubelet[3285]: E0912 18:56:20.824332 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.824569 kubelet[3285]: E0912 18:56:20.824557 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.824569 kubelet[3285]: W0912 18:56:20.824569 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.824672 kubelet[3285]: E0912 18:56:20.824580 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.824815 kubelet[3285]: E0912 18:56:20.824803 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.824815 kubelet[3285]: W0912 18:56:20.824814 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.824897 kubelet[3285]: E0912 18:56:20.824827 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.825048 kubelet[3285]: E0912 18:56:20.825035 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.825048 kubelet[3285]: W0912 18:56:20.825047 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.825126 kubelet[3285]: E0912 18:56:20.825057 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.825248 kubelet[3285]: E0912 18:56:20.825238 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.825298 kubelet[3285]: W0912 18:56:20.825248 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.825298 kubelet[3285]: E0912 18:56:20.825259 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.832633 kubelet[3285]: E0912 18:56:20.832604 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.832633 kubelet[3285]: W0912 18:56:20.832623 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.832752 kubelet[3285]: E0912 18:56:20.832640 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.832752 kubelet[3285]: I0912 18:56:20.832666 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/19fb1996-8df9-4ece-9557-f77103d3c3c7-socket-dir\") pod \"csi-node-driver-mkb45\" (UID: \"19fb1996-8df9-4ece-9557-f77103d3c3c7\") " pod="calico-system/csi-node-driver-mkb45" Sep 12 18:56:20.832994 kubelet[3285]: E0912 18:56:20.832946 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.832994 kubelet[3285]: W0912 18:56:20.832963 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.832994 kubelet[3285]: E0912 18:56:20.832982 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.833143 kubelet[3285]: I0912 18:56:20.833005 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/19fb1996-8df9-4ece-9557-f77103d3c3c7-varrun\") pod \"csi-node-driver-mkb45\" (UID: \"19fb1996-8df9-4ece-9557-f77103d3c3c7\") " pod="calico-system/csi-node-driver-mkb45" Sep 12 18:56:20.833365 kubelet[3285]: E0912 18:56:20.833316 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.833365 kubelet[3285]: W0912 18:56:20.833334 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.833365 kubelet[3285]: E0912 18:56:20.833353 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.833517 kubelet[3285]: I0912 18:56:20.833377 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmh7n\" (UniqueName: \"kubernetes.io/projected/19fb1996-8df9-4ece-9557-f77103d3c3c7-kube-api-access-tmh7n\") pod \"csi-node-driver-mkb45\" (UID: \"19fb1996-8df9-4ece-9557-f77103d3c3c7\") " pod="calico-system/csi-node-driver-mkb45" Sep 12 18:56:20.833689 kubelet[3285]: E0912 18:56:20.833640 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.833689 kubelet[3285]: W0912 18:56:20.833654 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.833689 kubelet[3285]: E0912 18:56:20.833689 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.833838 kubelet[3285]: I0912 18:56:20.833714 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19fb1996-8df9-4ece-9557-f77103d3c3c7-kubelet-dir\") pod \"csi-node-driver-mkb45\" (UID: \"19fb1996-8df9-4ece-9557-f77103d3c3c7\") " pod="calico-system/csi-node-driver-mkb45" Sep 12 18:56:20.834007 kubelet[3285]: E0912 18:56:20.833954 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.834007 kubelet[3285]: W0912 18:56:20.833971 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.834007 kubelet[3285]: E0912 18:56:20.833985 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.834007 kubelet[3285]: I0912 18:56:20.834003 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/19fb1996-8df9-4ece-9557-f77103d3c3c7-registration-dir\") pod \"csi-node-driver-mkb45\" (UID: \"19fb1996-8df9-4ece-9557-f77103d3c3c7\") " pod="calico-system/csi-node-driver-mkb45" Sep 12 18:56:20.834352 kubelet[3285]: E0912 18:56:20.834303 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.834352 kubelet[3285]: W0912 18:56:20.834322 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.834352 kubelet[3285]: E0912 18:56:20.834342 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.834545 kubelet[3285]: E0912 18:56:20.834533 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.834545 kubelet[3285]: W0912 18:56:20.834544 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.834645 kubelet[3285]: E0912 18:56:20.834572 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.834819 kubelet[3285]: E0912 18:56:20.834804 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.834873 kubelet[3285]: W0912 18:56:20.834820 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.834873 kubelet[3285]: E0912 18:56:20.834846 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.835123 kubelet[3285]: E0912 18:56:20.835107 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.835169 kubelet[3285]: W0912 18:56:20.835123 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.835169 kubelet[3285]: E0912 18:56:20.835149 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.835402 kubelet[3285]: E0912 18:56:20.835383 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.835402 kubelet[3285]: W0912 18:56:20.835396 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.835545 kubelet[3285]: E0912 18:56:20.835419 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.835640 kubelet[3285]: E0912 18:56:20.835607 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.835640 kubelet[3285]: W0912 18:56:20.835619 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.835727 kubelet[3285]: E0912 18:56:20.835647 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.835835 kubelet[3285]: E0912 18:56:20.835822 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.835835 kubelet[3285]: W0912 18:56:20.835833 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.835924 kubelet[3285]: E0912 18:56:20.835845 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.836076 kubelet[3285]: E0912 18:56:20.836064 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.836076 kubelet[3285]: W0912 18:56:20.836075 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.836162 kubelet[3285]: E0912 18:56:20.836086 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.836272 kubelet[3285]: E0912 18:56:20.836260 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.836314 kubelet[3285]: W0912 18:56:20.836271 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.836314 kubelet[3285]: E0912 18:56:20.836282 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.836463 kubelet[3285]: E0912 18:56:20.836452 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.836504 kubelet[3285]: W0912 18:56:20.836463 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.836504 kubelet[3285]: E0912 18:56:20.836474 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.853345 containerd[1921]: time="2025-09-12T18:56:20.853256353Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-574mq,Uid:9148df86-8ee7-4dad-9d2e-254fbf363a2f,Namespace:calico-system,Attempt:0,}" Sep 12 18:56:20.869921 containerd[1921]: time="2025-09-12T18:56:20.869884436Z" level=info msg="connecting to shim 4e8df7852a0401b9e58f31e3d3b489951ede8ad01aeef3c8f0f752d97d817bb2" address="unix:///run/containerd/s/7ab0899356d612a35bf4c91bdeec8be29bf5e5f080ebe9778f348bc0a23de635" namespace=k8s.io protocol=ttrpc version=3 Sep 12 18:56:20.895910 systemd[1]: Started cri-containerd-4e8df7852a0401b9e58f31e3d3b489951ede8ad01aeef3c8f0f752d97d817bb2.scope - libcontainer container 4e8df7852a0401b9e58f31e3d3b489951ede8ad01aeef3c8f0f752d97d817bb2. Sep 12 18:56:20.907259 containerd[1921]: time="2025-09-12T18:56:20.907236412Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-574mq,Uid:9148df86-8ee7-4dad-9d2e-254fbf363a2f,Namespace:calico-system,Attempt:0,} returns sandbox id \"4e8df7852a0401b9e58f31e3d3b489951ede8ad01aeef3c8f0f752d97d817bb2\"" Sep 12 18:56:20.935581 kubelet[3285]: E0912 18:56:20.935521 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.935581 kubelet[3285]: W0912 18:56:20.935582 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.936093 kubelet[3285]: E0912 18:56:20.935664 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.936437 kubelet[3285]: E0912 18:56:20.936387 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.936437 kubelet[3285]: W0912 18:56:20.936425 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.936812 kubelet[3285]: E0912 18:56:20.936484 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.937164 kubelet[3285]: E0912 18:56:20.937084 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.937164 kubelet[3285]: W0912 18:56:20.937125 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.937164 kubelet[3285]: E0912 18:56:20.937170 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.937754 kubelet[3285]: E0912 18:56:20.937668 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.937754 kubelet[3285]: W0912 18:56:20.937695 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.937754 kubelet[3285]: E0912 18:56:20.937731 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.938289 kubelet[3285]: E0912 18:56:20.938217 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.938289 kubelet[3285]: W0912 18:56:20.938242 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.938695 kubelet[3285]: E0912 18:56:20.938329 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.938695 kubelet[3285]: E0912 18:56:20.938684 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.939020 kubelet[3285]: W0912 18:56:20.938708 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.939020 kubelet[3285]: E0912 18:56:20.938791 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.939271 kubelet[3285]: E0912 18:56:20.939173 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.939271 kubelet[3285]: W0912 18:56:20.939199 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.939519 kubelet[3285]: E0912 18:56:20.939280 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.939782 kubelet[3285]: E0912 18:56:20.939707 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.939782 kubelet[3285]: W0912 18:56:20.939732 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.939782 kubelet[3285]: E0912 18:56:20.939766 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.940264 kubelet[3285]: E0912 18:56:20.940233 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.940378 kubelet[3285]: W0912 18:56:20.940261 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.940378 kubelet[3285]: E0912 18:56:20.940299 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.940909 kubelet[3285]: E0912 18:56:20.940877 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.940909 kubelet[3285]: W0912 18:56:20.940905 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.941232 kubelet[3285]: E0912 18:56:20.940984 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.941450 kubelet[3285]: E0912 18:56:20.941408 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.941450 kubelet[3285]: W0912 18:56:20.941433 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.941767 kubelet[3285]: E0912 18:56:20.941514 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.941952 kubelet[3285]: E0912 18:56:20.941919 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.941952 kubelet[3285]: W0912 18:56:20.941947 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.942196 kubelet[3285]: E0912 18:56:20.942021 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.942470 kubelet[3285]: E0912 18:56:20.942438 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.942470 kubelet[3285]: W0912 18:56:20.942466 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.942766 kubelet[3285]: E0912 18:56:20.942538 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.942979 kubelet[3285]: E0912 18:56:20.942946 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.943127 kubelet[3285]: W0912 18:56:20.942975 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.943127 kubelet[3285]: E0912 18:56:20.943057 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.943492 kubelet[3285]: E0912 18:56:20.943460 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.943492 kubelet[3285]: W0912 18:56:20.943488 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.943789 kubelet[3285]: E0912 18:56:20.943553 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.943982 kubelet[3285]: E0912 18:56:20.943953 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.943982 kubelet[3285]: W0912 18:56:20.943978 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.944219 kubelet[3285]: E0912 18:56:20.944110 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.944436 kubelet[3285]: E0912 18:56:20.944405 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.944436 kubelet[3285]: W0912 18:56:20.944434 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.944710 kubelet[3285]: E0912 18:56:20.944552 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.944921 kubelet[3285]: E0912 18:56:20.944892 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.944921 kubelet[3285]: W0912 18:56:20.944916 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.945141 kubelet[3285]: E0912 18:56:20.945032 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.945384 kubelet[3285]: E0912 18:56:20.945355 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.945384 kubelet[3285]: W0912 18:56:20.945383 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.945652 kubelet[3285]: E0912 18:56:20.945491 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.945927 kubelet[3285]: E0912 18:56:20.945897 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.945927 kubelet[3285]: W0912 18:56:20.945922 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.946197 kubelet[3285]: E0912 18:56:20.945994 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.946439 kubelet[3285]: E0912 18:56:20.946409 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.946439 kubelet[3285]: W0912 18:56:20.946436 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.946694 kubelet[3285]: E0912 18:56:20.946558 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.946955 kubelet[3285]: E0912 18:56:20.946924 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.946955 kubelet[3285]: W0912 18:56:20.946951 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.947188 kubelet[3285]: E0912 18:56:20.947066 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.947455 kubelet[3285]: E0912 18:56:20.947428 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.947455 kubelet[3285]: W0912 18:56:20.947453 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.947704 kubelet[3285]: E0912 18:56:20.947569 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.948017 kubelet[3285]: E0912 18:56:20.947979 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.948017 kubelet[3285]: W0912 18:56:20.948007 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.948256 kubelet[3285]: E0912 18:56:20.948043 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.948643 kubelet[3285]: E0912 18:56:20.948574 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.948762 kubelet[3285]: W0912 18:56:20.948648 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.948762 kubelet[3285]: E0912 18:56:20.948679 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:20.967923 kubelet[3285]: E0912 18:56:20.967867 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:20.967923 kubelet[3285]: W0912 18:56:20.967904 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:20.968228 kubelet[3285]: E0912 18:56:20.967942 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:22.161934 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3568734981.mount: Deactivated successfully. Sep 12 18:56:22.536930 kubelet[3285]: E0912 18:56:22.536910 3285 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mkb45" podUID="19fb1996-8df9-4ece-9557-f77103d3c3c7" Sep 12 18:56:22.763693 containerd[1921]: time="2025-09-12T18:56:22.763640745Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:22.763912 containerd[1921]: time="2025-09-12T18:56:22.763796021Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 12 18:56:22.764235 containerd[1921]: time="2025-09-12T18:56:22.764192843Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:22.765048 containerd[1921]: time="2025-09-12T18:56:22.765003794Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:22.765405 containerd[1921]: time="2025-09-12T18:56:22.765365070Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.159612377s" Sep 12 18:56:22.765405 containerd[1921]: time="2025-09-12T18:56:22.765379735Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 12 18:56:22.765868 containerd[1921]: time="2025-09-12T18:56:22.765856333Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 18:56:22.768948 containerd[1921]: time="2025-09-12T18:56:22.768932919Z" level=info msg="CreateContainer within sandbox \"d8ebb2c5869a55487d293aa43ee857f041c5ac173b48fd5136bbc74e6005fc56\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 18:56:22.771514 containerd[1921]: time="2025-09-12T18:56:22.771472143Z" level=info msg="Container 56afa352140d3244cebeb0eaac6e0b8f4a7e6216524841e3647c2d138bfffa31: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:22.774135 containerd[1921]: time="2025-09-12T18:56:22.774122068Z" level=info msg="CreateContainer within sandbox \"d8ebb2c5869a55487d293aa43ee857f041c5ac173b48fd5136bbc74e6005fc56\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"56afa352140d3244cebeb0eaac6e0b8f4a7e6216524841e3647c2d138bfffa31\"" Sep 12 18:56:22.774327 containerd[1921]: time="2025-09-12T18:56:22.774313421Z" level=info msg="StartContainer for \"56afa352140d3244cebeb0eaac6e0b8f4a7e6216524841e3647c2d138bfffa31\"" Sep 12 18:56:22.774836 containerd[1921]: time="2025-09-12T18:56:22.774815052Z" level=info msg="connecting to shim 56afa352140d3244cebeb0eaac6e0b8f4a7e6216524841e3647c2d138bfffa31" address="unix:///run/containerd/s/100afbdc329f3fcf9b8db28e89504f05c5df4accbaa1a53d134a0960339c08f7" protocol=ttrpc version=3 Sep 12 18:56:22.792871 systemd[1]: Started cri-containerd-56afa352140d3244cebeb0eaac6e0b8f4a7e6216524841e3647c2d138bfffa31.scope - libcontainer container 56afa352140d3244cebeb0eaac6e0b8f4a7e6216524841e3647c2d138bfffa31. Sep 12 18:56:22.820246 containerd[1921]: time="2025-09-12T18:56:22.820225938Z" level=info msg="StartContainer for \"56afa352140d3244cebeb0eaac6e0b8f4a7e6216524841e3647c2d138bfffa31\" returns successfully" Sep 12 18:56:23.622944 kubelet[3285]: I0912 18:56:23.622762 3285 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6c594cc55-pnw9h" podStartSLOduration=1.462529166 podStartE2EDuration="3.622709881s" podCreationTimestamp="2025-09-12 18:56:20 +0000 UTC" firstStartedPulling="2025-09-12 18:56:20.605621636 +0000 UTC m=+15.155008023" lastFinishedPulling="2025-09-12 18:56:22.76580235 +0000 UTC m=+17.315188738" observedRunningTime="2025-09-12 18:56:23.622537771 +0000 UTC m=+18.171924260" watchObservedRunningTime="2025-09-12 18:56:23.622709881 +0000 UTC m=+18.172096321" Sep 12 18:56:23.646293 kubelet[3285]: E0912 18:56:23.646179 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.646293 kubelet[3285]: W0912 18:56:23.646230 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.646293 kubelet[3285]: E0912 18:56:23.646278 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.647054 kubelet[3285]: E0912 18:56:23.646957 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.647054 kubelet[3285]: W0912 18:56:23.646996 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.647054 kubelet[3285]: E0912 18:56:23.647031 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.647777 kubelet[3285]: E0912 18:56:23.647694 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.647777 kubelet[3285]: W0912 18:56:23.647730 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.647777 kubelet[3285]: E0912 18:56:23.647765 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.648508 kubelet[3285]: E0912 18:56:23.648428 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.648508 kubelet[3285]: W0912 18:56:23.648465 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.648508 kubelet[3285]: E0912 18:56:23.648499 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.649207 kubelet[3285]: E0912 18:56:23.649126 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.649207 kubelet[3285]: W0912 18:56:23.649164 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.649207 kubelet[3285]: E0912 18:56:23.649199 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.649756 kubelet[3285]: E0912 18:56:23.649697 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.649756 kubelet[3285]: W0912 18:56:23.649724 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.649756 kubelet[3285]: E0912 18:56:23.649753 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.650309 kubelet[3285]: E0912 18:56:23.650257 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.650309 kubelet[3285]: W0912 18:56:23.650287 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.650541 kubelet[3285]: E0912 18:56:23.650314 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.650906 kubelet[3285]: E0912 18:56:23.650853 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.650906 kubelet[3285]: W0912 18:56:23.650881 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.650906 kubelet[3285]: E0912 18:56:23.650909 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.651490 kubelet[3285]: E0912 18:56:23.651431 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.651490 kubelet[3285]: W0912 18:56:23.651460 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.651789 kubelet[3285]: E0912 18:56:23.651492 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.652030 kubelet[3285]: E0912 18:56:23.651977 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.652030 kubelet[3285]: W0912 18:56:23.652005 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.652030 kubelet[3285]: E0912 18:56:23.652031 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.652580 kubelet[3285]: E0912 18:56:23.652548 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.652580 kubelet[3285]: W0912 18:56:23.652579 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.652890 kubelet[3285]: E0912 18:56:23.652621 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.653199 kubelet[3285]: E0912 18:56:23.653143 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.653199 kubelet[3285]: W0912 18:56:23.653170 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.653199 kubelet[3285]: E0912 18:56:23.653197 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.653699 kubelet[3285]: E0912 18:56:23.653649 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.653699 kubelet[3285]: W0912 18:56:23.653674 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.653699 kubelet[3285]: E0912 18:56:23.653700 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.654230 kubelet[3285]: E0912 18:56:23.654171 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.654230 kubelet[3285]: W0912 18:56:23.654198 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.654230 kubelet[3285]: E0912 18:56:23.654224 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.654730 kubelet[3285]: E0912 18:56:23.654677 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.654730 kubelet[3285]: W0912 18:56:23.654703 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.654730 kubelet[3285]: E0912 18:56:23.654725 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.660290 kubelet[3285]: E0912 18:56:23.660243 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.660290 kubelet[3285]: W0912 18:56:23.660279 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.660642 kubelet[3285]: E0912 18:56:23.660312 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.661107 kubelet[3285]: E0912 18:56:23.661020 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.661107 kubelet[3285]: W0912 18:56:23.661058 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.661107 kubelet[3285]: E0912 18:56:23.661103 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.661821 kubelet[3285]: E0912 18:56:23.661734 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.661821 kubelet[3285]: W0912 18:56:23.661776 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.661821 kubelet[3285]: E0912 18:56:23.661819 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.662551 kubelet[3285]: E0912 18:56:23.662466 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.662551 kubelet[3285]: W0912 18:56:23.662503 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.662551 kubelet[3285]: E0912 18:56:23.662546 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.663298 kubelet[3285]: E0912 18:56:23.663216 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.663298 kubelet[3285]: W0912 18:56:23.663252 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.663636 kubelet[3285]: E0912 18:56:23.663364 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.663904 kubelet[3285]: E0912 18:56:23.663816 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.663904 kubelet[3285]: W0912 18:56:23.663846 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.664222 kubelet[3285]: E0912 18:56:23.663938 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.664629 kubelet[3285]: E0912 18:56:23.664528 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.664629 kubelet[3285]: W0912 18:56:23.664568 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.664928 kubelet[3285]: E0912 18:56:23.664807 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.665318 kubelet[3285]: E0912 18:56:23.665252 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.665318 kubelet[3285]: W0912 18:56:23.665289 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.665568 kubelet[3285]: E0912 18:56:23.665372 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.665959 kubelet[3285]: E0912 18:56:23.665892 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.665959 kubelet[3285]: W0912 18:56:23.665929 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.666179 kubelet[3285]: E0912 18:56:23.666031 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.666544 kubelet[3285]: E0912 18:56:23.666493 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.666544 kubelet[3285]: W0912 18:56:23.666522 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.666825 kubelet[3285]: E0912 18:56:23.666561 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.667237 kubelet[3285]: E0912 18:56:23.667167 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.667237 kubelet[3285]: W0912 18:56:23.667218 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.667438 kubelet[3285]: E0912 18:56:23.667284 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.667899 kubelet[3285]: E0912 18:56:23.667836 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.667899 kubelet[3285]: W0912 18:56:23.667876 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.668155 kubelet[3285]: E0912 18:56:23.667986 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.668513 kubelet[3285]: E0912 18:56:23.668476 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.668513 kubelet[3285]: W0912 18:56:23.668506 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.668791 kubelet[3285]: E0912 18:56:23.668630 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.669181 kubelet[3285]: E0912 18:56:23.669137 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.669359 kubelet[3285]: W0912 18:56:23.669179 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.669359 kubelet[3285]: E0912 18:56:23.669311 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.669842 kubelet[3285]: E0912 18:56:23.669804 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.669842 kubelet[3285]: W0912 18:56:23.669833 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.670082 kubelet[3285]: E0912 18:56:23.669874 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.670685 kubelet[3285]: E0912 18:56:23.670648 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.670685 kubelet[3285]: W0912 18:56:23.670680 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.670920 kubelet[3285]: E0912 18:56:23.670728 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.671391 kubelet[3285]: E0912 18:56:23.671355 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.671391 kubelet[3285]: W0912 18:56:23.671383 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.671624 kubelet[3285]: E0912 18:56:23.671425 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:23.672050 kubelet[3285]: E0912 18:56:23.671995 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:23.672050 kubelet[3285]: W0912 18:56:23.672022 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:23.672050 kubelet[3285]: E0912 18:56:23.672049 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.537691 kubelet[3285]: E0912 18:56:24.537541 3285 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mkb45" podUID="19fb1996-8df9-4ece-9557-f77103d3c3c7" Sep 12 18:56:24.602476 kubelet[3285]: I0912 18:56:24.602455 3285 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 18:56:24.623230 containerd[1921]: time="2025-09-12T18:56:24.623183293Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:24.623433 containerd[1921]: time="2025-09-12T18:56:24.623409520Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 12 18:56:24.623769 containerd[1921]: time="2025-09-12T18:56:24.623729455Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:24.624580 containerd[1921]: time="2025-09-12T18:56:24.624543407Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:24.624972 containerd[1921]: time="2025-09-12T18:56:24.624929791Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.859058378s" Sep 12 18:56:24.624972 containerd[1921]: time="2025-09-12T18:56:24.624946015Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 12 18:56:24.625814 containerd[1921]: time="2025-09-12T18:56:24.625802121Z" level=info msg="CreateContainer within sandbox \"4e8df7852a0401b9e58f31e3d3b489951ede8ad01aeef3c8f0f752d97d817bb2\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 18:56:24.628988 containerd[1921]: time="2025-09-12T18:56:24.628948733Z" level=info msg="Container 9c8a082e897bec2ce345370f6acc50855f4752c1824238a20969e4a572000136: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:24.649104 containerd[1921]: time="2025-09-12T18:56:24.649057668Z" level=info msg="CreateContainer within sandbox \"4e8df7852a0401b9e58f31e3d3b489951ede8ad01aeef3c8f0f752d97d817bb2\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"9c8a082e897bec2ce345370f6acc50855f4752c1824238a20969e4a572000136\"" Sep 12 18:56:24.649306 containerd[1921]: time="2025-09-12T18:56:24.649295007Z" level=info msg="StartContainer for \"9c8a082e897bec2ce345370f6acc50855f4752c1824238a20969e4a572000136\"" Sep 12 18:56:24.650092 containerd[1921]: time="2025-09-12T18:56:24.650052079Z" level=info msg="connecting to shim 9c8a082e897bec2ce345370f6acc50855f4752c1824238a20969e4a572000136" address="unix:///run/containerd/s/7ab0899356d612a35bf4c91bdeec8be29bf5e5f080ebe9778f348bc0a23de635" protocol=ttrpc version=3 Sep 12 18:56:24.662279 kubelet[3285]: E0912 18:56:24.662233 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.662279 kubelet[3285]: W0912 18:56:24.662245 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.662279 kubelet[3285]: E0912 18:56:24.662257 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.662527 kubelet[3285]: E0912 18:56:24.662413 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.662527 kubelet[3285]: W0912 18:56:24.662419 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.662527 kubelet[3285]: E0912 18:56:24.662425 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.662594 kubelet[3285]: E0912 18:56:24.662552 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.662594 kubelet[3285]: W0912 18:56:24.662558 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.662594 kubelet[3285]: E0912 18:56:24.662563 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.662736 kubelet[3285]: E0912 18:56:24.662699 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.662736 kubelet[3285]: W0912 18:56:24.662707 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.662736 kubelet[3285]: E0912 18:56:24.662714 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.662889 kubelet[3285]: E0912 18:56:24.662851 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.662889 kubelet[3285]: W0912 18:56:24.662859 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.662889 kubelet[3285]: E0912 18:56:24.662866 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.663036 kubelet[3285]: E0912 18:56:24.662996 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.663036 kubelet[3285]: W0912 18:56:24.663004 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.663036 kubelet[3285]: E0912 18:56:24.663011 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.663134 kubelet[3285]: E0912 18:56:24.663126 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.663134 kubelet[3285]: W0912 18:56:24.663134 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.663177 kubelet[3285]: E0912 18:56:24.663140 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.663244 kubelet[3285]: E0912 18:56:24.663237 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.663244 kubelet[3285]: W0912 18:56:24.663244 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.663288 kubelet[3285]: E0912 18:56:24.663250 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.663380 kubelet[3285]: E0912 18:56:24.663348 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.663380 kubelet[3285]: W0912 18:56:24.663354 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.663380 kubelet[3285]: E0912 18:56:24.663359 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.663453 kubelet[3285]: E0912 18:56:24.663447 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.663453 kubelet[3285]: W0912 18:56:24.663453 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.663496 kubelet[3285]: E0912 18:56:24.663458 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.663553 kubelet[3285]: E0912 18:56:24.663547 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.663575 kubelet[3285]: W0912 18:56:24.663553 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.663575 kubelet[3285]: E0912 18:56:24.663558 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.663661 kubelet[3285]: E0912 18:56:24.663655 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.663661 kubelet[3285]: W0912 18:56:24.663661 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.663708 kubelet[3285]: E0912 18:56:24.663666 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.663804 kubelet[3285]: E0912 18:56:24.663798 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.663826 kubelet[3285]: W0912 18:56:24.663803 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.663826 kubelet[3285]: E0912 18:56:24.663809 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.663904 kubelet[3285]: E0912 18:56:24.663898 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.663930 kubelet[3285]: W0912 18:56:24.663904 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.663930 kubelet[3285]: E0912 18:56:24.663909 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.664005 kubelet[3285]: E0912 18:56:24.663999 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.664027 kubelet[3285]: W0912 18:56:24.664006 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.664027 kubelet[3285]: E0912 18:56:24.664011 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.671488 kubelet[3285]: E0912 18:56:24.671477 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.671488 kubelet[3285]: W0912 18:56:24.671486 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.671567 kubelet[3285]: E0912 18:56:24.671496 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.671691 kubelet[3285]: E0912 18:56:24.671681 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.671691 kubelet[3285]: W0912 18:56:24.671690 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.671758 kubelet[3285]: E0912 18:56:24.671701 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.671914 kubelet[3285]: E0912 18:56:24.671905 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.671914 kubelet[3285]: W0912 18:56:24.671914 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.671971 kubelet[3285]: E0912 18:56:24.671924 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.672126 kubelet[3285]: E0912 18:56:24.672116 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.672126 kubelet[3285]: W0912 18:56:24.672126 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.672180 kubelet[3285]: E0912 18:56:24.672136 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.672303 kubelet[3285]: E0912 18:56:24.672294 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.672303 kubelet[3285]: W0912 18:56:24.672302 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.672355 kubelet[3285]: E0912 18:56:24.672311 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.672431 kubelet[3285]: E0912 18:56:24.672424 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.672431 kubelet[3285]: W0912 18:56:24.672430 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.672487 kubelet[3285]: E0912 18:56:24.672438 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.672567 kubelet[3285]: E0912 18:56:24.672560 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.672567 kubelet[3285]: W0912 18:56:24.672566 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.672626 kubelet[3285]: E0912 18:56:24.672574 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.672731 kubelet[3285]: E0912 18:56:24.672721 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.672757 kubelet[3285]: W0912 18:56:24.672731 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.672757 kubelet[3285]: E0912 18:56:24.672742 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.672897 kubelet[3285]: E0912 18:56:24.672891 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.672897 kubelet[3285]: W0912 18:56:24.672897 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.672952 kubelet[3285]: E0912 18:56:24.672908 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.673068 kubelet[3285]: E0912 18:56:24.673060 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.673068 kubelet[3285]: W0912 18:56:24.673067 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.673129 kubelet[3285]: E0912 18:56:24.673088 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.673199 kubelet[3285]: E0912 18:56:24.673192 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.673199 kubelet[3285]: W0912 18:56:24.673199 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.673263 kubelet[3285]: E0912 18:56:24.673219 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.673325 kubelet[3285]: E0912 18:56:24.673317 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.673356 kubelet[3285]: W0912 18:56:24.673324 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.673356 kubelet[3285]: E0912 18:56:24.673334 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.673489 kubelet[3285]: E0912 18:56:24.673481 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.673489 kubelet[3285]: W0912 18:56:24.673488 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.673545 kubelet[3285]: E0912 18:56:24.673498 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.673735 kubelet[3285]: E0912 18:56:24.673725 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.673769 kubelet[3285]: W0912 18:56:24.673738 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.673769 kubelet[3285]: E0912 18:56:24.673750 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.673912 kubelet[3285]: E0912 18:56:24.673904 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.673912 kubelet[3285]: W0912 18:56:24.673911 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.673975 kubelet[3285]: E0912 18:56:24.673923 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.674066 kubelet[3285]: E0912 18:56:24.674058 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.674066 kubelet[3285]: W0912 18:56:24.674065 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.674132 kubelet[3285]: E0912 18:56:24.674075 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.674304 kubelet[3285]: E0912 18:56:24.674296 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.674304 kubelet[3285]: W0912 18:56:24.674304 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.674360 kubelet[3285]: E0912 18:56:24.674314 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.674441 kubelet[3285]: E0912 18:56:24.674434 3285 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 18:56:24.674441 kubelet[3285]: W0912 18:56:24.674440 3285 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 18:56:24.674491 kubelet[3285]: E0912 18:56:24.674447 3285 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 18:56:24.675703 systemd[1]: Started cri-containerd-9c8a082e897bec2ce345370f6acc50855f4752c1824238a20969e4a572000136.scope - libcontainer container 9c8a082e897bec2ce345370f6acc50855f4752c1824238a20969e4a572000136. Sep 12 18:56:24.707686 containerd[1921]: time="2025-09-12T18:56:24.707656555Z" level=info msg="StartContainer for \"9c8a082e897bec2ce345370f6acc50855f4752c1824238a20969e4a572000136\" returns successfully" Sep 12 18:56:24.715082 systemd[1]: cri-containerd-9c8a082e897bec2ce345370f6acc50855f4752c1824238a20969e4a572000136.scope: Deactivated successfully. Sep 12 18:56:24.717123 containerd[1921]: time="2025-09-12T18:56:24.717091202Z" level=info msg="received exit event container_id:\"9c8a082e897bec2ce345370f6acc50855f4752c1824238a20969e4a572000136\" id:\"9c8a082e897bec2ce345370f6acc50855f4752c1824238a20969e4a572000136\" pid:4145 exited_at:{seconds:1757703384 nanos:716833828}" Sep 12 18:56:24.717210 containerd[1921]: time="2025-09-12T18:56:24.717101573Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c8a082e897bec2ce345370f6acc50855f4752c1824238a20969e4a572000136\" id:\"9c8a082e897bec2ce345370f6acc50855f4752c1824238a20969e4a572000136\" pid:4145 exited_at:{seconds:1757703384 nanos:716833828}" Sep 12 18:56:24.735402 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9c8a082e897bec2ce345370f6acc50855f4752c1824238a20969e4a572000136-rootfs.mount: Deactivated successfully. Sep 12 18:56:26.537367 kubelet[3285]: E0912 18:56:26.537228 3285 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mkb45" podUID="19fb1996-8df9-4ece-9557-f77103d3c3c7" Sep 12 18:56:26.622292 containerd[1921]: time="2025-09-12T18:56:26.622221380Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 18:56:28.537112 kubelet[3285]: E0912 18:56:28.537089 3285 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mkb45" podUID="19fb1996-8df9-4ece-9557-f77103d3c3c7" Sep 12 18:56:29.755976 containerd[1921]: time="2025-09-12T18:56:29.755922236Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:29.756201 containerd[1921]: time="2025-09-12T18:56:29.756154656Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 12 18:56:29.756515 containerd[1921]: time="2025-09-12T18:56:29.756475279Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:29.757312 containerd[1921]: time="2025-09-12T18:56:29.757297315Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:29.757696 containerd[1921]: time="2025-09-12T18:56:29.757683180Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.135396525s" Sep 12 18:56:29.757738 containerd[1921]: time="2025-09-12T18:56:29.757699828Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 12 18:56:29.758615 containerd[1921]: time="2025-09-12T18:56:29.758603662Z" level=info msg="CreateContainer within sandbox \"4e8df7852a0401b9e58f31e3d3b489951ede8ad01aeef3c8f0f752d97d817bb2\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 18:56:29.761988 containerd[1921]: time="2025-09-12T18:56:29.761948144Z" level=info msg="Container 282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:29.765930 containerd[1921]: time="2025-09-12T18:56:29.765887450Z" level=info msg="CreateContainer within sandbox \"4e8df7852a0401b9e58f31e3d3b489951ede8ad01aeef3c8f0f752d97d817bb2\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930\"" Sep 12 18:56:29.766162 containerd[1921]: time="2025-09-12T18:56:29.766116884Z" level=info msg="StartContainer for \"282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930\"" Sep 12 18:56:29.766912 containerd[1921]: time="2025-09-12T18:56:29.766897750Z" level=info msg="connecting to shim 282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930" address="unix:///run/containerd/s/7ab0899356d612a35bf4c91bdeec8be29bf5e5f080ebe9778f348bc0a23de635" protocol=ttrpc version=3 Sep 12 18:56:29.781878 systemd[1]: Started cri-containerd-282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930.scope - libcontainer container 282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930. Sep 12 18:56:29.801555 containerd[1921]: time="2025-09-12T18:56:29.801533927Z" level=info msg="StartContainer for \"282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930\" returns successfully" Sep 12 18:56:30.372159 containerd[1921]: time="2025-09-12T18:56:30.372095198Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 18:56:30.373220 systemd[1]: cri-containerd-282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930.scope: Deactivated successfully. Sep 12 18:56:30.373406 systemd[1]: cri-containerd-282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930.scope: Consumed 359ms CPU time, 195.3M memory peak, 171.3M written to disk. Sep 12 18:56:30.373801 containerd[1921]: time="2025-09-12T18:56:30.373758056Z" level=info msg="received exit event container_id:\"282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930\" id:\"282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930\" pid:4204 exited_at:{seconds:1757703390 nanos:373628431}" Sep 12 18:56:30.373801 containerd[1921]: time="2025-09-12T18:56:30.373786392Z" level=info msg="TaskExit event in podsandbox handler container_id:\"282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930\" id:\"282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930\" pid:4204 exited_at:{seconds:1757703390 nanos:373628431}" Sep 12 18:56:30.386396 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930-rootfs.mount: Deactivated successfully. Sep 12 18:56:30.390684 kubelet[3285]: I0912 18:56:30.390440 3285 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 12 18:56:30.433754 systemd[1]: Created slice kubepods-burstable-podecd0baa7_57e9_4324_aeff_1245c967addc.slice - libcontainer container kubepods-burstable-podecd0baa7_57e9_4324_aeff_1245c967addc.slice. Sep 12 18:56:30.443175 systemd[1]: Created slice kubepods-besteffort-pod8ad1c2c3_91b8_4133_bd04_a7165ab7b049.slice - libcontainer container kubepods-besteffort-pod8ad1c2c3_91b8_4133_bd04_a7165ab7b049.slice. Sep 12 18:56:30.451328 systemd[1]: Created slice kubepods-besteffort-pod39d4e67c_14c5_4675_8693_1f65bf0c3499.slice - libcontainer container kubepods-besteffort-pod39d4e67c_14c5_4675_8693_1f65bf0c3499.slice. Sep 12 18:56:30.459895 systemd[1]: Created slice kubepods-burstable-podea20e964_361f_4ead_8929_49881fdc393b.slice - libcontainer container kubepods-burstable-podea20e964_361f_4ead_8929_49881fdc393b.slice. Sep 12 18:56:30.468094 systemd[1]: Created slice kubepods-besteffort-podb76b76fa_54b7_4652_8227_b8f6b96853be.slice - libcontainer container kubepods-besteffort-podb76b76fa_54b7_4652_8227_b8f6b96853be.slice. Sep 12 18:56:30.476783 systemd[1]: Created slice kubepods-besteffort-pode52696bd_4cda_4312_8869_f48d61792f6c.slice - libcontainer container kubepods-besteffort-pode52696bd_4cda_4312_8869_f48d61792f6c.slice. Sep 12 18:56:30.483176 systemd[1]: Created slice kubepods-besteffort-pod450d1971_2f13_49d5_abdc_f1fd517af156.slice - libcontainer container kubepods-besteffort-pod450d1971_2f13_49d5_abdc_f1fd517af156.slice. Sep 12 18:56:30.513353 kubelet[3285]: I0912 18:56:30.513288 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/450d1971-2f13-49d5-abdc-f1fd517af156-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-9jkf4\" (UID: \"450d1971-2f13-49d5-abdc-f1fd517af156\") " pod="calico-system/goldmane-54d579b49d-9jkf4" Sep 12 18:56:30.513353 kubelet[3285]: I0912 18:56:30.513331 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/450d1971-2f13-49d5-abdc-f1fd517af156-goldmane-key-pair\") pod \"goldmane-54d579b49d-9jkf4\" (UID: \"450d1971-2f13-49d5-abdc-f1fd517af156\") " pod="calico-system/goldmane-54d579b49d-9jkf4" Sep 12 18:56:30.513353 kubelet[3285]: I0912 18:56:30.513354 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecd0baa7-57e9-4324-aeff-1245c967addc-config-volume\") pod \"coredns-668d6bf9bc-nzcks\" (UID: \"ecd0baa7-57e9-4324-aeff-1245c967addc\") " pod="kube-system/coredns-668d6bf9bc-nzcks" Sep 12 18:56:30.513616 kubelet[3285]: I0912 18:56:30.513377 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e52696bd-4cda-4312-8869-f48d61792f6c-whisker-backend-key-pair\") pod \"whisker-76f44846d5-pnf2g\" (UID: \"e52696bd-4cda-4312-8869-f48d61792f6c\") " pod="calico-system/whisker-76f44846d5-pnf2g" Sep 12 18:56:30.513616 kubelet[3285]: I0912 18:56:30.513397 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/450d1971-2f13-49d5-abdc-f1fd517af156-config\") pod \"goldmane-54d579b49d-9jkf4\" (UID: \"450d1971-2f13-49d5-abdc-f1fd517af156\") " pod="calico-system/goldmane-54d579b49d-9jkf4" Sep 12 18:56:30.513616 kubelet[3285]: I0912 18:56:30.513430 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39d4e67c-14c5-4675-8693-1f65bf0c3499-tigera-ca-bundle\") pod \"calico-kube-controllers-6cdb495f56-2zsfm\" (UID: \"39d4e67c-14c5-4675-8693-1f65bf0c3499\") " pod="calico-system/calico-kube-controllers-6cdb495f56-2zsfm" Sep 12 18:56:30.513616 kubelet[3285]: I0912 18:56:30.513458 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e52696bd-4cda-4312-8869-f48d61792f6c-whisker-ca-bundle\") pod \"whisker-76f44846d5-pnf2g\" (UID: \"e52696bd-4cda-4312-8869-f48d61792f6c\") " pod="calico-system/whisker-76f44846d5-pnf2g" Sep 12 18:56:30.513616 kubelet[3285]: I0912 18:56:30.513478 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nksvc\" (UniqueName: \"kubernetes.io/projected/ecd0baa7-57e9-4324-aeff-1245c967addc-kube-api-access-nksvc\") pod \"coredns-668d6bf9bc-nzcks\" (UID: \"ecd0baa7-57e9-4324-aeff-1245c967addc\") " pod="kube-system/coredns-668d6bf9bc-nzcks" Sep 12 18:56:30.513859 kubelet[3285]: I0912 18:56:30.513503 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b76b76fa-54b7-4652-8227-b8f6b96853be-calico-apiserver-certs\") pod \"calico-apiserver-56857fb794-l2ttg\" (UID: \"b76b76fa-54b7-4652-8227-b8f6b96853be\") " pod="calico-apiserver/calico-apiserver-56857fb794-l2ttg" Sep 12 18:56:30.513859 kubelet[3285]: I0912 18:56:30.513551 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmkmj\" (UniqueName: \"kubernetes.io/projected/39d4e67c-14c5-4675-8693-1f65bf0c3499-kube-api-access-nmkmj\") pod \"calico-kube-controllers-6cdb495f56-2zsfm\" (UID: \"39d4e67c-14c5-4675-8693-1f65bf0c3499\") " pod="calico-system/calico-kube-controllers-6cdb495f56-2zsfm" Sep 12 18:56:30.513859 kubelet[3285]: I0912 18:56:30.513611 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw6s9\" (UniqueName: \"kubernetes.io/projected/b76b76fa-54b7-4652-8227-b8f6b96853be-kube-api-access-pw6s9\") pod \"calico-apiserver-56857fb794-l2ttg\" (UID: \"b76b76fa-54b7-4652-8227-b8f6b96853be\") " pod="calico-apiserver/calico-apiserver-56857fb794-l2ttg" Sep 12 18:56:30.513859 kubelet[3285]: I0912 18:56:30.513633 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn5r6\" (UniqueName: \"kubernetes.io/projected/e52696bd-4cda-4312-8869-f48d61792f6c-kube-api-access-dn5r6\") pod \"whisker-76f44846d5-pnf2g\" (UID: \"e52696bd-4cda-4312-8869-f48d61792f6c\") " pod="calico-system/whisker-76f44846d5-pnf2g" Sep 12 18:56:30.513859 kubelet[3285]: I0912 18:56:30.513654 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2c8c\" (UniqueName: \"kubernetes.io/projected/450d1971-2f13-49d5-abdc-f1fd517af156-kube-api-access-c2c8c\") pod \"goldmane-54d579b49d-9jkf4\" (UID: \"450d1971-2f13-49d5-abdc-f1fd517af156\") " pod="calico-system/goldmane-54d579b49d-9jkf4" Sep 12 18:56:30.514072 kubelet[3285]: I0912 18:56:30.513724 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtjvf\" (UniqueName: \"kubernetes.io/projected/8ad1c2c3-91b8-4133-bd04-a7165ab7b049-kube-api-access-mtjvf\") pod \"calico-apiserver-56857fb794-2bwf9\" (UID: \"8ad1c2c3-91b8-4133-bd04-a7165ab7b049\") " pod="calico-apiserver/calico-apiserver-56857fb794-2bwf9" Sep 12 18:56:30.514072 kubelet[3285]: I0912 18:56:30.513753 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk7xl\" (UniqueName: \"kubernetes.io/projected/ea20e964-361f-4ead-8929-49881fdc393b-kube-api-access-dk7xl\") pod \"coredns-668d6bf9bc-drxkk\" (UID: \"ea20e964-361f-4ead-8929-49881fdc393b\") " pod="kube-system/coredns-668d6bf9bc-drxkk" Sep 12 18:56:30.514072 kubelet[3285]: I0912 18:56:30.513777 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8ad1c2c3-91b8-4133-bd04-a7165ab7b049-calico-apiserver-certs\") pod \"calico-apiserver-56857fb794-2bwf9\" (UID: \"8ad1c2c3-91b8-4133-bd04-a7165ab7b049\") " pod="calico-apiserver/calico-apiserver-56857fb794-2bwf9" Sep 12 18:56:30.514072 kubelet[3285]: I0912 18:56:30.513795 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea20e964-361f-4ead-8929-49881fdc393b-config-volume\") pod \"coredns-668d6bf9bc-drxkk\" (UID: \"ea20e964-361f-4ead-8929-49881fdc393b\") " pod="kube-system/coredns-668d6bf9bc-drxkk" Sep 12 18:56:30.551036 systemd[1]: Created slice kubepods-besteffort-pod19fb1996_8df9_4ece_9557_f77103d3c3c7.slice - libcontainer container kubepods-besteffort-pod19fb1996_8df9_4ece_9557_f77103d3c3c7.slice. Sep 12 18:56:30.555228 containerd[1921]: time="2025-09-12T18:56:30.555191305Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mkb45,Uid:19fb1996-8df9-4ece-9557-f77103d3c3c7,Namespace:calico-system,Attempt:0,}" Sep 12 18:56:30.705641 containerd[1921]: time="2025-09-12T18:56:30.705534025Z" level=error msg="Failed to destroy network for sandbox \"50e5bd0fb7967b9c88e246c39120aea468c49c2c350845ce45530c2e96ab5622\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.729474 containerd[1921]: time="2025-09-12T18:56:30.729449821Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mkb45,Uid:19fb1996-8df9-4ece-9557-f77103d3c3c7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"50e5bd0fb7967b9c88e246c39120aea468c49c2c350845ce45530c2e96ab5622\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.730571 kubelet[3285]: E0912 18:56:30.729573 3285 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50e5bd0fb7967b9c88e246c39120aea468c49c2c350845ce45530c2e96ab5622\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.730571 kubelet[3285]: E0912 18:56:30.729621 3285 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50e5bd0fb7967b9c88e246c39120aea468c49c2c350845ce45530c2e96ab5622\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mkb45" Sep 12 18:56:30.730571 kubelet[3285]: E0912 18:56:30.729634 3285 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50e5bd0fb7967b9c88e246c39120aea468c49c2c350845ce45530c2e96ab5622\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mkb45" Sep 12 18:56:30.730670 kubelet[3285]: E0912 18:56:30.729658 3285 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-mkb45_calico-system(19fb1996-8df9-4ece-9557-f77103d3c3c7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-mkb45_calico-system(19fb1996-8df9-4ece-9557-f77103d3c3c7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"50e5bd0fb7967b9c88e246c39120aea468c49c2c350845ce45530c2e96ab5622\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mkb45" podUID="19fb1996-8df9-4ece-9557-f77103d3c3c7" Sep 12 18:56:30.740295 containerd[1921]: time="2025-09-12T18:56:30.739272155Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nzcks,Uid:ecd0baa7-57e9-4324-aeff-1245c967addc,Namespace:kube-system,Attempt:0,}" Sep 12 18:56:30.746545 containerd[1921]: time="2025-09-12T18:56:30.746523208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56857fb794-2bwf9,Uid:8ad1c2c3-91b8-4133-bd04-a7165ab7b049,Namespace:calico-apiserver,Attempt:0,}" Sep 12 18:56:30.754963 containerd[1921]: time="2025-09-12T18:56:30.754937849Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cdb495f56-2zsfm,Uid:39d4e67c-14c5-4675-8693-1f65bf0c3499,Namespace:calico-system,Attempt:0,}" Sep 12 18:56:30.764505 containerd[1921]: time="2025-09-12T18:56:30.764480826Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-drxkk,Uid:ea20e964-361f-4ead-8929-49881fdc393b,Namespace:kube-system,Attempt:0,}" Sep 12 18:56:30.767347 containerd[1921]: time="2025-09-12T18:56:30.767315392Z" level=error msg="Failed to destroy network for sandbox \"801ef221b1f0128407981e9a176046bc00bb343bb4f67abfb29f3d8f7f9d3212\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.767774 containerd[1921]: time="2025-09-12T18:56:30.767751721Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nzcks,Uid:ecd0baa7-57e9-4324-aeff-1245c967addc,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"801ef221b1f0128407981e9a176046bc00bb343bb4f67abfb29f3d8f7f9d3212\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.767945 kubelet[3285]: E0912 18:56:30.767923 3285 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"801ef221b1f0128407981e9a176046bc00bb343bb4f67abfb29f3d8f7f9d3212\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.768001 kubelet[3285]: E0912 18:56:30.767968 3285 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"801ef221b1f0128407981e9a176046bc00bb343bb4f67abfb29f3d8f7f9d3212\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-nzcks" Sep 12 18:56:30.768001 kubelet[3285]: E0912 18:56:30.767990 3285 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"801ef221b1f0128407981e9a176046bc00bb343bb4f67abfb29f3d8f7f9d3212\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-nzcks" Sep 12 18:56:30.768065 kubelet[3285]: E0912 18:56:30.768027 3285 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-nzcks_kube-system(ecd0baa7-57e9-4324-aeff-1245c967addc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-nzcks_kube-system(ecd0baa7-57e9-4324-aeff-1245c967addc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"801ef221b1f0128407981e9a176046bc00bb343bb4f67abfb29f3d8f7f9d3212\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-nzcks" podUID="ecd0baa7-57e9-4324-aeff-1245c967addc" Sep 12 18:56:30.771705 containerd[1921]: time="2025-09-12T18:56:30.771674578Z" level=error msg="Failed to destroy network for sandbox \"f2893961dfd6dc325b0a111439426e5c91579fa43c9a1286bf2057131bad665a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.772084 containerd[1921]: time="2025-09-12T18:56:30.772065944Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56857fb794-2bwf9,Uid:8ad1c2c3-91b8-4133-bd04-a7165ab7b049,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2893961dfd6dc325b0a111439426e5c91579fa43c9a1286bf2057131bad665a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.772136 systemd[1]: run-netns-cni\x2d7176ac48\x2d7b0b\x2d07df\x2dbd4e\x2d81a0282f2ce7.mount: Deactivated successfully. Sep 12 18:56:30.772222 kubelet[3285]: E0912 18:56:30.772199 3285 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2893961dfd6dc325b0a111439426e5c91579fa43c9a1286bf2057131bad665a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.772256 kubelet[3285]: E0912 18:56:30.772238 3285 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2893961dfd6dc325b0a111439426e5c91579fa43c9a1286bf2057131bad665a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56857fb794-2bwf9" Sep 12 18:56:30.772256 kubelet[3285]: E0912 18:56:30.772252 3285 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2893961dfd6dc325b0a111439426e5c91579fa43c9a1286bf2057131bad665a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56857fb794-2bwf9" Sep 12 18:56:30.772303 kubelet[3285]: E0912 18:56:30.772275 3285 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-56857fb794-2bwf9_calico-apiserver(8ad1c2c3-91b8-4133-bd04-a7165ab7b049)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-56857fb794-2bwf9_calico-apiserver(8ad1c2c3-91b8-4133-bd04-a7165ab7b049)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f2893961dfd6dc325b0a111439426e5c91579fa43c9a1286bf2057131bad665a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-56857fb794-2bwf9" podUID="8ad1c2c3-91b8-4133-bd04-a7165ab7b049" Sep 12 18:56:30.772577 containerd[1921]: time="2025-09-12T18:56:30.772559253Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56857fb794-l2ttg,Uid:b76b76fa-54b7-4652-8227-b8f6b96853be,Namespace:calico-apiserver,Attempt:0,}" Sep 12 18:56:30.774844 systemd[1]: run-netns-cni\x2d12e95bb3\x2d303e\x2db4ee\x2d0af5\x2d88be20584afe.mount: Deactivated successfully. Sep 12 18:56:30.774923 systemd[1]: run-netns-cni\x2dc6929431\x2d0820\x2d0c49\x2da77b\x2d7dc5a6cf4dbf.mount: Deactivated successfully. Sep 12 18:56:30.780925 containerd[1921]: time="2025-09-12T18:56:30.780857552Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-76f44846d5-pnf2g,Uid:e52696bd-4cda-4312-8869-f48d61792f6c,Namespace:calico-system,Attempt:0,}" Sep 12 18:56:30.781043 containerd[1921]: time="2025-09-12T18:56:30.780969250Z" level=error msg="Failed to destroy network for sandbox \"a1c8b587c5d8152af7da72ece9c35ee3182083d5eb70f65537d7f32e8ad14748\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.781608 containerd[1921]: time="2025-09-12T18:56:30.781582172Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cdb495f56-2zsfm,Uid:39d4e67c-14c5-4675-8693-1f65bf0c3499,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1c8b587c5d8152af7da72ece9c35ee3182083d5eb70f65537d7f32e8ad14748\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.781710 kubelet[3285]: E0912 18:56:30.781691 3285 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1c8b587c5d8152af7da72ece9c35ee3182083d5eb70f65537d7f32e8ad14748\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.781746 kubelet[3285]: E0912 18:56:30.781723 3285 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1c8b587c5d8152af7da72ece9c35ee3182083d5eb70f65537d7f32e8ad14748\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6cdb495f56-2zsfm" Sep 12 18:56:30.781746 kubelet[3285]: E0912 18:56:30.781736 3285 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1c8b587c5d8152af7da72ece9c35ee3182083d5eb70f65537d7f32e8ad14748\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6cdb495f56-2zsfm" Sep 12 18:56:30.781789 kubelet[3285]: E0912 18:56:30.781762 3285 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6cdb495f56-2zsfm_calico-system(39d4e67c-14c5-4675-8693-1f65bf0c3499)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6cdb495f56-2zsfm_calico-system(39d4e67c-14c5-4675-8693-1f65bf0c3499)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a1c8b587c5d8152af7da72ece9c35ee3182083d5eb70f65537d7f32e8ad14748\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6cdb495f56-2zsfm" podUID="39d4e67c-14c5-4675-8693-1f65bf0c3499" Sep 12 18:56:30.782520 systemd[1]: run-netns-cni\x2da6e96648\x2ddf39\x2dafd0\x2d8fcd\x2d990326758ec6.mount: Deactivated successfully. Sep 12 18:56:30.785624 containerd[1921]: time="2025-09-12T18:56:30.785401101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-9jkf4,Uid:450d1971-2f13-49d5-abdc-f1fd517af156,Namespace:calico-system,Attempt:0,}" Sep 12 18:56:30.790029 containerd[1921]: time="2025-09-12T18:56:30.789996326Z" level=error msg="Failed to destroy network for sandbox \"dbf363a6f09e958ad66b9834d7c9be5c8687e5b965af84bf543c6cc4ab64101b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.798349 containerd[1921]: time="2025-09-12T18:56:30.798322941Z" level=error msg="Failed to destroy network for sandbox \"4451f938ce88f9557b5945dd63379ec28801d5c3d765db04b0993c235de7c180\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.803771 containerd[1921]: time="2025-09-12T18:56:30.803701692Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-drxkk,Uid:ea20e964-361f-4ead-8929-49881fdc393b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbf363a6f09e958ad66b9834d7c9be5c8687e5b965af84bf543c6cc4ab64101b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.803942 kubelet[3285]: E0912 18:56:30.803888 3285 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbf363a6f09e958ad66b9834d7c9be5c8687e5b965af84bf543c6cc4ab64101b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.803942 kubelet[3285]: E0912 18:56:30.803929 3285 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbf363a6f09e958ad66b9834d7c9be5c8687e5b965af84bf543c6cc4ab64101b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-drxkk" Sep 12 18:56:30.804011 kubelet[3285]: E0912 18:56:30.803943 3285 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbf363a6f09e958ad66b9834d7c9be5c8687e5b965af84bf543c6cc4ab64101b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-drxkk" Sep 12 18:56:30.804011 kubelet[3285]: E0912 18:56:30.803973 3285 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-drxkk_kube-system(ea20e964-361f-4ead-8929-49881fdc393b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-drxkk_kube-system(ea20e964-361f-4ead-8929-49881fdc393b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dbf363a6f09e958ad66b9834d7c9be5c8687e5b965af84bf543c6cc4ab64101b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-drxkk" podUID="ea20e964-361f-4ead-8929-49881fdc393b" Sep 12 18:56:30.805740 containerd[1921]: time="2025-09-12T18:56:30.805715315Z" level=error msg="Failed to destroy network for sandbox \"e9c7e85fad1c59116670c98eec91811ddfd4f3f51ad560f4dbe6be9fcf46c789\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.806664 containerd[1921]: time="2025-09-12T18:56:30.806643151Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56857fb794-l2ttg,Uid:b76b76fa-54b7-4652-8227-b8f6b96853be,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4451f938ce88f9557b5945dd63379ec28801d5c3d765db04b0993c235de7c180\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.806788 kubelet[3285]: E0912 18:56:30.806763 3285 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4451f938ce88f9557b5945dd63379ec28801d5c3d765db04b0993c235de7c180\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.806842 kubelet[3285]: E0912 18:56:30.806800 3285 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4451f938ce88f9557b5945dd63379ec28801d5c3d765db04b0993c235de7c180\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56857fb794-l2ttg" Sep 12 18:56:30.806842 kubelet[3285]: E0912 18:56:30.806812 3285 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4451f938ce88f9557b5945dd63379ec28801d5c3d765db04b0993c235de7c180\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56857fb794-l2ttg" Sep 12 18:56:30.806905 kubelet[3285]: E0912 18:56:30.806835 3285 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-56857fb794-l2ttg_calico-apiserver(b76b76fa-54b7-4652-8227-b8f6b96853be)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-56857fb794-l2ttg_calico-apiserver(b76b76fa-54b7-4652-8227-b8f6b96853be)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4451f938ce88f9557b5945dd63379ec28801d5c3d765db04b0993c235de7c180\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-56857fb794-l2ttg" podUID="b76b76fa-54b7-4652-8227-b8f6b96853be" Sep 12 18:56:30.807256 containerd[1921]: time="2025-09-12T18:56:30.807237127Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-76f44846d5-pnf2g,Uid:e52696bd-4cda-4312-8869-f48d61792f6c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9c7e85fad1c59116670c98eec91811ddfd4f3f51ad560f4dbe6be9fcf46c789\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.807326 kubelet[3285]: E0912 18:56:30.807305 3285 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9c7e85fad1c59116670c98eec91811ddfd4f3f51ad560f4dbe6be9fcf46c789\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.807362 kubelet[3285]: E0912 18:56:30.807335 3285 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9c7e85fad1c59116670c98eec91811ddfd4f3f51ad560f4dbe6be9fcf46c789\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-76f44846d5-pnf2g" Sep 12 18:56:30.807362 kubelet[3285]: E0912 18:56:30.807346 3285 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9c7e85fad1c59116670c98eec91811ddfd4f3f51ad560f4dbe6be9fcf46c789\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-76f44846d5-pnf2g" Sep 12 18:56:30.807428 kubelet[3285]: E0912 18:56:30.807363 3285 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-76f44846d5-pnf2g_calico-system(e52696bd-4cda-4312-8869-f48d61792f6c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-76f44846d5-pnf2g_calico-system(e52696bd-4cda-4312-8869-f48d61792f6c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e9c7e85fad1c59116670c98eec91811ddfd4f3f51ad560f4dbe6be9fcf46c789\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-76f44846d5-pnf2g" podUID="e52696bd-4cda-4312-8869-f48d61792f6c" Sep 12 18:56:30.810486 containerd[1921]: time="2025-09-12T18:56:30.810446745Z" level=error msg="Failed to destroy network for sandbox \"b5245c2938d63d804680f7908f0e8d7e203d88bdd59eef29da8e59a1eb5b5417\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.811114 containerd[1921]: time="2025-09-12T18:56:30.811074032Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-9jkf4,Uid:450d1971-2f13-49d5-abdc-f1fd517af156,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b5245c2938d63d804680f7908f0e8d7e203d88bdd59eef29da8e59a1eb5b5417\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.811215 kubelet[3285]: E0912 18:56:30.811170 3285 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b5245c2938d63d804680f7908f0e8d7e203d88bdd59eef29da8e59a1eb5b5417\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 18:56:30.811215 kubelet[3285]: E0912 18:56:30.811191 3285 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b5245c2938d63d804680f7908f0e8d7e203d88bdd59eef29da8e59a1eb5b5417\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-9jkf4" Sep 12 18:56:30.811215 kubelet[3285]: E0912 18:56:30.811203 3285 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b5245c2938d63d804680f7908f0e8d7e203d88bdd59eef29da8e59a1eb5b5417\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-9jkf4" Sep 12 18:56:30.811280 kubelet[3285]: E0912 18:56:30.811222 3285 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-9jkf4_calico-system(450d1971-2f13-49d5-abdc-f1fd517af156)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-9jkf4_calico-system(450d1971-2f13-49d5-abdc-f1fd517af156)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b5245c2938d63d804680f7908f0e8d7e203d88bdd59eef29da8e59a1eb5b5417\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-9jkf4" podUID="450d1971-2f13-49d5-abdc-f1fd517af156" Sep 12 18:56:31.650707 containerd[1921]: time="2025-09-12T18:56:31.650623947Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 18:56:31.768078 systemd[1]: run-netns-cni\x2d78adc93b\x2d4fd2\x2d40c0\x2d3051\x2d0d2658f7899e.mount: Deactivated successfully. Sep 12 18:56:31.768132 systemd[1]: run-netns-cni\x2d62314f02\x2d6527\x2d3056\x2d686f\x2d30e428e7867f.mount: Deactivated successfully. Sep 12 18:56:31.768166 systemd[1]: run-netns-cni\x2d331f7f68\x2da2d7\x2deaf2\x2d1704\x2da54d19f1ae2d.mount: Deactivated successfully. Sep 12 18:56:37.133504 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1934477298.mount: Deactivated successfully. Sep 12 18:56:37.151284 containerd[1921]: time="2025-09-12T18:56:37.151232678Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:37.151482 containerd[1921]: time="2025-09-12T18:56:37.151349653Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 12 18:56:37.151736 containerd[1921]: time="2025-09-12T18:56:37.151724212Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:37.152503 containerd[1921]: time="2025-09-12T18:56:37.152492460Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:37.153075 containerd[1921]: time="2025-09-12T18:56:37.153036061Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 5.502343258s" Sep 12 18:56:37.153075 containerd[1921]: time="2025-09-12T18:56:37.153049968Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 12 18:56:37.156618 containerd[1921]: time="2025-09-12T18:56:37.156598333Z" level=info msg="CreateContainer within sandbox \"4e8df7852a0401b9e58f31e3d3b489951ede8ad01aeef3c8f0f752d97d817bb2\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 18:56:37.161044 containerd[1921]: time="2025-09-12T18:56:37.161003160Z" level=info msg="Container ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:37.164814 containerd[1921]: time="2025-09-12T18:56:37.164801293Z" level=info msg="CreateContainer within sandbox \"4e8df7852a0401b9e58f31e3d3b489951ede8ad01aeef3c8f0f752d97d817bb2\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\"" Sep 12 18:56:37.165065 containerd[1921]: time="2025-09-12T18:56:37.165019689Z" level=info msg="StartContainer for \"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\"" Sep 12 18:56:37.165808 containerd[1921]: time="2025-09-12T18:56:37.165786117Z" level=info msg="connecting to shim ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a" address="unix:///run/containerd/s/7ab0899356d612a35bf4c91bdeec8be29bf5e5f080ebe9778f348bc0a23de635" protocol=ttrpc version=3 Sep 12 18:56:37.179048 systemd[1]: Started cri-containerd-ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a.scope - libcontainer container ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a. Sep 12 18:56:37.219615 containerd[1921]: time="2025-09-12T18:56:37.219551214Z" level=info msg="StartContainer for \"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\" returns successfully" Sep 12 18:56:37.284407 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 18:56:37.284469 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 18:56:37.366746 kubelet[3285]: I0912 18:56:37.366717 3285 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e52696bd-4cda-4312-8869-f48d61792f6c-whisker-backend-key-pair\") pod \"e52696bd-4cda-4312-8869-f48d61792f6c\" (UID: \"e52696bd-4cda-4312-8869-f48d61792f6c\") " Sep 12 18:56:37.367081 kubelet[3285]: I0912 18:56:37.366761 3285 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn5r6\" (UniqueName: \"kubernetes.io/projected/e52696bd-4cda-4312-8869-f48d61792f6c-kube-api-access-dn5r6\") pod \"e52696bd-4cda-4312-8869-f48d61792f6c\" (UID: \"e52696bd-4cda-4312-8869-f48d61792f6c\") " Sep 12 18:56:37.367081 kubelet[3285]: I0912 18:56:37.366784 3285 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e52696bd-4cda-4312-8869-f48d61792f6c-whisker-ca-bundle\") pod \"e52696bd-4cda-4312-8869-f48d61792f6c\" (UID: \"e52696bd-4cda-4312-8869-f48d61792f6c\") " Sep 12 18:56:37.367140 kubelet[3285]: I0912 18:56:37.367079 3285 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e52696bd-4cda-4312-8869-f48d61792f6c-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "e52696bd-4cda-4312-8869-f48d61792f6c" (UID: "e52696bd-4cda-4312-8869-f48d61792f6c"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 12 18:56:37.368327 kubelet[3285]: I0912 18:56:37.368284 3285 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e52696bd-4cda-4312-8869-f48d61792f6c-kube-api-access-dn5r6" (OuterVolumeSpecName: "kube-api-access-dn5r6") pod "e52696bd-4cda-4312-8869-f48d61792f6c" (UID: "e52696bd-4cda-4312-8869-f48d61792f6c"). InnerVolumeSpecName "kube-api-access-dn5r6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 18:56:37.368467 kubelet[3285]: I0912 18:56:37.368426 3285 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52696bd-4cda-4312-8869-f48d61792f6c-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "e52696bd-4cda-4312-8869-f48d61792f6c" (UID: "e52696bd-4cda-4312-8869-f48d61792f6c"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 18:56:37.468274 kubelet[3285]: I0912 18:56:37.468023 3285 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e52696bd-4cda-4312-8869-f48d61792f6c-whisker-ca-bundle\") on node \"ci-4426.1.0-a-3db2d8461d\" DevicePath \"\"" Sep 12 18:56:37.468274 kubelet[3285]: I0912 18:56:37.468098 3285 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e52696bd-4cda-4312-8869-f48d61792f6c-whisker-backend-key-pair\") on node \"ci-4426.1.0-a-3db2d8461d\" DevicePath \"\"" Sep 12 18:56:37.468274 kubelet[3285]: I0912 18:56:37.468131 3285 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dn5r6\" (UniqueName: \"kubernetes.io/projected/e52696bd-4cda-4312-8869-f48d61792f6c-kube-api-access-dn5r6\") on node \"ci-4426.1.0-a-3db2d8461d\" DevicePath \"\"" Sep 12 18:56:37.553499 systemd[1]: Removed slice kubepods-besteffort-pode52696bd_4cda_4312_8869_f48d61792f6c.slice - libcontainer container kubepods-besteffort-pode52696bd_4cda_4312_8869_f48d61792f6c.slice. Sep 12 18:56:37.704029 kubelet[3285]: I0912 18:56:37.703872 3285 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-574mq" podStartSLOduration=1.458227854 podStartE2EDuration="17.703821423s" podCreationTimestamp="2025-09-12 18:56:20 +0000 UTC" firstStartedPulling="2025-09-12 18:56:20.907783145 +0000 UTC m=+15.457169533" lastFinishedPulling="2025-09-12 18:56:37.153376714 +0000 UTC m=+31.702763102" observedRunningTime="2025-09-12 18:56:37.702763241 +0000 UTC m=+32.252149685" watchObservedRunningTime="2025-09-12 18:56:37.703821423 +0000 UTC m=+32.253207854" Sep 12 18:56:37.774850 systemd[1]: Created slice kubepods-besteffort-pod4d4029de_01c6_4f6e_bf16_3dace831fac5.slice - libcontainer container kubepods-besteffort-pod4d4029de_01c6_4f6e_bf16_3dace831fac5.slice. Sep 12 18:56:37.870644 kubelet[3285]: I0912 18:56:37.870466 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4d4029de-01c6-4f6e-bf16-3dace831fac5-whisker-backend-key-pair\") pod \"whisker-df8ff547b-29mjj\" (UID: \"4d4029de-01c6-4f6e-bf16-3dace831fac5\") " pod="calico-system/whisker-df8ff547b-29mjj" Sep 12 18:56:37.870644 kubelet[3285]: I0912 18:56:37.870639 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4ktn\" (UniqueName: \"kubernetes.io/projected/4d4029de-01c6-4f6e-bf16-3dace831fac5-kube-api-access-b4ktn\") pod \"whisker-df8ff547b-29mjj\" (UID: \"4d4029de-01c6-4f6e-bf16-3dace831fac5\") " pod="calico-system/whisker-df8ff547b-29mjj" Sep 12 18:56:37.871060 kubelet[3285]: I0912 18:56:37.870713 3285 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d4029de-01c6-4f6e-bf16-3dace831fac5-whisker-ca-bundle\") pod \"whisker-df8ff547b-29mjj\" (UID: \"4d4029de-01c6-4f6e-bf16-3dace831fac5\") " pod="calico-system/whisker-df8ff547b-29mjj" Sep 12 18:56:38.078693 containerd[1921]: time="2025-09-12T18:56:38.078452705Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-df8ff547b-29mjj,Uid:4d4029de-01c6-4f6e-bf16-3dace831fac5,Namespace:calico-system,Attempt:0,}" Sep 12 18:56:38.137522 systemd[1]: var-lib-kubelet-pods-e52696bd\x2d4cda\x2d4312\x2d8869\x2df48d61792f6c-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ddn5r6.mount: Deactivated successfully. Sep 12 18:56:38.137610 systemd[1]: var-lib-kubelet-pods-e52696bd\x2d4cda\x2d4312\x2d8869\x2df48d61792f6c-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 18:56:38.163989 systemd-networkd[1836]: cali04b9641bc8c: Link UP Sep 12 18:56:38.164344 systemd-networkd[1836]: cali04b9641bc8c: Gained carrier Sep 12 18:56:38.176494 containerd[1921]: 2025-09-12 18:56:38.091 [INFO][4733] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 18:56:38.176494 containerd[1921]: 2025-09-12 18:56:38.099 [INFO][4733] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--3db2d8461d-k8s-whisker--df8ff547b--29mjj-eth0 whisker-df8ff547b- calico-system 4d4029de-01c6-4f6e-bf16-3dace831fac5 885 0 2025-09-12 18:56:37 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:df8ff547b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4426.1.0-a-3db2d8461d whisker-df8ff547b-29mjj eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali04b9641bc8c [] [] }} ContainerID="3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" Namespace="calico-system" Pod="whisker-df8ff547b-29mjj" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-whisker--df8ff547b--29mjj-" Sep 12 18:56:38.176494 containerd[1921]: 2025-09-12 18:56:38.099 [INFO][4733] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" Namespace="calico-system" Pod="whisker-df8ff547b-29mjj" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-whisker--df8ff547b--29mjj-eth0" Sep 12 18:56:38.176494 containerd[1921]: 2025-09-12 18:56:38.116 [INFO][4752] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" HandleID="k8s-pod-network.3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" Workload="ci--4426.1.0--a--3db2d8461d-k8s-whisker--df8ff547b--29mjj-eth0" Sep 12 18:56:38.177120 containerd[1921]: 2025-09-12 18:56:38.116 [INFO][4752] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" HandleID="k8s-pod-network.3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" Workload="ci--4426.1.0--a--3db2d8461d-k8s-whisker--df8ff547b--29mjj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001396c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.1.0-a-3db2d8461d", "pod":"whisker-df8ff547b-29mjj", "timestamp":"2025-09-12 18:56:38.116654739 +0000 UTC"}, Hostname:"ci-4426.1.0-a-3db2d8461d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 18:56:38.177120 containerd[1921]: 2025-09-12 18:56:38.116 [INFO][4752] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 18:56:38.177120 containerd[1921]: 2025-09-12 18:56:38.116 [INFO][4752] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 18:56:38.177120 containerd[1921]: 2025-09-12 18:56:38.116 [INFO][4752] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-3db2d8461d' Sep 12 18:56:38.177120 containerd[1921]: 2025-09-12 18:56:38.122 [INFO][4752] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:38.177120 containerd[1921]: 2025-09-12 18:56:38.126 [INFO][4752] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:38.177120 containerd[1921]: 2025-09-12 18:56:38.129 [INFO][4752] ipam/ipam.go 511: Trying affinity for 192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:38.177120 containerd[1921]: 2025-09-12 18:56:38.131 [INFO][4752] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:38.177120 containerd[1921]: 2025-09-12 18:56:38.133 [INFO][4752] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:38.177556 containerd[1921]: 2025-09-12 18:56:38.133 [INFO][4752] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.128/26 handle="k8s-pod-network.3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:38.177556 containerd[1921]: 2025-09-12 18:56:38.134 [INFO][4752] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729 Sep 12 18:56:38.177556 containerd[1921]: 2025-09-12 18:56:38.137 [INFO][4752] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.128/26 handle="k8s-pod-network.3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:38.177556 containerd[1921]: 2025-09-12 18:56:38.140 [INFO][4752] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.129/26] block=192.168.8.128/26 handle="k8s-pod-network.3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:38.177556 containerd[1921]: 2025-09-12 18:56:38.140 [INFO][4752] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.129/26] handle="k8s-pod-network.3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:38.177556 containerd[1921]: 2025-09-12 18:56:38.140 [INFO][4752] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 18:56:38.177556 containerd[1921]: 2025-09-12 18:56:38.140 [INFO][4752] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.129/26] IPv6=[] ContainerID="3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" HandleID="k8s-pod-network.3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" Workload="ci--4426.1.0--a--3db2d8461d-k8s-whisker--df8ff547b--29mjj-eth0" Sep 12 18:56:38.177861 containerd[1921]: 2025-09-12 18:56:38.142 [INFO][4733] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" Namespace="calico-system" Pod="whisker-df8ff547b-29mjj" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-whisker--df8ff547b--29mjj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--3db2d8461d-k8s-whisker--df8ff547b--29mjj-eth0", GenerateName:"whisker-df8ff547b-", Namespace:"calico-system", SelfLink:"", UID:"4d4029de-01c6-4f6e-bf16-3dace831fac5", ResourceVersion:"885", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 18, 56, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"df8ff547b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-3db2d8461d", ContainerID:"", Pod:"whisker-df8ff547b-29mjj", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.8.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali04b9641bc8c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 18:56:38.177861 containerd[1921]: 2025-09-12 18:56:38.142 [INFO][4733] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.129/32] ContainerID="3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" Namespace="calico-system" Pod="whisker-df8ff547b-29mjj" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-whisker--df8ff547b--29mjj-eth0" Sep 12 18:56:38.177999 containerd[1921]: 2025-09-12 18:56:38.142 [INFO][4733] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali04b9641bc8c ContainerID="3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" Namespace="calico-system" Pod="whisker-df8ff547b-29mjj" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-whisker--df8ff547b--29mjj-eth0" Sep 12 18:56:38.177999 containerd[1921]: 2025-09-12 18:56:38.164 [INFO][4733] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" Namespace="calico-system" Pod="whisker-df8ff547b-29mjj" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-whisker--df8ff547b--29mjj-eth0" Sep 12 18:56:38.178092 containerd[1921]: 2025-09-12 18:56:38.164 [INFO][4733] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" Namespace="calico-system" Pod="whisker-df8ff547b-29mjj" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-whisker--df8ff547b--29mjj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--3db2d8461d-k8s-whisker--df8ff547b--29mjj-eth0", GenerateName:"whisker-df8ff547b-", Namespace:"calico-system", SelfLink:"", UID:"4d4029de-01c6-4f6e-bf16-3dace831fac5", ResourceVersion:"885", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 18, 56, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"df8ff547b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-3db2d8461d", ContainerID:"3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729", Pod:"whisker-df8ff547b-29mjj", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.8.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali04b9641bc8c", MAC:"7a:81:e4:f8:28:b6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 18:56:38.178192 containerd[1921]: 2025-09-12 18:56:38.173 [INFO][4733] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" Namespace="calico-system" Pod="whisker-df8ff547b-29mjj" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-whisker--df8ff547b--29mjj-eth0" Sep 12 18:56:38.186498 containerd[1921]: time="2025-09-12T18:56:38.186475544Z" level=info msg="connecting to shim 3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729" address="unix:///run/containerd/s/2ddc1d6418c44a645490edce1010efd097b3e081e79ea132add4611b9b49b0a1" namespace=k8s.io protocol=ttrpc version=3 Sep 12 18:56:38.210834 systemd[1]: Started cri-containerd-3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729.scope - libcontainer container 3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729. Sep 12 18:56:38.231698 kubelet[3285]: I0912 18:56:38.231654 3285 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 18:56:38.246010 containerd[1921]: time="2025-09-12T18:56:38.245987035Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-df8ff547b-29mjj,Uid:4d4029de-01c6-4f6e-bf16-3dace831fac5,Namespace:calico-system,Attempt:0,} returns sandbox id \"3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729\"" Sep 12 18:56:38.246746 containerd[1921]: time="2025-09-12T18:56:38.246734621Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 18:56:38.641788 systemd-networkd[1836]: vxlan.calico: Link UP Sep 12 18:56:38.641792 systemd-networkd[1836]: vxlan.calico: Gained carrier Sep 12 18:56:38.717497 containerd[1921]: time="2025-09-12T18:56:38.717445908Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\" id:\"59297b31600ad205df11b09f560049e49d90bdf5fc169124a3444ac3e4d9a1dc\" pid:5064 exit_status:1 exited_at:{seconds:1757703398 nanos:717242249}" Sep 12 18:56:39.543612 kubelet[3285]: I0912 18:56:39.543532 3285 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e52696bd-4cda-4312-8869-f48d61792f6c" path="/var/lib/kubelet/pods/e52696bd-4cda-4312-8869-f48d61792f6c/volumes" Sep 12 18:56:39.557901 systemd-networkd[1836]: cali04b9641bc8c: Gained IPv6LL Sep 12 18:56:39.726118 containerd[1921]: time="2025-09-12T18:56:39.726072285Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\" id:\"7450d3fdb566479aa3d867af73f34d3a07787a867f7054cd06009967d4634d9a\" pid:5136 exit_status:1 exited_at:{seconds:1757703399 nanos:725866465}" Sep 12 18:56:39.941334 containerd[1921]: time="2025-09-12T18:56:39.941251362Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:39.941474 containerd[1921]: time="2025-09-12T18:56:39.941460315Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 12 18:56:39.941837 containerd[1921]: time="2025-09-12T18:56:39.941797044Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:39.942712 containerd[1921]: time="2025-09-12T18:56:39.942673870Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:39.943107 containerd[1921]: time="2025-09-12T18:56:39.943065772Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.696313985s" Sep 12 18:56:39.943107 containerd[1921]: time="2025-09-12T18:56:39.943081295Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 12 18:56:39.944022 containerd[1921]: time="2025-09-12T18:56:39.944012755Z" level=info msg="CreateContainer within sandbox \"3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 18:56:39.947380 containerd[1921]: time="2025-09-12T18:56:39.947365700Z" level=info msg="Container 83100cbe2e9f25c4dbcc0b8f65076d1a83e138c83ed8784cbe7263062ec65d6b: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:39.952009 containerd[1921]: time="2025-09-12T18:56:39.951951291Z" level=info msg="CreateContainer within sandbox \"3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"83100cbe2e9f25c4dbcc0b8f65076d1a83e138c83ed8784cbe7263062ec65d6b\"" Sep 12 18:56:39.952229 containerd[1921]: time="2025-09-12T18:56:39.952173176Z" level=info msg="StartContainer for \"83100cbe2e9f25c4dbcc0b8f65076d1a83e138c83ed8784cbe7263062ec65d6b\"" Sep 12 18:56:39.952708 containerd[1921]: time="2025-09-12T18:56:39.952683133Z" level=info msg="connecting to shim 83100cbe2e9f25c4dbcc0b8f65076d1a83e138c83ed8784cbe7263062ec65d6b" address="unix:///run/containerd/s/2ddc1d6418c44a645490edce1010efd097b3e081e79ea132add4611b9b49b0a1" protocol=ttrpc version=3 Sep 12 18:56:39.973855 systemd[1]: Started cri-containerd-83100cbe2e9f25c4dbcc0b8f65076d1a83e138c83ed8784cbe7263062ec65d6b.scope - libcontainer container 83100cbe2e9f25c4dbcc0b8f65076d1a83e138c83ed8784cbe7263062ec65d6b. Sep 12 18:56:40.001095 containerd[1921]: time="2025-09-12T18:56:40.001051923Z" level=info msg="StartContainer for \"83100cbe2e9f25c4dbcc0b8f65076d1a83e138c83ed8784cbe7263062ec65d6b\" returns successfully" Sep 12 18:56:40.001611 containerd[1921]: time="2025-09-12T18:56:40.001581580Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 18:56:40.580925 systemd-networkd[1836]: vxlan.calico: Gained IPv6LL Sep 12 18:56:41.539502 containerd[1921]: time="2025-09-12T18:56:41.539470260Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cdb495f56-2zsfm,Uid:39d4e67c-14c5-4675-8693-1f65bf0c3499,Namespace:calico-system,Attempt:0,}" Sep 12 18:56:41.596701 systemd-networkd[1836]: calif0e5f21f946: Link UP Sep 12 18:56:41.596920 systemd-networkd[1836]: calif0e5f21f946: Gained carrier Sep 12 18:56:41.605660 containerd[1921]: 2025-09-12 18:56:41.558 [INFO][5211] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--3db2d8461d-k8s-calico--kube--controllers--6cdb495f56--2zsfm-eth0 calico-kube-controllers-6cdb495f56- calico-system 39d4e67c-14c5-4675-8693-1f65bf0c3499 814 0 2025-09-12 18:56:20 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6cdb495f56 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4426.1.0-a-3db2d8461d calico-kube-controllers-6cdb495f56-2zsfm eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calif0e5f21f946 [] [] }} ContainerID="9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" Namespace="calico-system" Pod="calico-kube-controllers-6cdb495f56-2zsfm" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--kube--controllers--6cdb495f56--2zsfm-" Sep 12 18:56:41.605660 containerd[1921]: 2025-09-12 18:56:41.558 [INFO][5211] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" Namespace="calico-system" Pod="calico-kube-controllers-6cdb495f56-2zsfm" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--kube--controllers--6cdb495f56--2zsfm-eth0" Sep 12 18:56:41.605660 containerd[1921]: 2025-09-12 18:56:41.573 [INFO][5233] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" HandleID="k8s-pod-network.9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" Workload="ci--4426.1.0--a--3db2d8461d-k8s-calico--kube--controllers--6cdb495f56--2zsfm-eth0" Sep 12 18:56:41.605848 containerd[1921]: 2025-09-12 18:56:41.573 [INFO][5233] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" HandleID="k8s-pod-network.9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" Workload="ci--4426.1.0--a--3db2d8461d-k8s-calico--kube--controllers--6cdb495f56--2zsfm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00026e3d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.1.0-a-3db2d8461d", "pod":"calico-kube-controllers-6cdb495f56-2zsfm", "timestamp":"2025-09-12 18:56:41.573146388 +0000 UTC"}, Hostname:"ci-4426.1.0-a-3db2d8461d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 18:56:41.605848 containerd[1921]: 2025-09-12 18:56:41.573 [INFO][5233] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 18:56:41.605848 containerd[1921]: 2025-09-12 18:56:41.573 [INFO][5233] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 18:56:41.605848 containerd[1921]: 2025-09-12 18:56:41.573 [INFO][5233] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-3db2d8461d' Sep 12 18:56:41.605848 containerd[1921]: 2025-09-12 18:56:41.578 [INFO][5233] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:41.605848 containerd[1921]: 2025-09-12 18:56:41.581 [INFO][5233] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:41.605848 containerd[1921]: 2025-09-12 18:56:41.584 [INFO][5233] ipam/ipam.go 511: Trying affinity for 192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:41.605848 containerd[1921]: 2025-09-12 18:56:41.585 [INFO][5233] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:41.605848 containerd[1921]: 2025-09-12 18:56:41.586 [INFO][5233] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:41.606088 containerd[1921]: 2025-09-12 18:56:41.586 [INFO][5233] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.128/26 handle="k8s-pod-network.9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:41.606088 containerd[1921]: 2025-09-12 18:56:41.587 [INFO][5233] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642 Sep 12 18:56:41.606088 containerd[1921]: 2025-09-12 18:56:41.590 [INFO][5233] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.128/26 handle="k8s-pod-network.9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:41.606088 containerd[1921]: 2025-09-12 18:56:41.593 [INFO][5233] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.130/26] block=192.168.8.128/26 handle="k8s-pod-network.9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:41.606088 containerd[1921]: 2025-09-12 18:56:41.593 [INFO][5233] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.130/26] handle="k8s-pod-network.9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:41.606088 containerd[1921]: 2025-09-12 18:56:41.593 [INFO][5233] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 18:56:41.606088 containerd[1921]: 2025-09-12 18:56:41.593 [INFO][5233] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.130/26] IPv6=[] ContainerID="9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" HandleID="k8s-pod-network.9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" Workload="ci--4426.1.0--a--3db2d8461d-k8s-calico--kube--controllers--6cdb495f56--2zsfm-eth0" Sep 12 18:56:41.606256 containerd[1921]: 2025-09-12 18:56:41.595 [INFO][5211] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" Namespace="calico-system" Pod="calico-kube-controllers-6cdb495f56-2zsfm" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--kube--controllers--6cdb495f56--2zsfm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--3db2d8461d-k8s-calico--kube--controllers--6cdb495f56--2zsfm-eth0", GenerateName:"calico-kube-controllers-6cdb495f56-", Namespace:"calico-system", SelfLink:"", UID:"39d4e67c-14c5-4675-8693-1f65bf0c3499", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 18, 56, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6cdb495f56", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-3db2d8461d", ContainerID:"", Pod:"calico-kube-controllers-6cdb495f56-2zsfm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.8.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif0e5f21f946", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 18:56:41.606322 containerd[1921]: 2025-09-12 18:56:41.595 [INFO][5211] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.130/32] ContainerID="9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" Namespace="calico-system" Pod="calico-kube-controllers-6cdb495f56-2zsfm" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--kube--controllers--6cdb495f56--2zsfm-eth0" Sep 12 18:56:41.606322 containerd[1921]: 2025-09-12 18:56:41.595 [INFO][5211] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif0e5f21f946 ContainerID="9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" Namespace="calico-system" Pod="calico-kube-controllers-6cdb495f56-2zsfm" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--kube--controllers--6cdb495f56--2zsfm-eth0" Sep 12 18:56:41.606322 containerd[1921]: 2025-09-12 18:56:41.596 [INFO][5211] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" Namespace="calico-system" Pod="calico-kube-controllers-6cdb495f56-2zsfm" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--kube--controllers--6cdb495f56--2zsfm-eth0" Sep 12 18:56:41.606400 containerd[1921]: 2025-09-12 18:56:41.597 [INFO][5211] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" Namespace="calico-system" Pod="calico-kube-controllers-6cdb495f56-2zsfm" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--kube--controllers--6cdb495f56--2zsfm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--3db2d8461d-k8s-calico--kube--controllers--6cdb495f56--2zsfm-eth0", GenerateName:"calico-kube-controllers-6cdb495f56-", Namespace:"calico-system", SelfLink:"", UID:"39d4e67c-14c5-4675-8693-1f65bf0c3499", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 18, 56, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6cdb495f56", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-3db2d8461d", ContainerID:"9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642", Pod:"calico-kube-controllers-6cdb495f56-2zsfm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.8.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif0e5f21f946", MAC:"fa:45:fc:12:be:a1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 18:56:41.606466 containerd[1921]: 2025-09-12 18:56:41.603 [INFO][5211] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" Namespace="calico-system" Pod="calico-kube-controllers-6cdb495f56-2zsfm" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--kube--controllers--6cdb495f56--2zsfm-eth0" Sep 12 18:56:41.629225 containerd[1921]: time="2025-09-12T18:56:41.629201956Z" level=info msg="connecting to shim 9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642" address="unix:///run/containerd/s/d901449d9c95acf29ff8d5ed05c99412efb00e4502e597ecaf3ea80cc80a377e" namespace=k8s.io protocol=ttrpc version=3 Sep 12 18:56:41.655741 systemd[1]: Started cri-containerd-9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642.scope - libcontainer container 9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642. Sep 12 18:56:41.688428 containerd[1921]: time="2025-09-12T18:56:41.688408825Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cdb495f56-2zsfm,Uid:39d4e67c-14c5-4675-8693-1f65bf0c3499,Namespace:calico-system,Attempt:0,} returns sandbox id \"9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642\"" Sep 12 18:56:42.324465 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3043128902.mount: Deactivated successfully. Sep 12 18:56:42.330164 containerd[1921]: time="2025-09-12T18:56:42.330146713Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:42.330405 containerd[1921]: time="2025-09-12T18:56:42.330393599Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 12 18:56:42.330950 containerd[1921]: time="2025-09-12T18:56:42.330925356Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:42.332418 containerd[1921]: time="2025-09-12T18:56:42.332375669Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:42.332840 containerd[1921]: time="2025-09-12T18:56:42.332828956Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.331225049s" Sep 12 18:56:42.332878 containerd[1921]: time="2025-09-12T18:56:42.332843072Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 12 18:56:42.333269 containerd[1921]: time="2025-09-12T18:56:42.333259710Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 18:56:42.333801 containerd[1921]: time="2025-09-12T18:56:42.333789509Z" level=info msg="CreateContainer within sandbox \"3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 18:56:42.336366 containerd[1921]: time="2025-09-12T18:56:42.336354644Z" level=info msg="Container 53a97ae6cae391f69987d2931355e588a850a9916d1e1d5ac33304f7ca3f8ba2: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:42.339184 containerd[1921]: time="2025-09-12T18:56:42.339141853Z" level=info msg="CreateContainer within sandbox \"3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"53a97ae6cae391f69987d2931355e588a850a9916d1e1d5ac33304f7ca3f8ba2\"" Sep 12 18:56:42.339345 containerd[1921]: time="2025-09-12T18:56:42.339334920Z" level=info msg="StartContainer for \"53a97ae6cae391f69987d2931355e588a850a9916d1e1d5ac33304f7ca3f8ba2\"" Sep 12 18:56:42.339871 containerd[1921]: time="2025-09-12T18:56:42.339859296Z" level=info msg="connecting to shim 53a97ae6cae391f69987d2931355e588a850a9916d1e1d5ac33304f7ca3f8ba2" address="unix:///run/containerd/s/2ddc1d6418c44a645490edce1010efd097b3e081e79ea132add4611b9b49b0a1" protocol=ttrpc version=3 Sep 12 18:56:42.358867 systemd[1]: Started cri-containerd-53a97ae6cae391f69987d2931355e588a850a9916d1e1d5ac33304f7ca3f8ba2.scope - libcontainer container 53a97ae6cae391f69987d2931355e588a850a9916d1e1d5ac33304f7ca3f8ba2. Sep 12 18:56:42.399230 containerd[1921]: time="2025-09-12T18:56:42.399164408Z" level=info msg="StartContainer for \"53a97ae6cae391f69987d2931355e588a850a9916d1e1d5ac33304f7ca3f8ba2\" returns successfully" Sep 12 18:56:42.538287 containerd[1921]: time="2025-09-12T18:56:42.538140802Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-9jkf4,Uid:450d1971-2f13-49d5-abdc-f1fd517af156,Namespace:calico-system,Attempt:0,}" Sep 12 18:56:42.593419 systemd-networkd[1836]: cali7ce29528179: Link UP Sep 12 18:56:42.593582 systemd-networkd[1836]: cali7ce29528179: Gained carrier Sep 12 18:56:42.600440 containerd[1921]: 2025-09-12 18:56:42.557 [INFO][5358] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--3db2d8461d-k8s-goldmane--54d579b49d--9jkf4-eth0 goldmane-54d579b49d- calico-system 450d1971-2f13-49d5-abdc-f1fd517af156 818 0 2025-09-12 18:56:20 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4426.1.0-a-3db2d8461d goldmane-54d579b49d-9jkf4 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali7ce29528179 [] [] }} ContainerID="1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" Namespace="calico-system" Pod="goldmane-54d579b49d-9jkf4" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-goldmane--54d579b49d--9jkf4-" Sep 12 18:56:42.600440 containerd[1921]: 2025-09-12 18:56:42.558 [INFO][5358] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" Namespace="calico-system" Pod="goldmane-54d579b49d-9jkf4" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-goldmane--54d579b49d--9jkf4-eth0" Sep 12 18:56:42.600440 containerd[1921]: 2025-09-12 18:56:42.571 [INFO][5379] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" HandleID="k8s-pod-network.1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" Workload="ci--4426.1.0--a--3db2d8461d-k8s-goldmane--54d579b49d--9jkf4-eth0" Sep 12 18:56:42.600900 containerd[1921]: 2025-09-12 18:56:42.571 [INFO][5379] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" HandleID="k8s-pod-network.1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" Workload="ci--4426.1.0--a--3db2d8461d-k8s-goldmane--54d579b49d--9jkf4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e7520), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.1.0-a-3db2d8461d", "pod":"goldmane-54d579b49d-9jkf4", "timestamp":"2025-09-12 18:56:42.571446762 +0000 UTC"}, Hostname:"ci-4426.1.0-a-3db2d8461d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 18:56:42.600900 containerd[1921]: 2025-09-12 18:56:42.571 [INFO][5379] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 18:56:42.600900 containerd[1921]: 2025-09-12 18:56:42.571 [INFO][5379] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 18:56:42.600900 containerd[1921]: 2025-09-12 18:56:42.571 [INFO][5379] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-3db2d8461d' Sep 12 18:56:42.600900 containerd[1921]: 2025-09-12 18:56:42.576 [INFO][5379] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:42.600900 containerd[1921]: 2025-09-12 18:56:42.579 [INFO][5379] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:42.600900 containerd[1921]: 2025-09-12 18:56:42.582 [INFO][5379] ipam/ipam.go 511: Trying affinity for 192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:42.600900 containerd[1921]: 2025-09-12 18:56:42.583 [INFO][5379] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:42.600900 containerd[1921]: 2025-09-12 18:56:42.585 [INFO][5379] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:42.601323 containerd[1921]: 2025-09-12 18:56:42.585 [INFO][5379] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.128/26 handle="k8s-pod-network.1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:42.601323 containerd[1921]: 2025-09-12 18:56:42.586 [INFO][5379] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8 Sep 12 18:56:42.601323 containerd[1921]: 2025-09-12 18:56:42.588 [INFO][5379] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.128/26 handle="k8s-pod-network.1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:42.601323 containerd[1921]: 2025-09-12 18:56:42.591 [INFO][5379] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.131/26] block=192.168.8.128/26 handle="k8s-pod-network.1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:42.601323 containerd[1921]: 2025-09-12 18:56:42.591 [INFO][5379] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.131/26] handle="k8s-pod-network.1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:42.601323 containerd[1921]: 2025-09-12 18:56:42.591 [INFO][5379] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 18:56:42.601323 containerd[1921]: 2025-09-12 18:56:42.591 [INFO][5379] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.131/26] IPv6=[] ContainerID="1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" HandleID="k8s-pod-network.1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" Workload="ci--4426.1.0--a--3db2d8461d-k8s-goldmane--54d579b49d--9jkf4-eth0" Sep 12 18:56:42.601575 containerd[1921]: 2025-09-12 18:56:42.592 [INFO][5358] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" Namespace="calico-system" Pod="goldmane-54d579b49d-9jkf4" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-goldmane--54d579b49d--9jkf4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--3db2d8461d-k8s-goldmane--54d579b49d--9jkf4-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"450d1971-2f13-49d5-abdc-f1fd517af156", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 18, 56, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-3db2d8461d", ContainerID:"", Pod:"goldmane-54d579b49d-9jkf4", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.8.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7ce29528179", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 18:56:42.601666 containerd[1921]: 2025-09-12 18:56:42.592 [INFO][5358] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.131/32] ContainerID="1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" Namespace="calico-system" Pod="goldmane-54d579b49d-9jkf4" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-goldmane--54d579b49d--9jkf4-eth0" Sep 12 18:56:42.601666 containerd[1921]: 2025-09-12 18:56:42.592 [INFO][5358] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7ce29528179 ContainerID="1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" Namespace="calico-system" Pod="goldmane-54d579b49d-9jkf4" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-goldmane--54d579b49d--9jkf4-eth0" Sep 12 18:56:42.601666 containerd[1921]: 2025-09-12 18:56:42.593 [INFO][5358] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" Namespace="calico-system" Pod="goldmane-54d579b49d-9jkf4" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-goldmane--54d579b49d--9jkf4-eth0" Sep 12 18:56:42.601782 containerd[1921]: 2025-09-12 18:56:42.594 [INFO][5358] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" Namespace="calico-system" Pod="goldmane-54d579b49d-9jkf4" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-goldmane--54d579b49d--9jkf4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--3db2d8461d-k8s-goldmane--54d579b49d--9jkf4-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"450d1971-2f13-49d5-abdc-f1fd517af156", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 18, 56, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-3db2d8461d", ContainerID:"1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8", Pod:"goldmane-54d579b49d-9jkf4", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.8.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7ce29528179", MAC:"f2:05:08:cd:c4:f5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 18:56:42.601868 containerd[1921]: 2025-09-12 18:56:42.599 [INFO][5358] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" Namespace="calico-system" Pod="goldmane-54d579b49d-9jkf4" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-goldmane--54d579b49d--9jkf4-eth0" Sep 12 18:56:42.610066 containerd[1921]: time="2025-09-12T18:56:42.610024568Z" level=info msg="connecting to shim 1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8" address="unix:///run/containerd/s/6d7776280056b89b21761fc1b2e96d7ca1aba79ae0096af697911c9fabb01068" namespace=k8s.io protocol=ttrpc version=3 Sep 12 18:56:42.627877 systemd[1]: Started cri-containerd-1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8.scope - libcontainer container 1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8. Sep 12 18:56:42.653873 containerd[1921]: time="2025-09-12T18:56:42.653854061Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-9jkf4,Uid:450d1971-2f13-49d5-abdc-f1fd517af156,Namespace:calico-system,Attempt:0,} returns sandbox id \"1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8\"" Sep 12 18:56:42.713770 kubelet[3285]: I0912 18:56:42.713725 3285 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-df8ff547b-29mjj" podStartSLOduration=1.627106333 podStartE2EDuration="5.713709413s" podCreationTimestamp="2025-09-12 18:56:37 +0000 UTC" firstStartedPulling="2025-09-12 18:56:38.246608556 +0000 UTC m=+32.795994944" lastFinishedPulling="2025-09-12 18:56:42.333211635 +0000 UTC m=+36.882598024" observedRunningTime="2025-09-12 18:56:42.713706891 +0000 UTC m=+37.263093287" watchObservedRunningTime="2025-09-12 18:56:42.713709413 +0000 UTC m=+37.263095802" Sep 12 18:56:43.204884 systemd-networkd[1836]: calif0e5f21f946: Gained IPv6LL Sep 12 18:56:43.538770 containerd[1921]: time="2025-09-12T18:56:43.538695486Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56857fb794-2bwf9,Uid:8ad1c2c3-91b8-4133-bd04-a7165ab7b049,Namespace:calico-apiserver,Attempt:0,}" Sep 12 18:56:43.539234 containerd[1921]: time="2025-09-12T18:56:43.539159197Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-drxkk,Uid:ea20e964-361f-4ead-8929-49881fdc393b,Namespace:kube-system,Attempt:0,}" Sep 12 18:56:43.604102 systemd-networkd[1836]: calie767e372761: Link UP Sep 12 18:56:43.604310 systemd-networkd[1836]: calie767e372761: Gained carrier Sep 12 18:56:43.610029 containerd[1921]: 2025-09-12 18:56:43.570 [INFO][5458] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--drxkk-eth0 coredns-668d6bf9bc- kube-system ea20e964-361f-4ead-8929-49881fdc393b 819 0 2025-09-12 18:56:11 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426.1.0-a-3db2d8461d coredns-668d6bf9bc-drxkk eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie767e372761 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" Namespace="kube-system" Pod="coredns-668d6bf9bc-drxkk" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--drxkk-" Sep 12 18:56:43.610029 containerd[1921]: 2025-09-12 18:56:43.570 [INFO][5458] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" Namespace="kube-system" Pod="coredns-668d6bf9bc-drxkk" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--drxkk-eth0" Sep 12 18:56:43.610029 containerd[1921]: 2025-09-12 18:56:43.584 [INFO][5500] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" HandleID="k8s-pod-network.b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" Workload="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--drxkk-eth0" Sep 12 18:56:43.610304 containerd[1921]: 2025-09-12 18:56:43.584 [INFO][5500] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" HandleID="k8s-pod-network.b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" Workload="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--drxkk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004ea40), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426.1.0-a-3db2d8461d", "pod":"coredns-668d6bf9bc-drxkk", "timestamp":"2025-09-12 18:56:43.584230112 +0000 UTC"}, Hostname:"ci-4426.1.0-a-3db2d8461d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 18:56:43.610304 containerd[1921]: 2025-09-12 18:56:43.584 [INFO][5500] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 18:56:43.610304 containerd[1921]: 2025-09-12 18:56:43.584 [INFO][5500] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 18:56:43.610304 containerd[1921]: 2025-09-12 18:56:43.584 [INFO][5500] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-3db2d8461d' Sep 12 18:56:43.610304 containerd[1921]: 2025-09-12 18:56:43.588 [INFO][5500] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.610304 containerd[1921]: 2025-09-12 18:56:43.591 [INFO][5500] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.610304 containerd[1921]: 2025-09-12 18:56:43.593 [INFO][5500] ipam/ipam.go 511: Trying affinity for 192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.610304 containerd[1921]: 2025-09-12 18:56:43.594 [INFO][5500] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.610304 containerd[1921]: 2025-09-12 18:56:43.595 [INFO][5500] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.610449 containerd[1921]: 2025-09-12 18:56:43.595 [INFO][5500] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.128/26 handle="k8s-pod-network.b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.610449 containerd[1921]: 2025-09-12 18:56:43.596 [INFO][5500] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9 Sep 12 18:56:43.610449 containerd[1921]: 2025-09-12 18:56:43.598 [INFO][5500] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.128/26 handle="k8s-pod-network.b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.610449 containerd[1921]: 2025-09-12 18:56:43.601 [INFO][5500] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.132/26] block=192.168.8.128/26 handle="k8s-pod-network.b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.610449 containerd[1921]: 2025-09-12 18:56:43.601 [INFO][5500] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.132/26] handle="k8s-pod-network.b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.610449 containerd[1921]: 2025-09-12 18:56:43.602 [INFO][5500] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 18:56:43.610449 containerd[1921]: 2025-09-12 18:56:43.602 [INFO][5500] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.132/26] IPv6=[] ContainerID="b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" HandleID="k8s-pod-network.b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" Workload="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--drxkk-eth0" Sep 12 18:56:43.610559 containerd[1921]: 2025-09-12 18:56:43.603 [INFO][5458] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" Namespace="kube-system" Pod="coredns-668d6bf9bc-drxkk" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--drxkk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--drxkk-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"ea20e964-361f-4ead-8929-49881fdc393b", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 18, 56, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-3db2d8461d", ContainerID:"", Pod:"coredns-668d6bf9bc-drxkk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.8.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie767e372761", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 18:56:43.610559 containerd[1921]: 2025-09-12 18:56:43.603 [INFO][5458] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.132/32] ContainerID="b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" Namespace="kube-system" Pod="coredns-668d6bf9bc-drxkk" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--drxkk-eth0" Sep 12 18:56:43.610559 containerd[1921]: 2025-09-12 18:56:43.603 [INFO][5458] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie767e372761 ContainerID="b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" Namespace="kube-system" Pod="coredns-668d6bf9bc-drxkk" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--drxkk-eth0" Sep 12 18:56:43.610559 containerd[1921]: 2025-09-12 18:56:43.604 [INFO][5458] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" Namespace="kube-system" Pod="coredns-668d6bf9bc-drxkk" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--drxkk-eth0" Sep 12 18:56:43.610559 containerd[1921]: 2025-09-12 18:56:43.604 [INFO][5458] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" Namespace="kube-system" Pod="coredns-668d6bf9bc-drxkk" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--drxkk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--drxkk-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"ea20e964-361f-4ead-8929-49881fdc393b", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 18, 56, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-3db2d8461d", ContainerID:"b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9", Pod:"coredns-668d6bf9bc-drxkk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.8.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie767e372761", MAC:"52:2a:b8:29:6e:2d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 18:56:43.610559 containerd[1921]: 2025-09-12 18:56:43.609 [INFO][5458] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" Namespace="kube-system" Pod="coredns-668d6bf9bc-drxkk" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--drxkk-eth0" Sep 12 18:56:43.618292 containerd[1921]: time="2025-09-12T18:56:43.618267708Z" level=info msg="connecting to shim b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9" address="unix:///run/containerd/s/6d6e8cdda40219dc99ed563680d319ce147ed1b05d0a0b2518764d391adf86b7" namespace=k8s.io protocol=ttrpc version=3 Sep 12 18:56:43.640855 systemd[1]: Started cri-containerd-b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9.scope - libcontainer container b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9. Sep 12 18:56:43.670549 containerd[1921]: time="2025-09-12T18:56:43.670527663Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-drxkk,Uid:ea20e964-361f-4ead-8929-49881fdc393b,Namespace:kube-system,Attempt:0,} returns sandbox id \"b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9\"" Sep 12 18:56:43.671649 containerd[1921]: time="2025-09-12T18:56:43.671637061Z" level=info msg="CreateContainer within sandbox \"b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 18:56:43.674816 containerd[1921]: time="2025-09-12T18:56:43.674801782Z" level=info msg="Container e79dcbe849287e521deebc131864b1349157078d762002a16d9e561a6e6b1244: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:43.677637 containerd[1921]: time="2025-09-12T18:56:43.677613750Z" level=info msg="CreateContainer within sandbox \"b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e79dcbe849287e521deebc131864b1349157078d762002a16d9e561a6e6b1244\"" Sep 12 18:56:43.677892 containerd[1921]: time="2025-09-12T18:56:43.677877490Z" level=info msg="StartContainer for \"e79dcbe849287e521deebc131864b1349157078d762002a16d9e561a6e6b1244\"" Sep 12 18:56:43.678313 containerd[1921]: time="2025-09-12T18:56:43.678299426Z" level=info msg="connecting to shim e79dcbe849287e521deebc131864b1349157078d762002a16d9e561a6e6b1244" address="unix:///run/containerd/s/6d6e8cdda40219dc99ed563680d319ce147ed1b05d0a0b2518764d391adf86b7" protocol=ttrpc version=3 Sep 12 18:56:43.696729 systemd[1]: Started cri-containerd-e79dcbe849287e521deebc131864b1349157078d762002a16d9e561a6e6b1244.scope - libcontainer container e79dcbe849287e521deebc131864b1349157078d762002a16d9e561a6e6b1244. Sep 12 18:56:43.703813 systemd-networkd[1836]: cali1b4b0e36a3c: Link UP Sep 12 18:56:43.704101 systemd-networkd[1836]: cali1b4b0e36a3c: Gained carrier Sep 12 18:56:43.711825 containerd[1921]: time="2025-09-12T18:56:43.711765261Z" level=info msg="StartContainer for \"e79dcbe849287e521deebc131864b1349157078d762002a16d9e561a6e6b1244\" returns successfully" Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.570 [INFO][5456] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--2bwf9-eth0 calico-apiserver-56857fb794- calico-apiserver 8ad1c2c3-91b8-4133-bd04-a7165ab7b049 820 0 2025-09-12 18:56:18 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:56857fb794 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.1.0-a-3db2d8461d calico-apiserver-56857fb794-2bwf9 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1b4b0e36a3c [] [] }} ContainerID="63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" Namespace="calico-apiserver" Pod="calico-apiserver-56857fb794-2bwf9" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--2bwf9-" Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.570 [INFO][5456] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" Namespace="calico-apiserver" Pod="calico-apiserver-56857fb794-2bwf9" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--2bwf9-eth0" Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.584 [INFO][5501] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" HandleID="k8s-pod-network.63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" Workload="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--2bwf9-eth0" Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.584 [INFO][5501] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" HandleID="k8s-pod-network.63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" Workload="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--2bwf9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e71d0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.1.0-a-3db2d8461d", "pod":"calico-apiserver-56857fb794-2bwf9", "timestamp":"2025-09-12 18:56:43.584257949 +0000 UTC"}, Hostname:"ci-4426.1.0-a-3db2d8461d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.584 [INFO][5501] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.602 [INFO][5501] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.602 [INFO][5501] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-3db2d8461d' Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.689 [INFO][5501] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.691 [INFO][5501] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.694 [INFO][5501] ipam/ipam.go 511: Trying affinity for 192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.695 [INFO][5501] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.696 [INFO][5501] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.696 [INFO][5501] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.128/26 handle="k8s-pod-network.63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.697 [INFO][5501] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96 Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.699 [INFO][5501] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.128/26 handle="k8s-pod-network.63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.701 [INFO][5501] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.133/26] block=192.168.8.128/26 handle="k8s-pod-network.63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.702 [INFO][5501] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.133/26] handle="k8s-pod-network.63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.702 [INFO][5501] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 18:56:43.722813 containerd[1921]: 2025-09-12 18:56:43.702 [INFO][5501] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.133/26] IPv6=[] ContainerID="63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" HandleID="k8s-pod-network.63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" Workload="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--2bwf9-eth0" Sep 12 18:56:43.723232 containerd[1921]: 2025-09-12 18:56:43.702 [INFO][5456] cni-plugin/k8s.go 418: Populated endpoint ContainerID="63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" Namespace="calico-apiserver" Pod="calico-apiserver-56857fb794-2bwf9" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--2bwf9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--2bwf9-eth0", GenerateName:"calico-apiserver-56857fb794-", Namespace:"calico-apiserver", SelfLink:"", UID:"8ad1c2c3-91b8-4133-bd04-a7165ab7b049", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 18, 56, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56857fb794", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-3db2d8461d", ContainerID:"", Pod:"calico-apiserver-56857fb794-2bwf9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1b4b0e36a3c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 18:56:43.723232 containerd[1921]: 2025-09-12 18:56:43.703 [INFO][5456] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.133/32] ContainerID="63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" Namespace="calico-apiserver" Pod="calico-apiserver-56857fb794-2bwf9" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--2bwf9-eth0" Sep 12 18:56:43.723232 containerd[1921]: 2025-09-12 18:56:43.703 [INFO][5456] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1b4b0e36a3c ContainerID="63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" Namespace="calico-apiserver" Pod="calico-apiserver-56857fb794-2bwf9" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--2bwf9-eth0" Sep 12 18:56:43.723232 containerd[1921]: 2025-09-12 18:56:43.704 [INFO][5456] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" Namespace="calico-apiserver" Pod="calico-apiserver-56857fb794-2bwf9" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--2bwf9-eth0" Sep 12 18:56:43.723232 containerd[1921]: 2025-09-12 18:56:43.704 [INFO][5456] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" Namespace="calico-apiserver" Pod="calico-apiserver-56857fb794-2bwf9" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--2bwf9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--2bwf9-eth0", GenerateName:"calico-apiserver-56857fb794-", Namespace:"calico-apiserver", SelfLink:"", UID:"8ad1c2c3-91b8-4133-bd04-a7165ab7b049", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 18, 56, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56857fb794", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-3db2d8461d", ContainerID:"63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96", Pod:"calico-apiserver-56857fb794-2bwf9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1b4b0e36a3c", MAC:"5e:ed:bc:05:8a:3e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 18:56:43.723232 containerd[1921]: 2025-09-12 18:56:43.721 [INFO][5456] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" Namespace="calico-apiserver" Pod="calico-apiserver-56857fb794-2bwf9" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--2bwf9-eth0" Sep 12 18:56:43.749039 containerd[1921]: time="2025-09-12T18:56:43.749014099Z" level=info msg="connecting to shim 63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96" address="unix:///run/containerd/s/752601049801819d447dcb2f9e07c40ac0b8ad0dc1ac8b5eb4ca8c2152e70da4" namespace=k8s.io protocol=ttrpc version=3 Sep 12 18:56:43.772065 systemd[1]: Started cri-containerd-63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96.scope - libcontainer container 63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96. Sep 12 18:56:43.849630 containerd[1921]: time="2025-09-12T18:56:43.849576391Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56857fb794-2bwf9,Uid:8ad1c2c3-91b8-4133-bd04-a7165ab7b049,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96\"" Sep 12 18:56:44.100708 systemd-networkd[1836]: cali7ce29528179: Gained IPv6LL Sep 12 18:56:44.537459 containerd[1921]: time="2025-09-12T18:56:44.537402756Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nzcks,Uid:ecd0baa7-57e9-4324-aeff-1245c967addc,Namespace:kube-system,Attempt:0,}" Sep 12 18:56:44.603566 systemd-networkd[1836]: calif6a60cd40cf: Link UP Sep 12 18:56:44.603747 systemd-networkd[1836]: calif6a60cd40cf: Gained carrier Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.557 [INFO][5692] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--nzcks-eth0 coredns-668d6bf9bc- kube-system ecd0baa7-57e9-4324-aeff-1245c967addc 810 0 2025-09-12 18:56:11 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426.1.0-a-3db2d8461d coredns-668d6bf9bc-nzcks eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif6a60cd40cf [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" Namespace="kube-system" Pod="coredns-668d6bf9bc-nzcks" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--nzcks-" Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.557 [INFO][5692] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" Namespace="kube-system" Pod="coredns-668d6bf9bc-nzcks" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--nzcks-eth0" Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.570 [INFO][5714] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" HandleID="k8s-pod-network.26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" Workload="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--nzcks-eth0" Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.570 [INFO][5714] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" HandleID="k8s-pod-network.26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" Workload="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--nzcks-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004ff00), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426.1.0-a-3db2d8461d", "pod":"coredns-668d6bf9bc-nzcks", "timestamp":"2025-09-12 18:56:44.570669143 +0000 UTC"}, Hostname:"ci-4426.1.0-a-3db2d8461d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.570 [INFO][5714] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.570 [INFO][5714] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.570 [INFO][5714] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-3db2d8461d' Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.575 [INFO][5714] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.578 [INFO][5714] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.582 [INFO][5714] ipam/ipam.go 511: Trying affinity for 192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.583 [INFO][5714] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.587 [INFO][5714] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.587 [INFO][5714] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.128/26 handle="k8s-pod-network.26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.590 [INFO][5714] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9 Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.597 [INFO][5714] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.128/26 handle="k8s-pod-network.26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.601 [INFO][5714] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.134/26] block=192.168.8.128/26 handle="k8s-pod-network.26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.601 [INFO][5714] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.134/26] handle="k8s-pod-network.26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.601 [INFO][5714] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 18:56:44.611224 containerd[1921]: 2025-09-12 18:56:44.601 [INFO][5714] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.134/26] IPv6=[] ContainerID="26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" HandleID="k8s-pod-network.26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" Workload="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--nzcks-eth0" Sep 12 18:56:44.611772 containerd[1921]: 2025-09-12 18:56:44.602 [INFO][5692] cni-plugin/k8s.go 418: Populated endpoint ContainerID="26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" Namespace="kube-system" Pod="coredns-668d6bf9bc-nzcks" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--nzcks-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--nzcks-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"ecd0baa7-57e9-4324-aeff-1245c967addc", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 18, 56, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-3db2d8461d", ContainerID:"", Pod:"coredns-668d6bf9bc-nzcks", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.8.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif6a60cd40cf", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 18:56:44.611772 containerd[1921]: 2025-09-12 18:56:44.602 [INFO][5692] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.134/32] ContainerID="26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" Namespace="kube-system" Pod="coredns-668d6bf9bc-nzcks" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--nzcks-eth0" Sep 12 18:56:44.611772 containerd[1921]: 2025-09-12 18:56:44.602 [INFO][5692] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif6a60cd40cf ContainerID="26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" Namespace="kube-system" Pod="coredns-668d6bf9bc-nzcks" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--nzcks-eth0" Sep 12 18:56:44.611772 containerd[1921]: 2025-09-12 18:56:44.603 [INFO][5692] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" Namespace="kube-system" Pod="coredns-668d6bf9bc-nzcks" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--nzcks-eth0" Sep 12 18:56:44.611772 containerd[1921]: 2025-09-12 18:56:44.604 [INFO][5692] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" Namespace="kube-system" Pod="coredns-668d6bf9bc-nzcks" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--nzcks-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--nzcks-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"ecd0baa7-57e9-4324-aeff-1245c967addc", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 18, 56, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-3db2d8461d", ContainerID:"26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9", Pod:"coredns-668d6bf9bc-nzcks", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.8.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif6a60cd40cf", MAC:"42:90:b7:fb:e8:e9", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 18:56:44.611772 containerd[1921]: 2025-09-12 18:56:44.608 [INFO][5692] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" Namespace="kube-system" Pod="coredns-668d6bf9bc-nzcks" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-coredns--668d6bf9bc--nzcks-eth0" Sep 12 18:56:44.621355 containerd[1921]: time="2025-09-12T18:56:44.621303000Z" level=info msg="connecting to shim 26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9" address="unix:///run/containerd/s/2c2022d1da5e0a0ef334a3d74b9cd484694db4a65734d3ba13d2284017c038c1" namespace=k8s.io protocol=ttrpc version=3 Sep 12 18:56:44.637756 systemd[1]: Started cri-containerd-26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9.scope - libcontainer container 26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9. Sep 12 18:56:44.663876 containerd[1921]: time="2025-09-12T18:56:44.663826114Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nzcks,Uid:ecd0baa7-57e9-4324-aeff-1245c967addc,Namespace:kube-system,Attempt:0,} returns sandbox id \"26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9\"" Sep 12 18:56:44.664918 containerd[1921]: time="2025-09-12T18:56:44.664902546Z" level=info msg="CreateContainer within sandbox \"26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 18:56:44.668205 containerd[1921]: time="2025-09-12T18:56:44.668162210Z" level=info msg="Container 9e9ad4cfc6cc1b02f330941532a6ec3ae69fd4e052868535834ffebec89206c9: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:44.670807 containerd[1921]: time="2025-09-12T18:56:44.670761384Z" level=info msg="CreateContainer within sandbox \"26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9e9ad4cfc6cc1b02f330941532a6ec3ae69fd4e052868535834ffebec89206c9\"" Sep 12 18:56:44.670984 containerd[1921]: time="2025-09-12T18:56:44.670970190Z" level=info msg="StartContainer for \"9e9ad4cfc6cc1b02f330941532a6ec3ae69fd4e052868535834ffebec89206c9\"" Sep 12 18:56:44.671436 containerd[1921]: time="2025-09-12T18:56:44.671422993Z" level=info msg="connecting to shim 9e9ad4cfc6cc1b02f330941532a6ec3ae69fd4e052868535834ffebec89206c9" address="unix:///run/containerd/s/2c2022d1da5e0a0ef334a3d74b9cd484694db4a65734d3ba13d2284017c038c1" protocol=ttrpc version=3 Sep 12 18:56:44.687824 systemd[1]: Started cri-containerd-9e9ad4cfc6cc1b02f330941532a6ec3ae69fd4e052868535834ffebec89206c9.scope - libcontainer container 9e9ad4cfc6cc1b02f330941532a6ec3ae69fd4e052868535834ffebec89206c9. Sep 12 18:56:44.707193 kubelet[3285]: I0912 18:56:44.707141 3285 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-drxkk" podStartSLOduration=33.707117865 podStartE2EDuration="33.707117865s" podCreationTimestamp="2025-09-12 18:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 18:56:44.706954844 +0000 UTC m=+39.256341232" watchObservedRunningTime="2025-09-12 18:56:44.707117865 +0000 UTC m=+39.256504249" Sep 12 18:56:44.768821 containerd[1921]: time="2025-09-12T18:56:44.768778960Z" level=info msg="StartContainer for \"9e9ad4cfc6cc1b02f330941532a6ec3ae69fd4e052868535834ffebec89206c9\" returns successfully" Sep 12 18:56:44.932729 systemd-networkd[1836]: calie767e372761: Gained IPv6LL Sep 12 18:56:45.130456 containerd[1921]: time="2025-09-12T18:56:45.130400434Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:45.130826 containerd[1921]: time="2025-09-12T18:56:45.130807733Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 12 18:56:45.131670 containerd[1921]: time="2025-09-12T18:56:45.131651160Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:45.149944 containerd[1921]: time="2025-09-12T18:56:45.149884058Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:45.150358 containerd[1921]: time="2025-09-12T18:56:45.150309090Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 2.817032787s" Sep 12 18:56:45.150358 containerd[1921]: time="2025-09-12T18:56:45.150329815Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 12 18:56:45.150897 containerd[1921]: time="2025-09-12T18:56:45.150877925Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 18:56:45.153968 containerd[1921]: time="2025-09-12T18:56:45.153925350Z" level=info msg="CreateContainer within sandbox \"9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 18:56:45.156542 containerd[1921]: time="2025-09-12T18:56:45.156500570Z" level=info msg="Container 1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:45.159164 containerd[1921]: time="2025-09-12T18:56:45.159123989Z" level=info msg="CreateContainer within sandbox \"9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\"" Sep 12 18:56:45.159358 containerd[1921]: time="2025-09-12T18:56:45.159323852Z" level=info msg="StartContainer for \"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\"" Sep 12 18:56:45.159862 containerd[1921]: time="2025-09-12T18:56:45.159825022Z" level=info msg="connecting to shim 1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0" address="unix:///run/containerd/s/d901449d9c95acf29ff8d5ed05c99412efb00e4502e597ecaf3ea80cc80a377e" protocol=ttrpc version=3 Sep 12 18:56:45.175731 systemd[1]: Started cri-containerd-1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0.scope - libcontainer container 1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0. Sep 12 18:56:45.207845 containerd[1921]: time="2025-09-12T18:56:45.207750859Z" level=info msg="StartContainer for \"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" returns successfully" Sep 12 18:56:45.540784 containerd[1921]: time="2025-09-12T18:56:45.540756131Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mkb45,Uid:19fb1996-8df9-4ece-9557-f77103d3c3c7,Namespace:calico-system,Attempt:0,}" Sep 12 18:56:45.540859 containerd[1921]: time="2025-09-12T18:56:45.540786864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56857fb794-l2ttg,Uid:b76b76fa-54b7-4652-8227-b8f6b96853be,Namespace:calico-apiserver,Attempt:0,}" Sep 12 18:56:45.573762 systemd-networkd[1836]: cali1b4b0e36a3c: Gained IPv6LL Sep 12 18:56:45.602180 systemd-networkd[1836]: calicd33ef9797b: Link UP Sep 12 18:56:45.602309 systemd-networkd[1836]: calicd33ef9797b: Gained carrier Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.559 [INFO][5893] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--l2ttg-eth0 calico-apiserver-56857fb794- calico-apiserver b76b76fa-54b7-4652-8227-b8f6b96853be 816 0 2025-09-12 18:56:18 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:56857fb794 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.1.0-a-3db2d8461d calico-apiserver-56857fb794-l2ttg eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calicd33ef9797b [] [] }} ContainerID="9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" Namespace="calico-apiserver" Pod="calico-apiserver-56857fb794-l2ttg" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--l2ttg-" Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.559 [INFO][5893] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" Namespace="calico-apiserver" Pod="calico-apiserver-56857fb794-l2ttg" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--l2ttg-eth0" Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.573 [INFO][5932] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" HandleID="k8s-pod-network.9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" Workload="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--l2ttg-eth0" Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.574 [INFO][5932] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" HandleID="k8s-pod-network.9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" Workload="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--l2ttg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fa10), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.1.0-a-3db2d8461d", "pod":"calico-apiserver-56857fb794-l2ttg", "timestamp":"2025-09-12 18:56:45.573919531 +0000 UTC"}, Hostname:"ci-4426.1.0-a-3db2d8461d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.574 [INFO][5932] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.574 [INFO][5932] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.574 [INFO][5932] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-3db2d8461d' Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.578 [INFO][5932] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.581 [INFO][5932] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.591 [INFO][5932] ipam/ipam.go 511: Trying affinity for 192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.592 [INFO][5932] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.594 [INFO][5932] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.594 [INFO][5932] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.128/26 handle="k8s-pod-network.9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.594 [INFO][5932] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5 Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.597 [INFO][5932] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.128/26 handle="k8s-pod-network.9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.600 [INFO][5932] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.135/26] block=192.168.8.128/26 handle="k8s-pod-network.9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.600 [INFO][5932] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.135/26] handle="k8s-pod-network.9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.600 [INFO][5932] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 18:56:45.607719 containerd[1921]: 2025-09-12 18:56:45.600 [INFO][5932] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.135/26] IPv6=[] ContainerID="9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" HandleID="k8s-pod-network.9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" Workload="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--l2ttg-eth0" Sep 12 18:56:45.608306 containerd[1921]: 2025-09-12 18:56:45.601 [INFO][5893] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" Namespace="calico-apiserver" Pod="calico-apiserver-56857fb794-l2ttg" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--l2ttg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--l2ttg-eth0", GenerateName:"calico-apiserver-56857fb794-", Namespace:"calico-apiserver", SelfLink:"", UID:"b76b76fa-54b7-4652-8227-b8f6b96853be", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 18, 56, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56857fb794", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-3db2d8461d", ContainerID:"", Pod:"calico-apiserver-56857fb794-l2ttg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicd33ef9797b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 18:56:45.608306 containerd[1921]: 2025-09-12 18:56:45.601 [INFO][5893] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.135/32] ContainerID="9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" Namespace="calico-apiserver" Pod="calico-apiserver-56857fb794-l2ttg" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--l2ttg-eth0" Sep 12 18:56:45.608306 containerd[1921]: 2025-09-12 18:56:45.601 [INFO][5893] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicd33ef9797b ContainerID="9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" Namespace="calico-apiserver" Pod="calico-apiserver-56857fb794-l2ttg" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--l2ttg-eth0" Sep 12 18:56:45.608306 containerd[1921]: 2025-09-12 18:56:45.602 [INFO][5893] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" Namespace="calico-apiserver" Pod="calico-apiserver-56857fb794-l2ttg" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--l2ttg-eth0" Sep 12 18:56:45.608306 containerd[1921]: 2025-09-12 18:56:45.602 [INFO][5893] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" Namespace="calico-apiserver" Pod="calico-apiserver-56857fb794-l2ttg" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--l2ttg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--l2ttg-eth0", GenerateName:"calico-apiserver-56857fb794-", Namespace:"calico-apiserver", SelfLink:"", UID:"b76b76fa-54b7-4652-8227-b8f6b96853be", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 18, 56, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56857fb794", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-3db2d8461d", ContainerID:"9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5", Pod:"calico-apiserver-56857fb794-l2ttg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicd33ef9797b", MAC:"d6:18:10:5b:71:8f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 18:56:45.608306 containerd[1921]: 2025-09-12 18:56:45.606 [INFO][5893] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" Namespace="calico-apiserver" Pod="calico-apiserver-56857fb794-l2ttg" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-calico--apiserver--56857fb794--l2ttg-eth0" Sep 12 18:56:45.616214 containerd[1921]: time="2025-09-12T18:56:45.616188143Z" level=info msg="connecting to shim 9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5" address="unix:///run/containerd/s/edecdc0c537aa9b810754bec6619cdbf9c4b72713936f2d9c1ba6a425714d8e6" namespace=k8s.io protocol=ttrpc version=3 Sep 12 18:56:45.643165 systemd[1]: Started cri-containerd-9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5.scope - libcontainer container 9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5. Sep 12 18:56:45.712973 systemd-networkd[1836]: cali63cd440c9ba: Link UP Sep 12 18:56:45.713204 systemd-networkd[1836]: cali63cd440c9ba: Gained carrier Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.559 [INFO][5886] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--3db2d8461d-k8s-csi--node--driver--mkb45-eth0 csi-node-driver- calico-system 19fb1996-8df9-4ece-9557-f77103d3c3c7 703 0 2025-09-12 18:56:20 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4426.1.0-a-3db2d8461d csi-node-driver-mkb45 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali63cd440c9ba [] [] }} ContainerID="3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" Namespace="calico-system" Pod="csi-node-driver-mkb45" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-csi--node--driver--mkb45-" Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.559 [INFO][5886] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" Namespace="calico-system" Pod="csi-node-driver-mkb45" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-csi--node--driver--mkb45-eth0" Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.574 [INFO][5930] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" HandleID="k8s-pod-network.3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" Workload="ci--4426.1.0--a--3db2d8461d-k8s-csi--node--driver--mkb45-eth0" Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.574 [INFO][5930] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" HandleID="k8s-pod-network.3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" Workload="ci--4426.1.0--a--3db2d8461d-k8s-csi--node--driver--mkb45-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00011bed0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.1.0-a-3db2d8461d", "pod":"csi-node-driver-mkb45", "timestamp":"2025-09-12 18:56:45.574450046 +0000 UTC"}, Hostname:"ci-4426.1.0-a-3db2d8461d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.574 [INFO][5930] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.600 [INFO][5930] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.600 [INFO][5930] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-3db2d8461d' Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.681 [INFO][5930] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.689 [INFO][5930] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.694 [INFO][5930] ipam/ipam.go 511: Trying affinity for 192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.696 [INFO][5930] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.699 [INFO][5930] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.128/26 host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.699 [INFO][5930] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.128/26 handle="k8s-pod-network.3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.700 [INFO][5930] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.703 [INFO][5930] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.128/26 handle="k8s-pod-network.3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.710 [INFO][5930] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.136/26] block=192.168.8.128/26 handle="k8s-pod-network.3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.710 [INFO][5930] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.136/26] handle="k8s-pod-network.3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" host="ci-4426.1.0-a-3db2d8461d" Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.710 [INFO][5930] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 18:56:45.721102 containerd[1921]: 2025-09-12 18:56:45.710 [INFO][5930] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.136/26] IPv6=[] ContainerID="3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" HandleID="k8s-pod-network.3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" Workload="ci--4426.1.0--a--3db2d8461d-k8s-csi--node--driver--mkb45-eth0" Sep 12 18:56:45.721732 containerd[1921]: 2025-09-12 18:56:45.711 [INFO][5886] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" Namespace="calico-system" Pod="csi-node-driver-mkb45" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-csi--node--driver--mkb45-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--3db2d8461d-k8s-csi--node--driver--mkb45-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"19fb1996-8df9-4ece-9557-f77103d3c3c7", ResourceVersion:"703", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 18, 56, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-3db2d8461d", ContainerID:"", Pod:"csi-node-driver-mkb45", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.8.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali63cd440c9ba", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 18:56:45.721732 containerd[1921]: 2025-09-12 18:56:45.711 [INFO][5886] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.136/32] ContainerID="3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" Namespace="calico-system" Pod="csi-node-driver-mkb45" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-csi--node--driver--mkb45-eth0" Sep 12 18:56:45.721732 containerd[1921]: 2025-09-12 18:56:45.711 [INFO][5886] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali63cd440c9ba ContainerID="3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" Namespace="calico-system" Pod="csi-node-driver-mkb45" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-csi--node--driver--mkb45-eth0" Sep 12 18:56:45.721732 containerd[1921]: 2025-09-12 18:56:45.713 [INFO][5886] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" Namespace="calico-system" Pod="csi-node-driver-mkb45" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-csi--node--driver--mkb45-eth0" Sep 12 18:56:45.721732 containerd[1921]: 2025-09-12 18:56:45.713 [INFO][5886] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" Namespace="calico-system" Pod="csi-node-driver-mkb45" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-csi--node--driver--mkb45-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--3db2d8461d-k8s-csi--node--driver--mkb45-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"19fb1996-8df9-4ece-9557-f77103d3c3c7", ResourceVersion:"703", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 18, 56, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-3db2d8461d", ContainerID:"3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b", Pod:"csi-node-driver-mkb45", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.8.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali63cd440c9ba", MAC:"2e:83:37:61:8c:7e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 18:56:45.721732 containerd[1921]: 2025-09-12 18:56:45.719 [INFO][5886] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" Namespace="calico-system" Pod="csi-node-driver-mkb45" WorkloadEndpoint="ci--4426.1.0--a--3db2d8461d-k8s-csi--node--driver--mkb45-eth0" Sep 12 18:56:45.727359 containerd[1921]: time="2025-09-12T18:56:45.727340928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56857fb794-l2ttg,Uid:b76b76fa-54b7-4652-8227-b8f6b96853be,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5\"" Sep 12 18:56:45.733194 containerd[1921]: time="2025-09-12T18:56:45.733145365Z" level=info msg="connecting to shim 3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b" address="unix:///run/containerd/s/9dc2c370928db2859c2fe2a24742bd8bec702433658c8224ba85f69d978e071b" namespace=k8s.io protocol=ttrpc version=3 Sep 12 18:56:45.755916 systemd[1]: Started cri-containerd-3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b.scope - libcontainer container 3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b. Sep 12 18:56:45.767080 containerd[1921]: time="2025-09-12T18:56:45.767055695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mkb45,Uid:19fb1996-8df9-4ece-9557-f77103d3c3c7,Namespace:calico-system,Attempt:0,} returns sandbox id \"3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b\"" Sep 12 18:56:45.776888 kubelet[3285]: I0912 18:56:45.776854 3285 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6cdb495f56-2zsfm" podStartSLOduration=22.314943512 podStartE2EDuration="25.776842676s" podCreationTimestamp="2025-09-12 18:56:20 +0000 UTC" firstStartedPulling="2025-09-12 18:56:41.68890106 +0000 UTC m=+36.238287448" lastFinishedPulling="2025-09-12 18:56:45.150800224 +0000 UTC m=+39.700186612" observedRunningTime="2025-09-12 18:56:45.776703282 +0000 UTC m=+40.326089671" watchObservedRunningTime="2025-09-12 18:56:45.776842676 +0000 UTC m=+40.326229062" Sep 12 18:56:46.148846 systemd-networkd[1836]: calif6a60cd40cf: Gained IPv6LL Sep 12 18:56:46.724910 systemd-networkd[1836]: calicd33ef9797b: Gained IPv6LL Sep 12 18:56:46.774898 kubelet[3285]: I0912 18:56:46.774869 3285 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 18:56:46.916762 systemd-networkd[1836]: cali63cd440c9ba: Gained IPv6LL Sep 12 18:56:47.531022 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2139677769.mount: Deactivated successfully. Sep 12 18:56:47.734385 containerd[1921]: time="2025-09-12T18:56:47.734329123Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:47.734610 containerd[1921]: time="2025-09-12T18:56:47.734472883Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 12 18:56:47.734956 containerd[1921]: time="2025-09-12T18:56:47.734908641Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:47.735859 containerd[1921]: time="2025-09-12T18:56:47.735832646Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:47.736614 containerd[1921]: time="2025-09-12T18:56:47.736599609Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 2.585702824s" Sep 12 18:56:47.736651 containerd[1921]: time="2025-09-12T18:56:47.736616632Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 12 18:56:47.737081 containerd[1921]: time="2025-09-12T18:56:47.737072354Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 18:56:47.737575 containerd[1921]: time="2025-09-12T18:56:47.737555763Z" level=info msg="CreateContainer within sandbox \"1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 18:56:47.740233 containerd[1921]: time="2025-09-12T18:56:47.740194572Z" level=info msg="Container 5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:47.742884 containerd[1921]: time="2025-09-12T18:56:47.742843694Z" level=info msg="CreateContainer within sandbox \"1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\"" Sep 12 18:56:47.743108 containerd[1921]: time="2025-09-12T18:56:47.743066230Z" level=info msg="StartContainer for \"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\"" Sep 12 18:56:47.743644 containerd[1921]: time="2025-09-12T18:56:47.743632890Z" level=info msg="connecting to shim 5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0" address="unix:///run/containerd/s/6d7776280056b89b21761fc1b2e96d7ca1aba79ae0096af697911c9fabb01068" protocol=ttrpc version=3 Sep 12 18:56:47.766871 systemd[1]: Started cri-containerd-5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0.scope - libcontainer container 5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0. Sep 12 18:56:47.801920 containerd[1921]: time="2025-09-12T18:56:47.801851024Z" level=info msg="StartContainer for \"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" returns successfully" Sep 12 18:56:48.801442 kubelet[3285]: I0912 18:56:48.801401 3285 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-nzcks" podStartSLOduration=37.801388629 podStartE2EDuration="37.801388629s" podCreationTimestamp="2025-09-12 18:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 18:56:45.781470419 +0000 UTC m=+40.330856809" watchObservedRunningTime="2025-09-12 18:56:48.801388629 +0000 UTC m=+43.350775015" Sep 12 18:56:48.850917 containerd[1921]: time="2025-09-12T18:56:48.850864003Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"2cf2da074c55f0694ffb159e8dc0fb2655734e520ebeee625b10327cc217cde2\" pid:6155 exit_status:1 exited_at:{seconds:1757703408 nanos:850621115}" Sep 12 18:56:49.850009 containerd[1921]: time="2025-09-12T18:56:49.849981389Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"bfd99aaf91ea704f54105af0705d1a44b93c6eb293dca902367855f1747b88e6\" pid:6192 exit_status:1 exited_at:{seconds:1757703409 nanos:849284586}" Sep 12 18:56:50.403377 containerd[1921]: time="2025-09-12T18:56:50.403320855Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:50.403603 containerd[1921]: time="2025-09-12T18:56:50.403522056Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 12 18:56:50.403915 containerd[1921]: time="2025-09-12T18:56:50.403872726Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:50.404763 containerd[1921]: time="2025-09-12T18:56:50.404720091Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:50.405380 containerd[1921]: time="2025-09-12T18:56:50.405334315Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 2.668246406s" Sep 12 18:56:50.405380 containerd[1921]: time="2025-09-12T18:56:50.405351300Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 18:56:50.405789 containerd[1921]: time="2025-09-12T18:56:50.405776379Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 18:56:50.406257 containerd[1921]: time="2025-09-12T18:56:50.406241072Z" level=info msg="CreateContainer within sandbox \"63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 18:56:50.408972 containerd[1921]: time="2025-09-12T18:56:50.408929568Z" level=info msg="Container f5383962614876ec32b34b3819df0f74656ae9d47e0da2f6935ab0a0cde1101b: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:50.411763 containerd[1921]: time="2025-09-12T18:56:50.411719818Z" level=info msg="CreateContainer within sandbox \"63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"f5383962614876ec32b34b3819df0f74656ae9d47e0da2f6935ab0a0cde1101b\"" Sep 12 18:56:50.411959 containerd[1921]: time="2025-09-12T18:56:50.411944196Z" level=info msg="StartContainer for \"f5383962614876ec32b34b3819df0f74656ae9d47e0da2f6935ab0a0cde1101b\"" Sep 12 18:56:50.412565 containerd[1921]: time="2025-09-12T18:56:50.412553729Z" level=info msg="connecting to shim f5383962614876ec32b34b3819df0f74656ae9d47e0da2f6935ab0a0cde1101b" address="unix:///run/containerd/s/752601049801819d447dcb2f9e07c40ac0b8ad0dc1ac8b5eb4ca8c2152e70da4" protocol=ttrpc version=3 Sep 12 18:56:50.438885 systemd[1]: Started cri-containerd-f5383962614876ec32b34b3819df0f74656ae9d47e0da2f6935ab0a0cde1101b.scope - libcontainer container f5383962614876ec32b34b3819df0f74656ae9d47e0da2f6935ab0a0cde1101b. Sep 12 18:56:50.476792 containerd[1921]: time="2025-09-12T18:56:50.476764481Z" level=info msg="StartContainer for \"f5383962614876ec32b34b3819df0f74656ae9d47e0da2f6935ab0a0cde1101b\" returns successfully" Sep 12 18:56:50.800859 kubelet[3285]: I0912 18:56:50.800813 3285 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-56857fb794-2bwf9" podStartSLOduration=26.245226231 podStartE2EDuration="32.800798228s" podCreationTimestamp="2025-09-12 18:56:18 +0000 UTC" firstStartedPulling="2025-09-12 18:56:43.850144623 +0000 UTC m=+38.399531011" lastFinishedPulling="2025-09-12 18:56:50.405716621 +0000 UTC m=+44.955103008" observedRunningTime="2025-09-12 18:56:50.800597972 +0000 UTC m=+45.349984365" watchObservedRunningTime="2025-09-12 18:56:50.800798228 +0000 UTC m=+45.350184616" Sep 12 18:56:50.801194 kubelet[3285]: I0912 18:56:50.800977 3285 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-9jkf4" podStartSLOduration=25.718376575 podStartE2EDuration="30.800972481s" podCreationTimestamp="2025-09-12 18:56:20 +0000 UTC" firstStartedPulling="2025-09-12 18:56:42.654418762 +0000 UTC m=+37.203805151" lastFinishedPulling="2025-09-12 18:56:47.737014668 +0000 UTC m=+42.286401057" observedRunningTime="2025-09-12 18:56:48.801519014 +0000 UTC m=+43.350905403" watchObservedRunningTime="2025-09-12 18:56:50.800972481 +0000 UTC m=+45.350358867" Sep 12 18:56:50.803014 containerd[1921]: time="2025-09-12T18:56:50.802991666Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:50.803177 containerd[1921]: time="2025-09-12T18:56:50.803142905Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 18:56:50.804404 containerd[1921]: time="2025-09-12T18:56:50.804367150Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 398.573895ms" Sep 12 18:56:50.804404 containerd[1921]: time="2025-09-12T18:56:50.804383137Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 18:56:50.804838 containerd[1921]: time="2025-09-12T18:56:50.804826441Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 18:56:50.805386 containerd[1921]: time="2025-09-12T18:56:50.805371871Z" level=info msg="CreateContainer within sandbox \"9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 18:56:50.808233 containerd[1921]: time="2025-09-12T18:56:50.808188361Z" level=info msg="Container 31a27cba20059c0755297bb61b89e7aa93155fbb8a9e126190a8a01871297c3a: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:50.813179 containerd[1921]: time="2025-09-12T18:56:50.813161749Z" level=info msg="CreateContainer within sandbox \"9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"31a27cba20059c0755297bb61b89e7aa93155fbb8a9e126190a8a01871297c3a\"" Sep 12 18:56:50.813477 containerd[1921]: time="2025-09-12T18:56:50.813462892Z" level=info msg="StartContainer for \"31a27cba20059c0755297bb61b89e7aa93155fbb8a9e126190a8a01871297c3a\"" Sep 12 18:56:50.814009 containerd[1921]: time="2025-09-12T18:56:50.813996865Z" level=info msg="connecting to shim 31a27cba20059c0755297bb61b89e7aa93155fbb8a9e126190a8a01871297c3a" address="unix:///run/containerd/s/edecdc0c537aa9b810754bec6619cdbf9c4b72713936f2d9c1ba6a425714d8e6" protocol=ttrpc version=3 Sep 12 18:56:50.831736 systemd[1]: Started cri-containerd-31a27cba20059c0755297bb61b89e7aa93155fbb8a9e126190a8a01871297c3a.scope - libcontainer container 31a27cba20059c0755297bb61b89e7aa93155fbb8a9e126190a8a01871297c3a. Sep 12 18:56:50.861945 containerd[1921]: time="2025-09-12T18:56:50.861923067Z" level=info msg="StartContainer for \"31a27cba20059c0755297bb61b89e7aa93155fbb8a9e126190a8a01871297c3a\" returns successfully" Sep 12 18:56:51.803043 kubelet[3285]: I0912 18:56:51.802992 3285 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-56857fb794-l2ttg" podStartSLOduration=28.726097597 podStartE2EDuration="33.802978226s" podCreationTimestamp="2025-09-12 18:56:18 +0000 UTC" firstStartedPulling="2025-09-12 18:56:45.72788286 +0000 UTC m=+40.277269247" lastFinishedPulling="2025-09-12 18:56:50.804763487 +0000 UTC m=+45.354149876" observedRunningTime="2025-09-12 18:56:51.80275185 +0000 UTC m=+46.352138239" watchObservedRunningTime="2025-09-12 18:56:51.802978226 +0000 UTC m=+46.352364611" Sep 12 18:56:52.375975 containerd[1921]: time="2025-09-12T18:56:52.375922446Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:52.376200 containerd[1921]: time="2025-09-12T18:56:52.376132126Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 12 18:56:52.376541 containerd[1921]: time="2025-09-12T18:56:52.376496306Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:52.377312 containerd[1921]: time="2025-09-12T18:56:52.377267088Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:52.377618 containerd[1921]: time="2025-09-12T18:56:52.377600782Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.572759517s" Sep 12 18:56:52.377618 containerd[1921]: time="2025-09-12T18:56:52.377615346Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 12 18:56:52.378547 containerd[1921]: time="2025-09-12T18:56:52.378533171Z" level=info msg="CreateContainer within sandbox \"3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 18:56:52.382513 containerd[1921]: time="2025-09-12T18:56:52.382499846Z" level=info msg="Container 8b60d3164bc4be922eaed4b0437869205d526959c9f074efc877fa6eae9c1c7e: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:52.387342 containerd[1921]: time="2025-09-12T18:56:52.387298247Z" level=info msg="CreateContainer within sandbox \"3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"8b60d3164bc4be922eaed4b0437869205d526959c9f074efc877fa6eae9c1c7e\"" Sep 12 18:56:52.387519 containerd[1921]: time="2025-09-12T18:56:52.387507859Z" level=info msg="StartContainer for \"8b60d3164bc4be922eaed4b0437869205d526959c9f074efc877fa6eae9c1c7e\"" Sep 12 18:56:52.388277 containerd[1921]: time="2025-09-12T18:56:52.388233345Z" level=info msg="connecting to shim 8b60d3164bc4be922eaed4b0437869205d526959c9f074efc877fa6eae9c1c7e" address="unix:///run/containerd/s/9dc2c370928db2859c2fe2a24742bd8bec702433658c8224ba85f69d978e071b" protocol=ttrpc version=3 Sep 12 18:56:52.405784 systemd[1]: Started cri-containerd-8b60d3164bc4be922eaed4b0437869205d526959c9f074efc877fa6eae9c1c7e.scope - libcontainer container 8b60d3164bc4be922eaed4b0437869205d526959c9f074efc877fa6eae9c1c7e. Sep 12 18:56:52.425494 containerd[1921]: time="2025-09-12T18:56:52.425472602Z" level=info msg="StartContainer for \"8b60d3164bc4be922eaed4b0437869205d526959c9f074efc877fa6eae9c1c7e\" returns successfully" Sep 12 18:56:52.426055 containerd[1921]: time="2025-09-12T18:56:52.426039015Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 18:56:52.802882 kubelet[3285]: I0912 18:56:52.802820 3285 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 18:56:54.227745 containerd[1921]: time="2025-09-12T18:56:54.227689695Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:54.228003 containerd[1921]: time="2025-09-12T18:56:54.227911401Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 12 18:56:54.228296 containerd[1921]: time="2025-09-12T18:56:54.228255072Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:54.229110 containerd[1921]: time="2025-09-12T18:56:54.229070424Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 18:56:54.229473 containerd[1921]: time="2025-09-12T18:56:54.229426296Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.803366752s" Sep 12 18:56:54.229473 containerd[1921]: time="2025-09-12T18:56:54.229443601Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 12 18:56:54.230455 containerd[1921]: time="2025-09-12T18:56:54.230442560Z" level=info msg="CreateContainer within sandbox \"3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 18:56:54.233790 containerd[1921]: time="2025-09-12T18:56:54.233748443Z" level=info msg="Container 3a2c491dca6d5af3b2182f3b4a915dc2329aa7149dfb8011ded8146284a83d14: CDI devices from CRI Config.CDIDevices: []" Sep 12 18:56:54.238776 containerd[1921]: time="2025-09-12T18:56:54.238738240Z" level=info msg="CreateContainer within sandbox \"3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"3a2c491dca6d5af3b2182f3b4a915dc2329aa7149dfb8011ded8146284a83d14\"" Sep 12 18:56:54.239128 containerd[1921]: time="2025-09-12T18:56:54.239043133Z" level=info msg="StartContainer for \"3a2c491dca6d5af3b2182f3b4a915dc2329aa7149dfb8011ded8146284a83d14\"" Sep 12 18:56:54.240168 containerd[1921]: time="2025-09-12T18:56:54.240155234Z" level=info msg="connecting to shim 3a2c491dca6d5af3b2182f3b4a915dc2329aa7149dfb8011ded8146284a83d14" address="unix:///run/containerd/s/9dc2c370928db2859c2fe2a24742bd8bec702433658c8224ba85f69d978e071b" protocol=ttrpc version=3 Sep 12 18:56:54.263737 systemd[1]: Started cri-containerd-3a2c491dca6d5af3b2182f3b4a915dc2329aa7149dfb8011ded8146284a83d14.scope - libcontainer container 3a2c491dca6d5af3b2182f3b4a915dc2329aa7149dfb8011ded8146284a83d14. Sep 12 18:56:54.284419 containerd[1921]: time="2025-09-12T18:56:54.284395849Z" level=info msg="StartContainer for \"3a2c491dca6d5af3b2182f3b4a915dc2329aa7149dfb8011ded8146284a83d14\" returns successfully" Sep 12 18:56:54.585119 kubelet[3285]: I0912 18:56:54.585067 3285 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 18:56:54.585119 kubelet[3285]: I0912 18:56:54.585133 3285 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 18:56:54.840451 kubelet[3285]: I0912 18:56:54.840198 3285 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-mkb45" podStartSLOduration=26.377905465 podStartE2EDuration="34.840161082s" podCreationTimestamp="2025-09-12 18:56:20 +0000 UTC" firstStartedPulling="2025-09-12 18:56:45.767547428 +0000 UTC m=+40.316933815" lastFinishedPulling="2025-09-12 18:56:54.229803045 +0000 UTC m=+48.779189432" observedRunningTime="2025-09-12 18:56:54.839488376 +0000 UTC m=+49.388874843" watchObservedRunningTime="2025-09-12 18:56:54.840161082 +0000 UTC m=+49.389547515" Sep 12 18:56:57.317498 kubelet[3285]: I0912 18:56:57.317428 3285 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 18:56:57.407880 containerd[1921]: time="2025-09-12T18:56:57.407836484Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"daa6090edeb524f4c3c05bbf47104b254804d25ec4ffaecf968c8a634e05e729\" pid:6419 exited_at:{seconds:1757703417 nanos:407651770}" Sep 12 18:56:57.440957 containerd[1921]: time="2025-09-12T18:56:57.440934168Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"98acc6633a5a41d9775f77ef90e3a129a03fc9069c36068e404c2b63dc183484\" pid:6440 exited_at:{seconds:1757703417 nanos:440829627}" Sep 12 18:57:08.294092 containerd[1921]: time="2025-09-12T18:57:08.294019610Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"fbec3a2b4a7b772bb9f04d14a688e54fcdce2677cf89141476626262adea4f0d\" pid:6474 exited_at:{seconds:1757703428 nanos:293910309}" Sep 12 18:57:09.753228 containerd[1921]: time="2025-09-12T18:57:09.753194762Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\" id:\"989cc77230a91fc90d015b6ecfd110b38554bbd1448ef6eadb53a016afb91598\" pid:6496 exited_at:{seconds:1757703429 nanos:752847700}" Sep 12 18:57:19.910838 containerd[1921]: time="2025-09-12T18:57:19.910805022Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"78385e97dee3264ba5df65da15fe85dffe7d9902d43a659259b7bf2de85e4d4a\" pid:6542 exited_at:{seconds:1757703439 nanos:910621962}" Sep 12 18:57:27.512254 containerd[1921]: time="2025-09-12T18:57:27.512191959Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"f26c50a1d61ded118069cf1832ff8a4416a349fe4b1df415c3cd289c816711aa\" pid:6578 exited_at:{seconds:1757703447 nanos:512024241}" Sep 12 18:57:30.693629 kubelet[3285]: I0912 18:57:30.693526 3285 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 18:57:31.647859 containerd[1921]: time="2025-09-12T18:57:31.647833506Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"2afcfdc20e6f1d41cd8d6cd13b3eb82a6070d733d903b445289e0927dc70817e\" pid:6602 exited_at:{seconds:1757703451 nanos:647647439}" Sep 12 18:57:39.727771 containerd[1921]: time="2025-09-12T18:57:39.727745843Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\" id:\"3d59f47c4331ae9d096e5a0ae84b80a1e02fb1151d1f61c033ce18332de0ce34\" pid:6636 exited_at:{seconds:1757703459 nanos:727559630}" Sep 12 18:57:49.896565 containerd[1921]: time="2025-09-12T18:57:49.896537458Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"0465a15672b1819bd5c0b01a8ea08121a973639e5b764e08a569e0223e9ec9ff\" pid:6673 exited_at:{seconds:1757703469 nanos:896338126}" Sep 12 18:57:57.515230 containerd[1921]: time="2025-09-12T18:57:57.515198506Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"756d13c2660ad786792bd51edeefe1a1689f27b19e4c1125c0ef732d13d8d827\" pid:6703 exited_at:{seconds:1757703477 nanos:515039683}" Sep 12 18:58:08.292593 containerd[1921]: time="2025-09-12T18:58:08.292568432Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"c9e9dc2529abb2e3ed807b3b5a8c84fde4ce7de920b0f6f13de280de7318ebe1\" pid:6736 exited_at:{seconds:1757703488 nanos:292469199}" Sep 12 18:58:09.794454 containerd[1921]: time="2025-09-12T18:58:09.794423428Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\" id:\"7b64b6353b4f3a71ceb11979f215adb8e9be17da97d61eab74e8fd8c06201d58\" pid:6777 exited_at:{seconds:1757703489 nanos:794145384}" Sep 12 18:58:19.915157 containerd[1921]: time="2025-09-12T18:58:19.915132693Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"0dde738bb3fde1844a159a33f26051af64dffe594b32f05533ebddf9f0e24608\" pid:6822 exited_at:{seconds:1757703499 nanos:914974705}" Sep 12 18:58:27.457495 containerd[1921]: time="2025-09-12T18:58:27.457466674Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"9c90242f71dafd9e13e16bae4976a71dd3a53da84997b69a7d1db636834c45e2\" pid:6857 exited_at:{seconds:1757703507 nanos:457309554}" Sep 12 18:58:31.655612 containerd[1921]: time="2025-09-12T18:58:31.655525649Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"e967e48481f9355ac244bb6c2a8429e346ced3f9dc639386ea9ac4f7333a68d3\" pid:6879 exited_at:{seconds:1757703511 nanos:655338306}" Sep 12 18:58:39.731456 containerd[1921]: time="2025-09-12T18:58:39.731429128Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\" id:\"020c95bb21f1ff517f75e72435b712696d81eacd0b0b347404b2f3b2ffa0ad8e\" pid:6913 exited_at:{seconds:1757703519 nanos:731200439}" Sep 12 18:58:49.862715 containerd[1921]: time="2025-09-12T18:58:49.862688455Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"92bc991e743d31197f893fe7629830967f9e410e13079ed06f77259fca199c45\" pid:6950 exited_at:{seconds:1757703529 nanos:862513977}" Sep 12 18:58:57.500359 containerd[1921]: time="2025-09-12T18:58:57.500327571Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"fd3e0e658a728398de0f0e68757e10a7ca9a799f6f55ac7093a6234d90d7cc80\" pid:6984 exited_at:{seconds:1757703537 nanos:500128356}" Sep 12 18:59:08.328080 containerd[1921]: time="2025-09-12T18:59:08.328051564Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"8211d4fd2d5d68b6edb2f97d76e144e325bd60b11561e85c3467ea1165b0740f\" pid:7008 exited_at:{seconds:1757703548 nanos:327864352}" Sep 12 18:59:09.738006 containerd[1921]: time="2025-09-12T18:59:09.737977086Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\" id:\"5fe8f12b887ab43f470526897c9fb9a330545ca4021e65560a27756716516d6e\" pid:7030 exited_at:{seconds:1757703549 nanos:737729972}" Sep 12 18:59:19.851449 containerd[1921]: time="2025-09-12T18:59:19.851415753Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"12e4c8d484bd94810a1a043fe5738f180eca260bdf2a53ecad1e679a0bd30d3b\" pid:7074 exited_at:{seconds:1757703559 nanos:851215038}" Sep 12 18:59:27.495738 containerd[1921]: time="2025-09-12T18:59:27.495702582Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"f7e88b955eb0466057fb1b6788ab5be27db0ee699cc658adf542c4fae2b7fd99\" pid:7109 exited_at:{seconds:1757703567 nanos:495506273}" Sep 12 18:59:31.662341 containerd[1921]: time="2025-09-12T18:59:31.662316654Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"58b1d84503e60f81769b90215f0a32edbf572599c6bb02e9e920d7e61d0fc596\" pid:7132 exited_at:{seconds:1757703571 nanos:662111746}" Sep 12 18:59:39.736382 containerd[1921]: time="2025-09-12T18:59:39.736352034Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\" id:\"b83c0c0ad977438674953509d1a619b4a5c4aca887b7ffcfe771976554e51ce9\" pid:7166 exited_at:{seconds:1757703579 nanos:736131342}" Sep 12 18:59:49.846410 containerd[1921]: time="2025-09-12T18:59:49.846384432Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"3e23c71ffe0f05ec29c21229fb0a7eab35be71527c06b741239bbe8648f11a6c\" pid:7229 exited_at:{seconds:1757703589 nanos:846206683}" Sep 12 18:59:57.507685 containerd[1921]: time="2025-09-12T18:59:57.507626973Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"80905aca18069379be9d7788d14ca85251b7ff3d4d54864395dbcae843bee56f\" pid:7260 exited_at:{seconds:1757703597 nanos:507469568}" Sep 12 19:00:08.292443 containerd[1921]: time="2025-09-12T19:00:08.292390204Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"a4c3990595ccc09fea1d5ed2e9b53f1138c85f7cbcc9595dbfb63668ccb917c0\" pid:7284 exited_at:{seconds:1757703608 nanos:292253546}" Sep 12 19:00:09.734776 containerd[1921]: time="2025-09-12T19:00:09.734717893Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\" id:\"afd9238f12472ad076d77cad6cce131ed629782fed0b0c2b3320e9020813e85f\" pid:7307 exited_at:{seconds:1757703609 nanos:734535647}" Sep 12 19:00:19.899878 containerd[1921]: time="2025-09-12T19:00:19.899820235Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"669eea28b80a336c02d2b08c773fe831e7864b95ee10e26f17c3af8843ca704c\" pid:7344 exited_at:{seconds:1757703619 nanos:899622737}" Sep 12 19:00:27.451354 containerd[1921]: time="2025-09-12T19:00:27.451330415Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"573de3a55cdb239bcc2dcb45c372192ba0c80861201669738f63f7a44d17e42e\" pid:7375 exited_at:{seconds:1757703627 nanos:451239500}" Sep 12 19:00:31.677519 containerd[1921]: time="2025-09-12T19:00:31.677445703Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"2fb7b0cf1e171bf2e8f51ae33e48af38ac0070dcc731b55ef42538621cf587f6\" pid:7397 exited_at:{seconds:1757703631 nanos:677269187}" Sep 12 19:00:39.736062 containerd[1921]: time="2025-09-12T19:00:39.736027837Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\" id:\"61388ae6d5c0bde5ca774007ff06e3ea17b6f65a7d33e83bc49c546a11c621f6\" pid:7430 exited_at:{seconds:1757703639 nanos:735804521}" Sep 12 19:00:49.895544 containerd[1921]: time="2025-09-12T19:00:49.895517703Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"2c4fc1bf0334d6dd2ef2326b3bf878888c1dc9be67a57286e75c4195565e8a42\" pid:7467 exited_at:{seconds:1757703649 nanos:895346655}" Sep 12 19:00:57.459943 containerd[1921]: time="2025-09-12T19:00:57.459913416Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"057a04df3604489dca742f7b875f9c558a5ccdb7100fbbe83bb20e7e67a2e046\" pid:7504 exited_at:{seconds:1757703657 nanos:459738233}" Sep 12 19:01:01.732647 containerd[1921]: time="2025-09-12T19:01:01.732455536Z" level=warning msg="container event discarded" container=ceae479cf33779232ac6811e69d4eb4d894070cae0f4d4c0192a059115413047 type=CONTAINER_CREATED_EVENT Sep 12 19:01:01.732647 containerd[1921]: time="2025-09-12T19:01:01.732577063Z" level=warning msg="container event discarded" container=ceae479cf33779232ac6811e69d4eb4d894070cae0f4d4c0192a059115413047 type=CONTAINER_STARTED_EVENT Sep 12 19:01:01.748855 containerd[1921]: time="2025-09-12T19:01:01.748711804Z" level=warning msg="container event discarded" container=0508e8d8339b6f76996db3137578420a7e238172c818a4841839e3eac1315f23 type=CONTAINER_CREATED_EVENT Sep 12 19:01:01.748855 containerd[1921]: time="2025-09-12T19:01:01.748822182Z" level=warning msg="container event discarded" container=29445818f8aceccbefbaddde72e74376140db46c5e3dd6aade2b462d0f9d898e type=CONTAINER_CREATED_EVENT Sep 12 19:01:01.748855 containerd[1921]: time="2025-09-12T19:01:01.748853049Z" level=warning msg="container event discarded" container=29445818f8aceccbefbaddde72e74376140db46c5e3dd6aade2b462d0f9d898e type=CONTAINER_STARTED_EVENT Sep 12 19:01:01.748855 containerd[1921]: time="2025-09-12T19:01:01.748876750Z" level=warning msg="container event discarded" container=93c6c2ebb60a91cdd328d82a37cfb794a696b5dbe7ea39d9e8e943ac9c3a4d0c type=CONTAINER_CREATED_EVENT Sep 12 19:01:01.767602 containerd[1921]: time="2025-09-12T19:01:01.767409851Z" level=warning msg="container event discarded" container=03f7b83eaf4997eda3b66e31139b238645af354ae1923e39e41fc28803e57fe2 type=CONTAINER_CREATED_EVENT Sep 12 19:01:01.767602 containerd[1921]: time="2025-09-12T19:01:01.767526138Z" level=warning msg="container event discarded" container=03f7b83eaf4997eda3b66e31139b238645af354ae1923e39e41fc28803e57fe2 type=CONTAINER_STARTED_EVENT Sep 12 19:01:01.767602 containerd[1921]: time="2025-09-12T19:01:01.767555483Z" level=warning msg="container event discarded" container=90f27b6d4ceeffd4bb46e66c7cf2adea109c5a61aea5071015da3a0fe06811c8 type=CONTAINER_CREATED_EVENT Sep 12 19:01:01.802015 containerd[1921]: time="2025-09-12T19:01:01.801845149Z" level=warning msg="container event discarded" container=0508e8d8339b6f76996db3137578420a7e238172c818a4841839e3eac1315f23 type=CONTAINER_STARTED_EVENT Sep 12 19:01:01.802015 containerd[1921]: time="2025-09-12T19:01:01.801957716Z" level=warning msg="container event discarded" container=93c6c2ebb60a91cdd328d82a37cfb794a696b5dbe7ea39d9e8e943ac9c3a4d0c type=CONTAINER_STARTED_EVENT Sep 12 19:01:01.802015 containerd[1921]: time="2025-09-12T19:01:01.801991216Z" level=warning msg="container event discarded" container=90f27b6d4ceeffd4bb46e66c7cf2adea109c5a61aea5071015da3a0fe06811c8 type=CONTAINER_STARTED_EVENT Sep 12 19:01:08.290016 containerd[1921]: time="2025-09-12T19:01:08.289983659Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"41f46732b7c5f5217e1a07429cd59fcc2ab5cf5532ec6fa12f779ac0cf11a292\" pid:7528 exited_at:{seconds:1757703668 nanos:289812587}" Sep 12 19:01:09.779662 containerd[1921]: time="2025-09-12T19:01:09.779600310Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\" id:\"2f29a9b1baa5e9c99b732f59fca2c0af3446a056506bd19c90b5bf1a70eff25f\" pid:7549 exited_at:{seconds:1757703669 nanos:779312966}" Sep 12 19:01:11.809072 containerd[1921]: time="2025-09-12T19:01:11.808860176Z" level=warning msg="container event discarded" container=7fc5a3aa2b635b1f04ddc1d57e5932eb550f1740ecead0a7bf13896007214f61 type=CONTAINER_CREATED_EVENT Sep 12 19:01:11.809072 containerd[1921]: time="2025-09-12T19:01:11.809013617Z" level=warning msg="container event discarded" container=7fc5a3aa2b635b1f04ddc1d57e5932eb550f1740ecead0a7bf13896007214f61 type=CONTAINER_STARTED_EVENT Sep 12 19:01:11.809072 containerd[1921]: time="2025-09-12T19:01:11.809053287Z" level=warning msg="container event discarded" container=01dd60dc3b06ab58c6b3fedd63314ad8e89ac97b961bfab33bfb1340fcfb0b7e type=CONTAINER_CREATED_EVENT Sep 12 19:01:11.824789 containerd[1921]: time="2025-09-12T19:01:11.824653754Z" level=warning msg="container event discarded" container=103bd90771156adcce369946831a04728b719581889e59faf7d164bff9c7b7c4 type=CONTAINER_CREATED_EVENT Sep 12 19:01:11.824789 containerd[1921]: time="2025-09-12T19:01:11.824734762Z" level=warning msg="container event discarded" container=103bd90771156adcce369946831a04728b719581889e59faf7d164bff9c7b7c4 type=CONTAINER_STARTED_EVENT Sep 12 19:01:11.857366 containerd[1921]: time="2025-09-12T19:01:11.857201029Z" level=warning msg="container event discarded" container=01dd60dc3b06ab58c6b3fedd63314ad8e89ac97b961bfab33bfb1340fcfb0b7e type=CONTAINER_STARTED_EVENT Sep 12 19:01:13.548969 containerd[1921]: time="2025-09-12T19:01:13.548838431Z" level=warning msg="container event discarded" container=649ccd2af6188be53b82e09488e2a966209f1daa048ea8bea29c306b26e9b22d type=CONTAINER_CREATED_EVENT Sep 12 19:01:13.581490 containerd[1921]: time="2025-09-12T19:01:13.581314102Z" level=warning msg="container event discarded" container=649ccd2af6188be53b82e09488e2a966209f1daa048ea8bea29c306b26e9b22d type=CONTAINER_STARTED_EVENT Sep 12 19:01:19.887517 containerd[1921]: time="2025-09-12T19:01:19.887491290Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"f01987c8dd6cec5e35216b33a9a344f893e1ba7637b6bbf59d0771aedeea083a\" pid:7613 exited_at:{seconds:1757703679 nanos:887253323}" Sep 12 19:01:20.615872 containerd[1921]: time="2025-09-12T19:01:20.615672313Z" level=warning msg="container event discarded" container=d8ebb2c5869a55487d293aa43ee857f041c5ac173b48fd5136bbc74e6005fc56 type=CONTAINER_CREATED_EVENT Sep 12 19:01:20.615872 containerd[1921]: time="2025-09-12T19:01:20.615802249Z" level=warning msg="container event discarded" container=d8ebb2c5869a55487d293aa43ee857f041c5ac173b48fd5136bbc74e6005fc56 type=CONTAINER_STARTED_EVENT Sep 12 19:01:20.917678 containerd[1921]: time="2025-09-12T19:01:20.917356725Z" level=warning msg="container event discarded" container=4e8df7852a0401b9e58f31e3d3b489951ede8ad01aeef3c8f0f752d97d817bb2 type=CONTAINER_CREATED_EVENT Sep 12 19:01:20.917678 containerd[1921]: time="2025-09-12T19:01:20.917458283Z" level=warning msg="container event discarded" container=4e8df7852a0401b9e58f31e3d3b489951ede8ad01aeef3c8f0f752d97d817bb2 type=CONTAINER_STARTED_EVENT Sep 12 19:01:22.784582 containerd[1921]: time="2025-09-12T19:01:22.784429645Z" level=warning msg="container event discarded" container=56afa352140d3244cebeb0eaac6e0b8f4a7e6216524841e3647c2d138bfffa31 type=CONTAINER_CREATED_EVENT Sep 12 19:01:22.829918 containerd[1921]: time="2025-09-12T19:01:22.829761457Z" level=warning msg="container event discarded" container=56afa352140d3244cebeb0eaac6e0b8f4a7e6216524841e3647c2d138bfffa31 type=CONTAINER_STARTED_EVENT Sep 12 19:01:24.659252 containerd[1921]: time="2025-09-12T19:01:24.659086730Z" level=warning msg="container event discarded" container=9c8a082e897bec2ce345370f6acc50855f4752c1824238a20969e4a572000136 type=CONTAINER_CREATED_EVENT Sep 12 19:01:24.717576 containerd[1921]: time="2025-09-12T19:01:24.717436287Z" level=warning msg="container event discarded" container=9c8a082e897bec2ce345370f6acc50855f4752c1824238a20969e4a572000136 type=CONTAINER_STARTED_EVENT Sep 12 19:01:25.663000 containerd[1921]: time="2025-09-12T19:01:25.662892694Z" level=warning msg="container event discarded" container=9c8a082e897bec2ce345370f6acc50855f4752c1824238a20969e4a572000136 type=CONTAINER_STOPPED_EVENT Sep 12 19:01:27.456408 containerd[1921]: time="2025-09-12T19:01:27.456380025Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"b826de674941c8f9494eb0f91d528011b9d44d44aeb6e74dc515bea5911d418b\" pid:7646 exited_at:{seconds:1757703687 nanos:456244371}" Sep 12 19:01:29.776175 containerd[1921]: time="2025-09-12T19:01:29.776035093Z" level=warning msg="container event discarded" container=282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930 type=CONTAINER_CREATED_EVENT Sep 12 19:01:29.811506 containerd[1921]: time="2025-09-12T19:01:29.811366906Z" level=warning msg="container event discarded" container=282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930 type=CONTAINER_STARTED_EVENT Sep 12 19:01:30.744206 containerd[1921]: time="2025-09-12T19:01:30.744066283Z" level=warning msg="container event discarded" container=282c3d9e0794ccaa3a55bbace37cfd65a24cd8d98b59ae08e42d29cd3d74d930 type=CONTAINER_STOPPED_EVENT Sep 12 19:01:31.700459 containerd[1921]: time="2025-09-12T19:01:31.700406552Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"108565c765084a8cf7e27bc8c3a38c86eb0f10c2684fa4036178719656e04ef1\" pid:7669 exited_at:{seconds:1757703691 nanos:700153578}" Sep 12 19:01:37.175426 containerd[1921]: time="2025-09-12T19:01:37.175214924Z" level=warning msg="container event discarded" container=ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a type=CONTAINER_CREATED_EVENT Sep 12 19:01:37.229992 containerd[1921]: time="2025-09-12T19:01:37.229835361Z" level=warning msg="container event discarded" container=ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a type=CONTAINER_STARTED_EVENT Sep 12 19:01:38.256562 containerd[1921]: time="2025-09-12T19:01:38.256390379Z" level=warning msg="container event discarded" container=3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729 type=CONTAINER_CREATED_EVENT Sep 12 19:01:38.256562 containerd[1921]: time="2025-09-12T19:01:38.256507603Z" level=warning msg="container event discarded" container=3b5b9e778edb3b22d206cfb27e7de2afa5d747a31bc9ed68911613ef6408f729 type=CONTAINER_STARTED_EVENT Sep 12 19:01:39.736013 containerd[1921]: time="2025-09-12T19:01:39.735957140Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\" id:\"d1582fbd683cda8526f2f18530bbce3b709e613db2ddfbf711da8944bd841016\" pid:7700 exited_at:{seconds:1757703699 nanos:735663887}" Sep 12 19:01:39.962410 containerd[1921]: time="2025-09-12T19:01:39.962267555Z" level=warning msg="container event discarded" container=83100cbe2e9f25c4dbcc0b8f65076d1a83e138c83ed8784cbe7263062ec65d6b type=CONTAINER_CREATED_EVENT Sep 12 19:01:40.011909 containerd[1921]: time="2025-09-12T19:01:40.011752522Z" level=warning msg="container event discarded" container=83100cbe2e9f25c4dbcc0b8f65076d1a83e138c83ed8784cbe7263062ec65d6b type=CONTAINER_STARTED_EVENT Sep 12 19:01:41.698962 containerd[1921]: time="2025-09-12T19:01:41.698795698Z" level=warning msg="container event discarded" container=9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642 type=CONTAINER_CREATED_EVENT Sep 12 19:01:41.698962 containerd[1921]: time="2025-09-12T19:01:41.698901566Z" level=warning msg="container event discarded" container=9225191c658f84b9389d53e6c6f9c96c388e711e60035fa3549484e909c98642 type=CONTAINER_STARTED_EVENT Sep 12 19:01:42.349477 containerd[1921]: time="2025-09-12T19:01:42.349334393Z" level=warning msg="container event discarded" container=53a97ae6cae391f69987d2931355e588a850a9916d1e1d5ac33304f7ca3f8ba2 type=CONTAINER_CREATED_EVENT Sep 12 19:01:42.408925 containerd[1921]: time="2025-09-12T19:01:42.408783880Z" level=warning msg="container event discarded" container=53a97ae6cae391f69987d2931355e588a850a9916d1e1d5ac33304f7ca3f8ba2 type=CONTAINER_STARTED_EVENT Sep 12 19:01:42.664781 containerd[1921]: time="2025-09-12T19:01:42.664512032Z" level=warning msg="container event discarded" container=1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8 type=CONTAINER_CREATED_EVENT Sep 12 19:01:42.664781 containerd[1921]: time="2025-09-12T19:01:42.664620294Z" level=warning msg="container event discarded" container=1063d98aa4a7990e49bcc0284a9b2aad7526946f3c1d68f0e5cc1c9c7a8a4ac8 type=CONTAINER_STARTED_EVENT Sep 12 19:01:43.681431 containerd[1921]: time="2025-09-12T19:01:43.681293386Z" level=warning msg="container event discarded" container=b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9 type=CONTAINER_CREATED_EVENT Sep 12 19:01:43.681431 containerd[1921]: time="2025-09-12T19:01:43.681387491Z" level=warning msg="container event discarded" container=b99a02dfa1ea681b275eb4122876541da3cc668cda6f7e8c2021e2270ea7f9b9 type=CONTAINER_STARTED_EVENT Sep 12 19:01:43.681431 containerd[1921]: time="2025-09-12T19:01:43.681416557Z" level=warning msg="container event discarded" container=e79dcbe849287e521deebc131864b1349157078d762002a16d9e561a6e6b1244 type=CONTAINER_CREATED_EVENT Sep 12 19:01:43.721964 containerd[1921]: time="2025-09-12T19:01:43.721825777Z" level=warning msg="container event discarded" container=e79dcbe849287e521deebc131864b1349157078d762002a16d9e561a6e6b1244 type=CONTAINER_STARTED_EVENT Sep 12 19:01:43.860672 containerd[1921]: time="2025-09-12T19:01:43.860511184Z" level=warning msg="container event discarded" container=63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96 type=CONTAINER_CREATED_EVENT Sep 12 19:01:43.860672 containerd[1921]: time="2025-09-12T19:01:43.860631737Z" level=warning msg="container event discarded" container=63de9f42ad1703d2084ae097a72f11e29074ea4046cdbdfe136c990fbbbb5b96 type=CONTAINER_STARTED_EVENT Sep 12 19:01:44.674654 containerd[1921]: time="2025-09-12T19:01:44.674501503Z" level=warning msg="container event discarded" container=26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9 type=CONTAINER_CREATED_EVENT Sep 12 19:01:44.674654 containerd[1921]: time="2025-09-12T19:01:44.674612727Z" level=warning msg="container event discarded" container=26270dfd2146f5caa0c01dbf2de3098cf8b61a191c17598030b3331b1d4969b9 type=CONTAINER_STARTED_EVENT Sep 12 19:01:44.674654 containerd[1921]: time="2025-09-12T19:01:44.674644672Z" level=warning msg="container event discarded" container=9e9ad4cfc6cc1b02f330941532a6ec3ae69fd4e052868535834ffebec89206c9 type=CONTAINER_CREATED_EVENT Sep 12 19:01:44.778893 containerd[1921]: time="2025-09-12T19:01:44.778729866Z" level=warning msg="container event discarded" container=9e9ad4cfc6cc1b02f330941532a6ec3ae69fd4e052868535834ffebec89206c9 type=CONTAINER_STARTED_EVENT Sep 12 19:01:45.169037 containerd[1921]: time="2025-09-12T19:01:45.168874849Z" level=warning msg="container event discarded" container=1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0 type=CONTAINER_CREATED_EVENT Sep 12 19:01:45.217836 containerd[1921]: time="2025-09-12T19:01:45.217700255Z" level=warning msg="container event discarded" container=1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0 type=CONTAINER_STARTED_EVENT Sep 12 19:01:45.738554 containerd[1921]: time="2025-09-12T19:01:45.738388709Z" level=warning msg="container event discarded" container=9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5 type=CONTAINER_CREATED_EVENT Sep 12 19:01:45.738554 containerd[1921]: time="2025-09-12T19:01:45.738495079Z" level=warning msg="container event discarded" container=9d2482f0e377923c35871bb91a3ccc397b388b95be91691103b4deb88a2723a5 type=CONTAINER_STARTED_EVENT Sep 12 19:01:45.777679 containerd[1921]: time="2025-09-12T19:01:45.777526460Z" level=warning msg="container event discarded" container=3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b type=CONTAINER_CREATED_EVENT Sep 12 19:01:45.777679 containerd[1921]: time="2025-09-12T19:01:45.777619621Z" level=warning msg="container event discarded" container=3b8aa195216045c973298ed62df999ddb1f9e208940116433ff7aaf5e14c548b type=CONTAINER_STARTED_EVENT Sep 12 19:01:47.753010 containerd[1921]: time="2025-09-12T19:01:47.752855707Z" level=warning msg="container event discarded" container=5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0 type=CONTAINER_CREATED_EVENT Sep 12 19:01:47.811865 containerd[1921]: time="2025-09-12T19:01:47.811710508Z" level=warning msg="container event discarded" container=5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0 type=CONTAINER_STARTED_EVENT Sep 12 19:01:49.863334 containerd[1921]: time="2025-09-12T19:01:49.863279558Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"a1e0aee3f51ba1906edd1e5e7dc1178684ba68ece5763b0f88c45499d6231489\" pid:7737 exited_at:{seconds:1757703709 nanos:863047515}" Sep 12 19:01:50.421958 containerd[1921]: time="2025-09-12T19:01:50.421733901Z" level=warning msg="container event discarded" container=f5383962614876ec32b34b3819df0f74656ae9d47e0da2f6935ab0a0cde1101b type=CONTAINER_CREATED_EVENT Sep 12 19:01:50.486462 containerd[1921]: time="2025-09-12T19:01:50.486313035Z" level=warning msg="container event discarded" container=f5383962614876ec32b34b3819df0f74656ae9d47e0da2f6935ab0a0cde1101b type=CONTAINER_STARTED_EVENT Sep 12 19:01:50.823765 containerd[1921]: time="2025-09-12T19:01:50.823632866Z" level=warning msg="container event discarded" container=31a27cba20059c0755297bb61b89e7aa93155fbb8a9e126190a8a01871297c3a type=CONTAINER_CREATED_EVENT Sep 12 19:01:50.872201 containerd[1921]: time="2025-09-12T19:01:50.872051857Z" level=warning msg="container event discarded" container=31a27cba20059c0755297bb61b89e7aa93155fbb8a9e126190a8a01871297c3a type=CONTAINER_STARTED_EVENT Sep 12 19:01:52.397411 containerd[1921]: time="2025-09-12T19:01:52.397247606Z" level=warning msg="container event discarded" container=8b60d3164bc4be922eaed4b0437869205d526959c9f074efc877fa6eae9c1c7e type=CONTAINER_CREATED_EVENT Sep 12 19:01:52.435868 containerd[1921]: time="2025-09-12T19:01:52.435725586Z" level=warning msg="container event discarded" container=8b60d3164bc4be922eaed4b0437869205d526959c9f074efc877fa6eae9c1c7e type=CONTAINER_STARTED_EVENT Sep 12 19:01:54.248882 containerd[1921]: time="2025-09-12T19:01:54.248737886Z" level=warning msg="container event discarded" container=3a2c491dca6d5af3b2182f3b4a915dc2329aa7149dfb8011ded8146284a83d14 type=CONTAINER_CREATED_EVENT Sep 12 19:01:54.294210 containerd[1921]: time="2025-09-12T19:01:54.294152258Z" level=warning msg="container event discarded" container=3a2c491dca6d5af3b2182f3b4a915dc2329aa7149dfb8011ded8146284a83d14 type=CONTAINER_STARTED_EVENT Sep 12 19:01:57.465293 containerd[1921]: time="2025-09-12T19:01:57.465256781Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"3d563eb10eff60fe5af15b43c1e416c526676ee4190f7fd488c55dce48a833d9\" pid:7773 exited_at:{seconds:1757703717 nanos:465105651}" Sep 12 19:02:08.289155 containerd[1921]: time="2025-09-12T19:02:08.289125228Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"2f7883e73e12f69570bfce9e9bae6afa6cf143d2ac9f83ed603168bf0c66e9b9\" pid:7805 exited_at:{seconds:1757703728 nanos:288972313}" Sep 12 19:02:09.753580 containerd[1921]: time="2025-09-12T19:02:09.753540717Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\" id:\"38751a6a6d55a717eba1b121ed2809d9d61da5728f11a4dd4524fe792f516e6c\" pid:7827 exited_at:{seconds:1757703729 nanos:753323486}" Sep 12 19:02:19.909837 containerd[1921]: time="2025-09-12T19:02:19.909812007Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"4e8ee13850f3480d17fbe4348f6f313780765467fe00a8cbcf0c319917e7e633\" pid:7864 exited_at:{seconds:1757703739 nanos:909655234}" Sep 12 19:02:27.457618 containerd[1921]: time="2025-09-12T19:02:27.457574787Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"bc2540f95ba3361f8a7d924063c99cacae7937dcf8001b50f0a875381629afb9\" pid:7897 exited_at:{seconds:1757703747 nanos:457420794}" Sep 12 19:02:31.669256 containerd[1921]: time="2025-09-12T19:02:31.669224220Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"0d8936f4d9c597b872742ee6b9edeb89e92d927d7c5c9e515f6b6f6a18d40c53\" pid:7919 exited_at:{seconds:1757703751 nanos:669057158}" Sep 12 19:02:35.098289 systemd[1]: Started sshd@9-139.178.94.145:22-139.178.89.65:38958.service - OpenSSH per-connection server daemon (139.178.89.65:38958). Sep 12 19:02:35.218123 sshd[7944]: Accepted publickey for core from 139.178.89.65 port 38958 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 19:02:35.219789 sshd-session[7944]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 19:02:35.226092 systemd-logind[1909]: New session 12 of user core. Sep 12 19:02:35.235117 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 19:02:35.393137 sshd[7947]: Connection closed by 139.178.89.65 port 38958 Sep 12 19:02:35.393347 sshd-session[7944]: pam_unix(sshd:session): session closed for user core Sep 12 19:02:35.395506 systemd[1]: sshd@9-139.178.94.145:22-139.178.89.65:38958.service: Deactivated successfully. Sep 12 19:02:35.396734 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 19:02:35.398045 systemd-logind[1909]: Session 12 logged out. Waiting for processes to exit. Sep 12 19:02:35.398700 systemd-logind[1909]: Removed session 12. Sep 12 19:02:39.723853 containerd[1921]: time="2025-09-12T19:02:39.723825314Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\" id:\"e5b649dfc5205c16deee426d72f72eb79e17b3b1fd8e4b9646cc5fdc69126c13\" pid:7986 exited_at:{seconds:1757703759 nanos:723602585}" Sep 12 19:02:40.418463 systemd[1]: Started sshd@10-139.178.94.145:22-139.178.89.65:53424.service - OpenSSH per-connection server daemon (139.178.89.65:53424). Sep 12 19:02:40.493131 sshd[8011]: Accepted publickey for core from 139.178.89.65 port 53424 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 19:02:40.494579 sshd-session[8011]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 19:02:40.499632 systemd-logind[1909]: New session 13 of user core. Sep 12 19:02:40.515803 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 19:02:40.647528 sshd[8014]: Connection closed by 139.178.89.65 port 53424 Sep 12 19:02:40.647756 sshd-session[8011]: pam_unix(sshd:session): session closed for user core Sep 12 19:02:40.649659 systemd[1]: sshd@10-139.178.94.145:22-139.178.89.65:53424.service: Deactivated successfully. Sep 12 19:02:40.650734 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 19:02:40.651422 systemd-logind[1909]: Session 13 logged out. Waiting for processes to exit. Sep 12 19:02:40.652172 systemd-logind[1909]: Removed session 13. Sep 12 19:02:45.674462 systemd[1]: Started sshd@11-139.178.94.145:22-139.178.89.65:53434.service - OpenSSH per-connection server daemon (139.178.89.65:53434). Sep 12 19:02:45.791558 sshd[8043]: Accepted publickey for core from 139.178.89.65 port 53434 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 19:02:45.792606 sshd-session[8043]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 19:02:45.796890 systemd-logind[1909]: New session 14 of user core. Sep 12 19:02:45.814874 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 19:02:45.959479 sshd[8046]: Connection closed by 139.178.89.65 port 53434 Sep 12 19:02:45.959677 sshd-session[8043]: pam_unix(sshd:session): session closed for user core Sep 12 19:02:45.974289 systemd[1]: sshd@11-139.178.94.145:22-139.178.89.65:53434.service: Deactivated successfully. Sep 12 19:02:45.975396 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 19:02:45.975994 systemd-logind[1909]: Session 14 logged out. Waiting for processes to exit. Sep 12 19:02:45.977283 systemd[1]: Started sshd@12-139.178.94.145:22-139.178.89.65:53440.service - OpenSSH per-connection server daemon (139.178.89.65:53440). Sep 12 19:02:45.978166 systemd-logind[1909]: Removed session 14. Sep 12 19:02:46.018991 sshd[8073]: Accepted publickey for core from 139.178.89.65 port 53440 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 19:02:46.019792 sshd-session[8073]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 19:02:46.023370 systemd-logind[1909]: New session 15 of user core. Sep 12 19:02:46.035851 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 19:02:46.180506 sshd[8077]: Connection closed by 139.178.89.65 port 53440 Sep 12 19:02:46.180811 sshd-session[8073]: pam_unix(sshd:session): session closed for user core Sep 12 19:02:46.195807 systemd[1]: sshd@12-139.178.94.145:22-139.178.89.65:53440.service: Deactivated successfully. Sep 12 19:02:46.196790 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 19:02:46.197288 systemd-logind[1909]: Session 15 logged out. Waiting for processes to exit. Sep 12 19:02:46.198345 systemd[1]: Started sshd@13-139.178.94.145:22-139.178.89.65:53446.service - OpenSSH per-connection server daemon (139.178.89.65:53446). Sep 12 19:02:46.198976 systemd-logind[1909]: Removed session 15. Sep 12 19:02:46.231198 sshd[8100]: Accepted publickey for core from 139.178.89.65 port 53446 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 19:02:46.231877 sshd-session[8100]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 19:02:46.234935 systemd-logind[1909]: New session 16 of user core. Sep 12 19:02:46.242955 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 19:02:46.370245 sshd[8104]: Connection closed by 139.178.89.65 port 53446 Sep 12 19:02:46.370457 sshd-session[8100]: pam_unix(sshd:session): session closed for user core Sep 12 19:02:46.372380 systemd[1]: sshd@13-139.178.94.145:22-139.178.89.65:53446.service: Deactivated successfully. Sep 12 19:02:46.373426 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 19:02:46.374172 systemd-logind[1909]: Session 16 logged out. Waiting for processes to exit. Sep 12 19:02:46.374774 systemd-logind[1909]: Removed session 16. Sep 12 19:02:49.854261 containerd[1921]: time="2025-09-12T19:02:49.854234322Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"6259b6bdc35a0e658f62440362c90d9cd12fd06bcf35531812d27728f4e23462\" pid:8137 exited_at:{seconds:1757703769 nanos:854035506}" Sep 12 19:02:51.392913 systemd[1]: Started sshd@14-139.178.94.145:22-139.178.89.65:35464.service - OpenSSH per-connection server daemon (139.178.89.65:35464). Sep 12 19:02:51.486026 sshd[8171]: Accepted publickey for core from 139.178.89.65 port 35464 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 19:02:51.487575 sshd-session[8171]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 19:02:51.492962 systemd-logind[1909]: New session 17 of user core. Sep 12 19:02:51.502866 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 19:02:51.653475 sshd[8175]: Connection closed by 139.178.89.65 port 35464 Sep 12 19:02:51.653654 sshd-session[8171]: pam_unix(sshd:session): session closed for user core Sep 12 19:02:51.655665 systemd[1]: sshd@14-139.178.94.145:22-139.178.89.65:35464.service: Deactivated successfully. Sep 12 19:02:51.656783 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 19:02:51.657528 systemd-logind[1909]: Session 17 logged out. Waiting for processes to exit. Sep 12 19:02:51.658220 systemd-logind[1909]: Removed session 17. Sep 12 19:02:56.682275 systemd[1]: Started sshd@15-139.178.94.145:22-139.178.89.65:35480.service - OpenSSH per-connection server daemon (139.178.89.65:35480). Sep 12 19:02:56.726041 sshd[8217]: Accepted publickey for core from 139.178.89.65 port 35480 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 19:02:56.726669 sshd-session[8217]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 19:02:56.729478 systemd-logind[1909]: New session 18 of user core. Sep 12 19:02:56.745861 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 19:02:56.893824 sshd[8220]: Connection closed by 139.178.89.65 port 35480 Sep 12 19:02:56.894053 sshd-session[8217]: pam_unix(sshd:session): session closed for user core Sep 12 19:02:56.896149 systemd[1]: sshd@15-139.178.94.145:22-139.178.89.65:35480.service: Deactivated successfully. Sep 12 19:02:56.897208 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 19:02:56.897745 systemd-logind[1909]: Session 18 logged out. Waiting for processes to exit. Sep 12 19:02:56.898383 systemd-logind[1909]: Removed session 18. Sep 12 19:02:57.458819 containerd[1921]: time="2025-09-12T19:02:57.458771612Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"e1afeca4cbcde89ab7e4665e4fd55a27a38ab61eb41663d3ee5fd74dc0d34c80\" pid:8255 exited_at:{seconds:1757703777 nanos:458613397}" Sep 12 19:03:01.916990 systemd[1]: Started sshd@16-139.178.94.145:22-139.178.89.65:53820.service - OpenSSH per-connection server daemon (139.178.89.65:53820). Sep 12 19:03:01.966925 sshd[8265]: Accepted publickey for core from 139.178.89.65 port 53820 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 19:03:01.967659 sshd-session[8265]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 19:03:01.970942 systemd-logind[1909]: New session 19 of user core. Sep 12 19:03:01.982855 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 19:03:02.073757 sshd[8268]: Connection closed by 139.178.89.65 port 53820 Sep 12 19:03:02.073958 sshd-session[8265]: pam_unix(sshd:session): session closed for user core Sep 12 19:03:02.076215 systemd[1]: sshd@16-139.178.94.145:22-139.178.89.65:53820.service: Deactivated successfully. Sep 12 19:03:02.077244 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 19:03:02.077776 systemd-logind[1909]: Session 19 logged out. Waiting for processes to exit. Sep 12 19:03:02.078544 systemd-logind[1909]: Removed session 19. Sep 12 19:03:07.087609 systemd[1]: Started sshd@17-139.178.94.145:22-139.178.89.65:53830.service - OpenSSH per-connection server daemon (139.178.89.65:53830). Sep 12 19:03:07.120221 sshd[8295]: Accepted publickey for core from 139.178.89.65 port 53830 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 19:03:07.120990 sshd-session[8295]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 19:03:07.123993 systemd-logind[1909]: New session 20 of user core. Sep 12 19:03:07.141780 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 19:03:07.234876 sshd[8298]: Connection closed by 139.178.89.65 port 53830 Sep 12 19:03:07.235078 sshd-session[8295]: pam_unix(sshd:session): session closed for user core Sep 12 19:03:07.247836 systemd[1]: sshd@17-139.178.94.145:22-139.178.89.65:53830.service: Deactivated successfully. Sep 12 19:03:07.248790 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 19:03:07.249248 systemd-logind[1909]: Session 20 logged out. Waiting for processes to exit. Sep 12 19:03:07.250343 systemd[1]: Started sshd@18-139.178.94.145:22-139.178.89.65:53834.service - OpenSSH per-connection server daemon (139.178.89.65:53834). Sep 12 19:03:07.250888 systemd-logind[1909]: Removed session 20. Sep 12 19:03:07.283094 sshd[8323]: Accepted publickey for core from 139.178.89.65 port 53834 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 19:03:07.283881 sshd-session[8323]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 19:03:07.286695 systemd-logind[1909]: New session 21 of user core. Sep 12 19:03:07.301751 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 12 19:03:07.443179 sshd[8326]: Connection closed by 139.178.89.65 port 53834 Sep 12 19:03:07.443422 sshd-session[8323]: pam_unix(sshd:session): session closed for user core Sep 12 19:03:07.458206 systemd[1]: sshd@18-139.178.94.145:22-139.178.89.65:53834.service: Deactivated successfully. Sep 12 19:03:07.460399 systemd[1]: session-21.scope: Deactivated successfully. Sep 12 19:03:07.461756 systemd-logind[1909]: Session 21 logged out. Waiting for processes to exit. Sep 12 19:03:07.465826 systemd[1]: Started sshd@19-139.178.94.145:22-139.178.89.65:53848.service - OpenSSH per-connection server daemon (139.178.89.65:53848). Sep 12 19:03:07.467006 systemd-logind[1909]: Removed session 21. Sep 12 19:03:07.571001 sshd[8349]: Accepted publickey for core from 139.178.89.65 port 53848 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 19:03:07.572346 sshd-session[8349]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 19:03:07.577280 systemd-logind[1909]: New session 22 of user core. Sep 12 19:03:07.588892 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 12 19:03:08.225347 sshd[8354]: Connection closed by 139.178.89.65 port 53848 Sep 12 19:03:08.225576 sshd-session[8349]: pam_unix(sshd:session): session closed for user core Sep 12 19:03:08.235433 systemd[1]: sshd@19-139.178.94.145:22-139.178.89.65:53848.service: Deactivated successfully. Sep 12 19:03:08.236649 systemd[1]: session-22.scope: Deactivated successfully. Sep 12 19:03:08.237208 systemd-logind[1909]: Session 22 logged out. Waiting for processes to exit. Sep 12 19:03:08.238799 systemd[1]: Started sshd@20-139.178.94.145:22-139.178.89.65:53864.service - OpenSSH per-connection server daemon (139.178.89.65:53864). Sep 12 19:03:08.239338 systemd-logind[1909]: Removed session 22. Sep 12 19:03:08.272229 containerd[1921]: time="2025-09-12T19:03:08.272206708Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1218fd0407f42f5fe144f9982fe60831ca59c37b9a39a22a2cbe7377cfd439e0\" id:\"53ccece1e3dbad20ae9aa42cd4f1fda0b14efb32fcc5c1056a00caf86990b76b\" pid:8400 exited_at:{seconds:1757703788 nanos:272098166}" Sep 12 19:03:08.282542 sshd[8386]: Accepted publickey for core from 139.178.89.65 port 53864 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 19:03:08.283334 sshd-session[8386]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 19:03:08.286108 systemd-logind[1909]: New session 23 of user core. Sep 12 19:03:08.302741 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 12 19:03:08.437553 sshd[8413]: Connection closed by 139.178.89.65 port 53864 Sep 12 19:03:08.437759 sshd-session[8386]: pam_unix(sshd:session): session closed for user core Sep 12 19:03:08.455233 systemd[1]: sshd@20-139.178.94.145:22-139.178.89.65:53864.service: Deactivated successfully. Sep 12 19:03:08.456374 systemd[1]: session-23.scope: Deactivated successfully. Sep 12 19:03:08.456972 systemd-logind[1909]: Session 23 logged out. Waiting for processes to exit. Sep 12 19:03:08.458400 systemd[1]: Started sshd@21-139.178.94.145:22-139.178.89.65:53872.service - OpenSSH per-connection server daemon (139.178.89.65:53872). Sep 12 19:03:08.458907 systemd-logind[1909]: Removed session 23. Sep 12 19:03:08.494508 sshd[8436]: Accepted publickey for core from 139.178.89.65 port 53872 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 19:03:08.495239 sshd-session[8436]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 19:03:08.498135 systemd-logind[1909]: New session 24 of user core. Sep 12 19:03:08.516777 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 12 19:03:08.604563 sshd[8439]: Connection closed by 139.178.89.65 port 53872 Sep 12 19:03:08.604754 sshd-session[8436]: pam_unix(sshd:session): session closed for user core Sep 12 19:03:08.606654 systemd[1]: sshd@21-139.178.94.145:22-139.178.89.65:53872.service: Deactivated successfully. Sep 12 19:03:08.607703 systemd[1]: session-24.scope: Deactivated successfully. Sep 12 19:03:08.608669 systemd-logind[1909]: Session 24 logged out. Waiting for processes to exit. Sep 12 19:03:08.609214 systemd-logind[1909]: Removed session 24. Sep 12 19:03:09.736706 containerd[1921]: time="2025-09-12T19:03:09.736673415Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae7b61648fb5a43692ef7067d17b1db1cbb2d2e67841eb8a64a732fc9db4061a\" id:\"f92568a90b66da79c771be7079a794263fdf3cf4953029bb8756d972f5b4926f\" pid:8475 exited_at:{seconds:1757703789 nanos:736457376}" Sep 12 19:03:13.622504 systemd[1]: Started sshd@22-139.178.94.145:22-139.178.89.65:48788.service - OpenSSH per-connection server daemon (139.178.89.65:48788). Sep 12 19:03:13.665718 sshd[8503]: Accepted publickey for core from 139.178.89.65 port 48788 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 19:03:13.666375 sshd-session[8503]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 19:03:13.669429 systemd-logind[1909]: New session 25 of user core. Sep 12 19:03:13.679738 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 12 19:03:13.768828 sshd[8506]: Connection closed by 139.178.89.65 port 48788 Sep 12 19:03:13.769027 sshd-session[8503]: pam_unix(sshd:session): session closed for user core Sep 12 19:03:13.771021 systemd[1]: sshd@22-139.178.94.145:22-139.178.89.65:48788.service: Deactivated successfully. Sep 12 19:03:13.772093 systemd[1]: session-25.scope: Deactivated successfully. Sep 12 19:03:13.773050 systemd-logind[1909]: Session 25 logged out. Waiting for processes to exit. Sep 12 19:03:13.773679 systemd-logind[1909]: Removed session 25. Sep 12 19:03:18.786444 systemd[1]: Started sshd@23-139.178.94.145:22-139.178.89.65:48804.service - OpenSSH per-connection server daemon (139.178.89.65:48804). Sep 12 19:03:18.844156 sshd[8529]: Accepted publickey for core from 139.178.89.65 port 48804 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 19:03:18.845067 sshd-session[8529]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 19:03:18.848670 systemd-logind[1909]: New session 26 of user core. Sep 12 19:03:18.860738 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 12 19:03:18.948222 sshd[8532]: Connection closed by 139.178.89.65 port 48804 Sep 12 19:03:18.948441 sshd-session[8529]: pam_unix(sshd:session): session closed for user core Sep 12 19:03:18.950521 systemd[1]: sshd@23-139.178.94.145:22-139.178.89.65:48804.service: Deactivated successfully. Sep 12 19:03:18.951703 systemd[1]: session-26.scope: Deactivated successfully. Sep 12 19:03:18.952737 systemd-logind[1909]: Session 26 logged out. Waiting for processes to exit. Sep 12 19:03:18.953452 systemd-logind[1909]: Removed session 26. Sep 12 19:03:19.856802 containerd[1921]: time="2025-09-12T19:03:19.856744733Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5bc6e18b3df62c52f60f8c0d16ba2c562d9177f380c103f13a9c543c774676e0\" id:\"6b7a958207632977ff75b957ea4effeefed996f168dba758bf8d90cfd37ca420\" pid:8567 exited_at:{seconds:1757703799 nanos:856539875}" Sep 12 19:03:23.966674 systemd[1]: Started sshd@24-139.178.94.145:22-139.178.89.65:48386.service - OpenSSH per-connection server daemon (139.178.89.65:48386). Sep 12 19:03:24.022915 sshd[8589]: Accepted publickey for core from 139.178.89.65 port 48386 ssh2: RSA SHA256:jhJE5yMhCjIg1NNlTVEYEeu45ef3XMX6vpKvmtEe/iU Sep 12 19:03:24.023901 sshd-session[8589]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 19:03:24.027892 systemd-logind[1909]: New session 27 of user core. Sep 12 19:03:24.047282 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 12 19:03:24.137409 sshd[8592]: Connection closed by 139.178.89.65 port 48386 Sep 12 19:03:24.137607 sshd-session[8589]: pam_unix(sshd:session): session closed for user core Sep 12 19:03:24.139553 systemd[1]: sshd@24-139.178.94.145:22-139.178.89.65:48386.service: Deactivated successfully. Sep 12 19:03:24.140582 systemd[1]: session-27.scope: Deactivated successfully. Sep 12 19:03:24.141520 systemd-logind[1909]: Session 27 logged out. Waiting for processes to exit. Sep 12 19:03:24.142264 systemd-logind[1909]: Removed session 27.