Aug 13 01:45:01.893401 kernel: Linux version 6.12.40-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Aug 12 21:42:48 -00 2025 Aug 13 01:45:01.893416 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=215bdedb8de38f6b96ec4f9db80853e25015f60454b867e319fdcb9244320a21 Aug 13 01:45:01.893422 kernel: BIOS-provided physical RAM map: Aug 13 01:45:01.893426 kernel: BIOS-e820: [mem 0x0000000000000000-0x00000000000997ff] usable Aug 13 01:45:01.893430 kernel: BIOS-e820: [mem 0x0000000000099800-0x000000000009ffff] reserved Aug 13 01:45:01.893434 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Aug 13 01:45:01.893439 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Aug 13 01:45:01.893444 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Aug 13 01:45:01.893448 kernel: BIOS-e820: [mem 0x0000000040400000-0x000000006dfb3fff] usable Aug 13 01:45:01.893452 kernel: BIOS-e820: [mem 0x000000006dfb4000-0x000000006dfb4fff] ACPI NVS Aug 13 01:45:01.893457 kernel: BIOS-e820: [mem 0x000000006dfb5000-0x000000006dfb5fff] reserved Aug 13 01:45:01.893461 kernel: BIOS-e820: [mem 0x000000006dfb6000-0x0000000077fc6fff] usable Aug 13 01:45:01.893465 kernel: BIOS-e820: [mem 0x0000000077fc7000-0x00000000790a9fff] reserved Aug 13 01:45:01.893469 kernel: BIOS-e820: [mem 0x00000000790aa000-0x0000000079232fff] usable Aug 13 01:45:01.893475 kernel: BIOS-e820: [mem 0x0000000079233000-0x0000000079664fff] ACPI NVS Aug 13 01:45:01.893480 kernel: BIOS-e820: [mem 0x0000000079665000-0x000000007befefff] reserved Aug 13 01:45:01.893484 kernel: BIOS-e820: [mem 0x000000007beff000-0x000000007befffff] usable Aug 13 01:45:01.893489 kernel: BIOS-e820: [mem 0x000000007bf00000-0x000000007f7fffff] reserved Aug 13 01:45:01.893494 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Aug 13 01:45:01.893498 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Aug 13 01:45:01.893503 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Aug 13 01:45:01.893507 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Aug 13 01:45:01.893513 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Aug 13 01:45:01.893517 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000087f7fffff] usable Aug 13 01:45:01.893522 kernel: NX (Execute Disable) protection: active Aug 13 01:45:01.893527 kernel: APIC: Static calls initialized Aug 13 01:45:01.893531 kernel: SMBIOS 3.2.1 present. Aug 13 01:45:01.893536 kernel: DMI: Supermicro PIO-519C-MR-PH004/X11SCH-F, BIOS 1.5 11/17/2020 Aug 13 01:45:01.893541 kernel: DMI: Memory slots populated: 2/4 Aug 13 01:45:01.893545 kernel: tsc: Detected 3400.000 MHz processor Aug 13 01:45:01.893550 kernel: tsc: Detected 3399.906 MHz TSC Aug 13 01:45:01.893554 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Aug 13 01:45:01.893559 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Aug 13 01:45:01.893564 kernel: last_pfn = 0x87f800 max_arch_pfn = 0x400000000 Aug 13 01:45:01.893570 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 23), built from 10 variable MTRRs Aug 13 01:45:01.893575 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Aug 13 01:45:01.893580 kernel: last_pfn = 0x7bf00 max_arch_pfn = 0x400000000 Aug 13 01:45:01.893584 kernel: Using GB pages for direct mapping Aug 13 01:45:01.893589 kernel: ACPI: Early table checksum verification disabled Aug 13 01:45:01.893594 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Aug 13 01:45:01.893600 kernel: ACPI: XSDT 0x00000000795460C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Aug 13 01:45:01.893606 kernel: ACPI: FACP 0x0000000079582620 000114 (v06 01072009 AMI 00010013) Aug 13 01:45:01.893611 kernel: ACPI: DSDT 0x0000000079546268 03C3B7 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Aug 13 01:45:01.893616 kernel: ACPI: FACS 0x0000000079664F80 000040 Aug 13 01:45:01.893621 kernel: ACPI: APIC 0x0000000079582738 00012C (v04 01072009 AMI 00010013) Aug 13 01:45:01.893626 kernel: ACPI: FPDT 0x0000000079582868 000044 (v01 01072009 AMI 00010013) Aug 13 01:45:01.893631 kernel: ACPI: FIDT 0x00000000795828B0 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Aug 13 01:45:01.893636 kernel: ACPI: MCFG 0x0000000079582950 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Aug 13 01:45:01.893642 kernel: ACPI: SPMI 0x0000000079582990 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Aug 13 01:45:01.893647 kernel: ACPI: SSDT 0x00000000795829D8 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Aug 13 01:45:01.893652 kernel: ACPI: SSDT 0x00000000795844F8 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Aug 13 01:45:01.893657 kernel: ACPI: SSDT 0x00000000795876C0 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Aug 13 01:45:01.893662 kernel: ACPI: HPET 0x00000000795899F0 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Aug 13 01:45:01.893666 kernel: ACPI: SSDT 0x0000000079589A28 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Aug 13 01:45:01.893671 kernel: ACPI: SSDT 0x000000007958A9D8 0008F7 (v02 INTEL xh_mossb 00000000 INTL 20160527) Aug 13 01:45:01.893676 kernel: ACPI: UEFI 0x000000007958B2D0 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Aug 13 01:45:01.893682 kernel: ACPI: LPIT 0x000000007958B318 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Aug 13 01:45:01.893687 kernel: ACPI: SSDT 0x000000007958B3B0 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Aug 13 01:45:01.893692 kernel: ACPI: SSDT 0x000000007958DB90 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Aug 13 01:45:01.893697 kernel: ACPI: DBGP 0x000000007958F078 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Aug 13 01:45:01.893702 kernel: ACPI: DBG2 0x000000007958F0B0 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Aug 13 01:45:01.893707 kernel: ACPI: SSDT 0x000000007958F108 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Aug 13 01:45:01.893712 kernel: ACPI: DMAR 0x0000000079590C70 0000A8 (v01 INTEL EDK2 00000002 01000013) Aug 13 01:45:01.893717 kernel: ACPI: SSDT 0x0000000079590D18 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Aug 13 01:45:01.893722 kernel: ACPI: TPM2 0x0000000079590E60 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Aug 13 01:45:01.893728 kernel: ACPI: SSDT 0x0000000079590E98 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Aug 13 01:45:01.893733 kernel: ACPI: WSMT 0x0000000079591C28 000028 (v01 \xf4m 01072009 AMI 00010013) Aug 13 01:45:01.893738 kernel: ACPI: EINJ 0x0000000079591C50 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Aug 13 01:45:01.893743 kernel: ACPI: ERST 0x0000000079591D80 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Aug 13 01:45:01.893748 kernel: ACPI: BERT 0x0000000079591FB0 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Aug 13 01:45:01.893753 kernel: ACPI: HEST 0x0000000079591FE0 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Aug 13 01:45:01.893758 kernel: ACPI: SSDT 0x0000000079592260 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Aug 13 01:45:01.893763 kernel: ACPI: Reserving FACP table memory at [mem 0x79582620-0x79582733] Aug 13 01:45:01.893769 kernel: ACPI: Reserving DSDT table memory at [mem 0x79546268-0x7958261e] Aug 13 01:45:01.893774 kernel: ACPI: Reserving FACS table memory at [mem 0x79664f80-0x79664fbf] Aug 13 01:45:01.893778 kernel: ACPI: Reserving APIC table memory at [mem 0x79582738-0x79582863] Aug 13 01:45:01.893783 kernel: ACPI: Reserving FPDT table memory at [mem 0x79582868-0x795828ab] Aug 13 01:45:01.893788 kernel: ACPI: Reserving FIDT table memory at [mem 0x795828b0-0x7958294b] Aug 13 01:45:01.893793 kernel: ACPI: Reserving MCFG table memory at [mem 0x79582950-0x7958298b] Aug 13 01:45:01.893798 kernel: ACPI: Reserving SPMI table memory at [mem 0x79582990-0x795829d0] Aug 13 01:45:01.893803 kernel: ACPI: Reserving SSDT table memory at [mem 0x795829d8-0x795844f3] Aug 13 01:45:01.893808 kernel: ACPI: Reserving SSDT table memory at [mem 0x795844f8-0x795876bd] Aug 13 01:45:01.893813 kernel: ACPI: Reserving SSDT table memory at [mem 0x795876c0-0x795899ea] Aug 13 01:45:01.893818 kernel: ACPI: Reserving HPET table memory at [mem 0x795899f0-0x79589a27] Aug 13 01:45:01.893823 kernel: ACPI: Reserving SSDT table memory at [mem 0x79589a28-0x7958a9d5] Aug 13 01:45:01.893828 kernel: ACPI: Reserving SSDT table memory at [mem 0x7958a9d8-0x7958b2ce] Aug 13 01:45:01.893833 kernel: ACPI: Reserving UEFI table memory at [mem 0x7958b2d0-0x7958b311] Aug 13 01:45:01.893838 kernel: ACPI: Reserving LPIT table memory at [mem 0x7958b318-0x7958b3ab] Aug 13 01:45:01.893843 kernel: ACPI: Reserving SSDT table memory at [mem 0x7958b3b0-0x7958db8d] Aug 13 01:45:01.893847 kernel: ACPI: Reserving SSDT table memory at [mem 0x7958db90-0x7958f071] Aug 13 01:45:01.893852 kernel: ACPI: Reserving DBGP table memory at [mem 0x7958f078-0x7958f0ab] Aug 13 01:45:01.893857 kernel: ACPI: Reserving DBG2 table memory at [mem 0x7958f0b0-0x7958f103] Aug 13 01:45:01.893866 kernel: ACPI: Reserving SSDT table memory at [mem 0x7958f108-0x79590c6e] Aug 13 01:45:01.893871 kernel: ACPI: Reserving DMAR table memory at [mem 0x79590c70-0x79590d17] Aug 13 01:45:01.893876 kernel: ACPI: Reserving SSDT table memory at [mem 0x79590d18-0x79590e5b] Aug 13 01:45:01.893899 kernel: ACPI: Reserving TPM2 table memory at [mem 0x79590e60-0x79590e93] Aug 13 01:45:01.893904 kernel: ACPI: Reserving SSDT table memory at [mem 0x79590e98-0x79591c26] Aug 13 01:45:01.893909 kernel: ACPI: Reserving WSMT table memory at [mem 0x79591c28-0x79591c4f] Aug 13 01:45:01.893928 kernel: ACPI: Reserving EINJ table memory at [mem 0x79591c50-0x79591d7f] Aug 13 01:45:01.893933 kernel: ACPI: Reserving ERST table memory at [mem 0x79591d80-0x79591faf] Aug 13 01:45:01.893937 kernel: ACPI: Reserving BERT table memory at [mem 0x79591fb0-0x79591fdf] Aug 13 01:45:01.893943 kernel: ACPI: Reserving HEST table memory at [mem 0x79591fe0-0x7959225b] Aug 13 01:45:01.893948 kernel: ACPI: Reserving SSDT table memory at [mem 0x79592260-0x795923c1] Aug 13 01:45:01.893953 kernel: No NUMA configuration found Aug 13 01:45:01.893958 kernel: Faking a node at [mem 0x0000000000000000-0x000000087f7fffff] Aug 13 01:45:01.893963 kernel: NODE_DATA(0) allocated [mem 0x87f7f8dc0-0x87f7fffff] Aug 13 01:45:01.893968 kernel: Zone ranges: Aug 13 01:45:01.893973 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Aug 13 01:45:01.893978 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Aug 13 01:45:01.893983 kernel: Normal [mem 0x0000000100000000-0x000000087f7fffff] Aug 13 01:45:01.893989 kernel: Device empty Aug 13 01:45:01.893994 kernel: Movable zone start for each node Aug 13 01:45:01.893999 kernel: Early memory node ranges Aug 13 01:45:01.894004 kernel: node 0: [mem 0x0000000000001000-0x0000000000098fff] Aug 13 01:45:01.894009 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Aug 13 01:45:01.894013 kernel: node 0: [mem 0x0000000040400000-0x000000006dfb3fff] Aug 13 01:45:01.894018 kernel: node 0: [mem 0x000000006dfb6000-0x0000000077fc6fff] Aug 13 01:45:01.894024 kernel: node 0: [mem 0x00000000790aa000-0x0000000079232fff] Aug 13 01:45:01.894032 kernel: node 0: [mem 0x000000007beff000-0x000000007befffff] Aug 13 01:45:01.894038 kernel: node 0: [mem 0x0000000100000000-0x000000087f7fffff] Aug 13 01:45:01.894043 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000087f7fffff] Aug 13 01:45:01.894048 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Aug 13 01:45:01.894055 kernel: On node 0, zone DMA: 103 pages in unavailable ranges Aug 13 01:45:01.894060 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Aug 13 01:45:01.894065 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Aug 13 01:45:01.894070 kernel: On node 0, zone DMA32: 4323 pages in unavailable ranges Aug 13 01:45:01.894076 kernel: On node 0, zone DMA32: 11468 pages in unavailable ranges Aug 13 01:45:01.894081 kernel: On node 0, zone Normal: 16640 pages in unavailable ranges Aug 13 01:45:01.894087 kernel: On node 0, zone Normal: 2048 pages in unavailable ranges Aug 13 01:45:01.894092 kernel: ACPI: PM-Timer IO Port: 0x1808 Aug 13 01:45:01.894098 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Aug 13 01:45:01.894103 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Aug 13 01:45:01.894108 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Aug 13 01:45:01.894113 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Aug 13 01:45:01.894118 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Aug 13 01:45:01.894124 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Aug 13 01:45:01.894129 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Aug 13 01:45:01.894135 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Aug 13 01:45:01.894140 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Aug 13 01:45:01.894145 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Aug 13 01:45:01.894151 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Aug 13 01:45:01.894156 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Aug 13 01:45:01.894161 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Aug 13 01:45:01.894166 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Aug 13 01:45:01.894171 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Aug 13 01:45:01.894176 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Aug 13 01:45:01.894182 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Aug 13 01:45:01.894187 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Aug 13 01:45:01.894193 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Aug 13 01:45:01.894198 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Aug 13 01:45:01.894203 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Aug 13 01:45:01.894208 kernel: TSC deadline timer available Aug 13 01:45:01.894214 kernel: CPU topo: Max. logical packages: 1 Aug 13 01:45:01.894219 kernel: CPU topo: Max. logical dies: 1 Aug 13 01:45:01.894224 kernel: CPU topo: Max. dies per package: 1 Aug 13 01:45:01.894230 kernel: CPU topo: Max. threads per core: 2 Aug 13 01:45:01.894235 kernel: CPU topo: Num. cores per package: 8 Aug 13 01:45:01.894240 kernel: CPU topo: Num. threads per package: 16 Aug 13 01:45:01.894246 kernel: CPU topo: Allowing 16 present CPUs plus 0 hotplug CPUs Aug 13 01:45:01.894251 kernel: [mem 0x7f800000-0xdfffffff] available for PCI devices Aug 13 01:45:01.894256 kernel: Booting paravirtualized kernel on bare hardware Aug 13 01:45:01.894262 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Aug 13 01:45:01.894267 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Aug 13 01:45:01.894272 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Aug 13 01:45:01.894278 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Aug 13 01:45:01.894283 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Aug 13 01:45:01.894290 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=215bdedb8de38f6b96ec4f9db80853e25015f60454b867e319fdcb9244320a21 Aug 13 01:45:01.894295 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Aug 13 01:45:01.894300 kernel: random: crng init done Aug 13 01:45:01.894305 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Aug 13 01:45:01.894311 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Aug 13 01:45:01.894316 kernel: Fallback order for Node 0: 0 Aug 13 01:45:01.894322 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8352999 Aug 13 01:45:01.894328 kernel: Policy zone: Normal Aug 13 01:45:01.894333 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Aug 13 01:45:01.894338 kernel: software IO TLB: area num 16. Aug 13 01:45:01.894343 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Aug 13 01:45:01.894349 kernel: ftrace: allocating 40098 entries in 157 pages Aug 13 01:45:01.894354 kernel: ftrace: allocated 157 pages with 5 groups Aug 13 01:45:01.894359 kernel: Dynamic Preempt: voluntary Aug 13 01:45:01.894364 kernel: rcu: Preemptible hierarchical RCU implementation. Aug 13 01:45:01.894370 kernel: rcu: RCU event tracing is enabled. Aug 13 01:45:01.894376 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Aug 13 01:45:01.894381 kernel: Trampoline variant of Tasks RCU enabled. Aug 13 01:45:01.894387 kernel: Rude variant of Tasks RCU enabled. Aug 13 01:45:01.894392 kernel: Tracing variant of Tasks RCU enabled. Aug 13 01:45:01.894398 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Aug 13 01:45:01.894403 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Aug 13 01:45:01.894408 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Aug 13 01:45:01.894413 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Aug 13 01:45:01.894419 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Aug 13 01:45:01.894425 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Aug 13 01:45:01.894430 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Aug 13 01:45:01.894435 kernel: Console: colour VGA+ 80x25 Aug 13 01:45:01.894441 kernel: printk: legacy console [tty0] enabled Aug 13 01:45:01.894446 kernel: printk: legacy console [ttyS1] enabled Aug 13 01:45:01.894451 kernel: ACPI: Core revision 20240827 Aug 13 01:45:01.894456 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 79635855245 ns Aug 13 01:45:01.894462 kernel: APIC: Switch to symmetric I/O mode setup Aug 13 01:45:01.894467 kernel: DMAR: Host address width 39 Aug 13 01:45:01.894473 kernel: DMAR: DRHD base: 0x000000fed90000 flags: 0x0 Aug 13 01:45:01.894478 kernel: DMAR: dmar0: reg_base_addr fed90000 ver 1:0 cap 1c0000c40660462 ecap 19e2ff0505e Aug 13 01:45:01.894484 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Aug 13 01:45:01.894489 kernel: DMAR: dmar1: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Aug 13 01:45:01.894494 kernel: DMAR: RMRR base: 0x00000079f11000 end: 0x0000007a15afff Aug 13 01:45:01.894499 kernel: DMAR: RMRR base: 0x0000007d000000 end: 0x0000007f7fffff Aug 13 01:45:01.894504 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 1 Aug 13 01:45:01.894510 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Aug 13 01:45:01.894515 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Aug 13 01:45:01.894521 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Aug 13 01:45:01.894526 kernel: x2apic enabled Aug 13 01:45:01.894532 kernel: APIC: Switched APIC routing to: cluster x2apic Aug 13 01:45:01.894537 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Aug 13 01:45:01.894542 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Aug 13 01:45:01.894547 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Aug 13 01:45:01.894553 kernel: CPU0: Thermal monitoring enabled (TM1) Aug 13 01:45:01.894558 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Aug 13 01:45:01.894563 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Aug 13 01:45:01.894569 kernel: process: using mwait in idle threads Aug 13 01:45:01.894574 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Aug 13 01:45:01.894580 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Aug 13 01:45:01.894585 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Aug 13 01:45:01.894590 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Aug 13 01:45:01.894595 kernel: RETBleed: Mitigation: Enhanced IBRS Aug 13 01:45:01.894601 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Aug 13 01:45:01.894606 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Aug 13 01:45:01.894612 kernel: TAA: Mitigation: Clear CPU buffers Aug 13 01:45:01.894618 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Aug 13 01:45:01.894623 kernel: SRBDS: Mitigation: Microcode Aug 13 01:45:01.894628 kernel: GDS: Vulnerable: No microcode Aug 13 01:45:01.894633 kernel: ITS: Mitigation: Aligned branch/return thunks Aug 13 01:45:01.894639 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Aug 13 01:45:01.894644 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Aug 13 01:45:01.894649 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Aug 13 01:45:01.894654 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Aug 13 01:45:01.894661 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Aug 13 01:45:01.894666 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Aug 13 01:45:01.894671 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Aug 13 01:45:01.894676 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Aug 13 01:45:01.894682 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Aug 13 01:45:01.894687 kernel: Freeing SMP alternatives memory: 32K Aug 13 01:45:01.894692 kernel: pid_max: default: 32768 minimum: 301 Aug 13 01:45:01.894697 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Aug 13 01:45:01.894703 kernel: landlock: Up and running. Aug 13 01:45:01.894709 kernel: SELinux: Initializing. Aug 13 01:45:01.894714 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Aug 13 01:45:01.894719 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Aug 13 01:45:01.894725 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Aug 13 01:45:01.894730 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Aug 13 01:45:01.894735 kernel: ... version: 4 Aug 13 01:45:01.894740 kernel: ... bit width: 48 Aug 13 01:45:01.894745 kernel: ... generic registers: 4 Aug 13 01:45:01.894751 kernel: ... value mask: 0000ffffffffffff Aug 13 01:45:01.894757 kernel: ... max period: 00007fffffffffff Aug 13 01:45:01.894762 kernel: ... fixed-purpose events: 3 Aug 13 01:45:01.894767 kernel: ... event mask: 000000070000000f Aug 13 01:45:01.894773 kernel: signal: max sigframe size: 2032 Aug 13 01:45:01.894778 kernel: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Aug 13 01:45:01.894783 kernel: rcu: Hierarchical SRCU implementation. Aug 13 01:45:01.894788 kernel: rcu: Max phase no-delay instances is 400. Aug 13 01:45:01.894794 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Aug 13 01:45:01.894799 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Aug 13 01:45:01.894805 kernel: smp: Bringing up secondary CPUs ... Aug 13 01:45:01.894810 kernel: smpboot: x86: Booting SMP configuration: Aug 13 01:45:01.894816 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 #11 #12 #13 #14 #15 Aug 13 01:45:01.894821 kernel: Transient Scheduler Attacks: TAA CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/tsx_async_abort.html for more details. Aug 13 01:45:01.894827 kernel: Transient Scheduler Attacks: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Aug 13 01:45:01.894832 kernel: smp: Brought up 1 node, 16 CPUs Aug 13 01:45:01.894838 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Aug 13 01:45:01.894843 kernel: Memory: 32652116K/33411996K available (14336K kernel code, 2430K rwdata, 9960K rodata, 54444K init, 2524K bss, 732524K reserved, 0K cma-reserved) Aug 13 01:45:01.894849 kernel: devtmpfs: initialized Aug 13 01:45:01.894855 kernel: x86/mm: Memory block size: 128MB Aug 13 01:45:01.894860 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x6dfb4000-0x6dfb4fff] (4096 bytes) Aug 13 01:45:01.894867 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x79233000-0x79664fff] (4399104 bytes) Aug 13 01:45:01.894873 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Aug 13 01:45:01.894878 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Aug 13 01:45:01.894902 kernel: pinctrl core: initialized pinctrl subsystem Aug 13 01:45:01.894907 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Aug 13 01:45:01.894927 kernel: audit: initializing netlink subsys (disabled) Aug 13 01:45:01.894933 kernel: audit: type=2000 audit(1755049494.167:1): state=initialized audit_enabled=0 res=1 Aug 13 01:45:01.894938 kernel: thermal_sys: Registered thermal governor 'step_wise' Aug 13 01:45:01.894944 kernel: thermal_sys: Registered thermal governor 'user_space' Aug 13 01:45:01.894949 kernel: cpuidle: using governor menu Aug 13 01:45:01.894954 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Aug 13 01:45:01.894959 kernel: dca service started, version 1.12.1 Aug 13 01:45:01.894964 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Aug 13 01:45:01.894970 kernel: PCI: Using configuration type 1 for base access Aug 13 01:45:01.894975 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Aug 13 01:45:01.894981 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Aug 13 01:45:01.894987 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Aug 13 01:45:01.894992 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Aug 13 01:45:01.894997 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Aug 13 01:45:01.895002 kernel: ACPI: Added _OSI(Module Device) Aug 13 01:45:01.895008 kernel: ACPI: Added _OSI(Processor Device) Aug 13 01:45:01.895013 kernel: ACPI: Added _OSI(Processor Aggregator Device) Aug 13 01:45:01.895018 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Aug 13 01:45:01.895023 kernel: ACPI: Dynamic OEM Table Load: Aug 13 01:45:01.895030 kernel: ACPI: SSDT 0xFFFF9F7C82307400 000400 (v02 PmRef Cpu0Cst 00003001 INTL 20160527) Aug 13 01:45:01.895035 kernel: ACPI: Dynamic OEM Table Load: Aug 13 01:45:01.895040 kernel: ACPI: SSDT 0xFFFF9F7C822FA800 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Aug 13 01:45:01.895045 kernel: ACPI: Dynamic OEM Table Load: Aug 13 01:45:01.895050 kernel: ACPI: SSDT 0xFFFF9F7C8182B400 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Aug 13 01:45:01.895056 kernel: ACPI: Dynamic OEM Table Load: Aug 13 01:45:01.895061 kernel: ACPI: SSDT 0xFFFF9F7C81522800 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Aug 13 01:45:01.895066 kernel: ACPI: Dynamic OEM Table Load: Aug 13 01:45:01.895071 kernel: ACPI: SSDT 0xFFFF9F7C8152A000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Aug 13 01:45:01.895076 kernel: ACPI: Dynamic OEM Table Load: Aug 13 01:45:01.895082 kernel: ACPI: SSDT 0xFFFF9F7C81D70000 00030A (v02 PmRef ApCst 00003000 INTL 20160527) Aug 13 01:45:01.895088 kernel: ACPI: Interpreter enabled Aug 13 01:45:01.895093 kernel: ACPI: PM: (supports S0 S5) Aug 13 01:45:01.895098 kernel: ACPI: Using IOAPIC for interrupt routing Aug 13 01:45:01.895103 kernel: HEST: Enabling Firmware First mode for corrected errors. Aug 13 01:45:01.895109 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Aug 13 01:45:01.895114 kernel: HEST: Table parsing has been initialized. Aug 13 01:45:01.895119 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Aug 13 01:45:01.895124 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Aug 13 01:45:01.895130 kernel: PCI: Using E820 reservations for host bridge windows Aug 13 01:45:01.895136 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Aug 13 01:45:01.895141 kernel: ACPI: \_SB_.PCI0.XDCI.USBC: New power resource Aug 13 01:45:01.895147 kernel: ACPI: \_SB_.PCI0.SAT0.VOL0.V0PR: New power resource Aug 13 01:45:01.895152 kernel: ACPI: \_SB_.PCI0.SAT0.VOL1.V1PR: New power resource Aug 13 01:45:01.895157 kernel: ACPI: \_SB_.PCI0.SAT0.VOL2.V2PR: New power resource Aug 13 01:45:01.895162 kernel: ACPI: \_SB_.PCI0.CNVW.WRST: New power resource Aug 13 01:45:01.895167 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Aug 13 01:45:01.895173 kernel: ACPI: \_TZ_.FN00: New power resource Aug 13 01:45:01.895179 kernel: ACPI: \_TZ_.FN01: New power resource Aug 13 01:45:01.895184 kernel: ACPI: \_TZ_.FN02: New power resource Aug 13 01:45:01.895189 kernel: ACPI: \_TZ_.FN03: New power resource Aug 13 01:45:01.895195 kernel: ACPI: \_TZ_.FN04: New power resource Aug 13 01:45:01.895200 kernel: ACPI: \PIN_: New power resource Aug 13 01:45:01.895205 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Aug 13 01:45:01.895291 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Aug 13 01:45:01.895349 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Aug 13 01:45:01.895404 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Aug 13 01:45:01.895412 kernel: PCI host bridge to bus 0000:00 Aug 13 01:45:01.895467 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Aug 13 01:45:01.895517 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Aug 13 01:45:01.895565 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Aug 13 01:45:01.895613 kernel: pci_bus 0000:00: root bus resource [mem 0x7f800000-0xdfffffff window] Aug 13 01:45:01.895660 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Aug 13 01:45:01.895709 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Aug 13 01:45:01.895772 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 conventional PCI endpoint Aug 13 01:45:01.895838 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 PCIe Root Port Aug 13 01:45:01.895937 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Aug 13 01:45:01.895992 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Aug 13 01:45:01.896050 kernel: pci 0000:00:01.1: [8086:1905] type 01 class 0x060400 PCIe Root Port Aug 13 01:45:01.896107 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Aug 13 01:45:01.896162 kernel: pci 0000:00:01.1: bridge window [mem 0x96100000-0x962fffff] Aug 13 01:45:01.896215 kernel: pci 0000:00:01.1: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Aug 13 01:45:01.896269 kernel: pci 0000:00:01.1: PME# supported from D0 D3hot D3cold Aug 13 01:45:01.896326 kernel: pci 0000:00:02.0: [8086:3e9a] type 00 class 0x038000 PCIe Root Complex Integrated Endpoint Aug 13 01:45:01.896380 kernel: pci 0000:00:02.0: BAR 0 [mem 0x94000000-0x94ffffff 64bit] Aug 13 01:45:01.896443 kernel: pci 0000:00:02.0: BAR 2 [mem 0x80000000-0x8fffffff 64bit pref] Aug 13 01:45:01.896496 kernel: pci 0000:00:02.0: BAR 4 [io 0x6000-0x603f] Aug 13 01:45:01.896553 kernel: pci 0000:00:08.0: [8086:1911] type 00 class 0x088000 conventional PCI endpoint Aug 13 01:45:01.896606 kernel: pci 0000:00:08.0: BAR 0 [mem 0x9651f000-0x9651ffff 64bit] Aug 13 01:45:01.896664 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 conventional PCI endpoint Aug 13 01:45:01.896717 kernel: pci 0000:00:12.0: BAR 0 [mem 0x9651e000-0x9651efff 64bit] Aug 13 01:45:01.896775 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 conventional PCI endpoint Aug 13 01:45:01.896830 kernel: pci 0000:00:14.0: BAR 0 [mem 0x96500000-0x9650ffff 64bit] Aug 13 01:45:01.896906 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Aug 13 01:45:01.896981 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 conventional PCI endpoint Aug 13 01:45:01.897035 kernel: pci 0000:00:14.2: BAR 0 [mem 0x96512000-0x96513fff 64bit] Aug 13 01:45:01.897087 kernel: pci 0000:00:14.2: BAR 2 [mem 0x9651d000-0x9651dfff 64bit] Aug 13 01:45:01.897144 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 conventional PCI endpoint Aug 13 01:45:01.897200 kernel: pci 0000:00:15.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Aug 13 01:45:01.897261 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 conventional PCI endpoint Aug 13 01:45:01.897315 kernel: pci 0000:00:15.1: BAR 0 [mem 0x00000000-0x00000fff 64bit] Aug 13 01:45:01.897372 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 conventional PCI endpoint Aug 13 01:45:01.897427 kernel: pci 0000:00:16.0: BAR 0 [mem 0x9651a000-0x9651afff 64bit] Aug 13 01:45:01.897480 kernel: pci 0000:00:16.0: PME# supported from D3hot Aug 13 01:45:01.897537 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 conventional PCI endpoint Aug 13 01:45:01.897592 kernel: pci 0000:00:16.1: BAR 0 [mem 0x96519000-0x96519fff 64bit] Aug 13 01:45:01.897645 kernel: pci 0000:00:16.1: PME# supported from D3hot Aug 13 01:45:01.897702 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 conventional PCI endpoint Aug 13 01:45:01.897755 kernel: pci 0000:00:16.4: BAR 0 [mem 0x96518000-0x96518fff 64bit] Aug 13 01:45:01.897810 kernel: pci 0000:00:16.4: PME# supported from D3hot Aug 13 01:45:01.897869 kernel: pci 0000:00:17.0: [8086:a352] type 00 class 0x010601 conventional PCI endpoint Aug 13 01:45:01.897960 kernel: pci 0000:00:17.0: BAR 0 [mem 0x96510000-0x96511fff] Aug 13 01:45:01.898014 kernel: pci 0000:00:17.0: BAR 1 [mem 0x96517000-0x965170ff] Aug 13 01:45:01.898066 kernel: pci 0000:00:17.0: BAR 2 [io 0x6090-0x6097] Aug 13 01:45:01.898119 kernel: pci 0000:00:17.0: BAR 3 [io 0x6080-0x6083] Aug 13 01:45:01.898174 kernel: pci 0000:00:17.0: BAR 4 [io 0x6060-0x607f] Aug 13 01:45:01.898228 kernel: pci 0000:00:17.0: BAR 5 [mem 0x96516000-0x965167ff] Aug 13 01:45:01.898281 kernel: pci 0000:00:17.0: PME# supported from D3hot Aug 13 01:45:01.898341 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 PCIe Root Port Aug 13 01:45:01.898396 kernel: pci 0000:00:1b.0: PCI bridge to [bus 03] Aug 13 01:45:01.898449 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Aug 13 01:45:01.898507 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 PCIe Root Port Aug 13 01:45:01.898564 kernel: pci 0000:00:1b.4: PCI bridge to [bus 04] Aug 13 01:45:01.898618 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Aug 13 01:45:01.898672 kernel: pci 0000:00:1b.4: bridge window [mem 0x96400000-0x964fffff] Aug 13 01:45:01.898725 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Aug 13 01:45:01.898783 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 PCIe Root Port Aug 13 01:45:01.898838 kernel: pci 0000:00:1b.5: PCI bridge to [bus 05] Aug 13 01:45:01.898929 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Aug 13 01:45:01.898986 kernel: pci 0000:00:1b.5: bridge window [mem 0x96300000-0x963fffff] Aug 13 01:45:01.899040 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Aug 13 01:45:01.899097 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 PCIe Root Port Aug 13 01:45:01.899152 kernel: pci 0000:00:1c.0: PCI bridge to [bus 06] Aug 13 01:45:01.899206 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Aug 13 01:45:01.899264 kernel: pci 0000:00:1c.1: [8086:a339] type 01 class 0x060400 PCIe Root Port Aug 13 01:45:01.899319 kernel: pci 0000:00:1c.1: PCI bridge to [bus 07-08] Aug 13 01:45:01.899375 kernel: pci 0000:00:1c.1: bridge window [io 0x3000-0x3fff] Aug 13 01:45:01.899429 kernel: pci 0000:00:1c.1: bridge window [mem 0x95000000-0x960fffff] Aug 13 01:45:01.899483 kernel: pci 0000:00:1c.1: PME# supported from D0 D3hot D3cold Aug 13 01:45:01.899540 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 conventional PCI endpoint Aug 13 01:45:01.899594 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Aug 13 01:45:01.899655 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 conventional PCI endpoint Aug 13 01:45:01.899712 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 conventional PCI endpoint Aug 13 01:45:01.899768 kernel: pci 0000:00:1f.4: BAR 0 [mem 0x96514000-0x965140ff 64bit] Aug 13 01:45:01.899822 kernel: pci 0000:00:1f.4: BAR 4 [io 0xefa0-0xefbf] Aug 13 01:45:01.899914 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 conventional PCI endpoint Aug 13 01:45:01.899983 kernel: pci 0000:00:1f.5: BAR 0 [mem 0xfe010000-0xfe010fff] Aug 13 01:45:01.900037 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Aug 13 01:45:01.900099 kernel: pci 0000:02:00.0: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Aug 13 01:45:01.900158 kernel: pci 0000:02:00.0: BAR 0 [mem 0x92000000-0x93ffffff 64bit pref] Aug 13 01:45:01.900214 kernel: pci 0000:02:00.0: ROM [mem 0x96200000-0x962fffff pref] Aug 13 01:45:01.900269 kernel: pci 0000:02:00.0: PME# supported from D3cold Aug 13 01:45:01.900324 kernel: pci 0000:02:00.0: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Aug 13 01:45:01.900380 kernel: pci 0000:02:00.0: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Aug 13 01:45:01.900439 kernel: pci 0000:02:00.1: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Aug 13 01:45:01.900495 kernel: pci 0000:02:00.1: BAR 0 [mem 0x90000000-0x91ffffff 64bit pref] Aug 13 01:45:01.900554 kernel: pci 0000:02:00.1: ROM [mem 0x96100000-0x961fffff pref] Aug 13 01:45:01.900609 kernel: pci 0000:02:00.1: PME# supported from D3cold Aug 13 01:45:01.900664 kernel: pci 0000:02:00.1: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Aug 13 01:45:01.900750 kernel: pci 0000:02:00.1: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Aug 13 01:45:01.900805 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Aug 13 01:45:01.900860 kernel: pci 0000:00:1b.0: PCI bridge to [bus 03] Aug 13 01:45:01.900953 kernel: pci 0000:04:00.0: working around ROM BAR overlap defect Aug 13 01:45:01.901012 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Aug 13 01:45:01.901068 kernel: pci 0000:04:00.0: BAR 0 [mem 0x96400000-0x9647ffff] Aug 13 01:45:01.901123 kernel: pci 0000:04:00.0: BAR 2 [io 0x5000-0x501f] Aug 13 01:45:01.901178 kernel: pci 0000:04:00.0: BAR 3 [mem 0x96480000-0x96483fff] Aug 13 01:45:01.901233 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Aug 13 01:45:01.901288 kernel: pci 0000:00:1b.4: PCI bridge to [bus 04] Aug 13 01:45:01.901347 kernel: pci 0000:05:00.0: working around ROM BAR overlap defect Aug 13 01:45:01.901405 kernel: pci 0000:05:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Aug 13 01:45:01.901461 kernel: pci 0000:05:00.0: BAR 0 [mem 0x96300000-0x9637ffff] Aug 13 01:45:01.901517 kernel: pci 0000:05:00.0: BAR 2 [io 0x4000-0x401f] Aug 13 01:45:01.901572 kernel: pci 0000:05:00.0: BAR 3 [mem 0x96380000-0x96383fff] Aug 13 01:45:01.901626 kernel: pci 0000:05:00.0: PME# supported from D0 D3hot D3cold Aug 13 01:45:01.901680 kernel: pci 0000:00:1b.5: PCI bridge to [bus 05] Aug 13 01:45:01.901734 kernel: pci 0000:00:1c.0: PCI bridge to [bus 06] Aug 13 01:45:01.901797 kernel: pci 0000:07:00.0: [1a03:1150] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Aug 13 01:45:01.901855 kernel: pci 0000:07:00.0: PCI bridge to [bus 08] Aug 13 01:45:01.901941 kernel: pci 0000:07:00.0: bridge window [io 0x3000-0x3fff] Aug 13 01:45:01.902012 kernel: pci 0000:07:00.0: bridge window [mem 0x95000000-0x960fffff] Aug 13 01:45:01.902067 kernel: pci 0000:07:00.0: enabling Extended Tags Aug 13 01:45:01.902122 kernel: pci 0000:07:00.0: supports D1 D2 Aug 13 01:45:01.902179 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Aug 13 01:45:01.902233 kernel: pci 0000:00:1c.1: PCI bridge to [bus 07-08] Aug 13 01:45:01.902294 kernel: pci_bus 0000:08: extended config space not accessible Aug 13 01:45:01.902359 kernel: pci 0000:08:00.0: [1a03:2000] type 00 class 0x030000 conventional PCI endpoint Aug 13 01:45:01.902418 kernel: pci 0000:08:00.0: BAR 0 [mem 0x95000000-0x95ffffff] Aug 13 01:45:01.902476 kernel: pci 0000:08:00.0: BAR 1 [mem 0x96000000-0x9601ffff] Aug 13 01:45:01.902533 kernel: pci 0000:08:00.0: BAR 2 [io 0x3000-0x307f] Aug 13 01:45:01.902590 kernel: pci 0000:08:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Aug 13 01:45:01.902647 kernel: pci 0000:08:00.0: supports D1 D2 Aug 13 01:45:01.902707 kernel: pci 0000:08:00.0: PME# supported from D0 D1 D2 D3hot D3cold Aug 13 01:45:01.902763 kernel: pci 0000:07:00.0: PCI bridge to [bus 08] Aug 13 01:45:01.902771 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Aug 13 01:45:01.902777 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Aug 13 01:45:01.902784 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Aug 13 01:45:01.902790 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Aug 13 01:45:01.902795 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Aug 13 01:45:01.902801 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Aug 13 01:45:01.902806 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Aug 13 01:45:01.902812 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Aug 13 01:45:01.902817 kernel: iommu: Default domain type: Translated Aug 13 01:45:01.902823 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Aug 13 01:45:01.902829 kernel: PCI: Using ACPI for IRQ routing Aug 13 01:45:01.902835 kernel: PCI: pci_cache_line_size set to 64 bytes Aug 13 01:45:01.902841 kernel: e820: reserve RAM buffer [mem 0x00099800-0x0009ffff] Aug 13 01:45:01.902846 kernel: e820: reserve RAM buffer [mem 0x6dfb4000-0x6fffffff] Aug 13 01:45:01.902851 kernel: e820: reserve RAM buffer [mem 0x77fc7000-0x77ffffff] Aug 13 01:45:01.902857 kernel: e820: reserve RAM buffer [mem 0x79233000-0x7bffffff] Aug 13 01:45:01.902865 kernel: e820: reserve RAM buffer [mem 0x7bf00000-0x7bffffff] Aug 13 01:45:01.902871 kernel: e820: reserve RAM buffer [mem 0x87f800000-0x87fffffff] Aug 13 01:45:01.902928 kernel: pci 0000:08:00.0: vgaarb: setting as boot VGA device Aug 13 01:45:01.902987 kernel: pci 0000:08:00.0: vgaarb: bridge control possible Aug 13 01:45:01.903046 kernel: pci 0000:08:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Aug 13 01:45:01.903055 kernel: vgaarb: loaded Aug 13 01:45:01.903060 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0 Aug 13 01:45:01.903066 kernel: hpet0: 8 comparators, 64-bit 24.000000 MHz counter Aug 13 01:45:01.903072 kernel: clocksource: Switched to clocksource tsc-early Aug 13 01:45:01.903077 kernel: VFS: Disk quotas dquot_6.6.0 Aug 13 01:45:01.903083 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Aug 13 01:45:01.903089 kernel: pnp: PnP ACPI init Aug 13 01:45:01.903144 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Aug 13 01:45:01.903201 kernel: pnp 00:02: [dma 0 disabled] Aug 13 01:45:01.903256 kernel: pnp 00:03: [dma 0 disabled] Aug 13 01:45:01.903308 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Aug 13 01:45:01.903359 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Aug 13 01:45:01.903411 kernel: system 00:05: [mem 0xfed10000-0xfed17fff] has been reserved Aug 13 01:45:01.903461 kernel: system 00:05: [mem 0xfed18000-0xfed18fff] has been reserved Aug 13 01:45:01.903512 kernel: system 00:05: [mem 0xfed19000-0xfed19fff] has been reserved Aug 13 01:45:01.903561 kernel: system 00:05: [mem 0xe0000000-0xefffffff] has been reserved Aug 13 01:45:01.903610 kernel: system 00:05: [mem 0xfed20000-0xfed3ffff] has been reserved Aug 13 01:45:01.903658 kernel: system 00:05: [mem 0xfed90000-0xfed93fff] could not be reserved Aug 13 01:45:01.903707 kernel: system 00:05: [mem 0xfed45000-0xfed8ffff] has been reserved Aug 13 01:45:01.903756 kernel: system 00:05: [mem 0xfee00000-0xfeefffff] could not be reserved Aug 13 01:45:01.903807 kernel: system 00:06: [io 0x1800-0x18fe] could not be reserved Aug 13 01:45:01.903859 kernel: system 00:06: [mem 0xfd000000-0xfd69ffff] has been reserved Aug 13 01:45:01.903951 kernel: system 00:06: [mem 0xfd6c0000-0xfd6cffff] has been reserved Aug 13 01:45:01.904000 kernel: system 00:06: [mem 0xfd6f0000-0xfdffffff] has been reserved Aug 13 01:45:01.904049 kernel: system 00:06: [mem 0xfe000000-0xfe01ffff] could not be reserved Aug 13 01:45:01.904099 kernel: system 00:06: [mem 0xfe200000-0xfe7fffff] has been reserved Aug 13 01:45:01.904148 kernel: system 00:06: [mem 0xff000000-0xffffffff] has been reserved Aug 13 01:45:01.904201 kernel: system 00:07: [io 0x2000-0x20fe] has been reserved Aug 13 01:45:01.904211 kernel: pnp: PnP ACPI: found 9 devices Aug 13 01:45:01.904217 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Aug 13 01:45:01.904222 kernel: NET: Registered PF_INET protocol family Aug 13 01:45:01.904228 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Aug 13 01:45:01.904234 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Aug 13 01:45:01.904240 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Aug 13 01:45:01.904245 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Aug 13 01:45:01.904251 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Aug 13 01:45:01.904258 kernel: TCP: Hash tables configured (established 262144 bind 65536) Aug 13 01:45:01.904263 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Aug 13 01:45:01.904269 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Aug 13 01:45:01.904274 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Aug 13 01:45:01.904280 kernel: NET: Registered PF_XDP protocol family Aug 13 01:45:01.904334 kernel: pci 0000:00:15.0: BAR 0 [mem 0x7f800000-0x7f800fff 64bit]: assigned Aug 13 01:45:01.904388 kernel: pci 0000:00:15.1: BAR 0 [mem 0x7f801000-0x7f801fff 64bit]: assigned Aug 13 01:45:01.904443 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x7f802000-0x7f802fff 64bit]: assigned Aug 13 01:45:01.904499 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Aug 13 01:45:01.904556 kernel: pci 0000:02:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Aug 13 01:45:01.904612 kernel: pci 0000:02:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Aug 13 01:45:01.904667 kernel: pci 0000:02:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Aug 13 01:45:01.904723 kernel: pci 0000:02:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Aug 13 01:45:01.904778 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Aug 13 01:45:01.904831 kernel: pci 0000:00:01.1: bridge window [mem 0x96100000-0x962fffff] Aug 13 01:45:01.904888 kernel: pci 0000:00:01.1: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Aug 13 01:45:01.904943 kernel: pci 0000:00:1b.0: PCI bridge to [bus 03] Aug 13 01:45:01.904999 kernel: pci 0000:00:1b.4: PCI bridge to [bus 04] Aug 13 01:45:01.905053 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Aug 13 01:45:01.905106 kernel: pci 0000:00:1b.4: bridge window [mem 0x96400000-0x964fffff] Aug 13 01:45:01.905160 kernel: pci 0000:00:1b.5: PCI bridge to [bus 05] Aug 13 01:45:01.905213 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Aug 13 01:45:01.905267 kernel: pci 0000:00:1b.5: bridge window [mem 0x96300000-0x963fffff] Aug 13 01:45:01.905320 kernel: pci 0000:00:1c.0: PCI bridge to [bus 06] Aug 13 01:45:01.905375 kernel: pci 0000:07:00.0: PCI bridge to [bus 08] Aug 13 01:45:01.905430 kernel: pci 0000:07:00.0: bridge window [io 0x3000-0x3fff] Aug 13 01:45:01.905485 kernel: pci 0000:07:00.0: bridge window [mem 0x95000000-0x960fffff] Aug 13 01:45:01.905540 kernel: pci 0000:00:1c.1: PCI bridge to [bus 07-08] Aug 13 01:45:01.905594 kernel: pci 0000:00:1c.1: bridge window [io 0x3000-0x3fff] Aug 13 01:45:01.905647 kernel: pci 0000:00:1c.1: bridge window [mem 0x95000000-0x960fffff] Aug 13 01:45:01.905696 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Aug 13 01:45:01.905744 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Aug 13 01:45:01.905792 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Aug 13 01:45:01.905840 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Aug 13 01:45:01.905915 kernel: pci_bus 0000:00: resource 7 [mem 0x7f800000-0xdfffffff window] Aug 13 01:45:01.905976 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Aug 13 01:45:01.906034 kernel: pci_bus 0000:02: resource 1 [mem 0x96100000-0x962fffff] Aug 13 01:45:01.906085 kernel: pci_bus 0000:02: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Aug 13 01:45:01.906143 kernel: pci_bus 0000:04: resource 0 [io 0x5000-0x5fff] Aug 13 01:45:01.906193 kernel: pci_bus 0000:04: resource 1 [mem 0x96400000-0x964fffff] Aug 13 01:45:01.906246 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Aug 13 01:45:01.906296 kernel: pci_bus 0000:05: resource 1 [mem 0x96300000-0x963fffff] Aug 13 01:45:01.906352 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Aug 13 01:45:01.906401 kernel: pci_bus 0000:07: resource 1 [mem 0x95000000-0x960fffff] Aug 13 01:45:01.906454 kernel: pci_bus 0000:08: resource 0 [io 0x3000-0x3fff] Aug 13 01:45:01.906505 kernel: pci_bus 0000:08: resource 1 [mem 0x95000000-0x960fffff] Aug 13 01:45:01.906513 kernel: PCI: CLS 64 bytes, default 64 Aug 13 01:45:01.906519 kernel: DMAR: No ATSR found Aug 13 01:45:01.906525 kernel: DMAR: No SATC found Aug 13 01:45:01.906530 kernel: DMAR: IOMMU feature fl1gp_support inconsistent Aug 13 01:45:01.906538 kernel: DMAR: IOMMU feature pgsel_inv inconsistent Aug 13 01:45:01.906543 kernel: DMAR: IOMMU feature nwfs inconsistent Aug 13 01:45:01.906549 kernel: DMAR: IOMMU feature pasid inconsistent Aug 13 01:45:01.906554 kernel: DMAR: IOMMU feature eafs inconsistent Aug 13 01:45:01.906560 kernel: DMAR: IOMMU feature prs inconsistent Aug 13 01:45:01.906566 kernel: DMAR: IOMMU feature nest inconsistent Aug 13 01:45:01.906571 kernel: DMAR: IOMMU feature mts inconsistent Aug 13 01:45:01.906577 kernel: DMAR: IOMMU feature sc_support inconsistent Aug 13 01:45:01.906583 kernel: DMAR: IOMMU feature dev_iotlb_support inconsistent Aug 13 01:45:01.906589 kernel: DMAR: dmar0: Using Queued invalidation Aug 13 01:45:01.906595 kernel: DMAR: dmar1: Using Queued invalidation Aug 13 01:45:01.906648 kernel: pci 0000:00:02.0: Adding to iommu group 0 Aug 13 01:45:01.906704 kernel: pci 0000:00:00.0: Adding to iommu group 1 Aug 13 01:45:01.906758 kernel: pci 0000:00:01.0: Adding to iommu group 2 Aug 13 01:45:01.906812 kernel: pci 0000:00:01.1: Adding to iommu group 2 Aug 13 01:45:01.906913 kernel: pci 0000:00:08.0: Adding to iommu group 3 Aug 13 01:45:01.906983 kernel: pci 0000:00:12.0: Adding to iommu group 4 Aug 13 01:45:01.907039 kernel: pci 0000:00:14.0: Adding to iommu group 5 Aug 13 01:45:01.907093 kernel: pci 0000:00:14.2: Adding to iommu group 5 Aug 13 01:45:01.907148 kernel: pci 0000:00:15.0: Adding to iommu group 6 Aug 13 01:45:01.907201 kernel: pci 0000:00:15.1: Adding to iommu group 6 Aug 13 01:45:01.907255 kernel: pci 0000:00:16.0: Adding to iommu group 7 Aug 13 01:45:01.907309 kernel: pci 0000:00:16.1: Adding to iommu group 7 Aug 13 01:45:01.907362 kernel: pci 0000:00:16.4: Adding to iommu group 7 Aug 13 01:45:01.907415 kernel: pci 0000:00:17.0: Adding to iommu group 8 Aug 13 01:45:01.907471 kernel: pci 0000:00:1b.0: Adding to iommu group 9 Aug 13 01:45:01.907525 kernel: pci 0000:00:1b.4: Adding to iommu group 10 Aug 13 01:45:01.907579 kernel: pci 0000:00:1b.5: Adding to iommu group 11 Aug 13 01:45:01.907634 kernel: pci 0000:00:1c.0: Adding to iommu group 12 Aug 13 01:45:01.907688 kernel: pci 0000:00:1c.1: Adding to iommu group 13 Aug 13 01:45:01.907742 kernel: pci 0000:00:1e.0: Adding to iommu group 14 Aug 13 01:45:01.907795 kernel: pci 0000:00:1f.0: Adding to iommu group 15 Aug 13 01:45:01.907849 kernel: pci 0000:00:1f.4: Adding to iommu group 15 Aug 13 01:45:01.907944 kernel: pci 0000:00:1f.5: Adding to iommu group 15 Aug 13 01:45:01.908000 kernel: pci 0000:02:00.0: Adding to iommu group 2 Aug 13 01:45:01.908055 kernel: pci 0000:02:00.1: Adding to iommu group 2 Aug 13 01:45:01.908111 kernel: pci 0000:04:00.0: Adding to iommu group 16 Aug 13 01:45:01.908167 kernel: pci 0000:05:00.0: Adding to iommu group 17 Aug 13 01:45:01.908222 kernel: pci 0000:07:00.0: Adding to iommu group 18 Aug 13 01:45:01.908279 kernel: pci 0000:08:00.0: Adding to iommu group 18 Aug 13 01:45:01.908287 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Aug 13 01:45:01.908295 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Aug 13 01:45:01.908300 kernel: software IO TLB: mapped [mem 0x0000000073fc7000-0x0000000077fc7000] (64MB) Aug 13 01:45:01.908306 kernel: RAPL PMU: API unit is 2^-32 Joules, 4 fixed counters, 655360 ms ovfl timer Aug 13 01:45:01.908312 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Aug 13 01:45:01.908317 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Aug 13 01:45:01.908323 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Aug 13 01:45:01.908328 kernel: RAPL PMU: hw unit of domain pp1-gpu 2^-14 Joules Aug 13 01:45:01.908385 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Aug 13 01:45:01.908395 kernel: Initialise system trusted keyrings Aug 13 01:45:01.908400 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Aug 13 01:45:01.908406 kernel: Key type asymmetric registered Aug 13 01:45:01.908411 kernel: Asymmetric key parser 'x509' registered Aug 13 01:45:01.908417 kernel: tsc: Refined TSC clocksource calibration: 3407.989 MHz Aug 13 01:45:01.908423 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fc9c9451, max_idle_ns: 440795361646 ns Aug 13 01:45:01.908428 kernel: clocksource: Switched to clocksource tsc Aug 13 01:45:01.908434 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Aug 13 01:45:01.908440 kernel: io scheduler mq-deadline registered Aug 13 01:45:01.908446 kernel: io scheduler kyber registered Aug 13 01:45:01.908452 kernel: io scheduler bfq registered Aug 13 01:45:01.908506 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 122 Aug 13 01:45:01.908561 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 123 Aug 13 01:45:01.908616 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 124 Aug 13 01:45:01.908671 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 125 Aug 13 01:45:01.908725 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 126 Aug 13 01:45:01.908779 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 127 Aug 13 01:45:01.908835 kernel: pcieport 0000:00:1c.1: PME: Signaling with IRQ 128 Aug 13 01:45:01.908939 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Aug 13 01:45:01.908947 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Aug 13 01:45:01.908953 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Aug 13 01:45:01.908959 kernel: pstore: Using crash dump compression: deflate Aug 13 01:45:01.908964 kernel: pstore: Registered erst as persistent store backend Aug 13 01:45:01.908970 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Aug 13 01:45:01.908975 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Aug 13 01:45:01.908981 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Aug 13 01:45:01.908988 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Aug 13 01:45:01.909045 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Aug 13 01:45:01.909053 kernel: i8042: PNP: No PS/2 controller found. Aug 13 01:45:01.909103 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Aug 13 01:45:01.909154 kernel: rtc_cmos rtc_cmos: registered as rtc0 Aug 13 01:45:01.909204 kernel: rtc_cmos rtc_cmos: setting system clock to 2025-08-13T01:45:00 UTC (1755049500) Aug 13 01:45:01.909255 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Aug 13 01:45:01.909265 kernel: intel_pstate: Intel P-state driver initializing Aug 13 01:45:01.909270 kernel: intel_pstate: Disabling energy efficiency optimization Aug 13 01:45:01.909276 kernel: intel_pstate: HWP enabled Aug 13 01:45:01.909282 kernel: NET: Registered PF_INET6 protocol family Aug 13 01:45:01.909287 kernel: Segment Routing with IPv6 Aug 13 01:45:01.909293 kernel: In-situ OAM (IOAM) with IPv6 Aug 13 01:45:01.909298 kernel: NET: Registered PF_PACKET protocol family Aug 13 01:45:01.909304 kernel: Key type dns_resolver registered Aug 13 01:45:01.909309 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Aug 13 01:45:01.909316 kernel: microcode: Current revision: 0x000000de Aug 13 01:45:01.909322 kernel: IPI shorthand broadcast: enabled Aug 13 01:45:01.909327 kernel: sched_clock: Marking stable (4905000693, 1504031428)->(6951462325, -542430204) Aug 13 01:45:01.909333 kernel: registered taskstats version 1 Aug 13 01:45:01.909338 kernel: Loading compiled-in X.509 certificates Aug 13 01:45:01.909344 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.40-flatcar: dee0b464d3f7f8d09744a2392f69dde258bc95c0' Aug 13 01:45:01.909349 kernel: Demotion targets for Node 0: null Aug 13 01:45:01.909355 kernel: Key type .fscrypt registered Aug 13 01:45:01.909361 kernel: Key type fscrypt-provisioning registered Aug 13 01:45:01.909367 kernel: ima: Allocated hash algorithm: sha1 Aug 13 01:45:01.909373 kernel: ima: No architecture policies found Aug 13 01:45:01.909378 kernel: clk: Disabling unused clocks Aug 13 01:45:01.909384 kernel: Warning: unable to open an initial console. Aug 13 01:45:01.909390 kernel: Freeing unused kernel image (initmem) memory: 54444K Aug 13 01:45:01.909395 kernel: Write protecting the kernel read-only data: 24576k Aug 13 01:45:01.909401 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Aug 13 01:45:01.909406 kernel: Run /init as init process Aug 13 01:45:01.909412 kernel: with arguments: Aug 13 01:45:01.909418 kernel: /init Aug 13 01:45:01.909424 kernel: with environment: Aug 13 01:45:01.909429 kernel: HOME=/ Aug 13 01:45:01.909434 kernel: TERM=linux Aug 13 01:45:01.909440 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Aug 13 01:45:01.909446 systemd[1]: Successfully made /usr/ read-only. Aug 13 01:45:01.909454 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 13 01:45:01.909461 systemd[1]: Detected architecture x86-64. Aug 13 01:45:01.909467 systemd[1]: Running in initrd. Aug 13 01:45:01.909472 systemd[1]: No hostname configured, using default hostname. Aug 13 01:45:01.909478 systemd[1]: Hostname set to . Aug 13 01:45:01.909484 systemd[1]: Initializing machine ID from random generator. Aug 13 01:45:01.909490 systemd[1]: Queued start job for default target initrd.target. Aug 13 01:45:01.909496 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 01:45:01.909501 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 01:45:01.909508 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Aug 13 01:45:01.909515 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 13 01:45:01.909521 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Aug 13 01:45:01.909527 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Aug 13 01:45:01.909534 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Aug 13 01:45:01.909540 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Aug 13 01:45:01.909545 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 01:45:01.909552 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 13 01:45:01.909558 systemd[1]: Reached target paths.target - Path Units. Aug 13 01:45:01.909564 systemd[1]: Reached target slices.target - Slice Units. Aug 13 01:45:01.909570 systemd[1]: Reached target swap.target - Swaps. Aug 13 01:45:01.909576 systemd[1]: Reached target timers.target - Timer Units. Aug 13 01:45:01.909582 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Aug 13 01:45:01.909587 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 13 01:45:01.909593 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Aug 13 01:45:01.909599 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Aug 13 01:45:01.909606 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 13 01:45:01.909612 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 13 01:45:01.909617 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 01:45:01.909623 systemd[1]: Reached target sockets.target - Socket Units. Aug 13 01:45:01.909629 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Aug 13 01:45:01.909635 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 13 01:45:01.909641 systemd[1]: Finished network-cleanup.service - Network Cleanup. Aug 13 01:45:01.909647 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Aug 13 01:45:01.909653 systemd[1]: Starting systemd-fsck-usr.service... Aug 13 01:45:01.909659 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 13 01:45:01.909676 systemd-journald[297]: Collecting audit messages is disabled. Aug 13 01:45:01.909691 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 13 01:45:01.909699 systemd-journald[297]: Journal started Aug 13 01:45:01.909712 systemd-journald[297]: Runtime Journal (/run/log/journal/f0e343c2f6e44856a97b84585af76bba) is 8M, max 639.3M, 631.3M free. Aug 13 01:45:01.920869 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 01:45:01.929869 systemd[1]: Started systemd-journald.service - Journal Service. Aug 13 01:45:01.930086 systemd-modules-load[299]: Inserted module 'overlay' Aug 13 01:45:01.969024 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Aug 13 01:45:01.969086 kernel: Bridge firewalling registered Aug 13 01:45:01.962976 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Aug 13 01:45:01.967931 systemd-modules-load[299]: Inserted module 'br_netfilter' Aug 13 01:45:01.979339 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 01:45:01.988616 systemd[1]: Finished systemd-fsck-usr.service. Aug 13 01:45:01.988711 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 13 01:45:01.989789 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 13 01:45:01.990228 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 13 01:45:01.990655 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 13 01:45:02.007943 systemd-tmpfiles[313]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Aug 13 01:45:02.009298 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 13 01:45:02.014228 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 13 01:45:02.124991 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 01:45:02.156718 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 01:45:02.182920 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 13 01:45:02.198806 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 13 01:45:02.214526 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 13 01:45:02.233399 systemd-resolved[326]: Positive Trust Anchors: Aug 13 01:45:02.233404 systemd-resolved[326]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 13 01:45:02.233426 systemd-resolved[326]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 13 01:45:02.235094 systemd-resolved[326]: Defaulting to hostname 'linux'. Aug 13 01:45:02.249194 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 13 01:45:02.249315 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 13 01:45:02.275270 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 01:45:02.293260 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 13 01:45:02.304484 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Aug 13 01:45:02.351618 dracut-cmdline[342]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=215bdedb8de38f6b96ec4f9db80853e25015f60454b867e319fdcb9244320a21 Aug 13 01:45:02.452897 kernel: SCSI subsystem initialized Aug 13 01:45:02.465897 kernel: Loading iSCSI transport class v2.0-870. Aug 13 01:45:02.477899 kernel: iscsi: registered transport (tcp) Aug 13 01:45:02.501428 kernel: iscsi: registered transport (qla4xxx) Aug 13 01:45:02.501447 kernel: QLogic iSCSI HBA Driver Aug 13 01:45:02.512162 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 13 01:45:02.544491 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 13 01:45:02.556168 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 13 01:45:02.673832 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Aug 13 01:45:02.686831 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Aug 13 01:45:02.812905 kernel: raid6: avx2x4 gen() 17339 MB/s Aug 13 01:45:02.833902 kernel: raid6: avx2x2 gen() 39254 MB/s Aug 13 01:45:02.859981 kernel: raid6: avx2x1 gen() 46242 MB/s Aug 13 01:45:02.859997 kernel: raid6: using algorithm avx2x1 gen() 46242 MB/s Aug 13 01:45:02.887024 kernel: raid6: .... xor() 24907 MB/s, rmw enabled Aug 13 01:45:02.887040 kernel: raid6: using avx2x2 recovery algorithm Aug 13 01:45:02.909882 kernel: xor: automatically using best checksumming function avx Aug 13 01:45:03.015903 kernel: Btrfs loaded, zoned=no, fsverity=no Aug 13 01:45:03.019170 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Aug 13 01:45:03.029059 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 01:45:03.069672 systemd-udevd[554]: Using default interface naming scheme 'v255'. Aug 13 01:45:03.073420 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 01:45:03.098784 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Aug 13 01:45:03.138421 dracut-pre-trigger[566]: rd.md=0: removing MD RAID activation Aug 13 01:45:03.154167 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Aug 13 01:45:03.166153 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 13 01:45:03.250489 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 01:45:03.273971 kernel: cryptd: max_cpu_qlen set to 1000 Aug 13 01:45:03.263593 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Aug 13 01:45:03.316435 kernel: pps_core: LinuxPPS API ver. 1 registered Aug 13 01:45:03.316470 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Aug 13 01:45:03.316498 kernel: ACPI: bus type USB registered Aug 13 01:45:03.316519 kernel: usbcore: registered new interface driver usbfs Aug 13 01:45:03.316526 kernel: usbcore: registered new interface driver hub Aug 13 01:45:03.316533 kernel: usbcore: registered new device driver usb Aug 13 01:45:03.312354 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 13 01:45:03.524146 kernel: PTP clock support registered Aug 13 01:45:03.524167 kernel: libata version 3.00 loaded. Aug 13 01:45:03.524176 kernel: AES CTR mode by8 optimization enabled Aug 13 01:45:03.524188 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Aug 13 01:45:03.524312 kernel: ahci 0000:00:17.0: version 3.0 Aug 13 01:45:03.524406 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Aug 13 01:45:03.524510 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Aug 13 01:45:03.524626 kernel: ahci 0000:00:17.0: AHCI vers 0001.0301, 32 command slots, 6 Gbps, SATA mode Aug 13 01:45:03.524738 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Aug 13 01:45:03.524845 kernel: ahci 0000:00:17.0: 8/8 ports implemented (port mask 0xff) Aug 13 01:45:03.524943 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Aug 13 01:45:03.525022 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Aug 13 01:45:03.525130 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Aug 13 01:45:03.525238 kernel: hub 1-0:1.0: USB hub found Aug 13 01:45:03.525355 kernel: hub 1-0:1.0: 16 ports detected Aug 13 01:45:03.525452 kernel: hub 2-0:1.0: USB hub found Aug 13 01:45:03.525552 kernel: scsi host0: ahci Aug 13 01:45:03.525659 kernel: scsi host1: ahci Aug 13 01:45:03.525764 kernel: scsi host2: ahci Aug 13 01:45:03.525867 kernel: scsi host3: ahci Aug 13 01:45:03.525970 kernel: scsi host4: ahci Aug 13 01:45:03.526073 kernel: scsi host5: ahci Aug 13 01:45:03.526175 kernel: scsi host6: ahci Aug 13 01:45:03.526277 kernel: scsi host7: ahci Aug 13 01:45:03.526381 kernel: ata1: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516100 irq 129 lpm-pol 0 Aug 13 01:45:03.526394 kernel: ata2: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516180 irq 129 lpm-pol 0 Aug 13 01:45:03.526406 kernel: ata3: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516200 irq 129 lpm-pol 0 Aug 13 01:45:03.526418 kernel: ata4: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516280 irq 129 lpm-pol 0 Aug 13 01:45:03.526429 kernel: ata5: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516300 irq 129 lpm-pol 0 Aug 13 01:45:03.526441 kernel: ata6: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516380 irq 129 lpm-pol 0 Aug 13 01:45:03.526452 kernel: ata7: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516400 irq 129 lpm-pol 0 Aug 13 01:45:03.526465 kernel: ata8: SATA max UDMA/133 abar m2048@0x96516000 port 0x96516480 irq 129 lpm-pol 0 Aug 13 01:45:03.526479 kernel: hub 2-0:1.0: 10 ports detected Aug 13 01:45:03.312435 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 01:45:03.562062 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Aug 13 01:45:03.562074 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Aug 13 01:45:03.316737 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 01:45:03.524826 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 01:45:03.610970 kernel: igb 0000:04:00.0: added PHC on eth0 Aug 13 01:45:03.611068 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Aug 13 01:45:03.611146 kernel: igb 0000:04:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:73:0f:30 Aug 13 01:45:03.611220 kernel: igb 0000:04:00.0: eth0: PBA No: 010000-000 Aug 13 01:45:03.611295 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Aug 13 01:45:03.553415 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 13 01:45:03.642929 kernel: igb 0000:05:00.0: added PHC on eth1 Aug 13 01:45:03.643034 kernel: igb 0000:05:00.0: Intel(R) Gigabit Ethernet Network Connection Aug 13 01:45:03.643114 kernel: igb 0000:05:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:73:0f:31 Aug 13 01:45:03.643190 kernel: igb 0000:05:00.0: eth1: PBA No: 010000-000 Aug 13 01:45:03.643262 kernel: igb 0000:05:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Aug 13 01:45:03.681810 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 01:45:03.724895 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Aug 13 01:45:03.724914 kernel: ata3: SATA link down (SStatus 0 SControl 300) Aug 13 01:45:03.730896 kernel: ata8: SATA link down (SStatus 0 SControl 300) Aug 13 01:45:03.736869 kernel: ata5: SATA link down (SStatus 0 SControl 300) Aug 13 01:45:03.748492 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Aug 13 01:45:03.748523 kernel: ata4: SATA link down (SStatus 0 SControl 300) Aug 13 01:45:03.754867 kernel: ata7: SATA link down (SStatus 0 SControl 300) Aug 13 01:45:03.760904 kernel: ata2.00: Model 'Micron_5300_MTFDDAK480TDT', rev ' D3MU001', applying quirks: zeroaftertrim Aug 13 01:45:03.777062 kernel: ata2.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Aug 13 01:45:03.777893 kernel: ata6: SATA link down (SStatus 0 SControl 300) Aug 13 01:45:03.783900 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Aug 13 01:45:03.806980 kernel: ata1.00: Model 'Micron_5300_MTFDDAK480TDT', rev ' D3MU001', applying quirks: zeroaftertrim Aug 13 01:45:03.806996 kernel: ata1.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Aug 13 01:45:03.818075 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Aug 13 01:45:03.826086 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Aug 13 01:45:03.836086 kernel: ata2.00: Features: NCQ-prio Aug 13 01:45:03.841869 kernel: ata1.00: Features: NCQ-prio Aug 13 01:45:03.858909 kernel: ata2.00: configured for UDMA/133 Aug 13 01:45:03.858925 kernel: ata1.00: configured for UDMA/133 Aug 13 01:45:03.862914 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Aug 13 01:45:03.871927 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Aug 13 01:45:03.879870 kernel: hub 1-14:1.0: USB hub found Aug 13 01:45:03.888868 kernel: hub 1-14:1.0: 4 ports detected Aug 13 01:45:03.888965 kernel: igb 0000:04:00.0 eno1: renamed from eth0 Aug 13 01:45:03.898832 kernel: ata2.00: Enabling discard_zeroes_data Aug 13 01:45:03.898852 kernel: igb 0000:05:00.0 eno2: renamed from eth1 Aug 13 01:45:03.898976 kernel: ata1.00: Enabling discard_zeroes_data Aug 13 01:45:03.898986 kernel: sd 1:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Aug 13 01:45:03.899080 kernel: sd 0:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Aug 13 01:45:03.899158 kernel: sd 0:0:0:0: [sdb] 4096-byte physical blocks Aug 13 01:45:03.899228 kernel: sd 0:0:0:0: [sdb] Write Protect is off Aug 13 01:45:03.899299 kernel: sd 0:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Aug 13 01:45:03.899367 kernel: sd 0:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Aug 13 01:45:03.899435 kernel: sd 0:0:0:0: [sdb] Preferred minimum I/O size 4096 bytes Aug 13 01:45:03.899502 kernel: ata1.00: Enabling discard_zeroes_data Aug 13 01:45:03.961615 kernel: sd 1:0:0:0: [sda] 4096-byte physical blocks Aug 13 01:45:03.976045 kernel: sd 1:0:0:0: [sda] Write Protect is off Aug 13 01:45:03.976142 kernel: sd 1:0:0:0: [sda] Mode Sense: 00 3a 00 00 Aug 13 01:45:03.976218 kernel: sd 1:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Aug 13 01:45:03.982649 kernel: sd 1:0:0:0: [sda] Preferred minimum I/O size 4096 bytes Aug 13 01:45:03.988084 kernel: ata2.00: Enabling discard_zeroes_data Aug 13 01:45:04.002603 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Aug 13 01:45:04.002620 kernel: GPT:9289727 != 937703087 Aug 13 01:45:04.008890 kernel: GPT:Alternate GPT header not at the end of the disk. Aug 13 01:45:04.012748 kernel: GPT:9289727 != 937703087 Aug 13 01:45:04.018156 kernel: GPT: Use GNU Parted to correct GPT errors. Aug 13 01:45:04.023418 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Aug 13 01:45:04.028453 kernel: sd 0:0:0:0: [sdb] Attached SCSI disk Aug 13 01:45:04.147870 kernel: sd 1:0:0:0: [sda] Attached SCSI disk Aug 13 01:45:04.147965 kernel: mlx5_core 0000:02:00.0: PTM is not supported by PCIe Aug 13 01:45:04.160325 kernel: mlx5_core 0000:02:00.0: firmware version: 14.29.2002 Aug 13 01:45:04.169410 kernel: mlx5_core 0000:02:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Aug 13 01:45:04.186924 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Aug 13 01:45:04.192174 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Micron_5300_MTFDDAK480TDT ROOT. Aug 13 01:45:04.218533 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Micron_5300_MTFDDAK480TDT EFI-SYSTEM. Aug 13 01:45:04.235816 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5300_MTFDDAK480TDT OEM. Aug 13 01:45:04.255691 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Micron_5300_MTFDDAK480TDT USR-A. Aug 13 01:45:04.269940 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Micron_5300_MTFDDAK480TDT USR-A. Aug 13 01:45:04.340935 kernel: hid: raw HID events driver (C) Jiri Kosina Aug 13 01:45:04.340949 kernel: usbcore: registered new interface driver usbhid Aug 13 01:45:04.340957 kernel: usbhid: USB HID core driver Aug 13 01:45:04.340964 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Aug 13 01:45:04.270462 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Aug 13 01:45:04.355031 disk-uuid[762]: Primary Header is updated. Aug 13 01:45:04.355031 disk-uuid[762]: Secondary Entries is updated. Aug 13 01:45:04.355031 disk-uuid[762]: Secondary Header is updated. Aug 13 01:45:04.377966 kernel: ata1.00: Enabling discard_zeroes_data Aug 13 01:45:04.377979 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Aug 13 01:45:04.407815 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Aug 13 01:45:04.408015 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Aug 13 01:45:04.419767 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Aug 13 01:45:04.419941 kernel: mlx5_core 0000:02:00.0: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Aug 13 01:45:04.440545 kernel: mlx5_core 0000:02:00.0: Port module event: module 0, Cable plugged Aug 13 01:45:04.672955 kernel: mlx5_core 0000:02:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Aug 13 01:45:04.691488 kernel: mlx5_core 0000:02:00.1: PTM is not supported by PCIe Aug 13 01:45:04.692053 kernel: mlx5_core 0000:02:00.1: firmware version: 14.29.2002 Aug 13 01:45:04.692460 kernel: mlx5_core 0000:02:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Aug 13 01:45:04.977900 kernel: mlx5_core 0000:02:00.1: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Aug 13 01:45:04.990193 kernel: mlx5_core 0000:02:00.1: Port module event: module 1, Cable plugged Aug 13 01:45:05.249926 kernel: mlx5_core 0000:02:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Aug 13 01:45:05.260921 kernel: mlx5_core 0000:02:00.0 enp2s0f0np0: renamed from eth0 Aug 13 01:45:05.261035 kernel: mlx5_core 0000:02:00.1 enp2s0f1np1: renamed from eth1 Aug 13 01:45:05.273830 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Aug 13 01:45:05.284517 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Aug 13 01:45:05.293085 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 01:45:05.313110 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 13 01:45:05.343325 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Aug 13 01:45:05.379931 kernel: ata1.00: Enabling discard_zeroes_data Aug 13 01:45:05.381880 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Aug 13 01:45:05.403970 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Aug 13 01:45:05.403988 disk-uuid[763]: The operation has completed successfully. Aug 13 01:45:05.435659 systemd[1]: disk-uuid.service: Deactivated successfully. Aug 13 01:45:05.435711 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Aug 13 01:45:05.473249 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Aug 13 01:45:05.500868 sh[833]: Success Aug 13 01:45:05.529543 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Aug 13 01:45:05.529567 kernel: device-mapper: uevent: version 1.0.3 Aug 13 01:45:05.529898 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Aug 13 01:45:05.551932 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Aug 13 01:45:05.596009 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Aug 13 01:45:05.606189 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Aug 13 01:45:05.640261 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Aug 13 01:45:05.702984 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Aug 13 01:45:05.702998 kernel: BTRFS: device fsid 0c0338fb-9434-41c1-99a2-737cbe2351c4 devid 1 transid 44 /dev/mapper/usr (254:0) scanned by mount (846) Aug 13 01:45:05.703006 kernel: BTRFS info (device dm-0): first mount of filesystem 0c0338fb-9434-41c1-99a2-737cbe2351c4 Aug 13 01:45:05.703013 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Aug 13 01:45:05.703021 kernel: BTRFS info (device dm-0): using free-space-tree Aug 13 01:45:05.708196 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Aug 13 01:45:05.715275 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Aug 13 01:45:05.740125 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Aug 13 01:45:05.740670 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Aug 13 01:45:05.766277 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Aug 13 01:45:05.823970 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sdb6 (8:22) scanned by mount (869) Aug 13 01:45:05.823995 kernel: BTRFS info (device sdb6): first mount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 01:45:05.824008 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Aug 13 01:45:05.824020 kernel: BTRFS info (device sdb6): using free-space-tree Aug 13 01:45:05.824014 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 13 01:45:05.854294 kernel: BTRFS info (device sdb6): last unmount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 01:45:05.844090 systemd[1]: Finished ignition-setup.service - Ignition (setup). Aug 13 01:45:05.866304 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Aug 13 01:45:05.884914 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 13 01:45:05.936442 systemd-networkd[1015]: lo: Link UP Aug 13 01:45:05.936445 systemd-networkd[1015]: lo: Gained carrier Aug 13 01:45:05.939312 systemd-networkd[1015]: Enumeration completed Aug 13 01:45:05.939373 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 13 01:45:05.939839 systemd-networkd[1015]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 01:45:05.951087 systemd[1]: Reached target network.target - Network. Aug 13 01:45:05.968162 systemd-networkd[1015]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 01:45:05.995336 systemd-networkd[1015]: enp2s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 01:45:06.005215 ignition[1014]: Ignition 2.21.0 Aug 13 01:45:06.007316 unknown[1014]: fetched base config from "system" Aug 13 01:45:06.005220 ignition[1014]: Stage: fetch-offline Aug 13 01:45:06.007321 unknown[1014]: fetched user config from "system" Aug 13 01:45:06.005239 ignition[1014]: no configs at "/usr/lib/ignition/base.d" Aug 13 01:45:06.008364 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Aug 13 01:45:06.005244 ignition[1014]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Aug 13 01:45:06.029377 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Aug 13 01:45:06.005330 ignition[1014]: parsed url from cmdline: "" Aug 13 01:45:06.029923 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Aug 13 01:45:06.005332 ignition[1014]: no config URL provided Aug 13 01:45:06.005335 ignition[1014]: reading system config file "/usr/lib/ignition/user.ign" Aug 13 01:45:06.005360 ignition[1014]: parsing config with SHA512: 030c9abadb3e9431cbecf7b8cd29b7400879e6a04ec3c11b1d1596c4644ffca8bdec2f61235238a68c01a15d6c084f817681c8c8f3d3ab16fe9dec7894c6965b Aug 13 01:45:06.007500 ignition[1014]: fetch-offline: fetch-offline passed Aug 13 01:45:06.007503 ignition[1014]: POST message to Packet Timeline Aug 13 01:45:06.007505 ignition[1014]: POST Status error: resource requires networking Aug 13 01:45:06.007538 ignition[1014]: Ignition finished successfully Aug 13 01:45:06.165056 kernel: mlx5_core 0000:02:00.0 enp2s0f0np0: Link up Aug 13 01:45:06.159411 systemd-networkd[1015]: enp2s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 01:45:06.057406 ignition[1031]: Ignition 2.21.0 Aug 13 01:45:06.057410 ignition[1031]: Stage: kargs Aug 13 01:45:06.057506 ignition[1031]: no configs at "/usr/lib/ignition/base.d" Aug 13 01:45:06.057512 ignition[1031]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Aug 13 01:45:06.059023 ignition[1031]: kargs: kargs passed Aug 13 01:45:06.059027 ignition[1031]: POST message to Packet Timeline Aug 13 01:45:06.059042 ignition[1031]: GET https://metadata.packet.net/metadata: attempt #1 Aug 13 01:45:06.059384 ignition[1031]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:50095->[::1]:53: read: connection refused Aug 13 01:45:06.260389 ignition[1031]: GET https://metadata.packet.net/metadata: attempt #2 Aug 13 01:45:06.260727 ignition[1031]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:47495->[::1]:53: read: connection refused Aug 13 01:45:06.350909 kernel: mlx5_core 0000:02:00.1 enp2s0f1np1: Link up Aug 13 01:45:06.352406 systemd-networkd[1015]: eno1: Link UP Aug 13 01:45:06.352570 systemd-networkd[1015]: eno2: Link UP Aug 13 01:45:06.352724 systemd-networkd[1015]: enp2s0f0np0: Link UP Aug 13 01:45:06.352921 systemd-networkd[1015]: enp2s0f0np0: Gained carrier Aug 13 01:45:06.363182 systemd-networkd[1015]: enp2s0f1np1: Link UP Aug 13 01:45:06.363999 systemd-networkd[1015]: enp2s0f1np1: Gained carrier Aug 13 01:45:06.398041 systemd-networkd[1015]: enp2s0f0np0: DHCPv4 address 147.75.71.211/31, gateway 147.75.71.210 acquired from 145.40.83.140 Aug 13 01:45:06.661837 ignition[1031]: GET https://metadata.packet.net/metadata: attempt #3 Aug 13 01:45:06.663029 ignition[1031]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:50653->[::1]:53: read: connection refused Aug 13 01:45:07.463438 ignition[1031]: GET https://metadata.packet.net/metadata: attempt #4 Aug 13 01:45:07.464659 ignition[1031]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:45924->[::1]:53: read: connection refused Aug 13 01:45:08.104413 systemd-networkd[1015]: enp2s0f0np0: Gained IPv6LL Aug 13 01:45:08.105283 systemd-networkd[1015]: enp2s0f1np1: Gained IPv6LL Aug 13 01:45:09.066173 ignition[1031]: GET https://metadata.packet.net/metadata: attempt #5 Aug 13 01:45:09.067419 ignition[1031]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:54760->[::1]:53: read: connection refused Aug 13 01:45:12.270904 ignition[1031]: GET https://metadata.packet.net/metadata: attempt #6 Aug 13 01:45:13.289721 ignition[1031]: GET result: OK Aug 13 01:45:13.742756 ignition[1031]: Ignition finished successfully Aug 13 01:45:13.748974 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Aug 13 01:45:13.758810 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Aug 13 01:45:13.803433 ignition[1049]: Ignition 2.21.0 Aug 13 01:45:13.803438 ignition[1049]: Stage: disks Aug 13 01:45:13.803520 ignition[1049]: no configs at "/usr/lib/ignition/base.d" Aug 13 01:45:13.803526 ignition[1049]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Aug 13 01:45:13.804616 ignition[1049]: disks: disks passed Aug 13 01:45:13.804620 ignition[1049]: POST message to Packet Timeline Aug 13 01:45:13.804633 ignition[1049]: GET https://metadata.packet.net/metadata: attempt #1 Aug 13 01:45:14.871073 ignition[1049]: GET result: OK Aug 13 01:45:16.267018 ignition[1049]: Ignition finished successfully Aug 13 01:45:16.272359 systemd[1]: Finished ignition-disks.service - Ignition (disks). Aug 13 01:45:16.284263 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Aug 13 01:45:16.292172 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Aug 13 01:45:16.310178 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 13 01:45:16.340290 systemd[1]: Reached target sysinit.target - System Initialization. Aug 13 01:45:16.358314 systemd[1]: Reached target basic.target - Basic System. Aug 13 01:45:16.378161 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Aug 13 01:45:16.428828 systemd-fsck[1067]: ROOT: clean, 15/553520 files, 52789/553472 blocks Aug 13 01:45:16.438287 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Aug 13 01:45:16.452781 systemd[1]: Mounting sysroot.mount - /sysroot... Aug 13 01:45:16.557919 kernel: EXT4-fs (sdb9): mounted filesystem 069caac6-7833-4acd-8940-01a7ff7d1281 r/w with ordered data mode. Quota mode: none. Aug 13 01:45:16.558486 systemd[1]: Mounted sysroot.mount - /sysroot. Aug 13 01:45:16.566306 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Aug 13 01:45:16.573212 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 13 01:45:16.611616 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Aug 13 01:45:16.656833 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sdb6 (8:22) scanned by mount (1076) Aug 13 01:45:16.656851 kernel: BTRFS info (device sdb6): first mount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 01:45:16.656868 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Aug 13 01:45:16.656878 kernel: BTRFS info (device sdb6): using free-space-tree Aug 13 01:45:16.619778 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Aug 13 01:45:16.672300 systemd[1]: Starting flatcar-static-network.service - Flatcar Static Network Agent... Aug 13 01:45:16.686109 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Aug 13 01:45:16.686128 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Aug 13 01:45:16.738127 coreos-metadata[1078]: Aug 13 01:45:16.726 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Aug 13 01:45:16.704994 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 13 01:45:16.758144 coreos-metadata[1088]: Aug 13 01:45:16.726 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Aug 13 01:45:16.729223 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Aug 13 01:45:16.747027 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Aug 13 01:45:16.805195 initrd-setup-root[1108]: cut: /sysroot/etc/passwd: No such file or directory Aug 13 01:45:16.814016 initrd-setup-root[1115]: cut: /sysroot/etc/group: No such file or directory Aug 13 01:45:16.822985 initrd-setup-root[1122]: cut: /sysroot/etc/shadow: No such file or directory Aug 13 01:45:16.831977 initrd-setup-root[1129]: cut: /sysroot/etc/gshadow: No such file or directory Aug 13 01:45:16.868477 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Aug 13 01:45:16.877846 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Aug 13 01:45:16.886678 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Aug 13 01:45:16.925296 systemd[1]: sysroot-oem.mount: Deactivated successfully. Aug 13 01:45:16.933002 kernel: BTRFS info (device sdb6): last unmount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 01:45:16.940409 ignition[1196]: INFO : Ignition 2.21.0 Aug 13 01:45:16.940409 ignition[1196]: INFO : Stage: mount Aug 13 01:45:16.947043 ignition[1196]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 01:45:16.947043 ignition[1196]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Aug 13 01:45:16.947043 ignition[1196]: INFO : mount: mount passed Aug 13 01:45:16.947043 ignition[1196]: INFO : POST message to Packet Timeline Aug 13 01:45:16.947043 ignition[1196]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Aug 13 01:45:16.944225 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Aug 13 01:45:17.825101 coreos-metadata[1088]: Aug 13 01:45:17.825 INFO Fetch successful Aug 13 01:45:17.861064 systemd[1]: flatcar-static-network.service: Deactivated successfully. Aug 13 01:45:17.861123 systemd[1]: Finished flatcar-static-network.service - Flatcar Static Network Agent. Aug 13 01:45:17.985263 ignition[1196]: INFO : GET result: OK Aug 13 01:45:18.156973 coreos-metadata[1078]: Aug 13 01:45:18.156 INFO Fetch successful Aug 13 01:45:18.233576 coreos-metadata[1078]: Aug 13 01:45:18.233 INFO wrote hostname ci-4372.1.0-a-4296cabafa to /sysroot/etc/hostname Aug 13 01:45:18.234992 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 13 01:45:18.846480 ignition[1196]: INFO : Ignition finished successfully Aug 13 01:45:18.850817 systemd[1]: Finished ignition-mount.service - Ignition (mount). Aug 13 01:45:18.866830 systemd[1]: Starting ignition-files.service - Ignition (files)... Aug 13 01:45:18.900663 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 13 01:45:18.940899 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sdb6 (8:22) scanned by mount (1222) Aug 13 01:45:18.958340 kernel: BTRFS info (device sdb6): first mount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 01:45:18.958356 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Aug 13 01:45:18.964241 kernel: BTRFS info (device sdb6): using free-space-tree Aug 13 01:45:18.968407 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 13 01:45:18.998257 ignition[1239]: INFO : Ignition 2.21.0 Aug 13 01:45:18.998257 ignition[1239]: INFO : Stage: files Aug 13 01:45:19.011166 ignition[1239]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 01:45:19.011166 ignition[1239]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Aug 13 01:45:19.011166 ignition[1239]: DEBUG : files: compiled without relabeling support, skipping Aug 13 01:45:19.011166 ignition[1239]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Aug 13 01:45:19.011166 ignition[1239]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Aug 13 01:45:19.011166 ignition[1239]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Aug 13 01:45:19.011166 ignition[1239]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Aug 13 01:45:19.011166 ignition[1239]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Aug 13 01:45:19.011166 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Aug 13 01:45:19.011166 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Aug 13 01:45:19.002049 unknown[1239]: wrote ssh authorized keys file for user: core Aug 13 01:45:19.137953 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Aug 13 01:45:19.222404 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Aug 13 01:45:19.222404 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Aug 13 01:45:19.252091 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Aug 13 01:45:19.252091 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Aug 13 01:45:19.252091 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Aug 13 01:45:19.252091 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 13 01:45:19.252091 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 13 01:45:19.252091 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 13 01:45:19.252091 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 13 01:45:19.252091 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Aug 13 01:45:19.252091 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Aug 13 01:45:19.252091 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Aug 13 01:45:19.252091 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Aug 13 01:45:19.252091 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Aug 13 01:45:19.252091 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Aug 13 01:45:19.753117 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Aug 13 01:45:20.315418 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Aug 13 01:45:20.315418 ignition[1239]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Aug 13 01:45:20.346132 ignition[1239]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 13 01:45:20.346132 ignition[1239]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 13 01:45:20.346132 ignition[1239]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Aug 13 01:45:20.346132 ignition[1239]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Aug 13 01:45:20.346132 ignition[1239]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Aug 13 01:45:20.346132 ignition[1239]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Aug 13 01:45:20.346132 ignition[1239]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Aug 13 01:45:20.346132 ignition[1239]: INFO : files: files passed Aug 13 01:45:20.346132 ignition[1239]: INFO : POST message to Packet Timeline Aug 13 01:45:20.346132 ignition[1239]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Aug 13 01:45:21.730480 ignition[1239]: INFO : GET result: OK Aug 13 01:45:22.590691 ignition[1239]: INFO : Ignition finished successfully Aug 13 01:45:22.594929 systemd[1]: Finished ignition-files.service - Ignition (files). Aug 13 01:45:22.610390 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Aug 13 01:45:22.625513 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Aug 13 01:45:22.667597 systemd[1]: ignition-quench.service: Deactivated successfully. Aug 13 01:45:22.667709 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Aug 13 01:45:22.685564 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 13 01:45:22.715146 initrd-setup-root-after-ignition[1279]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 13 01:45:22.715146 initrd-setup-root-after-ignition[1279]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Aug 13 01:45:22.706075 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Aug 13 01:45:22.760182 initrd-setup-root-after-ignition[1283]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 13 01:45:22.726496 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Aug 13 01:45:22.839105 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Aug 13 01:45:22.839160 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Aug 13 01:45:22.857322 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Aug 13 01:45:22.867218 systemd[1]: Reached target initrd.target - Initrd Default Target. Aug 13 01:45:22.892315 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Aug 13 01:45:22.893978 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Aug 13 01:45:22.971195 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 13 01:45:22.985211 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Aug 13 01:45:23.028779 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Aug 13 01:45:23.039268 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 01:45:23.059676 systemd[1]: Stopped target timers.target - Timer Units. Aug 13 01:45:23.077653 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Aug 13 01:45:23.078099 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 13 01:45:23.113380 systemd[1]: Stopped target initrd.target - Initrd Default Target. Aug 13 01:45:23.122626 systemd[1]: Stopped target basic.target - Basic System. Aug 13 01:45:23.139583 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Aug 13 01:45:23.156582 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Aug 13 01:45:23.176599 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Aug 13 01:45:23.185739 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Aug 13 01:45:23.212738 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Aug 13 01:45:23.221742 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Aug 13 01:45:23.247760 systemd[1]: Stopped target sysinit.target - System Initialization. Aug 13 01:45:23.267736 systemd[1]: Stopped target local-fs.target - Local File Systems. Aug 13 01:45:23.285724 systemd[1]: Stopped target swap.target - Swaps. Aug 13 01:45:23.294647 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Aug 13 01:45:23.295087 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Aug 13 01:45:23.325622 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Aug 13 01:45:23.343590 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 01:45:23.352771 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Aug 13 01:45:23.353246 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 01:45:23.382472 systemd[1]: dracut-initqueue.service: Deactivated successfully. Aug 13 01:45:23.382903 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Aug 13 01:45:23.411532 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Aug 13 01:45:23.411989 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Aug 13 01:45:23.430783 systemd[1]: Stopped target paths.target - Path Units. Aug 13 01:45:23.447426 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Aug 13 01:45:23.447834 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 01:45:23.468592 systemd[1]: Stopped target slices.target - Slice Units. Aug 13 01:45:23.477740 systemd[1]: Stopped target sockets.target - Socket Units. Aug 13 01:45:23.492859 systemd[1]: iscsid.socket: Deactivated successfully. Aug 13 01:45:23.493188 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Aug 13 01:45:23.508806 systemd[1]: iscsiuio.socket: Deactivated successfully. Aug 13 01:45:23.509127 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 13 01:45:23.537699 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Aug 13 01:45:23.538145 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 13 01:45:23.555570 systemd[1]: ignition-files.service: Deactivated successfully. Aug 13 01:45:23.555965 systemd[1]: Stopped ignition-files.service - Ignition (files). Aug 13 01:45:23.678052 ignition[1305]: INFO : Ignition 2.21.0 Aug 13 01:45:23.678052 ignition[1305]: INFO : Stage: umount Aug 13 01:45:23.678052 ignition[1305]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 01:45:23.678052 ignition[1305]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Aug 13 01:45:23.678052 ignition[1305]: INFO : umount: umount passed Aug 13 01:45:23.678052 ignition[1305]: INFO : POST message to Packet Timeline Aug 13 01:45:23.678052 ignition[1305]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Aug 13 01:45:23.571678 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Aug 13 01:45:23.572112 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 13 01:45:23.591260 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Aug 13 01:45:23.603768 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Aug 13 01:45:23.610242 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Aug 13 01:45:23.610331 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 01:45:23.652142 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Aug 13 01:45:23.652217 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Aug 13 01:45:23.680278 systemd[1]: sysroot-boot.mount: Deactivated successfully. Aug 13 01:45:23.681108 systemd[1]: sysroot-boot.service: Deactivated successfully. Aug 13 01:45:23.681180 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Aug 13 01:45:23.700104 systemd[1]: initrd-cleanup.service: Deactivated successfully. Aug 13 01:45:23.700229 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Aug 13 01:45:24.763529 ignition[1305]: INFO : GET result: OK Aug 13 01:45:25.617185 ignition[1305]: INFO : Ignition finished successfully Aug 13 01:45:25.621651 systemd[1]: ignition-mount.service: Deactivated successfully. Aug 13 01:45:25.621993 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Aug 13 01:45:25.637154 systemd[1]: Stopped target network.target - Network. Aug 13 01:45:25.650300 systemd[1]: ignition-disks.service: Deactivated successfully. Aug 13 01:45:25.650492 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Aug 13 01:45:25.667405 systemd[1]: ignition-kargs.service: Deactivated successfully. Aug 13 01:45:25.667580 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Aug 13 01:45:25.683384 systemd[1]: ignition-setup.service: Deactivated successfully. Aug 13 01:45:25.683571 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Aug 13 01:45:25.699393 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Aug 13 01:45:25.699560 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Aug 13 01:45:25.717365 systemd[1]: initrd-setup-root.service: Deactivated successfully. Aug 13 01:45:25.717565 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Aug 13 01:45:25.733755 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Aug 13 01:45:25.751444 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Aug 13 01:45:25.769161 systemd[1]: systemd-resolved.service: Deactivated successfully. Aug 13 01:45:25.769555 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Aug 13 01:45:25.792470 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Aug 13 01:45:25.793396 systemd[1]: systemd-networkd.service: Deactivated successfully. Aug 13 01:45:25.793695 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Aug 13 01:45:25.807924 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Aug 13 01:45:25.809931 systemd[1]: Stopped target network-pre.target - Preparation for Network. Aug 13 01:45:25.822235 systemd[1]: systemd-networkd.socket: Deactivated successfully. Aug 13 01:45:25.822373 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Aug 13 01:45:25.844568 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Aug 13 01:45:25.868070 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Aug 13 01:45:25.868142 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 13 01:45:25.877371 systemd[1]: systemd-sysctl.service: Deactivated successfully. Aug 13 01:45:25.877430 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Aug 13 01:45:25.903406 systemd[1]: systemd-modules-load.service: Deactivated successfully. Aug 13 01:45:25.903507 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Aug 13 01:45:25.921395 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Aug 13 01:45:25.921573 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 01:45:25.943918 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 01:45:25.963492 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Aug 13 01:45:25.963575 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Aug 13 01:45:25.964104 systemd[1]: systemd-udevd.service: Deactivated successfully. Aug 13 01:45:25.964287 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 01:45:25.985102 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Aug 13 01:45:25.985279 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Aug 13 01:45:26.000245 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Aug 13 01:45:26.000356 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 01:45:26.025148 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Aug 13 01:45:26.025312 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Aug 13 01:45:26.054415 systemd[1]: dracut-cmdline.service: Deactivated successfully. Aug 13 01:45:26.054594 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Aug 13 01:45:26.092093 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 13 01:45:26.092286 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 13 01:45:26.132397 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Aug 13 01:45:26.139170 systemd[1]: systemd-network-generator.service: Deactivated successfully. Aug 13 01:45:26.139199 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Aug 13 01:45:26.167188 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Aug 13 01:45:26.167224 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 01:45:26.448999 systemd-journald[297]: Received SIGTERM from PID 1 (systemd). Aug 13 01:45:26.198305 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Aug 13 01:45:26.198384 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 13 01:45:26.219497 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Aug 13 01:45:26.219643 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 01:45:26.236190 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 13 01:45:26.236336 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 01:45:26.260851 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Aug 13 01:45:26.261044 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Aug 13 01:45:26.261166 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Aug 13 01:45:26.261295 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 13 01:45:26.262606 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Aug 13 01:45:26.262852 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Aug 13 01:45:26.276767 systemd[1]: network-cleanup.service: Deactivated successfully. Aug 13 01:45:26.277058 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Aug 13 01:45:26.297057 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Aug 13 01:45:26.315337 systemd[1]: Starting initrd-switch-root.service - Switch Root... Aug 13 01:45:26.378011 systemd[1]: Switching root. Aug 13 01:45:26.574130 systemd-journald[297]: Journal stopped Aug 13 01:45:28.317312 kernel: SELinux: policy capability network_peer_controls=1 Aug 13 01:45:28.317330 kernel: SELinux: policy capability open_perms=1 Aug 13 01:45:28.317338 kernel: SELinux: policy capability extended_socket_class=1 Aug 13 01:45:28.317343 kernel: SELinux: policy capability always_check_network=0 Aug 13 01:45:28.317349 kernel: SELinux: policy capability cgroup_seclabel=1 Aug 13 01:45:28.317354 kernel: SELinux: policy capability nnp_nosuid_transition=1 Aug 13 01:45:28.317361 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Aug 13 01:45:28.317368 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Aug 13 01:45:28.317375 kernel: SELinux: policy capability userspace_initial_context=0 Aug 13 01:45:28.317381 kernel: audit: type=1403 audit(1755049526.690:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Aug 13 01:45:28.317388 systemd[1]: Successfully loaded SELinux policy in 89.339ms. Aug 13 01:45:28.317396 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.527ms. Aug 13 01:45:28.317403 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 13 01:45:28.317409 systemd[1]: Detected architecture x86-64. Aug 13 01:45:28.317418 systemd[1]: Detected first boot. Aug 13 01:45:28.317424 systemd[1]: Hostname set to . Aug 13 01:45:28.317431 systemd[1]: Initializing machine ID from random generator. Aug 13 01:45:28.317438 zram_generator::config[1362]: No configuration found. Aug 13 01:45:28.317446 systemd[1]: Populated /etc with preset unit settings. Aug 13 01:45:28.317453 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Aug 13 01:45:28.317459 systemd[1]: initrd-switch-root.service: Deactivated successfully. Aug 13 01:45:28.317466 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Aug 13 01:45:28.317472 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Aug 13 01:45:28.317479 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Aug 13 01:45:28.317486 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Aug 13 01:45:28.317494 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Aug 13 01:45:28.317501 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Aug 13 01:45:28.317507 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Aug 13 01:45:28.317514 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Aug 13 01:45:28.317521 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Aug 13 01:45:28.317528 systemd[1]: Created slice user.slice - User and Session Slice. Aug 13 01:45:28.317534 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 01:45:28.317541 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 01:45:28.317549 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Aug 13 01:45:28.317556 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Aug 13 01:45:28.317563 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Aug 13 01:45:28.317570 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 13 01:45:28.317576 systemd[1]: Expecting device dev-ttyS1.device - /dev/ttyS1... Aug 13 01:45:28.317583 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 01:45:28.317590 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 13 01:45:28.317598 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Aug 13 01:45:28.317605 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Aug 13 01:45:28.317613 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Aug 13 01:45:28.317690 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Aug 13 01:45:28.317697 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 01:45:28.317704 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 13 01:45:28.317711 systemd[1]: Reached target slices.target - Slice Units. Aug 13 01:45:28.317718 systemd[1]: Reached target swap.target - Swaps. Aug 13 01:45:28.317725 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Aug 13 01:45:28.317733 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Aug 13 01:45:28.317741 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Aug 13 01:45:28.317748 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 13 01:45:28.317755 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 13 01:45:28.317762 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 01:45:28.317770 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Aug 13 01:45:28.317777 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Aug 13 01:45:28.317784 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Aug 13 01:45:28.317791 systemd[1]: Mounting media.mount - External Media Directory... Aug 13 01:45:28.317798 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 01:45:28.317805 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Aug 13 01:45:28.317812 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Aug 13 01:45:28.317819 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Aug 13 01:45:28.317827 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Aug 13 01:45:28.317835 systemd[1]: Reached target machines.target - Containers. Aug 13 01:45:28.317842 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Aug 13 01:45:28.317849 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 13 01:45:28.317856 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 13 01:45:28.317866 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Aug 13 01:45:28.317873 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 13 01:45:28.317880 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 13 01:45:28.317889 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 13 01:45:28.317896 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Aug 13 01:45:28.317903 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 13 01:45:28.317910 kernel: ACPI: bus type drm_connector registered Aug 13 01:45:28.317916 kernel: loop: module loaded Aug 13 01:45:28.317923 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Aug 13 01:45:28.317930 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Aug 13 01:45:28.317937 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Aug 13 01:45:28.317944 kernel: fuse: init (API version 7.41) Aug 13 01:45:28.317951 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Aug 13 01:45:28.317958 systemd[1]: Stopped systemd-fsck-usr.service. Aug 13 01:45:28.317966 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 13 01:45:28.317973 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 13 01:45:28.317980 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 13 01:45:28.317987 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 13 01:45:28.318006 systemd-journald[1465]: Collecting audit messages is disabled. Aug 13 01:45:28.318024 systemd-journald[1465]: Journal started Aug 13 01:45:28.318038 systemd-journald[1465]: Runtime Journal (/run/log/journal/6468c303ef9e426696169beb4de2cfb6) is 8M, max 639.3M, 631.3M free. Aug 13 01:45:27.155455 systemd[1]: Queued start job for default target multi-user.target. Aug 13 01:45:27.173873 systemd[1]: Unnecessary job was removed for dev-sdb6.device - /dev/sdb6. Aug 13 01:45:27.174198 systemd[1]: systemd-journald.service: Deactivated successfully. Aug 13 01:45:28.333945 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Aug 13 01:45:28.356924 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Aug 13 01:45:28.377915 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 13 01:45:28.397997 systemd[1]: verity-setup.service: Deactivated successfully. Aug 13 01:45:28.398022 systemd[1]: Stopped verity-setup.service. Aug 13 01:45:28.421924 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 01:45:28.429957 systemd[1]: Started systemd-journald.service - Journal Service. Aug 13 01:45:28.438374 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Aug 13 01:45:28.447001 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Aug 13 01:45:28.456066 systemd[1]: Mounted media.mount - External Media Directory. Aug 13 01:45:28.466198 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Aug 13 01:45:28.476179 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Aug 13 01:45:28.485144 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Aug 13 01:45:28.494243 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Aug 13 01:45:28.506242 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 01:45:28.516255 systemd[1]: modprobe@configfs.service: Deactivated successfully. Aug 13 01:45:28.516391 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Aug 13 01:45:28.527194 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 13 01:45:28.527350 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 13 01:45:28.538345 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 13 01:45:28.538552 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 13 01:45:28.547471 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 13 01:45:28.547761 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 13 01:45:28.558779 systemd[1]: modprobe@fuse.service: Deactivated successfully. Aug 13 01:45:28.559258 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Aug 13 01:45:28.568893 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 13 01:45:28.569430 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 13 01:45:28.578946 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 13 01:45:28.588951 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 13 01:45:28.599901 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Aug 13 01:45:28.610988 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Aug 13 01:45:28.621896 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 01:45:28.640851 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 13 01:45:28.650854 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Aug 13 01:45:28.667233 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Aug 13 01:45:28.676086 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Aug 13 01:45:28.676111 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 13 01:45:28.677051 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Aug 13 01:45:28.696911 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Aug 13 01:45:28.706232 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 01:45:28.724275 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Aug 13 01:45:28.743132 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Aug 13 01:45:28.752983 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 13 01:45:28.759144 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Aug 13 01:45:28.761562 systemd-journald[1465]: Time spent on flushing to /var/log/journal/6468c303ef9e426696169beb4de2cfb6 is 13.044ms for 1419 entries. Aug 13 01:45:28.761562 systemd-journald[1465]: System Journal (/var/log/journal/6468c303ef9e426696169beb4de2cfb6) is 8M, max 195.6M, 187.6M free. Aug 13 01:45:28.784972 systemd-journald[1465]: Received client request to flush runtime journal. Aug 13 01:45:28.776979 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 13 01:45:28.783445 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 13 01:45:28.792699 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Aug 13 01:45:28.803690 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 13 01:45:28.813960 kernel: loop0: detected capacity change from 0 to 8 Aug 13 01:45:28.820470 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Aug 13 01:45:28.826871 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Aug 13 01:45:28.836584 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Aug 13 01:45:28.840201 systemd-tmpfiles[1504]: ACLs are not supported, ignoring. Aug 13 01:45:28.840211 systemd-tmpfiles[1504]: ACLs are not supported, ignoring. Aug 13 01:45:28.847262 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Aug 13 01:45:28.861908 kernel: loop1: detected capacity change from 0 to 113872 Aug 13 01:45:28.864237 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Aug 13 01:45:28.874116 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 13 01:45:28.883102 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 13 01:45:28.894423 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Aug 13 01:45:28.904694 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Aug 13 01:45:28.919871 kernel: loop2: detected capacity change from 0 to 146240 Aug 13 01:45:28.930081 systemd[1]: Starting systemd-sysusers.service - Create System Users... Aug 13 01:45:28.942194 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Aug 13 01:45:28.943013 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Aug 13 01:45:28.961781 systemd[1]: Finished systemd-sysusers.service - Create System Users. Aug 13 01:45:28.970969 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 13 01:45:28.979871 kernel: loop3: detected capacity change from 0 to 224512 Aug 13 01:45:29.003980 systemd-tmpfiles[1522]: ACLs are not supported, ignoring. Aug 13 01:45:29.003991 systemd-tmpfiles[1522]: ACLs are not supported, ignoring. Aug 13 01:45:29.006241 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 01:45:29.022871 kernel: loop4: detected capacity change from 0 to 8 Aug 13 01:45:29.029870 kernel: loop5: detected capacity change from 0 to 113872 Aug 13 01:45:29.051927 kernel: loop6: detected capacity change from 0 to 146240 Aug 13 01:45:29.075905 kernel: loop7: detected capacity change from 0 to 224512 Aug 13 01:45:29.088857 (sd-merge)[1526]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-packet'. Aug 13 01:45:29.089147 (sd-merge)[1526]: Merged extensions into '/usr'. Aug 13 01:45:29.091604 systemd[1]: Reload requested from client PID 1501 ('systemd-sysext') (unit systemd-sysext.service)... Aug 13 01:45:29.091612 systemd[1]: Reloading... Aug 13 01:45:29.116929 zram_generator::config[1553]: No configuration found. Aug 13 01:45:29.124348 ldconfig[1495]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Aug 13 01:45:29.179914 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 01:45:29.245155 systemd[1]: Reloading finished in 153 ms. Aug 13 01:45:29.262830 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Aug 13 01:45:29.272273 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Aug 13 01:45:29.283271 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Aug 13 01:45:29.307341 systemd[1]: Starting ensure-sysext.service... Aug 13 01:45:29.314005 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 13 01:45:29.325062 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 01:45:29.337752 systemd-tmpfiles[1611]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Aug 13 01:45:29.337786 systemd-tmpfiles[1611]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Aug 13 01:45:29.338092 systemd-tmpfiles[1611]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Aug 13 01:45:29.338394 systemd-tmpfiles[1611]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Aug 13 01:45:29.339358 systemd-tmpfiles[1611]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Aug 13 01:45:29.339676 systemd-tmpfiles[1611]: ACLs are not supported, ignoring. Aug 13 01:45:29.339753 systemd-tmpfiles[1611]: ACLs are not supported, ignoring. Aug 13 01:45:29.343357 systemd-tmpfiles[1611]: Detected autofs mount point /boot during canonicalization of boot. Aug 13 01:45:29.343366 systemd-tmpfiles[1611]: Skipping /boot Aug 13 01:45:29.350740 systemd[1]: Reload requested from client PID 1610 ('systemctl') (unit ensure-sysext.service)... Aug 13 01:45:29.350757 systemd[1]: Reloading... Aug 13 01:45:29.354891 systemd-tmpfiles[1611]: Detected autofs mount point /boot during canonicalization of boot. Aug 13 01:45:29.354900 systemd-tmpfiles[1611]: Skipping /boot Aug 13 01:45:29.369876 systemd-udevd[1612]: Using default interface naming scheme 'v255'. Aug 13 01:45:29.382875 zram_generator::config[1639]: No configuration found. Aug 13 01:45:29.431893 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Aug 13 01:45:29.431962 kernel: ACPI: button: Sleep Button [SLPB] Aug 13 01:45:29.445968 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Aug 13 01:45:29.446026 kernel: IPMI message handler: version 39.2 Aug 13 01:45:29.450871 kernel: mousedev: PS/2 mouse device common for all mice Aug 13 01:45:29.456875 kernel: ACPI: button: Power Button [PWRF] Aug 13 01:45:29.472872 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Aug 13 01:45:29.473085 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Aug 13 01:45:29.474176 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 01:45:29.491875 kernel: ipmi device interface Aug 13 01:45:29.498897 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Aug 13 01:45:29.499104 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Aug 13 01:45:29.538722 kernel: ACPI: video: Video Device [GFX0] (multi-head: yes rom: no post: no) Aug 13 01:45:29.538800 kernel: input: Video Bus as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0A08:00/LNXVIDEO:00/input/input4 Aug 13 01:45:29.547874 kernel: iTCO_vendor_support: vendor-support=0 Aug 13 01:45:29.557885 kernel: MACsec IEEE 802.1AE Aug 13 01:45:29.557906 kernel: ipmi_si: IPMI System Interface driver Aug 13 01:45:29.569059 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Aug 13 01:45:29.576618 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Aug 13 01:45:29.582814 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Aug 13 01:45:29.589132 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Aug 13 01:45:29.590894 systemd[1]: Condition check resulted in dev-ttyS1.device - /dev/ttyS1 being skipped. Aug 13 01:45:29.590992 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5300_MTFDDAK480TDT OEM. Aug 13 01:45:29.597467 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Aug 13 01:45:29.615128 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Aug 13 01:45:29.615250 kernel: ipmi_si: Adding ACPI-specified kcs state machine Aug 13 01:45:29.626400 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Aug 13 01:45:29.632519 systemd[1]: Reloading finished in 281 ms. Aug 13 01:45:29.643887 kernel: iTCO_wdt iTCO_wdt: unable to reset NO_REBOOT flag, device disabled by hardware/BIOS Aug 13 01:45:29.659966 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 01:45:29.679790 kernel: intel_rapl_common: Found RAPL domain package Aug 13 01:45:29.679837 kernel: intel_rapl_common: Found RAPL domain core Aug 13 01:45:29.683068 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 01:45:29.691671 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Aug 13 01:45:29.691794 kernel: intel_rapl_common: Found RAPL domain uncore Aug 13 01:45:29.702582 kernel: intel_rapl_common: Found RAPL domain dram Aug 13 01:45:29.728913 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b11, dev_id: 0x20) Aug 13 01:45:29.741655 systemd[1]: Finished ensure-sysext.service. Aug 13 01:45:29.752928 systemd[1]: Reached target tpm2.target - Trusted Platform Module. Aug 13 01:45:29.760955 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 01:45:29.761897 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 13 01:45:29.813869 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Aug 13 01:45:30.020674 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Aug 13 01:45:30.033872 kernel: ipmi_ssif: IPMI SSIF Interface driver Aug 13 01:45:30.033922 kernel: i915 0000:00:02.0: can't derive routing for PCI INT A Aug 13 01:45:30.044531 augenrules[1836]: No rules Aug 13 01:45:30.045661 kernel: i915 0000:00:02.0: PCI INT A: not connected Aug 13 01:45:30.045822 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 13 01:45:30.046477 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 13 01:45:30.055454 kernel: i915 0000:00:02.0: [drm] Found COFFEELAKE (device ID 3e9a) display version 9.00 stepping N/A Aug 13 01:45:30.056532 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 13 01:45:30.070632 kernel: i915 0000:00:02.0: [drm] VT-d active for gfx access Aug 13 01:45:30.070748 kernel: i915 0000:00:02.0: [drm] Using Transparent Hugepages Aug 13 01:45:30.090464 kernel: i915 0000:00:02.0: ROM [??? 0x00000000 flags 0x20000000]: can't assign; bogus alignment Aug 13 01:45:30.090565 kernel: i915 0000:00:02.0: [drm] Failed to find VBIOS tables (VBT) Aug 13 01:45:30.103868 kernel: i915 0000:00:02.0: [drm] Finished loading DMC firmware i915/kbl_dmc_ver1_04.bin (v1.4) Aug 13 01:45:30.109305 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 13 01:45:30.121485 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 13 01:45:30.129984 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 01:45:30.130516 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Aug 13 01:45:30.139900 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 13 01:45:30.140508 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Aug 13 01:45:30.150902 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 13 01:45:30.152041 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 13 01:45:30.159967 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Aug 13 01:45:30.178504 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Aug 13 01:45:30.203998 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 01:45:30.213905 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 01:45:30.214662 systemd[1]: audit-rules.service: Deactivated successfully. Aug 13 01:45:30.214802 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 13 01:45:30.224264 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Aug 13 01:45:30.224447 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 13 01:45:30.224554 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 13 01:45:30.224725 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 13 01:45:30.224831 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 13 01:45:30.225013 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 13 01:45:30.225112 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 13 01:45:30.225281 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 13 01:45:30.225388 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 13 01:45:30.225570 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Aug 13 01:45:30.225795 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Aug 13 01:45:30.230311 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 13 01:45:30.230390 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 13 01:45:30.231258 systemd[1]: Starting systemd-update-done.service - Update is Completed... Aug 13 01:45:30.232198 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Aug 13 01:45:30.232227 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 13 01:45:30.232454 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Aug 13 01:45:30.251614 systemd[1]: Finished systemd-update-done.service - Update is Completed. Aug 13 01:45:30.276683 systemd[1]: Started systemd-userdbd.service - User Database Manager. Aug 13 01:45:30.331558 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Aug 13 01:45:30.333050 systemd-resolved[1853]: Positive Trust Anchors: Aug 13 01:45:30.333057 systemd-resolved[1853]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 13 01:45:30.333086 systemd-resolved[1853]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 13 01:45:30.336056 systemd-resolved[1853]: Using system hostname 'ci-4372.1.0-a-4296cabafa'. Aug 13 01:45:30.336515 systemd-networkd[1852]: lo: Link UP Aug 13 01:45:30.336519 systemd-networkd[1852]: lo: Gained carrier Aug 13 01:45:30.340056 systemd-networkd[1852]: bond0: netdev ready Aug 13 01:45:30.341267 systemd-networkd[1852]: Enumeration completed Aug 13 01:45:30.341279 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 13 01:45:30.342302 systemd-networkd[1852]: enp2s0f0np0: Configuring with /etc/systemd/network/10-04:3f:72:d9:a3:34.network. Aug 13 01:45:30.350940 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 13 01:45:30.360093 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 01:45:30.371146 systemd[1]: Reached target network.target - Network. Aug 13 01:45:30.377979 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 13 01:45:30.387914 systemd[1]: Reached target sysinit.target - System Initialization. Aug 13 01:45:30.395961 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Aug 13 01:45:30.405923 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Aug 13 01:45:30.415903 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Aug 13 01:45:30.426919 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Aug 13 01:45:30.436529 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Aug 13 01:45:30.436552 systemd[1]: Reached target paths.target - Path Units. Aug 13 01:45:30.443905 systemd[1]: Reached target time-set.target - System Time Set. Aug 13 01:45:30.453009 systemd[1]: Started logrotate.timer - Daily rotation of log files. Aug 13 01:45:30.462971 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Aug 13 01:45:30.471868 kernel: mlx5_core 0000:02:00.0 enp2s0f0np0: Link up Aug 13 01:45:30.483868 kernel: bond0: (slave enp2s0f0np0): Enslaving as a backup interface with an up link Aug 13 01:45:30.484585 systemd-networkd[1852]: enp2s0f1np1: Configuring with /etc/systemd/network/10-04:3f:72:d9:a3:35.network. Aug 13 01:45:30.486907 systemd[1]: Reached target timers.target - Timer Units. Aug 13 01:45:30.495829 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Aug 13 01:45:30.506765 systemd[1]: Starting docker.socket - Docker Socket for the API... Aug 13 01:45:30.516378 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Aug 13 01:45:30.528032 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Aug 13 01:45:30.538144 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Aug 13 01:45:30.549683 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Aug 13 01:45:30.561502 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Aug 13 01:45:30.572241 systemd[1]: Listening on docker.socket - Docker Socket for the API. Aug 13 01:45:30.582546 systemd[1]: Reached target sockets.target - Socket Units. Aug 13 01:45:30.591918 systemd[1]: Reached target basic.target - Basic System. Aug 13 01:45:30.599936 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Aug 13 01:45:30.599953 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Aug 13 01:45:30.600561 systemd[1]: Starting containerd.service - containerd container runtime... Aug 13 01:45:30.613869 kernel: mlx5_core 0000:02:00.1 enp2s0f1np1: Link up Aug 13 01:45:30.624762 systemd-networkd[1852]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Aug 13 01:45:30.624868 kernel: bond0: (slave enp2s0f1np1): Enslaving as a backup interface with an up link Aug 13 01:45:30.625799 systemd-networkd[1852]: enp2s0f0np0: Link UP Aug 13 01:45:30.626009 systemd-networkd[1852]: enp2s0f0np0: Gained carrier Aug 13 01:45:30.634867 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Aug 13 01:45:30.634993 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Aug 13 01:45:30.643290 systemd-networkd[1852]: enp2s0f1np1: Reconfiguring with /etc/systemd/network/10-04:3f:72:d9:a3:34.network. Aug 13 01:45:30.643456 systemd-networkd[1852]: enp2s0f1np1: Link UP Aug 13 01:45:30.643646 systemd-networkd[1852]: enp2s0f1np1: Gained carrier Aug 13 01:45:30.644509 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Aug 13 01:45:30.653521 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Aug 13 01:45:30.658011 systemd-networkd[1852]: bond0: Link UP Aug 13 01:45:30.658239 systemd-networkd[1852]: bond0: Gained carrier Aug 13 01:45:30.658380 systemd-timesyncd[1854]: Network configuration changed, trying to establish connection. Aug 13 01:45:30.658825 systemd-timesyncd[1854]: Network configuration changed, trying to establish connection. Aug 13 01:45:30.659080 systemd-timesyncd[1854]: Network configuration changed, trying to establish connection. Aug 13 01:45:30.659186 systemd-timesyncd[1854]: Network configuration changed, trying to establish connection. Aug 13 01:45:30.659583 coreos-metadata[1893]: Aug 13 01:45:30.659 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Aug 13 01:45:30.679966 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Aug 13 01:45:30.701996 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Aug 13 01:45:30.703995 jq[1899]: false Aug 13 01:45:30.711904 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Aug 13 01:45:30.722979 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Aug 13 01:45:30.727778 extend-filesystems[1900]: Found /dev/sdb6 Aug 13 01:45:30.750430 kernel: bond0: (slave enp2s0f0np0): link status definitely up, 10000 Mbps full duplex Aug 13 01:45:30.750451 kernel: bond0: active interface up! Aug 13 01:45:30.750477 kernel: EXT4-fs (sdb9): resizing filesystem from 553472 to 116605649 blocks Aug 13 01:45:30.743213 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Aug 13 01:45:30.750539 extend-filesystems[1900]: Found /dev/sdb9 Aug 13 01:45:30.750539 extend-filesystems[1900]: Checking size of /dev/sdb9 Aug 13 01:45:30.750539 extend-filesystems[1900]: Resized partition /dev/sdb9 Aug 13 01:45:30.766136 oslogin_cache_refresh[1901]: Refreshing passwd entry cache Aug 13 01:45:30.785106 extend-filesystems[1911]: resize2fs 1.47.2 (1-Jan-2025) Aug 13 01:45:30.825091 kernel: i915 0000:00:02.0: [drm] [ENCODER:98:DDI A/PHY A] failed to retrieve link info, disabling eDP Aug 13 01:45:30.825291 kernel: [drm] Initialized i915 1.6.0 for 0000:00:02.0 on minor 0 Aug 13 01:45:30.761993 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Aug 13 01:45:30.825375 google_oslogin_nss_cache[1901]: oslogin_cache_refresh[1901]: Refreshing passwd entry cache Aug 13 01:45:30.777631 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Aug 13 01:45:30.793984 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Aug 13 01:45:30.827209 systemd[1]: Starting systemd-logind.service - User Login Management... Aug 13 01:45:30.857869 kernel: bond0: (slave enp2s0f1np1): link status definitely up, 10000 Mbps full duplex Aug 13 01:45:30.859393 systemd[1]: Starting tcsd.service - TCG Core Services Daemon... Aug 13 01:45:30.868235 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Aug 13 01:45:30.868650 systemd[1]: Starting update-engine.service - Update Engine... Aug 13 01:45:30.886502 systemd-logind[1926]: New seat seat0. Aug 13 01:45:30.887920 systemd-logind[1926]: Watching system buttons on /dev/input/event3 (Power Button) Aug 13 01:45:30.888383 systemd-logind[1926]: Watching system buttons on /dev/input/event2 (Sleep Button) Aug 13 01:45:30.888400 systemd-logind[1926]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Aug 13 01:45:30.891165 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Aug 13 01:45:30.898161 update_engine[1931]: I20250813 01:45:30.898093 1931 main.cc:92] Flatcar Update Engine starting Aug 13 01:45:30.901966 systemd[1]: Started systemd-logind.service - User Login Management. Aug 13 01:45:30.903123 jq[1932]: true Aug 13 01:45:30.911216 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Aug 13 01:45:30.922473 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Aug 13 01:45:30.932136 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Aug 13 01:45:30.932264 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Aug 13 01:45:30.932465 systemd[1]: motdgen.service: Deactivated successfully. Aug 13 01:45:30.948027 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Aug 13 01:45:30.958568 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Aug 13 01:45:30.958700 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Aug 13 01:45:30.980709 (ntainerd)[1938]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Aug 13 01:45:30.982018 jq[1937]: true Aug 13 01:45:30.993788 tar[1936]: linux-amd64/LICENSE Aug 13 01:45:30.993950 tar[1936]: linux-amd64/helm Aug 13 01:45:31.000174 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Aug 13 01:45:31.000297 systemd[1]: Condition check resulted in tcsd.service - TCG Core Services Daemon being skipped. Aug 13 01:45:31.029399 bash[1964]: Updated "/home/core/.ssh/authorized_keys" Aug 13 01:45:31.030289 dbus-daemon[1894]: [system] SELinux support is enabled Aug 13 01:45:31.030438 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Aug 13 01:45:31.032111 update_engine[1931]: I20250813 01:45:31.032060 1931 update_check_scheduler.cc:74] Next update check in 8m10s Aug 13 01:45:31.040027 systemd[1]: Started dbus.service - D-Bus System Message Bus. Aug 13 01:45:31.051520 dbus-daemon[1894]: [system] Successfully activated service 'org.freedesktop.systemd1' Aug 13 01:45:31.051999 systemd[1]: Starting sshkeys.service... Aug 13 01:45:31.057930 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Aug 13 01:45:31.057949 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Aug 13 01:45:31.068941 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Aug 13 01:45:31.068954 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Aug 13 01:45:31.082073 systemd[1]: Started update-engine.service - Update Engine. Aug 13 01:45:31.093166 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Aug 13 01:45:31.103635 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Aug 13 01:45:31.121188 systemd[1]: Started locksmithd.service - Cluster reboot manager. Aug 13 01:45:31.134548 coreos-metadata[1976]: Aug 13 01:45:31.134 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Aug 13 01:45:31.141314 sshd_keygen[1930]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Aug 13 01:45:31.150842 containerd[1938]: time="2025-08-13T01:45:31Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Aug 13 01:45:31.152354 containerd[1938]: time="2025-08-13T01:45:31.152336292Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Aug 13 01:45:31.155432 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Aug 13 01:45:31.158093 containerd[1938]: time="2025-08-13T01:45:31.158045619Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.12µs" Aug 13 01:45:31.158093 containerd[1938]: time="2025-08-13T01:45:31.158065012Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Aug 13 01:45:31.158093 containerd[1938]: time="2025-08-13T01:45:31.158078476Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Aug 13 01:45:31.158199 containerd[1938]: time="2025-08-13T01:45:31.158160383Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Aug 13 01:45:31.158199 containerd[1938]: time="2025-08-13T01:45:31.158171439Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Aug 13 01:45:31.158199 containerd[1938]: time="2025-08-13T01:45:31.158186268Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 13 01:45:31.158242 containerd[1938]: time="2025-08-13T01:45:31.158217630Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 13 01:45:31.158242 containerd[1938]: time="2025-08-13T01:45:31.158225571Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 13 01:45:31.158389 containerd[1938]: time="2025-08-13T01:45:31.158350381Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 13 01:45:31.158389 containerd[1938]: time="2025-08-13T01:45:31.158358715Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 13 01:45:31.158389 containerd[1938]: time="2025-08-13T01:45:31.158364322Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 13 01:45:31.158389 containerd[1938]: time="2025-08-13T01:45:31.158368802Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Aug 13 01:45:31.158450 containerd[1938]: time="2025-08-13T01:45:31.158411178Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Aug 13 01:45:31.158552 containerd[1938]: time="2025-08-13T01:45:31.158518703Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 13 01:45:31.158552 containerd[1938]: time="2025-08-13T01:45:31.158534738Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 13 01:45:31.158552 containerd[1938]: time="2025-08-13T01:45:31.158541466Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Aug 13 01:45:31.158600 containerd[1938]: time="2025-08-13T01:45:31.158558637Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Aug 13 01:45:31.158698 containerd[1938]: time="2025-08-13T01:45:31.158690426Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Aug 13 01:45:31.158727 containerd[1938]: time="2025-08-13T01:45:31.158721027Z" level=info msg="metadata content store policy set" policy=shared Aug 13 01:45:31.165171 systemd[1]: Starting issuegen.service - Generate /run/issue... Aug 13 01:45:31.169634 locksmithd[1977]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Aug 13 01:45:31.172824 containerd[1938]: time="2025-08-13T01:45:31.172804258Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Aug 13 01:45:31.172865 containerd[1938]: time="2025-08-13T01:45:31.172840214Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Aug 13 01:45:31.172865 containerd[1938]: time="2025-08-13T01:45:31.172849254Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Aug 13 01:45:31.172865 containerd[1938]: time="2025-08-13T01:45:31.172856043Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Aug 13 01:45:31.172921 containerd[1938]: time="2025-08-13T01:45:31.172868146Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Aug 13 01:45:31.172921 containerd[1938]: time="2025-08-13T01:45:31.172875066Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Aug 13 01:45:31.172921 containerd[1938]: time="2025-08-13T01:45:31.172882152Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Aug 13 01:45:31.172921 containerd[1938]: time="2025-08-13T01:45:31.172888761Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Aug 13 01:45:31.172921 containerd[1938]: time="2025-08-13T01:45:31.172896383Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Aug 13 01:45:31.172921 containerd[1938]: time="2025-08-13T01:45:31.172907224Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Aug 13 01:45:31.172921 containerd[1938]: time="2025-08-13T01:45:31.172912938Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Aug 13 01:45:31.172921 containerd[1938]: time="2025-08-13T01:45:31.172920987Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Aug 13 01:45:31.173046 containerd[1938]: time="2025-08-13T01:45:31.172989565Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Aug 13 01:45:31.173046 containerd[1938]: time="2025-08-13T01:45:31.173001124Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Aug 13 01:45:31.173046 containerd[1938]: time="2025-08-13T01:45:31.173009588Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Aug 13 01:45:31.173046 containerd[1938]: time="2025-08-13T01:45:31.173015485Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Aug 13 01:45:31.173046 containerd[1938]: time="2025-08-13T01:45:31.173020996Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Aug 13 01:45:31.173046 containerd[1938]: time="2025-08-13T01:45:31.173027185Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Aug 13 01:45:31.173046 containerd[1938]: time="2025-08-13T01:45:31.173034029Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Aug 13 01:45:31.173046 containerd[1938]: time="2025-08-13T01:45:31.173039901Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Aug 13 01:45:31.173046 containerd[1938]: time="2025-08-13T01:45:31.173047691Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Aug 13 01:45:31.173171 containerd[1938]: time="2025-08-13T01:45:31.173057252Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Aug 13 01:45:31.173171 containerd[1938]: time="2025-08-13T01:45:31.173063788Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Aug 13 01:45:31.173171 containerd[1938]: time="2025-08-13T01:45:31.173101160Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Aug 13 01:45:31.173171 containerd[1938]: time="2025-08-13T01:45:31.173109070Z" level=info msg="Start snapshots syncer" Aug 13 01:45:31.173171 containerd[1938]: time="2025-08-13T01:45:31.173122721Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Aug 13 01:45:31.173280 containerd[1938]: time="2025-08-13T01:45:31.173260788Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Aug 13 01:45:31.173358 containerd[1938]: time="2025-08-13T01:45:31.173290563Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Aug 13 01:45:31.173664 containerd[1938]: time="2025-08-13T01:45:31.173653978Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Aug 13 01:45:31.173716 containerd[1938]: time="2025-08-13T01:45:31.173707385Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Aug 13 01:45:31.173734 containerd[1938]: time="2025-08-13T01:45:31.173720872Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Aug 13 01:45:31.173734 containerd[1938]: time="2025-08-13T01:45:31.173728364Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Aug 13 01:45:31.173764 containerd[1938]: time="2025-08-13T01:45:31.173734085Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Aug 13 01:45:31.173764 containerd[1938]: time="2025-08-13T01:45:31.173742137Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Aug 13 01:45:31.173764 containerd[1938]: time="2025-08-13T01:45:31.173747750Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Aug 13 01:45:31.173764 containerd[1938]: time="2025-08-13T01:45:31.173753716Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Aug 13 01:45:31.173834 containerd[1938]: time="2025-08-13T01:45:31.173766287Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Aug 13 01:45:31.173834 containerd[1938]: time="2025-08-13T01:45:31.173772680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Aug 13 01:45:31.173834 containerd[1938]: time="2025-08-13T01:45:31.173778674Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Aug 13 01:45:31.173834 containerd[1938]: time="2025-08-13T01:45:31.173793637Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 13 01:45:31.173834 containerd[1938]: time="2025-08-13T01:45:31.173803642Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 13 01:45:31.173834 containerd[1938]: time="2025-08-13T01:45:31.173808563Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 13 01:45:31.173834 containerd[1938]: time="2025-08-13T01:45:31.173814366Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 13 01:45:31.173834 containerd[1938]: time="2025-08-13T01:45:31.173818774Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Aug 13 01:45:31.173834 containerd[1938]: time="2025-08-13T01:45:31.173827856Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Aug 13 01:45:31.173834 containerd[1938]: time="2025-08-13T01:45:31.173834369Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Aug 13 01:45:31.173978 containerd[1938]: time="2025-08-13T01:45:31.173843877Z" level=info msg="runtime interface created" Aug 13 01:45:31.173978 containerd[1938]: time="2025-08-13T01:45:31.173846910Z" level=info msg="created NRI interface" Aug 13 01:45:31.173978 containerd[1938]: time="2025-08-13T01:45:31.173851390Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Aug 13 01:45:31.173978 containerd[1938]: time="2025-08-13T01:45:31.173857337Z" level=info msg="Connect containerd service" Aug 13 01:45:31.173978 containerd[1938]: time="2025-08-13T01:45:31.173875603Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Aug 13 01:45:31.174255 containerd[1938]: time="2025-08-13T01:45:31.174241695Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 13 01:45:31.191165 systemd[1]: issuegen.service: Deactivated successfully. Aug 13 01:45:31.191336 systemd[1]: Finished issuegen.service - Generate /run/issue. Aug 13 01:45:31.201383 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Aug 13 01:45:31.226490 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Aug 13 01:45:31.237349 systemd[1]: Started getty@tty1.service - Getty on tty1. Aug 13 01:45:31.245957 systemd[1]: Started serial-getty@ttyS1.service - Serial Getty on ttyS1. Aug 13 01:45:31.256147 systemd[1]: Reached target getty.target - Login Prompts. Aug 13 01:45:31.274553 containerd[1938]: time="2025-08-13T01:45:31.274528372Z" level=info msg="Start subscribing containerd event" Aug 13 01:45:31.274629 containerd[1938]: time="2025-08-13T01:45:31.274558378Z" level=info msg="Start recovering state" Aug 13 01:45:31.274629 containerd[1938]: time="2025-08-13T01:45:31.274605152Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Aug 13 01:45:31.274629 containerd[1938]: time="2025-08-13T01:45:31.274622619Z" level=info msg="Start event monitor" Aug 13 01:45:31.274629 containerd[1938]: time="2025-08-13T01:45:31.274631748Z" level=info msg="Start cni network conf syncer for default" Aug 13 01:45:31.274707 containerd[1938]: time="2025-08-13T01:45:31.274633890Z" level=info msg=serving... address=/run/containerd/containerd.sock Aug 13 01:45:31.274707 containerd[1938]: time="2025-08-13T01:45:31.274641256Z" level=info msg="Start streaming server" Aug 13 01:45:31.274707 containerd[1938]: time="2025-08-13T01:45:31.274671522Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Aug 13 01:45:31.274707 containerd[1938]: time="2025-08-13T01:45:31.274675969Z" level=info msg="runtime interface starting up..." Aug 13 01:45:31.274707 containerd[1938]: time="2025-08-13T01:45:31.274679317Z" level=info msg="starting plugins..." Aug 13 01:45:31.274707 containerd[1938]: time="2025-08-13T01:45:31.274687793Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Aug 13 01:45:31.274870 containerd[1938]: time="2025-08-13T01:45:31.274752860Z" level=info msg="containerd successfully booted in 0.124139s" Aug 13 01:45:31.274803 systemd[1]: Started containerd.service - containerd container runtime. Aug 13 01:45:31.280923 tar[1936]: linux-amd64/README.md Aug 13 01:45:31.306944 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Aug 13 01:45:31.308914 kernel: EXT4-fs (sdb9): resized filesystem to 116605649 Aug 13 01:45:31.333518 extend-filesystems[1911]: Filesystem at /dev/sdb9 is mounted on /; on-line resizing required Aug 13 01:45:31.333518 extend-filesystems[1911]: old_desc_blocks = 1, new_desc_blocks = 56 Aug 13 01:45:31.333518 extend-filesystems[1911]: The filesystem on /dev/sdb9 is now 116605649 (4k) blocks long. Aug 13 01:45:31.370941 extend-filesystems[1900]: Resized filesystem in /dev/sdb9 Aug 13 01:45:31.333969 systemd[1]: extend-filesystems.service: Deactivated successfully. Aug 13 01:45:31.334096 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Aug 13 01:45:31.912234 systemd-timesyncd[1854]: Network configuration changed, trying to establish connection. Aug 13 01:45:32.040259 systemd-networkd[1852]: bond0: Gained IPv6LL Aug 13 01:45:32.041651 systemd-timesyncd[1854]: Network configuration changed, trying to establish connection. Aug 13 01:45:32.046923 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Aug 13 01:45:32.058827 systemd[1]: Reached target network-online.target - Network is Online. Aug 13 01:45:32.072378 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 01:45:32.091252 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Aug 13 01:45:32.112239 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Aug 13 01:45:32.793718 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 01:45:32.819069 (kubelet)[2051]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 01:45:33.222939 kubelet[2051]: E0813 01:45:33.222855 2051 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 01:45:33.224027 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 01:45:33.224110 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 01:45:33.224290 systemd[1]: kubelet.service: Consumed 595ms CPU time, 276.9M memory peak. Aug 13 01:45:33.438681 kernel: mlx5_core 0000:02:00.0: lag map: port 1:1 port 2:2 Aug 13 01:45:33.438827 kernel: mlx5_core 0000:02:00.0: shared_fdb:0 mode:queue_affinity Aug 13 01:45:34.378544 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Aug 13 01:45:34.387838 systemd[1]: Started sshd@0-147.75.71.211:22-139.178.89.65:55312.service - OpenSSH per-connection server daemon (139.178.89.65:55312). Aug 13 01:45:34.467593 sshd[2072]: Accepted publickey for core from 139.178.89.65 port 55312 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:45:34.468279 sshd-session[2072]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:45:34.476159 systemd-logind[1926]: New session 1 of user core. Aug 13 01:45:34.477020 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Aug 13 01:45:34.486859 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Aug 13 01:45:34.516767 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Aug 13 01:45:34.528536 systemd[1]: Starting user@500.service - User Manager for UID 500... Aug 13 01:45:34.552426 (systemd)[2076]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Aug 13 01:45:34.555290 systemd-logind[1926]: New session c1 of user core. Aug 13 01:45:34.676680 systemd[2076]: Queued start job for default target default.target. Aug 13 01:45:34.690559 systemd[2076]: Created slice app.slice - User Application Slice. Aug 13 01:45:34.690592 systemd[2076]: Reached target paths.target - Paths. Aug 13 01:45:34.690614 systemd[2076]: Reached target timers.target - Timers. Aug 13 01:45:34.691277 systemd[2076]: Starting dbus.socket - D-Bus User Message Bus Socket... Aug 13 01:45:34.696917 systemd[2076]: Listening on dbus.socket - D-Bus User Message Bus Socket. Aug 13 01:45:34.696945 systemd[2076]: Reached target sockets.target - Sockets. Aug 13 01:45:34.696968 systemd[2076]: Reached target basic.target - Basic System. Aug 13 01:45:34.696992 systemd[2076]: Reached target default.target - Main User Target. Aug 13 01:45:34.697009 systemd[2076]: Startup finished in 134ms. Aug 13 01:45:34.697071 systemd[1]: Started user@500.service - User Manager for UID 500. Aug 13 01:45:34.707029 systemd[1]: Started session-1.scope - Session 1 of User core. Aug 13 01:45:34.769512 google_oslogin_nss_cache[1901]: oslogin_cache_refresh[1901]: Failure getting users, quitting Aug 13 01:45:34.769512 google_oslogin_nss_cache[1901]: oslogin_cache_refresh[1901]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 13 01:45:34.769497 oslogin_cache_refresh[1901]: Failure getting users, quitting Aug 13 01:45:34.770175 google_oslogin_nss_cache[1901]: oslogin_cache_refresh[1901]: Refreshing group entry cache Aug 13 01:45:34.769518 oslogin_cache_refresh[1901]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 13 01:45:34.769567 oslogin_cache_refresh[1901]: Refreshing group entry cache Aug 13 01:45:34.770374 google_oslogin_nss_cache[1901]: oslogin_cache_refresh[1901]: Failure getting groups, quitting Aug 13 01:45:34.770374 google_oslogin_nss_cache[1901]: oslogin_cache_refresh[1901]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 13 01:45:34.770367 oslogin_cache_refresh[1901]: Failure getting groups, quitting Aug 13 01:45:34.770379 oslogin_cache_refresh[1901]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 13 01:45:34.776682 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Aug 13 01:45:34.776910 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Aug 13 01:45:34.788667 systemd[1]: Started sshd@1-147.75.71.211:22-139.178.89.65:55320.service - OpenSSH per-connection server daemon (139.178.89.65:55320). Aug 13 01:45:34.827516 coreos-metadata[1976]: Aug 13 01:45:34.827 INFO Fetch successful Aug 13 01:45:34.847896 sshd[2088]: Accepted publickey for core from 139.178.89.65 port 55320 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:45:34.849252 sshd-session[2088]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:45:34.854913 systemd-logind[1926]: New session 2 of user core. Aug 13 01:45:34.865108 systemd[1]: Started session-2.scope - Session 2 of User core. Aug 13 01:45:34.889297 unknown[1976]: wrote ssh authorized keys file for user: core Aug 13 01:45:34.915997 update-ssh-keys[2091]: Updated "/home/core/.ssh/authorized_keys" Aug 13 01:45:34.916318 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Aug 13 01:45:34.928779 systemd[1]: Finished sshkeys.service. Aug 13 01:45:34.936542 sshd[2090]: Connection closed by 139.178.89.65 port 55320 Aug 13 01:45:34.936724 sshd-session[2088]: pam_unix(sshd:session): session closed for user core Aug 13 01:45:34.950557 systemd[1]: sshd@1-147.75.71.211:22-139.178.89.65:55320.service: Deactivated successfully. Aug 13 01:45:34.951635 systemd[1]: session-2.scope: Deactivated successfully. Aug 13 01:45:34.952347 systemd-logind[1926]: Session 2 logged out. Waiting for processes to exit. Aug 13 01:45:34.954018 systemd[1]: Started sshd@2-147.75.71.211:22-139.178.89.65:55322.service - OpenSSH per-connection server daemon (139.178.89.65:55322). Aug 13 01:45:34.964774 systemd-logind[1926]: Removed session 2. Aug 13 01:45:35.002833 sshd[2100]: Accepted publickey for core from 139.178.89.65 port 55322 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:45:35.003467 sshd-session[2100]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:45:35.006546 systemd-logind[1926]: New session 3 of user core. Aug 13 01:45:35.023078 systemd[1]: Started session-3.scope - Session 3 of User core. Aug 13 01:45:35.086374 sshd[2102]: Connection closed by 139.178.89.65 port 55322 Aug 13 01:45:35.086544 sshd-session[2100]: pam_unix(sshd:session): session closed for user core Aug 13 01:45:35.088156 systemd[1]: sshd@2-147.75.71.211:22-139.178.89.65:55322.service: Deactivated successfully. Aug 13 01:45:35.089079 systemd[1]: session-3.scope: Deactivated successfully. Aug 13 01:45:35.089939 systemd-logind[1926]: Session 3 logged out. Waiting for processes to exit. Aug 13 01:45:35.090707 systemd-logind[1926]: Removed session 3. Aug 13 01:45:36.061519 coreos-metadata[1893]: Aug 13 01:45:36.061 INFO Fetch successful Aug 13 01:45:36.167964 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Aug 13 01:45:36.179173 systemd[1]: Starting packet-phone-home.service - Report Success to Packet... Aug 13 01:45:36.337060 login[2018]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Aug 13 01:45:36.338454 login[2019]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Aug 13 01:45:36.351998 systemd-logind[1926]: New session 5 of user core. Aug 13 01:45:36.367343 systemd[1]: Started session-5.scope - Session 5 of User core. Aug 13 01:45:36.375266 systemd-logind[1926]: New session 4 of user core. Aug 13 01:45:36.386316 systemd[1]: Started session-4.scope - Session 4 of User core. Aug 13 01:45:36.613385 systemd[1]: Finished packet-phone-home.service - Report Success to Packet. Aug 13 01:45:36.615027 systemd[1]: Reached target multi-user.target - Multi-User System. Aug 13 01:45:36.615568 systemd[1]: Startup finished in 5.548s (kernel) + 25.381s (initrd) + 10.013s (userspace) = 40.943s. Aug 13 01:45:37.339789 systemd-timesyncd[1854]: Network configuration changed, trying to establish connection. Aug 13 01:45:43.371776 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Aug 13 01:45:43.374793 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 01:45:43.636173 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 01:45:43.638333 (kubelet)[2148]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 01:45:43.660980 kubelet[2148]: E0813 01:45:43.660928 2148 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 01:45:43.662885 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 01:45:43.662974 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 01:45:43.663145 systemd[1]: kubelet.service: Consumed 157ms CPU time, 115.1M memory peak. Aug 13 01:45:45.108730 systemd[1]: Started sshd@3-147.75.71.211:22-139.178.89.65:34586.service - OpenSSH per-connection server daemon (139.178.89.65:34586). Aug 13 01:45:45.152004 sshd[2166]: Accepted publickey for core from 139.178.89.65 port 34586 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:45:45.152732 sshd-session[2166]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:45:45.155925 systemd-logind[1926]: New session 6 of user core. Aug 13 01:45:45.168114 systemd[1]: Started session-6.scope - Session 6 of User core. Aug 13 01:45:45.223171 sshd[2168]: Connection closed by 139.178.89.65 port 34586 Aug 13 01:45:45.223358 sshd-session[2166]: pam_unix(sshd:session): session closed for user core Aug 13 01:45:45.236193 systemd[1]: sshd@3-147.75.71.211:22-139.178.89.65:34586.service: Deactivated successfully. Aug 13 01:45:45.237069 systemd[1]: session-6.scope: Deactivated successfully. Aug 13 01:45:45.237604 systemd-logind[1926]: Session 6 logged out. Waiting for processes to exit. Aug 13 01:45:45.239197 systemd[1]: Started sshd@4-147.75.71.211:22-139.178.89.65:34592.service - OpenSSH per-connection server daemon (139.178.89.65:34592). Aug 13 01:45:45.239620 systemd-logind[1926]: Removed session 6. Aug 13 01:45:45.279338 sshd[2174]: Accepted publickey for core from 139.178.89.65 port 34592 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:45:45.280030 sshd-session[2174]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:45:45.282853 systemd-logind[1926]: New session 7 of user core. Aug 13 01:45:45.300106 systemd[1]: Started session-7.scope - Session 7 of User core. Aug 13 01:45:45.351010 sshd[2176]: Connection closed by 139.178.89.65 port 34592 Aug 13 01:45:45.351161 sshd-session[2174]: pam_unix(sshd:session): session closed for user core Aug 13 01:45:45.377342 systemd[1]: sshd@4-147.75.71.211:22-139.178.89.65:34592.service: Deactivated successfully. Aug 13 01:45:45.381497 systemd[1]: session-7.scope: Deactivated successfully. Aug 13 01:45:45.383928 systemd-logind[1926]: Session 7 logged out. Waiting for processes to exit. Aug 13 01:45:45.390331 systemd[1]: Started sshd@5-147.75.71.211:22-139.178.89.65:34600.service - OpenSSH per-connection server daemon (139.178.89.65:34600). Aug 13 01:45:45.392208 systemd-logind[1926]: Removed session 7. Aug 13 01:45:45.435601 sshd[2182]: Accepted publickey for core from 139.178.89.65 port 34600 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:45:45.436326 sshd-session[2182]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:45:45.439559 systemd-logind[1926]: New session 8 of user core. Aug 13 01:45:45.455115 systemd[1]: Started session-8.scope - Session 8 of User core. Aug 13 01:45:45.519770 sshd[2185]: Connection closed by 139.178.89.65 port 34600 Aug 13 01:45:45.520612 sshd-session[2182]: pam_unix(sshd:session): session closed for user core Aug 13 01:45:45.541940 systemd[1]: sshd@5-147.75.71.211:22-139.178.89.65:34600.service: Deactivated successfully. Aug 13 01:45:45.545914 systemd[1]: session-8.scope: Deactivated successfully. Aug 13 01:45:45.548238 systemd-logind[1926]: Session 8 logged out. Waiting for processes to exit. Aug 13 01:45:45.554629 systemd[1]: Started sshd@6-147.75.71.211:22-139.178.89.65:34614.service - OpenSSH per-connection server daemon (139.178.89.65:34614). Aug 13 01:45:45.556529 systemd-logind[1926]: Removed session 8. Aug 13 01:45:45.640393 sshd[2191]: Accepted publickey for core from 139.178.89.65 port 34614 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:45:45.641372 sshd-session[2191]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:45:45.645573 systemd-logind[1926]: New session 9 of user core. Aug 13 01:45:45.655120 systemd[1]: Started session-9.scope - Session 9 of User core. Aug 13 01:45:45.722121 sudo[2194]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Aug 13 01:45:45.722278 sudo[2194]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 01:45:45.736359 sudo[2194]: pam_unix(sudo:session): session closed for user root Aug 13 01:45:45.737330 sshd[2193]: Connection closed by 139.178.89.65 port 34614 Aug 13 01:45:45.737533 sshd-session[2191]: pam_unix(sshd:session): session closed for user core Aug 13 01:45:45.749646 systemd[1]: sshd@6-147.75.71.211:22-139.178.89.65:34614.service: Deactivated successfully. Aug 13 01:45:45.750712 systemd[1]: session-9.scope: Deactivated successfully. Aug 13 01:45:45.751368 systemd-logind[1926]: Session 9 logged out. Waiting for processes to exit. Aug 13 01:45:45.753135 systemd[1]: Started sshd@7-147.75.71.211:22-139.178.89.65:34620.service - OpenSSH per-connection server daemon (139.178.89.65:34620). Aug 13 01:45:45.753478 systemd-logind[1926]: Removed session 9. Aug 13 01:45:45.793371 sshd[2200]: Accepted publickey for core from 139.178.89.65 port 34620 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:45:45.794217 sshd-session[2200]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:45:45.797702 systemd-logind[1926]: New session 10 of user core. Aug 13 01:45:45.817193 systemd[1]: Started session-10.scope - Session 10 of User core. Aug 13 01:45:45.872011 sudo[2204]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Aug 13 01:45:45.872235 sudo[2204]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 01:45:45.874900 sudo[2204]: pam_unix(sudo:session): session closed for user root Aug 13 01:45:45.877645 sudo[2203]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Aug 13 01:45:45.877810 sudo[2203]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 01:45:45.883738 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 13 01:45:45.923468 augenrules[2226]: No rules Aug 13 01:45:45.923845 systemd[1]: audit-rules.service: Deactivated successfully. Aug 13 01:45:45.923991 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 13 01:45:45.924504 sudo[2203]: pam_unix(sudo:session): session closed for user root Aug 13 01:45:45.925530 sshd[2202]: Connection closed by 139.178.89.65 port 34620 Aug 13 01:45:45.925685 sshd-session[2200]: pam_unix(sshd:session): session closed for user core Aug 13 01:45:45.936188 systemd[1]: sshd@7-147.75.71.211:22-139.178.89.65:34620.service: Deactivated successfully. Aug 13 01:45:45.937046 systemd[1]: session-10.scope: Deactivated successfully. Aug 13 01:45:45.937623 systemd-logind[1926]: Session 10 logged out. Waiting for processes to exit. Aug 13 01:45:45.938896 systemd[1]: Started sshd@8-147.75.71.211:22-139.178.89.65:34626.service - OpenSSH per-connection server daemon (139.178.89.65:34626). Aug 13 01:45:45.939591 systemd-logind[1926]: Removed session 10. Aug 13 01:45:45.973745 sshd[2235]: Accepted publickey for core from 139.178.89.65 port 34626 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:45:45.974633 sshd-session[2235]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:45:45.978322 systemd-logind[1926]: New session 11 of user core. Aug 13 01:45:45.998244 systemd[1]: Started session-11.scope - Session 11 of User core. Aug 13 01:45:46.052603 sudo[2238]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Aug 13 01:45:46.052759 sudo[2238]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 01:45:46.349419 systemd[1]: Starting docker.service - Docker Application Container Engine... Aug 13 01:45:46.363248 (dockerd)[2264]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Aug 13 01:45:46.584047 dockerd[2264]: time="2025-08-13T01:45:46.583995143Z" level=info msg="Starting up" Aug 13 01:45:46.584753 dockerd[2264]: time="2025-08-13T01:45:46.584713353Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Aug 13 01:45:46.627410 dockerd[2264]: time="2025-08-13T01:45:46.627366787Z" level=info msg="Loading containers: start." Aug 13 01:45:46.639924 kernel: Initializing XFRM netlink socket Aug 13 01:45:46.778268 systemd-timesyncd[1854]: Network configuration changed, trying to establish connection. Aug 13 01:45:46.821288 systemd-networkd[1852]: docker0: Link UP Aug 13 01:45:46.822914 dockerd[2264]: time="2025-08-13T01:45:46.822897713Z" level=info msg="Loading containers: done." Aug 13 01:45:46.829784 dockerd[2264]: time="2025-08-13T01:45:46.829765254Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Aug 13 01:45:46.829855 dockerd[2264]: time="2025-08-13T01:45:46.829803255Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Aug 13 01:45:46.829884 dockerd[2264]: time="2025-08-13T01:45:46.829856346Z" level=info msg="Initializing buildkit" Aug 13 01:45:46.841433 dockerd[2264]: time="2025-08-13T01:45:46.841422474Z" level=info msg="Completed buildkit initialization" Aug 13 01:45:46.844802 dockerd[2264]: time="2025-08-13T01:45:46.844760776Z" level=info msg="Daemon has completed initialization" Aug 13 01:45:46.844802 dockerd[2264]: time="2025-08-13T01:45:46.844787269Z" level=info msg="API listen on /run/docker.sock" Aug 13 01:45:46.844909 systemd[1]: Started docker.service - Docker Application Container Engine. Aug 13 01:45:47.185102 systemd-timesyncd[1854]: Contacted time server [2607:f710:35::29c:0:7]:123 (2.flatcar.pool.ntp.org). Aug 13 01:45:47.185142 systemd-timesyncd[1854]: Initial clock synchronization to Wed 2025-08-13 01:45:47.539968 UTC. Aug 13 01:45:47.655544 containerd[1938]: time="2025-08-13T01:45:47.655523205Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.7\"" Aug 13 01:45:48.274270 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3966357639.mount: Deactivated successfully. Aug 13 01:45:49.091192 containerd[1938]: time="2025-08-13T01:45:49.091166418Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:49.091430 containerd[1938]: time="2025-08-13T01:45:49.091365468Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.7: active requests=0, bytes read=28799994" Aug 13 01:45:49.091673 containerd[1938]: time="2025-08-13T01:45:49.091662218Z" level=info msg="ImageCreate event name:\"sha256:761ae2258f1825c2079bd41bcc1da2c9bda8b5e902aa147c14896491dfca0f16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:49.093045 containerd[1938]: time="2025-08-13T01:45:49.093032412Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e04f6223d52f8041c46ef4545ccaf07894b1ca5851506a9142706d4206911f64\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:49.093560 containerd[1938]: time="2025-08-13T01:45:49.093546662Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.7\" with image id \"sha256:761ae2258f1825c2079bd41bcc1da2c9bda8b5e902aa147c14896491dfca0f16\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e04f6223d52f8041c46ef4545ccaf07894b1ca5851506a9142706d4206911f64\", size \"28796794\" in 1.438000797s" Aug 13 01:45:49.093586 containerd[1938]: time="2025-08-13T01:45:49.093568978Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.7\" returns image reference \"sha256:761ae2258f1825c2079bd41bcc1da2c9bda8b5e902aa147c14896491dfca0f16\"" Aug 13 01:45:49.093963 containerd[1938]: time="2025-08-13T01:45:49.093923733Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.7\"" Aug 13 01:45:50.066438 containerd[1938]: time="2025-08-13T01:45:50.066412654Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:50.066657 containerd[1938]: time="2025-08-13T01:45:50.066644202Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.7: active requests=0, bytes read=24783636" Aug 13 01:45:50.067066 containerd[1938]: time="2025-08-13T01:45:50.067055859Z" level=info msg="ImageCreate event name:\"sha256:87f922d0bde0db7ffcb2174ba37bdab8fdd169a41e1882fe5aa308bb57e44fda\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:50.068313 containerd[1938]: time="2025-08-13T01:45:50.068265538Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:6c7f288ab0181e496606a43dbade954819af2b1e1c0552becf6903436e16ea75\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:50.068844 containerd[1938]: time="2025-08-13T01:45:50.068803930Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.7\" with image id \"sha256:87f922d0bde0db7ffcb2174ba37bdab8fdd169a41e1882fe5aa308bb57e44fda\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:6c7f288ab0181e496606a43dbade954819af2b1e1c0552becf6903436e16ea75\", size \"26385470\" in 974.864855ms" Aug 13 01:45:50.068844 containerd[1938]: time="2025-08-13T01:45:50.068822656Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.7\" returns image reference \"sha256:87f922d0bde0db7ffcb2174ba37bdab8fdd169a41e1882fe5aa308bb57e44fda\"" Aug 13 01:45:50.069104 containerd[1938]: time="2025-08-13T01:45:50.069089631Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.7\"" Aug 13 01:45:51.089691 containerd[1938]: time="2025-08-13T01:45:51.089660571Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:51.089935 containerd[1938]: time="2025-08-13T01:45:51.089869297Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.7: active requests=0, bytes read=19176921" Aug 13 01:45:51.090184 containerd[1938]: time="2025-08-13T01:45:51.090172849Z" level=info msg="ImageCreate event name:\"sha256:36cc9c80994ebf29b8e1a366d7e736b273a6c6a60bacb5446944cc0953416245\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:51.091480 containerd[1938]: time="2025-08-13T01:45:51.091467106Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:1c35a970b4450b4285531495be82cda1f6549952f70d6e3de8db57c20a3da4ce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:51.092486 containerd[1938]: time="2025-08-13T01:45:51.092445994Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.7\" with image id \"sha256:36cc9c80994ebf29b8e1a366d7e736b273a6c6a60bacb5446944cc0953416245\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:1c35a970b4450b4285531495be82cda1f6549952f70d6e3de8db57c20a3da4ce\", size \"20778773\" in 1.023338333s" Aug 13 01:45:51.092486 containerd[1938]: time="2025-08-13T01:45:51.092464150Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.7\" returns image reference \"sha256:36cc9c80994ebf29b8e1a366d7e736b273a6c6a60bacb5446944cc0953416245\"" Aug 13 01:45:51.092767 containerd[1938]: time="2025-08-13T01:45:51.092723836Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.7\"" Aug 13 01:45:51.905122 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1847399455.mount: Deactivated successfully. Aug 13 01:45:52.107041 containerd[1938]: time="2025-08-13T01:45:52.107017144Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:52.107285 containerd[1938]: time="2025-08-13T01:45:52.107120743Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.7: active requests=0, bytes read=30895380" Aug 13 01:45:52.107530 containerd[1938]: time="2025-08-13T01:45:52.107517909Z" level=info msg="ImageCreate event name:\"sha256:d5bc66d8682fdab0735e869a3f77730df378af7fd2505c1f4d6374ad3dbd181c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:52.108998 containerd[1938]: time="2025-08-13T01:45:52.108874152Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:8d589a18b5424f77a784ef2f00feffac0ef210414100822f1c120f0d7221def3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:52.109297 containerd[1938]: time="2025-08-13T01:45:52.109282656Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.7\" with image id \"sha256:d5bc66d8682fdab0735e869a3f77730df378af7fd2505c1f4d6374ad3dbd181c\", repo tag \"registry.k8s.io/kube-proxy:v1.32.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:8d589a18b5424f77a784ef2f00feffac0ef210414100822f1c120f0d7221def3\", size \"30894399\" in 1.016541864s" Aug 13 01:45:52.109339 containerd[1938]: time="2025-08-13T01:45:52.109301450Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.7\" returns image reference \"sha256:d5bc66d8682fdab0735e869a3f77730df378af7fd2505c1f4d6374ad3dbd181c\"" Aug 13 01:45:52.109622 containerd[1938]: time="2025-08-13T01:45:52.109610953Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Aug 13 01:45:52.649868 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1167763159.mount: Deactivated successfully. Aug 13 01:45:53.224323 containerd[1938]: time="2025-08-13T01:45:53.224265661Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:53.224550 containerd[1938]: time="2025-08-13T01:45:53.224524095Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Aug 13 01:45:53.225109 containerd[1938]: time="2025-08-13T01:45:53.225066602Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:53.226334 containerd[1938]: time="2025-08-13T01:45:53.226296808Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:53.226921 containerd[1938]: time="2025-08-13T01:45:53.226858306Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.11723131s" Aug 13 01:45:53.226921 containerd[1938]: time="2025-08-13T01:45:53.226877207Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Aug 13 01:45:53.227130 containerd[1938]: time="2025-08-13T01:45:53.227108303Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Aug 13 01:45:53.740379 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Aug 13 01:45:53.741508 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 01:45:53.742533 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3480896122.mount: Deactivated successfully. Aug 13 01:45:53.743592 containerd[1938]: time="2025-08-13T01:45:53.743574216Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 01:45:53.743761 containerd[1938]: time="2025-08-13T01:45:53.743749497Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Aug 13 01:45:53.744216 containerd[1938]: time="2025-08-13T01:45:53.744202655Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 01:45:53.745106 containerd[1938]: time="2025-08-13T01:45:53.745095446Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 01:45:53.745487 containerd[1938]: time="2025-08-13T01:45:53.745457315Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 518.334947ms" Aug 13 01:45:53.745524 containerd[1938]: time="2025-08-13T01:45:53.745491035Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Aug 13 01:45:53.745830 containerd[1938]: time="2025-08-13T01:45:53.745818500Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Aug 13 01:45:54.039841 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 01:45:54.042203 (kubelet)[2631]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 01:45:54.062678 kubelet[2631]: E0813 01:45:54.062602 2631 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 01:45:54.063755 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 01:45:54.063860 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 01:45:54.064106 systemd[1]: kubelet.service: Consumed 124ms CPU time, 115.2M memory peak. Aug 13 01:45:54.277522 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1309009629.mount: Deactivated successfully. Aug 13 01:45:55.381369 containerd[1938]: time="2025-08-13T01:45:55.381343218Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:55.381602 containerd[1938]: time="2025-08-13T01:45:55.381557652Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551360" Aug 13 01:45:55.381933 containerd[1938]: time="2025-08-13T01:45:55.381921667Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:55.383325 containerd[1938]: time="2025-08-13T01:45:55.383313408Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:45:55.384004 containerd[1938]: time="2025-08-13T01:45:55.383958913Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 1.638125201s" Aug 13 01:45:55.384004 containerd[1938]: time="2025-08-13T01:45:55.383980013Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Aug 13 01:45:57.039711 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 01:45:57.039963 systemd[1]: kubelet.service: Consumed 124ms CPU time, 115.2M memory peak. Aug 13 01:45:57.041384 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 01:45:57.056224 systemd[1]: Reload requested from client PID 2750 ('systemctl') (unit session-11.scope)... Aug 13 01:45:57.056233 systemd[1]: Reloading... Aug 13 01:45:57.101959 zram_generator::config[2796]: No configuration found. Aug 13 01:45:57.164078 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 01:45:57.260417 systemd[1]: Reloading finished in 203 ms. Aug 13 01:45:57.286971 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Aug 13 01:45:57.287017 systemd[1]: kubelet.service: Failed with result 'signal'. Aug 13 01:45:57.287144 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 01:45:57.287169 systemd[1]: kubelet.service: Consumed 50ms CPU time, 92.7M memory peak. Aug 13 01:45:57.288480 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 01:45:57.596411 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 01:45:57.598665 (kubelet)[2860]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 13 01:45:57.620466 kubelet[2860]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 01:45:57.620466 kubelet[2860]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 13 01:45:57.620466 kubelet[2860]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 01:45:57.620687 kubelet[2860]: I0813 01:45:57.620501 2860 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 13 01:45:57.725984 kubelet[2860]: I0813 01:45:57.725968 2860 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Aug 13 01:45:57.725984 kubelet[2860]: I0813 01:45:57.725979 2860 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 13 01:45:57.726149 kubelet[2860]: I0813 01:45:57.726139 2860 server.go:954] "Client rotation is on, will bootstrap in background" Aug 13 01:45:57.746067 kubelet[2860]: E0813 01:45:57.746049 2860 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://147.75.71.211:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 147.75.71.211:6443: connect: connection refused" logger="UnhandledError" Aug 13 01:45:57.746067 kubelet[2860]: I0813 01:45:57.746056 2860 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 13 01:45:57.750310 kubelet[2860]: I0813 01:45:57.750300 2860 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 13 01:45:57.759376 kubelet[2860]: I0813 01:45:57.759365 2860 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 13 01:45:57.760474 kubelet[2860]: I0813 01:45:57.760429 2860 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 13 01:45:57.760569 kubelet[2860]: I0813 01:45:57.760444 2860 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372.1.0-a-4296cabafa","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 13 01:45:57.760569 kubelet[2860]: I0813 01:45:57.760542 2860 topology_manager.go:138] "Creating topology manager with none policy" Aug 13 01:45:57.760569 kubelet[2860]: I0813 01:45:57.760548 2860 container_manager_linux.go:304] "Creating device plugin manager" Aug 13 01:45:57.760669 kubelet[2860]: I0813 01:45:57.760619 2860 state_mem.go:36] "Initialized new in-memory state store" Aug 13 01:45:57.763680 kubelet[2860]: I0813 01:45:57.763655 2860 kubelet.go:446] "Attempting to sync node with API server" Aug 13 01:45:57.763680 kubelet[2860]: I0813 01:45:57.763670 2860 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 13 01:45:57.763680 kubelet[2860]: I0813 01:45:57.763684 2860 kubelet.go:352] "Adding apiserver pod source" Aug 13 01:45:57.763760 kubelet[2860]: I0813 01:45:57.763691 2860 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 13 01:45:57.765484 kubelet[2860]: I0813 01:45:57.765454 2860 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Aug 13 01:45:57.765734 kubelet[2860]: I0813 01:45:57.765726 2860 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Aug 13 01:45:57.766733 kubelet[2860]: W0813 01:45:57.766725 2860 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Aug 13 01:45:57.768127 kubelet[2860]: I0813 01:45:57.768118 2860 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 13 01:45:57.768183 kubelet[2860]: I0813 01:45:57.768134 2860 server.go:1287] "Started kubelet" Aug 13 01:45:57.768210 kubelet[2860]: I0813 01:45:57.768188 2860 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Aug 13 01:45:57.768696 kubelet[2860]: W0813 01:45:57.768672 2860 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://147.75.71.211:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 147.75.71.211:6443: connect: connection refused Aug 13 01:45:57.768737 kubelet[2860]: E0813 01:45:57.768702 2860 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://147.75.71.211:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 147.75.71.211:6443: connect: connection refused" logger="UnhandledError" Aug 13 01:45:57.768797 kubelet[2860]: W0813 01:45:57.768775 2860 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://147.75.71.211:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372.1.0-a-4296cabafa&limit=500&resourceVersion=0": dial tcp 147.75.71.211:6443: connect: connection refused Aug 13 01:45:57.768832 kubelet[2860]: E0813 01:45:57.768807 2860 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://147.75.71.211:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372.1.0-a-4296cabafa&limit=500&resourceVersion=0\": dial tcp 147.75.71.211:6443: connect: connection refused" logger="UnhandledError" Aug 13 01:45:57.768957 kubelet[2860]: I0813 01:45:57.768893 2860 server.go:479] "Adding debug handlers to kubelet server" Aug 13 01:45:57.771323 kubelet[2860]: I0813 01:45:57.771284 2860 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 13 01:45:57.771323 kubelet[2860]: I0813 01:45:57.771294 2860 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 13 01:45:57.771403 kubelet[2860]: I0813 01:45:57.771337 2860 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 13 01:45:57.771403 kubelet[2860]: E0813 01:45:57.771367 2860 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372.1.0-a-4296cabafa\" not found" Aug 13 01:45:57.771467 kubelet[2860]: I0813 01:45:57.771409 2860 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 13 01:45:57.771612 kubelet[2860]: E0813 01:45:57.771589 2860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.75.71.211:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-a-4296cabafa?timeout=10s\": dial tcp 147.75.71.211:6443: connect: connection refused" interval="200ms" Aug 13 01:45:57.773790 kubelet[2860]: W0813 01:45:57.771611 2860 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://147.75.71.211:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 147.75.71.211:6443: connect: connection refused Aug 13 01:45:57.773790 kubelet[2860]: I0813 01:45:57.773774 2860 reconciler.go:26] "Reconciler: start to sync state" Aug 13 01:45:57.773790 kubelet[2860]: E0813 01:45:57.773770 2860 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://147.75.71.211:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 147.75.71.211:6443: connect: connection refused" logger="UnhandledError" Aug 13 01:45:57.774091 kubelet[2860]: I0813 01:45:57.773801 2860 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 13 01:45:57.774284 kubelet[2860]: I0813 01:45:57.774271 2860 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 13 01:45:57.774326 kubelet[2860]: I0813 01:45:57.774317 2860 factory.go:221] Registration of the systemd container factory successfully Aug 13 01:45:57.774406 kubelet[2860]: I0813 01:45:57.774389 2860 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 13 01:45:57.774996 kubelet[2860]: I0813 01:45:57.774987 2860 factory.go:221] Registration of the containerd container factory successfully Aug 13 01:45:57.775278 kubelet[2860]: E0813 01:45:57.775266 2860 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 13 01:45:57.776617 kubelet[2860]: E0813 01:45:57.775396 2860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://147.75.71.211:6443/api/v1/namespaces/default/events\": dial tcp 147.75.71.211:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4372.1.0-a-4296cabafa.185b30462df9b706 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4372.1.0-a-4296cabafa,UID:ci-4372.1.0-a-4296cabafa,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4372.1.0-a-4296cabafa,},FirstTimestamp:2025-08-13 01:45:57.768124166 +0000 UTC m=+0.167436741,LastTimestamp:2025-08-13 01:45:57.768124166 +0000 UTC m=+0.167436741,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372.1.0-a-4296cabafa,}" Aug 13 01:45:57.782670 kubelet[2860]: I0813 01:45:57.782660 2860 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 13 01:45:57.782670 kubelet[2860]: I0813 01:45:57.782668 2860 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 13 01:45:57.782727 kubelet[2860]: I0813 01:45:57.782677 2860 state_mem.go:36] "Initialized new in-memory state store" Aug 13 01:45:57.783549 kubelet[2860]: I0813 01:45:57.783542 2860 policy_none.go:49] "None policy: Start" Aug 13 01:45:57.783574 kubelet[2860]: I0813 01:45:57.783551 2860 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 13 01:45:57.783574 kubelet[2860]: I0813 01:45:57.783557 2860 state_mem.go:35] "Initializing new in-memory state store" Aug 13 01:45:57.783767 kubelet[2860]: I0813 01:45:57.783757 2860 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Aug 13 01:45:57.784455 kubelet[2860]: I0813 01:45:57.784445 2860 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Aug 13 01:45:57.784455 kubelet[2860]: I0813 01:45:57.784457 2860 status_manager.go:227] "Starting to sync pod status with apiserver" Aug 13 01:45:57.784514 kubelet[2860]: I0813 01:45:57.784470 2860 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 13 01:45:57.784514 kubelet[2860]: I0813 01:45:57.784478 2860 kubelet.go:2382] "Starting kubelet main sync loop" Aug 13 01:45:57.784548 kubelet[2860]: E0813 01:45:57.784513 2860 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 13 01:45:57.784698 kubelet[2860]: W0813 01:45:57.784684 2860 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://147.75.71.211:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 147.75.71.211:6443: connect: connection refused Aug 13 01:45:57.784736 kubelet[2860]: E0813 01:45:57.784705 2860 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://147.75.71.211:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 147.75.71.211:6443: connect: connection refused" logger="UnhandledError" Aug 13 01:45:57.786251 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Aug 13 01:45:57.810324 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Aug 13 01:45:57.819695 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Aug 13 01:45:57.836138 kubelet[2860]: I0813 01:45:57.836083 2860 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Aug 13 01:45:57.836497 kubelet[2860]: I0813 01:45:57.836461 2860 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 13 01:45:57.836633 kubelet[2860]: I0813 01:45:57.836493 2860 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 13 01:45:57.836980 kubelet[2860]: I0813 01:45:57.836864 2860 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 13 01:45:57.838279 kubelet[2860]: E0813 01:45:57.838183 2860 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 13 01:45:57.838279 kubelet[2860]: E0813 01:45:57.838271 2860 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4372.1.0-a-4296cabafa\" not found" Aug 13 01:45:57.893082 systemd[1]: Created slice kubepods-burstable-podb1af5fb9ce4a4dbf9dc4531d1aff0477.slice - libcontainer container kubepods-burstable-podb1af5fb9ce4a4dbf9dc4531d1aff0477.slice. Aug 13 01:45:57.915415 kubelet[2860]: E0813 01:45:57.915352 2860 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-a-4296cabafa\" not found" node="ci-4372.1.0-a-4296cabafa" Aug 13 01:45:57.918987 systemd[1]: Created slice kubepods-burstable-podb447eb39a22467875dcaa27290d2d65c.slice - libcontainer container kubepods-burstable-podb447eb39a22467875dcaa27290d2d65c.slice. Aug 13 01:45:57.933232 kubelet[2860]: E0813 01:45:57.933138 2860 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-a-4296cabafa\" not found" node="ci-4372.1.0-a-4296cabafa" Aug 13 01:45:57.940941 kubelet[2860]: I0813 01:45:57.940860 2860 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-a-4296cabafa" Aug 13 01:45:57.941265 systemd[1]: Created slice kubepods-burstable-pode1a0cd0cec53222f532ddfd5eab0ec4f.slice - libcontainer container kubepods-burstable-pode1a0cd0cec53222f532ddfd5eab0ec4f.slice. Aug 13 01:45:57.941759 kubelet[2860]: E0813 01:45:57.941673 2860 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://147.75.71.211:6443/api/v1/nodes\": dial tcp 147.75.71.211:6443: connect: connection refused" node="ci-4372.1.0-a-4296cabafa" Aug 13 01:45:57.945700 kubelet[2860]: E0813 01:45:57.945609 2860 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-a-4296cabafa\" not found" node="ci-4372.1.0-a-4296cabafa" Aug 13 01:45:57.973075 kubelet[2860]: E0813 01:45:57.972975 2860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.75.71.211:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-a-4296cabafa?timeout=10s\": dial tcp 147.75.71.211:6443: connect: connection refused" interval="400ms" Aug 13 01:45:57.975468 kubelet[2860]: I0813 01:45:57.975352 2860 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b1af5fb9ce4a4dbf9dc4531d1aff0477-k8s-certs\") pod \"kube-apiserver-ci-4372.1.0-a-4296cabafa\" (UID: \"b1af5fb9ce4a4dbf9dc4531d1aff0477\") " pod="kube-system/kube-apiserver-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:57.975468 kubelet[2860]: I0813 01:45:57.975458 2860 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b447eb39a22467875dcaa27290d2d65c-kubeconfig\") pod \"kube-controller-manager-ci-4372.1.0-a-4296cabafa\" (UID: \"b447eb39a22467875dcaa27290d2d65c\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:57.975760 kubelet[2860]: I0813 01:45:57.975523 2860 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b1af5fb9ce4a4dbf9dc4531d1aff0477-ca-certs\") pod \"kube-apiserver-ci-4372.1.0-a-4296cabafa\" (UID: \"b1af5fb9ce4a4dbf9dc4531d1aff0477\") " pod="kube-system/kube-apiserver-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:57.975760 kubelet[2860]: I0813 01:45:57.975580 2860 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b1af5fb9ce4a4dbf9dc4531d1aff0477-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372.1.0-a-4296cabafa\" (UID: \"b1af5fb9ce4a4dbf9dc4531d1aff0477\") " pod="kube-system/kube-apiserver-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:57.975760 kubelet[2860]: I0813 01:45:57.975632 2860 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b447eb39a22467875dcaa27290d2d65c-ca-certs\") pod \"kube-controller-manager-ci-4372.1.0-a-4296cabafa\" (UID: \"b447eb39a22467875dcaa27290d2d65c\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:57.975760 kubelet[2860]: I0813 01:45:57.975681 2860 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b447eb39a22467875dcaa27290d2d65c-flexvolume-dir\") pod \"kube-controller-manager-ci-4372.1.0-a-4296cabafa\" (UID: \"b447eb39a22467875dcaa27290d2d65c\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:57.975760 kubelet[2860]: I0813 01:45:57.975731 2860 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b447eb39a22467875dcaa27290d2d65c-k8s-certs\") pod \"kube-controller-manager-ci-4372.1.0-a-4296cabafa\" (UID: \"b447eb39a22467875dcaa27290d2d65c\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:57.976297 kubelet[2860]: I0813 01:45:57.975784 2860 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b447eb39a22467875dcaa27290d2d65c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372.1.0-a-4296cabafa\" (UID: \"b447eb39a22467875dcaa27290d2d65c\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:57.976297 kubelet[2860]: I0813 01:45:57.975836 2860 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e1a0cd0cec53222f532ddfd5eab0ec4f-kubeconfig\") pod \"kube-scheduler-ci-4372.1.0-a-4296cabafa\" (UID: \"e1a0cd0cec53222f532ddfd5eab0ec4f\") " pod="kube-system/kube-scheduler-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:58.146974 kubelet[2860]: I0813 01:45:58.146778 2860 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-a-4296cabafa" Aug 13 01:45:58.147620 kubelet[2860]: E0813 01:45:58.147543 2860 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://147.75.71.211:6443/api/v1/nodes\": dial tcp 147.75.71.211:6443: connect: connection refused" node="ci-4372.1.0-a-4296cabafa" Aug 13 01:45:58.217374 containerd[1938]: time="2025-08-13T01:45:58.217235021Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372.1.0-a-4296cabafa,Uid:b1af5fb9ce4a4dbf9dc4531d1aff0477,Namespace:kube-system,Attempt:0,}" Aug 13 01:45:58.227389 containerd[1938]: time="2025-08-13T01:45:58.227344324Z" level=info msg="connecting to shim 64ff6af9143ee2e106db5438c5cd33855ca0e01b02150572c59b7f5f35bb1cb8" address="unix:///run/containerd/s/2bdd0e01cd1171dd88a0314f94f72bd4ffa678ff9382e8ecba12d61f4d0a6a46" namespace=k8s.io protocol=ttrpc version=3 Aug 13 01:45:58.234776 containerd[1938]: time="2025-08-13T01:45:58.234753355Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372.1.0-a-4296cabafa,Uid:b447eb39a22467875dcaa27290d2d65c,Namespace:kube-system,Attempt:0,}" Aug 13 01:45:58.242530 containerd[1938]: time="2025-08-13T01:45:58.242479755Z" level=info msg="connecting to shim 1f0f3e5aa67a7ccf8fe204023802ee2664dc783ef145035f70cd84a082e3a1ce" address="unix:///run/containerd/s/4f043cf00b8c39dd823ff6e27d067ad334537a11ee591b4f65152eeb8e9a14eb" namespace=k8s.io protocol=ttrpc version=3 Aug 13 01:45:58.247265 containerd[1938]: time="2025-08-13T01:45:58.247217565Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372.1.0-a-4296cabafa,Uid:e1a0cd0cec53222f532ddfd5eab0ec4f,Namespace:kube-system,Attempt:0,}" Aug 13 01:45:58.249981 systemd[1]: Started cri-containerd-64ff6af9143ee2e106db5438c5cd33855ca0e01b02150572c59b7f5f35bb1cb8.scope - libcontainer container 64ff6af9143ee2e106db5438c5cd33855ca0e01b02150572c59b7f5f35bb1cb8. Aug 13 01:45:58.255474 containerd[1938]: time="2025-08-13T01:45:58.255431720Z" level=info msg="connecting to shim c6cad9ef8f4e993785c2c5cfde03dd888fa5d3d25c41d00896c72b236a33feb9" address="unix:///run/containerd/s/7f4005e50824d15960e065fc5d729ddfb6c9879d4c5e333e7bf4886f542ab910" namespace=k8s.io protocol=ttrpc version=3 Aug 13 01:45:58.256758 systemd[1]: Started cri-containerd-1f0f3e5aa67a7ccf8fe204023802ee2664dc783ef145035f70cd84a082e3a1ce.scope - libcontainer container 1f0f3e5aa67a7ccf8fe204023802ee2664dc783ef145035f70cd84a082e3a1ce. Aug 13 01:45:58.264169 systemd[1]: Started cri-containerd-c6cad9ef8f4e993785c2c5cfde03dd888fa5d3d25c41d00896c72b236a33feb9.scope - libcontainer container c6cad9ef8f4e993785c2c5cfde03dd888fa5d3d25c41d00896c72b236a33feb9. Aug 13 01:45:58.279066 containerd[1938]: time="2025-08-13T01:45:58.279038979Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372.1.0-a-4296cabafa,Uid:b1af5fb9ce4a4dbf9dc4531d1aff0477,Namespace:kube-system,Attempt:0,} returns sandbox id \"64ff6af9143ee2e106db5438c5cd33855ca0e01b02150572c59b7f5f35bb1cb8\"" Aug 13 01:45:58.280466 containerd[1938]: time="2025-08-13T01:45:58.280450918Z" level=info msg="CreateContainer within sandbox \"64ff6af9143ee2e106db5438c5cd33855ca0e01b02150572c59b7f5f35bb1cb8\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Aug 13 01:45:58.300082 containerd[1938]: time="2025-08-13T01:45:58.300057937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372.1.0-a-4296cabafa,Uid:b447eb39a22467875dcaa27290d2d65c,Namespace:kube-system,Attempt:0,} returns sandbox id \"1f0f3e5aa67a7ccf8fe204023802ee2664dc783ef145035f70cd84a082e3a1ce\"" Aug 13 01:45:58.300367 containerd[1938]: time="2025-08-13T01:45:58.300351931Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372.1.0-a-4296cabafa,Uid:e1a0cd0cec53222f532ddfd5eab0ec4f,Namespace:kube-system,Attempt:0,} returns sandbox id \"c6cad9ef8f4e993785c2c5cfde03dd888fa5d3d25c41d00896c72b236a33feb9\"" Aug 13 01:45:58.301665 containerd[1938]: time="2025-08-13T01:45:58.301652241Z" level=info msg="CreateContainer within sandbox \"c6cad9ef8f4e993785c2c5cfde03dd888fa5d3d25c41d00896c72b236a33feb9\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Aug 13 01:45:58.301703 containerd[1938]: time="2025-08-13T01:45:58.301664445Z" level=info msg="CreateContainer within sandbox \"1f0f3e5aa67a7ccf8fe204023802ee2664dc783ef145035f70cd84a082e3a1ce\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Aug 13 01:45:58.302681 containerd[1938]: time="2025-08-13T01:45:58.302643487Z" level=info msg="Container aab4af456155b502395cc8f20fb7f2c46bebf5e15062561ec4a941d3b153e25e: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:45:58.305339 containerd[1938]: time="2025-08-13T01:45:58.305326464Z" level=info msg="CreateContainer within sandbox \"64ff6af9143ee2e106db5438c5cd33855ca0e01b02150572c59b7f5f35bb1cb8\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"aab4af456155b502395cc8f20fb7f2c46bebf5e15062561ec4a941d3b153e25e\"" Aug 13 01:45:58.305590 containerd[1938]: time="2025-08-13T01:45:58.305578340Z" level=info msg="StartContainer for \"aab4af456155b502395cc8f20fb7f2c46bebf5e15062561ec4a941d3b153e25e\"" Aug 13 01:45:58.305839 containerd[1938]: time="2025-08-13T01:45:58.305827127Z" level=info msg="Container ddf6e6fab286b061b4ef76c293aa6383466b2117da2d8352970ad3acac319915: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:45:58.306162 containerd[1938]: time="2025-08-13T01:45:58.306136185Z" level=info msg="connecting to shim aab4af456155b502395cc8f20fb7f2c46bebf5e15062561ec4a941d3b153e25e" address="unix:///run/containerd/s/2bdd0e01cd1171dd88a0314f94f72bd4ffa678ff9382e8ecba12d61f4d0a6a46" protocol=ttrpc version=3 Aug 13 01:45:58.306462 containerd[1938]: time="2025-08-13T01:45:58.306421849Z" level=info msg="Container e5a999bfdeb08c7e882a677fbd4a4534e9078dbf8c776ed96737accd15b59002: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:45:58.308828 containerd[1938]: time="2025-08-13T01:45:58.308789435Z" level=info msg="CreateContainer within sandbox \"1f0f3e5aa67a7ccf8fe204023802ee2664dc783ef145035f70cd84a082e3a1ce\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"e5a999bfdeb08c7e882a677fbd4a4534e9078dbf8c776ed96737accd15b59002\"" Aug 13 01:45:58.309033 containerd[1938]: time="2025-08-13T01:45:58.308991527Z" level=info msg="StartContainer for \"e5a999bfdeb08c7e882a677fbd4a4534e9078dbf8c776ed96737accd15b59002\"" Aug 13 01:45:58.309290 containerd[1938]: time="2025-08-13T01:45:58.309243908Z" level=info msg="CreateContainer within sandbox \"c6cad9ef8f4e993785c2c5cfde03dd888fa5d3d25c41d00896c72b236a33feb9\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"ddf6e6fab286b061b4ef76c293aa6383466b2117da2d8352970ad3acac319915\"" Aug 13 01:45:58.309440 containerd[1938]: time="2025-08-13T01:45:58.309398787Z" level=info msg="StartContainer for \"ddf6e6fab286b061b4ef76c293aa6383466b2117da2d8352970ad3acac319915\"" Aug 13 01:45:58.309583 containerd[1938]: time="2025-08-13T01:45:58.309533803Z" level=info msg="connecting to shim e5a999bfdeb08c7e882a677fbd4a4534e9078dbf8c776ed96737accd15b59002" address="unix:///run/containerd/s/4f043cf00b8c39dd823ff6e27d067ad334537a11ee591b4f65152eeb8e9a14eb" protocol=ttrpc version=3 Aug 13 01:45:58.309960 containerd[1938]: time="2025-08-13T01:45:58.309942846Z" level=info msg="connecting to shim ddf6e6fab286b061b4ef76c293aa6383466b2117da2d8352970ad3acac319915" address="unix:///run/containerd/s/7f4005e50824d15960e065fc5d729ddfb6c9879d4c5e333e7bf4886f542ab910" protocol=ttrpc version=3 Aug 13 01:45:58.328160 systemd[1]: Started cri-containerd-aab4af456155b502395cc8f20fb7f2c46bebf5e15062561ec4a941d3b153e25e.scope - libcontainer container aab4af456155b502395cc8f20fb7f2c46bebf5e15062561ec4a941d3b153e25e. Aug 13 01:45:58.331019 systemd[1]: Started cri-containerd-ddf6e6fab286b061b4ef76c293aa6383466b2117da2d8352970ad3acac319915.scope - libcontainer container ddf6e6fab286b061b4ef76c293aa6383466b2117da2d8352970ad3acac319915. Aug 13 01:45:58.331641 systemd[1]: Started cri-containerd-e5a999bfdeb08c7e882a677fbd4a4534e9078dbf8c776ed96737accd15b59002.scope - libcontainer container e5a999bfdeb08c7e882a677fbd4a4534e9078dbf8c776ed96737accd15b59002. Aug 13 01:45:58.360558 containerd[1938]: time="2025-08-13T01:45:58.360533882Z" level=info msg="StartContainer for \"aab4af456155b502395cc8f20fb7f2c46bebf5e15062561ec4a941d3b153e25e\" returns successfully" Aug 13 01:45:58.361457 containerd[1938]: time="2025-08-13T01:45:58.361443670Z" level=info msg="StartContainer for \"ddf6e6fab286b061b4ef76c293aa6383466b2117da2d8352970ad3acac319915\" returns successfully" Aug 13 01:45:58.363904 containerd[1938]: time="2025-08-13T01:45:58.363874376Z" level=info msg="StartContainer for \"e5a999bfdeb08c7e882a677fbd4a4534e9078dbf8c776ed96737accd15b59002\" returns successfully" Aug 13 01:45:58.373465 kubelet[2860]: E0813 01:45:58.373420 2860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.75.71.211:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-a-4296cabafa?timeout=10s\": dial tcp 147.75.71.211:6443: connect: connection refused" interval="800ms" Aug 13 01:45:58.549323 kubelet[2860]: I0813 01:45:58.549208 2860 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-a-4296cabafa" Aug 13 01:45:58.788136 kubelet[2860]: E0813 01:45:58.788122 2860 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-a-4296cabafa\" not found" node="ci-4372.1.0-a-4296cabafa" Aug 13 01:45:58.788611 kubelet[2860]: E0813 01:45:58.788602 2860 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-a-4296cabafa\" not found" node="ci-4372.1.0-a-4296cabafa" Aug 13 01:45:58.790039 kubelet[2860]: E0813 01:45:58.790029 2860 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-a-4296cabafa\" not found" node="ci-4372.1.0-a-4296cabafa" Aug 13 01:45:59.177196 kubelet[2860]: E0813 01:45:59.177178 2860 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4372.1.0-a-4296cabafa\" not found" node="ci-4372.1.0-a-4296cabafa" Aug 13 01:45:59.283654 kubelet[2860]: I0813 01:45:59.283581 2860 kubelet_node_status.go:78] "Successfully registered node" node="ci-4372.1.0-a-4296cabafa" Aug 13 01:45:59.374064 kubelet[2860]: I0813 01:45:59.373960 2860 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:59.383772 kubelet[2860]: E0813 01:45:59.383660 2860 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4372.1.0-a-4296cabafa\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:59.383772 kubelet[2860]: I0813 01:45:59.383713 2860 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:59.387219 kubelet[2860]: E0813 01:45:59.387126 2860 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4372.1.0-a-4296cabafa\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:59.387219 kubelet[2860]: I0813 01:45:59.387174 2860 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:59.390795 kubelet[2860]: E0813 01:45:59.390701 2860 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4372.1.0-a-4296cabafa\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:59.765001 kubelet[2860]: I0813 01:45:59.764883 2860 apiserver.go:52] "Watching apiserver" Aug 13 01:45:59.772241 kubelet[2860]: I0813 01:45:59.772151 2860 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 13 01:45:59.791006 kubelet[2860]: I0813 01:45:59.790959 2860 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:59.791740 kubelet[2860]: I0813 01:45:59.791219 2860 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:59.791740 kubelet[2860]: I0813 01:45:59.791341 2860 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:59.794066 kubelet[2860]: E0813 01:45:59.794032 2860 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4372.1.0-a-4296cabafa\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:59.794353 kubelet[2860]: E0813 01:45:59.794302 2860 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4372.1.0-a-4296cabafa\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:45:59.794540 kubelet[2860]: E0813 01:45:59.794500 2860 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4372.1.0-a-4296cabafa\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:01.046812 systemd[1]: Reload requested from client PID 3181 ('systemctl') (unit session-11.scope)... Aug 13 01:46:01.046820 systemd[1]: Reloading... Aug 13 01:46:01.090879 zram_generator::config[3226]: No configuration found. Aug 13 01:46:01.159984 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 01:46:01.264670 systemd[1]: Reloading finished in 217 ms. Aug 13 01:46:01.286346 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 01:46:01.293999 systemd[1]: kubelet.service: Deactivated successfully. Aug 13 01:46:01.294149 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 01:46:01.294188 systemd[1]: kubelet.service: Consumed 631ms CPU time, 137.3M memory peak. Aug 13 01:46:01.295227 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 01:46:01.618377 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 01:46:01.620673 (kubelet)[3290]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 13 01:46:01.642069 kubelet[3290]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 01:46:01.642069 kubelet[3290]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 13 01:46:01.642069 kubelet[3290]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 01:46:01.642282 kubelet[3290]: I0813 01:46:01.642082 3290 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 13 01:46:01.645470 kubelet[3290]: I0813 01:46:01.645430 3290 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Aug 13 01:46:01.645470 kubelet[3290]: I0813 01:46:01.645440 3290 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 13 01:46:01.645588 kubelet[3290]: I0813 01:46:01.645558 3290 server.go:954] "Client rotation is on, will bootstrap in background" Aug 13 01:46:01.646238 kubelet[3290]: I0813 01:46:01.646201 3290 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Aug 13 01:46:01.647423 kubelet[3290]: I0813 01:46:01.647387 3290 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 13 01:46:01.649742 kubelet[3290]: I0813 01:46:01.649731 3290 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 13 01:46:01.657065 kubelet[3290]: I0813 01:46:01.657019 3290 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 13 01:46:01.657183 kubelet[3290]: I0813 01:46:01.657135 3290 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 13 01:46:01.657294 kubelet[3290]: I0813 01:46:01.657150 3290 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372.1.0-a-4296cabafa","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 13 01:46:01.657294 kubelet[3290]: I0813 01:46:01.657280 3290 topology_manager.go:138] "Creating topology manager with none policy" Aug 13 01:46:01.657294 kubelet[3290]: I0813 01:46:01.657287 3290 container_manager_linux.go:304] "Creating device plugin manager" Aug 13 01:46:01.657379 kubelet[3290]: I0813 01:46:01.657315 3290 state_mem.go:36] "Initialized new in-memory state store" Aug 13 01:46:01.657487 kubelet[3290]: I0813 01:46:01.657456 3290 kubelet.go:446] "Attempting to sync node with API server" Aug 13 01:46:01.657487 kubelet[3290]: I0813 01:46:01.657472 3290 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 13 01:46:01.657487 kubelet[3290]: I0813 01:46:01.657484 3290 kubelet.go:352] "Adding apiserver pod source" Aug 13 01:46:01.657539 kubelet[3290]: I0813 01:46:01.657490 3290 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 13 01:46:01.657847 kubelet[3290]: I0813 01:46:01.657826 3290 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Aug 13 01:46:01.658618 kubelet[3290]: I0813 01:46:01.658544 3290 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Aug 13 01:46:01.659097 kubelet[3290]: I0813 01:46:01.659086 3290 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 13 01:46:01.659135 kubelet[3290]: I0813 01:46:01.659110 3290 server.go:1287] "Started kubelet" Aug 13 01:46:01.659183 kubelet[3290]: I0813 01:46:01.659147 3290 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Aug 13 01:46:01.659218 kubelet[3290]: I0813 01:46:01.659166 3290 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 13 01:46:01.659345 kubelet[3290]: I0813 01:46:01.659334 3290 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 13 01:46:01.659847 kubelet[3290]: I0813 01:46:01.659836 3290 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 13 01:46:01.659932 kubelet[3290]: E0813 01:46:01.659912 3290 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372.1.0-a-4296cabafa\" not found" Aug 13 01:46:01.659970 kubelet[3290]: I0813 01:46:01.659947 3290 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 13 01:46:01.659970 kubelet[3290]: I0813 01:46:01.659958 3290 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 13 01:46:01.660055 kubelet[3290]: I0813 01:46:01.660046 3290 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 13 01:46:01.660168 kubelet[3290]: I0813 01:46:01.660157 3290 reconciler.go:26] "Reconciler: start to sync state" Aug 13 01:46:01.660408 kubelet[3290]: I0813 01:46:01.660396 3290 server.go:479] "Adding debug handlers to kubelet server" Aug 13 01:46:01.660488 kubelet[3290]: E0813 01:46:01.660471 3290 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 13 01:46:01.660559 kubelet[3290]: I0813 01:46:01.660550 3290 factory.go:221] Registration of the systemd container factory successfully Aug 13 01:46:01.660630 kubelet[3290]: I0813 01:46:01.660616 3290 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 13 01:46:01.661259 kubelet[3290]: I0813 01:46:01.661245 3290 factory.go:221] Registration of the containerd container factory successfully Aug 13 01:46:01.666104 kubelet[3290]: I0813 01:46:01.666078 3290 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Aug 13 01:46:01.666756 kubelet[3290]: I0813 01:46:01.666747 3290 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Aug 13 01:46:01.666790 kubelet[3290]: I0813 01:46:01.666763 3290 status_manager.go:227] "Starting to sync pod status with apiserver" Aug 13 01:46:01.666790 kubelet[3290]: I0813 01:46:01.666774 3290 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 13 01:46:01.666790 kubelet[3290]: I0813 01:46:01.666778 3290 kubelet.go:2382] "Starting kubelet main sync loop" Aug 13 01:46:01.666845 kubelet[3290]: E0813 01:46:01.666802 3290 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 13 01:46:01.676852 kubelet[3290]: I0813 01:46:01.676837 3290 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 13 01:46:01.676852 kubelet[3290]: I0813 01:46:01.676846 3290 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 13 01:46:01.676852 kubelet[3290]: I0813 01:46:01.676858 3290 state_mem.go:36] "Initialized new in-memory state store" Aug 13 01:46:01.676960 kubelet[3290]: I0813 01:46:01.676955 3290 state_mem.go:88] "Updated default CPUSet" cpuSet="" Aug 13 01:46:01.676978 kubelet[3290]: I0813 01:46:01.676961 3290 state_mem.go:96] "Updated CPUSet assignments" assignments={} Aug 13 01:46:01.676978 kubelet[3290]: I0813 01:46:01.676973 3290 policy_none.go:49] "None policy: Start" Aug 13 01:46:01.677006 kubelet[3290]: I0813 01:46:01.676978 3290 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 13 01:46:01.677006 kubelet[3290]: I0813 01:46:01.676984 3290 state_mem.go:35] "Initializing new in-memory state store" Aug 13 01:46:01.677048 kubelet[3290]: I0813 01:46:01.677042 3290 state_mem.go:75] "Updated machine memory state" Aug 13 01:46:01.679236 kubelet[3290]: I0813 01:46:01.679228 3290 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Aug 13 01:46:01.679351 kubelet[3290]: I0813 01:46:01.679314 3290 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 13 01:46:01.679351 kubelet[3290]: I0813 01:46:01.679321 3290 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 13 01:46:01.679404 kubelet[3290]: I0813 01:46:01.679398 3290 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 13 01:46:01.679679 kubelet[3290]: E0813 01:46:01.679671 3290 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 13 01:46:01.768638 kubelet[3290]: I0813 01:46:01.768514 3290 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:01.768638 kubelet[3290]: I0813 01:46:01.768557 3290 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:01.769060 kubelet[3290]: I0813 01:46:01.768675 3290 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:01.776549 kubelet[3290]: W0813 01:46:01.776491 3290 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Aug 13 01:46:01.776726 kubelet[3290]: W0813 01:46:01.776587 3290 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Aug 13 01:46:01.776726 kubelet[3290]: W0813 01:46:01.776696 3290 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Aug 13 01:46:01.786483 kubelet[3290]: I0813 01:46:01.786432 3290 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:01.795837 kubelet[3290]: I0813 01:46:01.795777 3290 kubelet_node_status.go:124] "Node was previously registered" node="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:01.796040 kubelet[3290]: I0813 01:46:01.795953 3290 kubelet_node_status.go:78] "Successfully registered node" node="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:01.962015 kubelet[3290]: I0813 01:46:01.961949 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b447eb39a22467875dcaa27290d2d65c-flexvolume-dir\") pod \"kube-controller-manager-ci-4372.1.0-a-4296cabafa\" (UID: \"b447eb39a22467875dcaa27290d2d65c\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:01.962279 kubelet[3290]: I0813 01:46:01.962040 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b447eb39a22467875dcaa27290d2d65c-k8s-certs\") pod \"kube-controller-manager-ci-4372.1.0-a-4296cabafa\" (UID: \"b447eb39a22467875dcaa27290d2d65c\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:01.962279 kubelet[3290]: I0813 01:46:01.962154 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b447eb39a22467875dcaa27290d2d65c-kubeconfig\") pod \"kube-controller-manager-ci-4372.1.0-a-4296cabafa\" (UID: \"b447eb39a22467875dcaa27290d2d65c\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:01.962528 kubelet[3290]: I0813 01:46:01.962270 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e1a0cd0cec53222f532ddfd5eab0ec4f-kubeconfig\") pod \"kube-scheduler-ci-4372.1.0-a-4296cabafa\" (UID: \"e1a0cd0cec53222f532ddfd5eab0ec4f\") " pod="kube-system/kube-scheduler-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:01.962528 kubelet[3290]: I0813 01:46:01.962366 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b1af5fb9ce4a4dbf9dc4531d1aff0477-ca-certs\") pod \"kube-apiserver-ci-4372.1.0-a-4296cabafa\" (UID: \"b1af5fb9ce4a4dbf9dc4531d1aff0477\") " pod="kube-system/kube-apiserver-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:01.962528 kubelet[3290]: I0813 01:46:01.962436 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b1af5fb9ce4a4dbf9dc4531d1aff0477-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372.1.0-a-4296cabafa\" (UID: \"b1af5fb9ce4a4dbf9dc4531d1aff0477\") " pod="kube-system/kube-apiserver-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:01.962528 kubelet[3290]: I0813 01:46:01.962497 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b447eb39a22467875dcaa27290d2d65c-ca-certs\") pod \"kube-controller-manager-ci-4372.1.0-a-4296cabafa\" (UID: \"b447eb39a22467875dcaa27290d2d65c\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:01.962989 kubelet[3290]: I0813 01:46:01.962552 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b447eb39a22467875dcaa27290d2d65c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372.1.0-a-4296cabafa\" (UID: \"b447eb39a22467875dcaa27290d2d65c\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:01.962989 kubelet[3290]: I0813 01:46:01.962608 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b1af5fb9ce4a4dbf9dc4531d1aff0477-k8s-certs\") pod \"kube-apiserver-ci-4372.1.0-a-4296cabafa\" (UID: \"b1af5fb9ce4a4dbf9dc4531d1aff0477\") " pod="kube-system/kube-apiserver-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:02.658508 kubelet[3290]: I0813 01:46:02.658489 3290 apiserver.go:52] "Watching apiserver" Aug 13 01:46:02.660106 kubelet[3290]: I0813 01:46:02.660092 3290 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 13 01:46:02.671047 kubelet[3290]: I0813 01:46:02.671031 3290 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:02.671171 kubelet[3290]: I0813 01:46:02.671161 3290 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:02.674031 kubelet[3290]: W0813 01:46:02.674020 3290 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Aug 13 01:46:02.674073 kubelet[3290]: E0813 01:46:02.674050 3290 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4372.1.0-a-4296cabafa\" already exists" pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:02.674307 kubelet[3290]: W0813 01:46:02.674298 3290 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Aug 13 01:46:02.674343 kubelet[3290]: E0813 01:46:02.674319 3290 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4372.1.0-a-4296cabafa\" already exists" pod="kube-system/kube-apiserver-ci-4372.1.0-a-4296cabafa" Aug 13 01:46:02.685204 kubelet[3290]: I0813 01:46:02.685174 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4372.1.0-a-4296cabafa" podStartSLOduration=1.685161594 podStartE2EDuration="1.685161594s" podCreationTimestamp="2025-08-13 01:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 01:46:02.681252064 +0000 UTC m=+1.058576328" watchObservedRunningTime="2025-08-13 01:46:02.685161594 +0000 UTC m=+1.062485854" Aug 13 01:46:02.688576 kubelet[3290]: I0813 01:46:02.688544 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4372.1.0-a-4296cabafa" podStartSLOduration=1.688533036 podStartE2EDuration="1.688533036s" podCreationTimestamp="2025-08-13 01:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 01:46:02.685252128 +0000 UTC m=+1.062576390" watchObservedRunningTime="2025-08-13 01:46:02.688533036 +0000 UTC m=+1.065857299" Aug 13 01:46:02.688682 kubelet[3290]: I0813 01:46:02.688643 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4372.1.0-a-4296cabafa" podStartSLOduration=1.6886360759999999 podStartE2EDuration="1.688636076s" podCreationTimestamp="2025-08-13 01:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 01:46:02.688601618 +0000 UTC m=+1.065925881" watchObservedRunningTime="2025-08-13 01:46:02.688636076 +0000 UTC m=+1.065960337" Aug 13 01:46:06.444565 kubelet[3290]: I0813 01:46:06.444533 3290 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Aug 13 01:46:06.444957 containerd[1938]: time="2025-08-13T01:46:06.444810764Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Aug 13 01:46:06.445171 kubelet[3290]: I0813 01:46:06.444981 3290 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Aug 13 01:46:07.288248 systemd[1]: Created slice kubepods-besteffort-podd8487f4b_21fb_4974_8506_a617f3b29e5c.slice - libcontainer container kubepods-besteffort-podd8487f4b_21fb_4974_8506_a617f3b29e5c.slice. Aug 13 01:46:07.296146 kubelet[3290]: I0813 01:46:07.296044 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d8487f4b-21fb-4974-8506-a617f3b29e5c-xtables-lock\") pod \"kube-proxy-7zqn6\" (UID: \"d8487f4b-21fb-4974-8506-a617f3b29e5c\") " pod="kube-system/kube-proxy-7zqn6" Aug 13 01:46:07.296146 kubelet[3290]: I0813 01:46:07.296132 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d8487f4b-21fb-4974-8506-a617f3b29e5c-lib-modules\") pod \"kube-proxy-7zqn6\" (UID: \"d8487f4b-21fb-4974-8506-a617f3b29e5c\") " pod="kube-system/kube-proxy-7zqn6" Aug 13 01:46:07.296525 kubelet[3290]: I0813 01:46:07.296197 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/d8487f4b-21fb-4974-8506-a617f3b29e5c-kube-proxy\") pod \"kube-proxy-7zqn6\" (UID: \"d8487f4b-21fb-4974-8506-a617f3b29e5c\") " pod="kube-system/kube-proxy-7zqn6" Aug 13 01:46:07.296525 kubelet[3290]: I0813 01:46:07.296257 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6gjw\" (UniqueName: \"kubernetes.io/projected/d8487f4b-21fb-4974-8506-a617f3b29e5c-kube-api-access-q6gjw\") pod \"kube-proxy-7zqn6\" (UID: \"d8487f4b-21fb-4974-8506-a617f3b29e5c\") " pod="kube-system/kube-proxy-7zqn6" Aug 13 01:46:07.607512 containerd[1938]: time="2025-08-13T01:46:07.607239375Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7zqn6,Uid:d8487f4b-21fb-4974-8506-a617f3b29e5c,Namespace:kube-system,Attempt:0,}" Aug 13 01:46:07.612941 systemd[1]: Created slice kubepods-besteffort-pod80c5c93f_94ed_43b2_8bbf_6828013ceb4d.slice - libcontainer container kubepods-besteffort-pod80c5c93f_94ed_43b2_8bbf_6828013ceb4d.slice. Aug 13 01:46:07.616207 containerd[1938]: time="2025-08-13T01:46:07.616180129Z" level=info msg="connecting to shim 7859634f416d68d09c80c2970b82474d3f59f643048c47afe27b6c314a9ce8bb" address="unix:///run/containerd/s/535a8221d52db138e21fed4b0f61e88b16f841ca66a88912b7454915a694c784" namespace=k8s.io protocol=ttrpc version=3 Aug 13 01:46:07.635211 systemd[1]: Started cri-containerd-7859634f416d68d09c80c2970b82474d3f59f643048c47afe27b6c314a9ce8bb.scope - libcontainer container 7859634f416d68d09c80c2970b82474d3f59f643048c47afe27b6c314a9ce8bb. Aug 13 01:46:07.647116 containerd[1938]: time="2025-08-13T01:46:07.647070802Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7zqn6,Uid:d8487f4b-21fb-4974-8506-a617f3b29e5c,Namespace:kube-system,Attempt:0,} returns sandbox id \"7859634f416d68d09c80c2970b82474d3f59f643048c47afe27b6c314a9ce8bb\"" Aug 13 01:46:07.648414 containerd[1938]: time="2025-08-13T01:46:07.648399578Z" level=info msg="CreateContainer within sandbox \"7859634f416d68d09c80c2970b82474d3f59f643048c47afe27b6c314a9ce8bb\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Aug 13 01:46:07.652650 containerd[1938]: time="2025-08-13T01:46:07.652635094Z" level=info msg="Container b002d54badd9b7ce06e0eefd79f028d19247dde5990c253dad250289b101c8f1: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:46:07.655934 containerd[1938]: time="2025-08-13T01:46:07.655832293Z" level=info msg="CreateContainer within sandbox \"7859634f416d68d09c80c2970b82474d3f59f643048c47afe27b6c314a9ce8bb\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"b002d54badd9b7ce06e0eefd79f028d19247dde5990c253dad250289b101c8f1\"" Aug 13 01:46:07.656216 containerd[1938]: time="2025-08-13T01:46:07.656171695Z" level=info msg="StartContainer for \"b002d54badd9b7ce06e0eefd79f028d19247dde5990c253dad250289b101c8f1\"" Aug 13 01:46:07.657152 containerd[1938]: time="2025-08-13T01:46:07.657108118Z" level=info msg="connecting to shim b002d54badd9b7ce06e0eefd79f028d19247dde5990c253dad250289b101c8f1" address="unix:///run/containerd/s/535a8221d52db138e21fed4b0f61e88b16f841ca66a88912b7454915a694c784" protocol=ttrpc version=3 Aug 13 01:46:07.676132 systemd[1]: Started cri-containerd-b002d54badd9b7ce06e0eefd79f028d19247dde5990c253dad250289b101c8f1.scope - libcontainer container b002d54badd9b7ce06e0eefd79f028d19247dde5990c253dad250289b101c8f1. Aug 13 01:46:07.695785 containerd[1938]: time="2025-08-13T01:46:07.695757403Z" level=info msg="StartContainer for \"b002d54badd9b7ce06e0eefd79f028d19247dde5990c253dad250289b101c8f1\" returns successfully" Aug 13 01:46:07.699808 kubelet[3290]: I0813 01:46:07.699789 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/80c5c93f-94ed-43b2-8bbf-6828013ceb4d-var-lib-calico\") pod \"tigera-operator-747864d56d-54dsc\" (UID: \"80c5c93f-94ed-43b2-8bbf-6828013ceb4d\") " pod="tigera-operator/tigera-operator-747864d56d-54dsc" Aug 13 01:46:07.700108 kubelet[3290]: I0813 01:46:07.699812 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh48f\" (UniqueName: \"kubernetes.io/projected/80c5c93f-94ed-43b2-8bbf-6828013ceb4d-kube-api-access-jh48f\") pod \"tigera-operator-747864d56d-54dsc\" (UID: \"80c5c93f-94ed-43b2-8bbf-6828013ceb4d\") " pod="tigera-operator/tigera-operator-747864d56d-54dsc" Aug 13 01:46:07.918587 containerd[1938]: time="2025-08-13T01:46:07.918465498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-54dsc,Uid:80c5c93f-94ed-43b2-8bbf-6828013ceb4d,Namespace:tigera-operator,Attempt:0,}" Aug 13 01:46:07.926122 containerd[1938]: time="2025-08-13T01:46:07.926077195Z" level=info msg="connecting to shim 89e62afa32a88a86b8db0e74301c090e82bfd266b47b02a4c26e803e803a2286" address="unix:///run/containerd/s/270d7c4f26d96d7239eaf78879efc5940c86f6a84671f487e59cb8b707d085c5" namespace=k8s.io protocol=ttrpc version=3 Aug 13 01:46:07.959335 systemd[1]: Started cri-containerd-89e62afa32a88a86b8db0e74301c090e82bfd266b47b02a4c26e803e803a2286.scope - libcontainer container 89e62afa32a88a86b8db0e74301c090e82bfd266b47b02a4c26e803e803a2286. Aug 13 01:46:08.037556 containerd[1938]: time="2025-08-13T01:46:08.037525094Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-54dsc,Uid:80c5c93f-94ed-43b2-8bbf-6828013ceb4d,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"89e62afa32a88a86b8db0e74301c090e82bfd266b47b02a4c26e803e803a2286\"" Aug 13 01:46:08.038962 containerd[1938]: time="2025-08-13T01:46:08.038946941Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Aug 13 01:46:08.702488 kubelet[3290]: I0813 01:46:08.702398 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-7zqn6" podStartSLOduration=1.702369574 podStartE2EDuration="1.702369574s" podCreationTimestamp="2025-08-13 01:46:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 01:46:08.702290098 +0000 UTC m=+7.079614360" watchObservedRunningTime="2025-08-13 01:46:08.702369574 +0000 UTC m=+7.079693836" Aug 13 01:46:09.689978 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1841134332.mount: Deactivated successfully. Aug 13 01:46:10.059317 containerd[1938]: time="2025-08-13T01:46:10.059237486Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:10.059545 containerd[1938]: time="2025-08-13T01:46:10.059389021Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Aug 13 01:46:10.059815 containerd[1938]: time="2025-08-13T01:46:10.059805467Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:10.060698 containerd[1938]: time="2025-08-13T01:46:10.060686150Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:10.061377 containerd[1938]: time="2025-08-13T01:46:10.061360975Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 2.022394775s" Aug 13 01:46:10.061425 containerd[1938]: time="2025-08-13T01:46:10.061379542Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Aug 13 01:46:10.062283 containerd[1938]: time="2025-08-13T01:46:10.062269452Z" level=info msg="CreateContainer within sandbox \"89e62afa32a88a86b8db0e74301c090e82bfd266b47b02a4c26e803e803a2286\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Aug 13 01:46:10.065008 containerd[1938]: time="2025-08-13T01:46:10.064997643Z" level=info msg="Container 4a45d2c80b92f285dd5a843fd9d6cdd12fb3078c664fef6fa321892471dfbaad: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:46:10.067379 containerd[1938]: time="2025-08-13T01:46:10.067355807Z" level=info msg="CreateContainer within sandbox \"89e62afa32a88a86b8db0e74301c090e82bfd266b47b02a4c26e803e803a2286\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"4a45d2c80b92f285dd5a843fd9d6cdd12fb3078c664fef6fa321892471dfbaad\"" Aug 13 01:46:10.067764 containerd[1938]: time="2025-08-13T01:46:10.067751891Z" level=info msg="StartContainer for \"4a45d2c80b92f285dd5a843fd9d6cdd12fb3078c664fef6fa321892471dfbaad\"" Aug 13 01:46:10.068179 containerd[1938]: time="2025-08-13T01:46:10.068166856Z" level=info msg="connecting to shim 4a45d2c80b92f285dd5a843fd9d6cdd12fb3078c664fef6fa321892471dfbaad" address="unix:///run/containerd/s/270d7c4f26d96d7239eaf78879efc5940c86f6a84671f487e59cb8b707d085c5" protocol=ttrpc version=3 Aug 13 01:46:10.082157 systemd[1]: Started cri-containerd-4a45d2c80b92f285dd5a843fd9d6cdd12fb3078c664fef6fa321892471dfbaad.scope - libcontainer container 4a45d2c80b92f285dd5a843fd9d6cdd12fb3078c664fef6fa321892471dfbaad. Aug 13 01:46:10.095849 containerd[1938]: time="2025-08-13T01:46:10.095827461Z" level=info msg="StartContainer for \"4a45d2c80b92f285dd5a843fd9d6cdd12fb3078c664fef6fa321892471dfbaad\" returns successfully" Aug 13 01:46:10.709557 kubelet[3290]: I0813 01:46:10.709527 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-747864d56d-54dsc" podStartSLOduration=1.686473365 podStartE2EDuration="3.70951672s" podCreationTimestamp="2025-08-13 01:46:07 +0000 UTC" firstStartedPulling="2025-08-13 01:46:08.038722912 +0000 UTC m=+6.416047174" lastFinishedPulling="2025-08-13 01:46:10.061766267 +0000 UTC m=+8.439090529" observedRunningTime="2025-08-13 01:46:10.709465157 +0000 UTC m=+9.086789424" watchObservedRunningTime="2025-08-13 01:46:10.70951672 +0000 UTC m=+9.086840979" Aug 13 01:46:14.479977 sudo[2238]: pam_unix(sudo:session): session closed for user root Aug 13 01:46:14.480632 sshd[2237]: Connection closed by 139.178.89.65 port 34626 Aug 13 01:46:14.480838 sshd-session[2235]: pam_unix(sshd:session): session closed for user core Aug 13 01:46:14.482693 systemd[1]: sshd@8-147.75.71.211:22-139.178.89.65:34626.service: Deactivated successfully. Aug 13 01:46:14.483829 systemd[1]: session-11.scope: Deactivated successfully. Aug 13 01:46:14.483940 systemd[1]: session-11.scope: Consumed 2.971s CPU time, 230M memory peak. Aug 13 01:46:14.485390 systemd-logind[1926]: Session 11 logged out. Waiting for processes to exit. Aug 13 01:46:14.486153 systemd-logind[1926]: Removed session 11. Aug 13 01:46:16.673978 update_engine[1931]: I20250813 01:46:16.673919 1931 update_attempter.cc:509] Updating boot flags... Aug 13 01:46:18.321665 systemd[1]: Created slice kubepods-besteffort-pod08106673_5c56_497a_9b87_f2aeed5c835c.slice - libcontainer container kubepods-besteffort-pod08106673_5c56_497a_9b87_f2aeed5c835c.slice. Aug 13 01:46:18.372505 kubelet[3290]: I0813 01:46:18.372430 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltkxf\" (UniqueName: \"kubernetes.io/projected/08106673-5c56-497a-9b87-f2aeed5c835c-kube-api-access-ltkxf\") pod \"calico-typha-567cb885c6-fw265\" (UID: \"08106673-5c56-497a-9b87-f2aeed5c835c\") " pod="calico-system/calico-typha-567cb885c6-fw265" Aug 13 01:46:18.373541 kubelet[3290]: I0813 01:46:18.372541 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08106673-5c56-497a-9b87-f2aeed5c835c-tigera-ca-bundle\") pod \"calico-typha-567cb885c6-fw265\" (UID: \"08106673-5c56-497a-9b87-f2aeed5c835c\") " pod="calico-system/calico-typha-567cb885c6-fw265" Aug 13 01:46:18.373541 kubelet[3290]: I0813 01:46:18.372604 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/08106673-5c56-497a-9b87-f2aeed5c835c-typha-certs\") pod \"calico-typha-567cb885c6-fw265\" (UID: \"08106673-5c56-497a-9b87-f2aeed5c835c\") " pod="calico-system/calico-typha-567cb885c6-fw265" Aug 13 01:46:18.626133 containerd[1938]: time="2025-08-13T01:46:18.626063341Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-567cb885c6-fw265,Uid:08106673-5c56-497a-9b87-f2aeed5c835c,Namespace:calico-system,Attempt:0,}" Aug 13 01:46:18.634159 containerd[1938]: time="2025-08-13T01:46:18.634101210Z" level=info msg="connecting to shim 8bc358a2dbe157e3ee8ed4e176274353d021f349957d0816d998280039a83973" address="unix:///run/containerd/s/82abcece01e9804a75b9894756810957ca9bc4ac2f3a9e56898d4319d315282d" namespace=k8s.io protocol=ttrpc version=3 Aug 13 01:46:18.650044 systemd[1]: Started cri-containerd-8bc358a2dbe157e3ee8ed4e176274353d021f349957d0816d998280039a83973.scope - libcontainer container 8bc358a2dbe157e3ee8ed4e176274353d021f349957d0816d998280039a83973. Aug 13 01:46:18.675897 containerd[1938]: time="2025-08-13T01:46:18.675839893Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-567cb885c6-fw265,Uid:08106673-5c56-497a-9b87-f2aeed5c835c,Namespace:calico-system,Attempt:0,} returns sandbox id \"8bc358a2dbe157e3ee8ed4e176274353d021f349957d0816d998280039a83973\"" Aug 13 01:46:18.676493 containerd[1938]: time="2025-08-13T01:46:18.676480429Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Aug 13 01:46:18.717019 systemd[1]: Created slice kubepods-besteffort-podb9456226_8c07_4eb0_b523_776a7baa3f0d.slice - libcontainer container kubepods-besteffort-podb9456226_8c07_4eb0_b523_776a7baa3f0d.slice. Aug 13 01:46:18.774282 kubelet[3290]: I0813 01:46:18.774189 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b9456226-8c07-4eb0-b523-776a7baa3f0d-node-certs\") pod \"calico-node-89f4b\" (UID: \"b9456226-8c07-4eb0-b523-776a7baa3f0d\") " pod="calico-system/calico-node-89f4b" Aug 13 01:46:18.774561 kubelet[3290]: I0813 01:46:18.774383 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b9456226-8c07-4eb0-b523-776a7baa3f0d-cni-net-dir\") pod \"calico-node-89f4b\" (UID: \"b9456226-8c07-4eb0-b523-776a7baa3f0d\") " pod="calico-system/calico-node-89f4b" Aug 13 01:46:18.774561 kubelet[3290]: I0813 01:46:18.774475 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b9456226-8c07-4eb0-b523-776a7baa3f0d-flexvol-driver-host\") pod \"calico-node-89f4b\" (UID: \"b9456226-8c07-4eb0-b523-776a7baa3f0d\") " pod="calico-system/calico-node-89f4b" Aug 13 01:46:18.774797 kubelet[3290]: I0813 01:46:18.774568 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r5c9\" (UniqueName: \"kubernetes.io/projected/b9456226-8c07-4eb0-b523-776a7baa3f0d-kube-api-access-2r5c9\") pod \"calico-node-89f4b\" (UID: \"b9456226-8c07-4eb0-b523-776a7baa3f0d\") " pod="calico-system/calico-node-89f4b" Aug 13 01:46:18.774797 kubelet[3290]: I0813 01:46:18.774631 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9456226-8c07-4eb0-b523-776a7baa3f0d-lib-modules\") pod \"calico-node-89f4b\" (UID: \"b9456226-8c07-4eb0-b523-776a7baa3f0d\") " pod="calico-system/calico-node-89f4b" Aug 13 01:46:18.774797 kubelet[3290]: I0813 01:46:18.774680 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9456226-8c07-4eb0-b523-776a7baa3f0d-tigera-ca-bundle\") pod \"calico-node-89f4b\" (UID: \"b9456226-8c07-4eb0-b523-776a7baa3f0d\") " pod="calico-system/calico-node-89f4b" Aug 13 01:46:18.774797 kubelet[3290]: I0813 01:46:18.774730 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b9456226-8c07-4eb0-b523-776a7baa3f0d-var-run-calico\") pod \"calico-node-89f4b\" (UID: \"b9456226-8c07-4eb0-b523-776a7baa3f0d\") " pod="calico-system/calico-node-89f4b" Aug 13 01:46:18.774797 kubelet[3290]: I0813 01:46:18.774781 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b9456226-8c07-4eb0-b523-776a7baa3f0d-cni-bin-dir\") pod \"calico-node-89f4b\" (UID: \"b9456226-8c07-4eb0-b523-776a7baa3f0d\") " pod="calico-system/calico-node-89f4b" Aug 13 01:46:18.775310 kubelet[3290]: I0813 01:46:18.774832 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b9456226-8c07-4eb0-b523-776a7baa3f0d-cni-log-dir\") pod \"calico-node-89f4b\" (UID: \"b9456226-8c07-4eb0-b523-776a7baa3f0d\") " pod="calico-system/calico-node-89f4b" Aug 13 01:46:18.775310 kubelet[3290]: I0813 01:46:18.774932 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b9456226-8c07-4eb0-b523-776a7baa3f0d-policysync\") pod \"calico-node-89f4b\" (UID: \"b9456226-8c07-4eb0-b523-776a7baa3f0d\") " pod="calico-system/calico-node-89f4b" Aug 13 01:46:18.775310 kubelet[3290]: I0813 01:46:18.774983 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b9456226-8c07-4eb0-b523-776a7baa3f0d-var-lib-calico\") pod \"calico-node-89f4b\" (UID: \"b9456226-8c07-4eb0-b523-776a7baa3f0d\") " pod="calico-system/calico-node-89f4b" Aug 13 01:46:18.775310 kubelet[3290]: I0813 01:46:18.775029 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b9456226-8c07-4eb0-b523-776a7baa3f0d-xtables-lock\") pod \"calico-node-89f4b\" (UID: \"b9456226-8c07-4eb0-b523-776a7baa3f0d\") " pod="calico-system/calico-node-89f4b" Aug 13 01:46:18.879444 kubelet[3290]: E0813 01:46:18.879271 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:18.879444 kubelet[3290]: W0813 01:46:18.879320 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:18.879444 kubelet[3290]: E0813 01:46:18.879397 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:18.884774 kubelet[3290]: E0813 01:46:18.884706 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:18.884774 kubelet[3290]: W0813 01:46:18.884746 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:18.885122 kubelet[3290]: E0813 01:46:18.884786 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:18.897702 kubelet[3290]: E0813 01:46:18.897651 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:18.897702 kubelet[3290]: W0813 01:46:18.897697 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:18.898047 kubelet[3290]: E0813 01:46:18.897743 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:18.985256 kubelet[3290]: E0813 01:46:18.985147 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fd57q" podUID="bde1b4b3-33c0-4855-9196-5fe2289fff45" Aug 13 01:46:19.019247 containerd[1938]: time="2025-08-13T01:46:19.019167246Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-89f4b,Uid:b9456226-8c07-4eb0-b523-776a7baa3f0d,Namespace:calico-system,Attempt:0,}" Aug 13 01:46:19.026792 containerd[1938]: time="2025-08-13T01:46:19.026769054Z" level=info msg="connecting to shim 9360514811ec3057aaabdb8bc4c74261d7b00940d25584a389ec70dd388ede4e" address="unix:///run/containerd/s/1a3b6de1faddb0eb4abe97e2c67d188e65513d0e50b9df11d00977705b971dab" namespace=k8s.io protocol=ttrpc version=3 Aug 13 01:46:19.052058 systemd[1]: Started cri-containerd-9360514811ec3057aaabdb8bc4c74261d7b00940d25584a389ec70dd388ede4e.scope - libcontainer container 9360514811ec3057aaabdb8bc4c74261d7b00940d25584a389ec70dd388ede4e. Aug 13 01:46:19.060419 kubelet[3290]: E0813 01:46:19.060401 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.060419 kubelet[3290]: W0813 01:46:19.060415 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.060532 kubelet[3290]: E0813 01:46:19.060429 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.060566 kubelet[3290]: E0813 01:46:19.060554 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.060601 kubelet[3290]: W0813 01:46:19.060567 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.060601 kubelet[3290]: E0813 01:46:19.060575 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.060737 kubelet[3290]: E0813 01:46:19.060728 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.060775 kubelet[3290]: W0813 01:46:19.060739 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.060775 kubelet[3290]: E0813 01:46:19.060751 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.060916 kubelet[3290]: E0813 01:46:19.060905 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.060916 kubelet[3290]: W0813 01:46:19.060912 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.061011 kubelet[3290]: E0813 01:46:19.060919 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.061056 kubelet[3290]: E0813 01:46:19.061050 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.061096 kubelet[3290]: W0813 01:46:19.061056 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.061096 kubelet[3290]: E0813 01:46:19.061063 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.061187 kubelet[3290]: E0813 01:46:19.061174 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.061187 kubelet[3290]: W0813 01:46:19.061187 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.061270 kubelet[3290]: E0813 01:46:19.061194 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.061312 kubelet[3290]: E0813 01:46:19.061305 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.061312 kubelet[3290]: W0813 01:46:19.061311 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.061394 kubelet[3290]: E0813 01:46:19.061317 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.061436 kubelet[3290]: E0813 01:46:19.061424 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.061436 kubelet[3290]: W0813 01:46:19.061430 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.061436 kubelet[3290]: E0813 01:46:19.061436 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.061553 kubelet[3290]: E0813 01:46:19.061543 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.061553 kubelet[3290]: W0813 01:46:19.061550 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.061619 kubelet[3290]: E0813 01:46:19.061557 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.061667 kubelet[3290]: E0813 01:46:19.061658 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.061667 kubelet[3290]: W0813 01:46:19.061665 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.061713 kubelet[3290]: E0813 01:46:19.061670 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.061771 kubelet[3290]: E0813 01:46:19.061763 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.061771 kubelet[3290]: W0813 01:46:19.061769 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.061824 kubelet[3290]: E0813 01:46:19.061780 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.061898 kubelet[3290]: E0813 01:46:19.061889 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.061898 kubelet[3290]: W0813 01:46:19.061896 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.061954 kubelet[3290]: E0813 01:46:19.061901 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.062017 kubelet[3290]: E0813 01:46:19.062011 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.062046 kubelet[3290]: W0813 01:46:19.062017 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.062046 kubelet[3290]: E0813 01:46:19.062025 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.062132 kubelet[3290]: E0813 01:46:19.062125 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.062132 kubelet[3290]: W0813 01:46:19.062131 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.062178 kubelet[3290]: E0813 01:46:19.062137 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.062242 kubelet[3290]: E0813 01:46:19.062235 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.062269 kubelet[3290]: W0813 01:46:19.062242 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.062269 kubelet[3290]: E0813 01:46:19.062248 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.062358 kubelet[3290]: E0813 01:46:19.062351 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.062382 kubelet[3290]: W0813 01:46:19.062358 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.062382 kubelet[3290]: E0813 01:46:19.062364 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.062472 kubelet[3290]: E0813 01:46:19.062466 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.062500 kubelet[3290]: W0813 01:46:19.062472 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.062500 kubelet[3290]: E0813 01:46:19.062478 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.062586 kubelet[3290]: E0813 01:46:19.062579 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.062609 kubelet[3290]: W0813 01:46:19.062585 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.062609 kubelet[3290]: E0813 01:46:19.062591 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.062695 kubelet[3290]: E0813 01:46:19.062688 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.062724 kubelet[3290]: W0813 01:46:19.062695 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.062724 kubelet[3290]: E0813 01:46:19.062700 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.062810 kubelet[3290]: E0813 01:46:19.062803 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.062833 kubelet[3290]: W0813 01:46:19.062810 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.062833 kubelet[3290]: E0813 01:46:19.062816 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.065941 containerd[1938]: time="2025-08-13T01:46:19.065925259Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-89f4b,Uid:b9456226-8c07-4eb0-b523-776a7baa3f0d,Namespace:calico-system,Attempt:0,} returns sandbox id \"9360514811ec3057aaabdb8bc4c74261d7b00940d25584a389ec70dd388ede4e\"" Aug 13 01:46:19.078252 kubelet[3290]: E0813 01:46:19.078240 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.078252 kubelet[3290]: W0813 01:46:19.078249 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.078315 kubelet[3290]: E0813 01:46:19.078258 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.078315 kubelet[3290]: I0813 01:46:19.078274 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqtdp\" (UniqueName: \"kubernetes.io/projected/bde1b4b3-33c0-4855-9196-5fe2289fff45-kube-api-access-tqtdp\") pod \"csi-node-driver-fd57q\" (UID: \"bde1b4b3-33c0-4855-9196-5fe2289fff45\") " pod="calico-system/csi-node-driver-fd57q" Aug 13 01:46:19.078438 kubelet[3290]: E0813 01:46:19.078427 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.078438 kubelet[3290]: W0813 01:46:19.078436 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.078492 kubelet[3290]: E0813 01:46:19.078446 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.078492 kubelet[3290]: I0813 01:46:19.078458 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/bde1b4b3-33c0-4855-9196-5fe2289fff45-varrun\") pod \"csi-node-driver-fd57q\" (UID: \"bde1b4b3-33c0-4855-9196-5fe2289fff45\") " pod="calico-system/csi-node-driver-fd57q" Aug 13 01:46:19.078608 kubelet[3290]: E0813 01:46:19.078598 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.078608 kubelet[3290]: W0813 01:46:19.078606 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.078657 kubelet[3290]: E0813 01:46:19.078615 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.078657 kubelet[3290]: I0813 01:46:19.078628 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bde1b4b3-33c0-4855-9196-5fe2289fff45-registration-dir\") pod \"csi-node-driver-fd57q\" (UID: \"bde1b4b3-33c0-4855-9196-5fe2289fff45\") " pod="calico-system/csi-node-driver-fd57q" Aug 13 01:46:19.078763 kubelet[3290]: E0813 01:46:19.078754 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.078763 kubelet[3290]: W0813 01:46:19.078761 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.078809 kubelet[3290]: E0813 01:46:19.078769 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.078809 kubelet[3290]: I0813 01:46:19.078780 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bde1b4b3-33c0-4855-9196-5fe2289fff45-socket-dir\") pod \"csi-node-driver-fd57q\" (UID: \"bde1b4b3-33c0-4855-9196-5fe2289fff45\") " pod="calico-system/csi-node-driver-fd57q" Aug 13 01:46:19.078967 kubelet[3290]: E0813 01:46:19.078957 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.078967 kubelet[3290]: W0813 01:46:19.078966 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.079022 kubelet[3290]: E0813 01:46:19.078977 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.079022 kubelet[3290]: I0813 01:46:19.078989 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bde1b4b3-33c0-4855-9196-5fe2289fff45-kubelet-dir\") pod \"csi-node-driver-fd57q\" (UID: \"bde1b4b3-33c0-4855-9196-5fe2289fff45\") " pod="calico-system/csi-node-driver-fd57q" Aug 13 01:46:19.079154 kubelet[3290]: E0813 01:46:19.079144 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.079154 kubelet[3290]: W0813 01:46:19.079153 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.079222 kubelet[3290]: E0813 01:46:19.079162 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.079351 kubelet[3290]: E0813 01:46:19.079343 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.079387 kubelet[3290]: W0813 01:46:19.079351 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.079387 kubelet[3290]: E0813 01:46:19.079360 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.079518 kubelet[3290]: E0813 01:46:19.079511 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.079544 kubelet[3290]: W0813 01:46:19.079518 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.079544 kubelet[3290]: E0813 01:46:19.079525 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.079633 kubelet[3290]: E0813 01:46:19.079627 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.079633 kubelet[3290]: W0813 01:46:19.079633 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.079681 kubelet[3290]: E0813 01:46:19.079670 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.079741 kubelet[3290]: E0813 01:46:19.079735 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.079769 kubelet[3290]: W0813 01:46:19.079741 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.079769 kubelet[3290]: E0813 01:46:19.079753 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.079849 kubelet[3290]: E0813 01:46:19.079842 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.079849 kubelet[3290]: W0813 01:46:19.079848 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.079903 kubelet[3290]: E0813 01:46:19.079859 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.079958 kubelet[3290]: E0813 01:46:19.079951 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.079958 kubelet[3290]: W0813 01:46:19.079957 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.080006 kubelet[3290]: E0813 01:46:19.079964 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.080093 kubelet[3290]: E0813 01:46:19.080086 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.080093 kubelet[3290]: W0813 01:46:19.080092 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.080137 kubelet[3290]: E0813 01:46:19.080098 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.080199 kubelet[3290]: E0813 01:46:19.080192 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.080199 kubelet[3290]: W0813 01:46:19.080198 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.080246 kubelet[3290]: E0813 01:46:19.080204 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.080301 kubelet[3290]: E0813 01:46:19.080294 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.080326 kubelet[3290]: W0813 01:46:19.080300 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.080326 kubelet[3290]: E0813 01:46:19.080305 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.180565 kubelet[3290]: E0813 01:46:19.180400 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.180565 kubelet[3290]: W0813 01:46:19.180440 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.180565 kubelet[3290]: E0813 01:46:19.180471 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.180969 kubelet[3290]: E0813 01:46:19.180896 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.180969 kubelet[3290]: W0813 01:46:19.180922 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.180969 kubelet[3290]: E0813 01:46:19.180948 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.181493 kubelet[3290]: E0813 01:46:19.181426 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.181493 kubelet[3290]: W0813 01:46:19.181453 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.181493 kubelet[3290]: E0813 01:46:19.181492 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.182040 kubelet[3290]: E0813 01:46:19.181998 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.182040 kubelet[3290]: W0813 01:46:19.182036 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.182277 kubelet[3290]: E0813 01:46:19.182076 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.182568 kubelet[3290]: E0813 01:46:19.182520 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.182568 kubelet[3290]: W0813 01:46:19.182545 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.182767 kubelet[3290]: E0813 01:46:19.182576 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.182990 kubelet[3290]: E0813 01:46:19.182963 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.182990 kubelet[3290]: W0813 01:46:19.182985 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.183199 kubelet[3290]: E0813 01:46:19.183066 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.183382 kubelet[3290]: E0813 01:46:19.183359 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.183382 kubelet[3290]: W0813 01:46:19.183380 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.183562 kubelet[3290]: E0813 01:46:19.183454 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.183799 kubelet[3290]: E0813 01:46:19.183776 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.183921 kubelet[3290]: W0813 01:46:19.183797 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.183921 kubelet[3290]: E0813 01:46:19.183856 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.184183 kubelet[3290]: E0813 01:46:19.184158 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.184183 kubelet[3290]: W0813 01:46:19.184180 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.184365 kubelet[3290]: E0813 01:46:19.184238 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.184620 kubelet[3290]: E0813 01:46:19.184585 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.184772 kubelet[3290]: W0813 01:46:19.184622 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.184772 kubelet[3290]: E0813 01:46:19.184721 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.185190 kubelet[3290]: E0813 01:46:19.185150 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.185190 kubelet[3290]: W0813 01:46:19.185187 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.185503 kubelet[3290]: E0813 01:46:19.185249 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.185662 kubelet[3290]: E0813 01:46:19.185633 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.185662 kubelet[3290]: W0813 01:46:19.185658 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.185970 kubelet[3290]: E0813 01:46:19.185739 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.186111 kubelet[3290]: E0813 01:46:19.186074 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.186111 kubelet[3290]: W0813 01:46:19.186106 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.186323 kubelet[3290]: E0813 01:46:19.186164 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.186572 kubelet[3290]: E0813 01:46:19.186544 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.186673 kubelet[3290]: W0813 01:46:19.186570 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.186673 kubelet[3290]: E0813 01:46:19.186657 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.187101 kubelet[3290]: E0813 01:46:19.187066 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.187101 kubelet[3290]: W0813 01:46:19.187091 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.187362 kubelet[3290]: E0813 01:46:19.187144 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.187523 kubelet[3290]: E0813 01:46:19.187490 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.187523 kubelet[3290]: W0813 01:46:19.187516 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.187740 kubelet[3290]: E0813 01:46:19.187568 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.187991 kubelet[3290]: E0813 01:46:19.187947 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.187991 kubelet[3290]: W0813 01:46:19.187972 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.188391 kubelet[3290]: E0813 01:46:19.188038 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.188391 kubelet[3290]: E0813 01:46:19.188372 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.188718 kubelet[3290]: W0813 01:46:19.188402 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.188718 kubelet[3290]: E0813 01:46:19.188457 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.188952 kubelet[3290]: E0813 01:46:19.188859 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.188952 kubelet[3290]: W0813 01:46:19.188905 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.189184 kubelet[3290]: E0813 01:46:19.188980 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.189438 kubelet[3290]: E0813 01:46:19.189380 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.189438 kubelet[3290]: W0813 01:46:19.189406 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.189638 kubelet[3290]: E0813 01:46:19.189447 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.189858 kubelet[3290]: E0813 01:46:19.189826 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.189858 kubelet[3290]: W0813 01:46:19.189852 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.190110 kubelet[3290]: E0813 01:46:19.189942 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.190413 kubelet[3290]: E0813 01:46:19.190381 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.190413 kubelet[3290]: W0813 01:46:19.190409 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.190597 kubelet[3290]: E0813 01:46:19.190460 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.190845 kubelet[3290]: E0813 01:46:19.190817 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.191005 kubelet[3290]: W0813 01:46:19.190843 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.191005 kubelet[3290]: E0813 01:46:19.190916 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.191636 kubelet[3290]: E0813 01:46:19.191572 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.191636 kubelet[3290]: W0813 01:46:19.191623 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.191853 kubelet[3290]: E0813 01:46:19.191671 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.192403 kubelet[3290]: E0813 01:46:19.192332 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.192403 kubelet[3290]: W0813 01:46:19.192369 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.192403 kubelet[3290]: E0813 01:46:19.192403 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:19.209984 kubelet[3290]: E0813 01:46:19.209885 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:19.209984 kubelet[3290]: W0813 01:46:19.209936 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:19.209984 kubelet[3290]: E0813 01:46:19.209974 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:20.229663 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1655072072.mount: Deactivated successfully. Aug 13 01:46:20.668190 kubelet[3290]: E0813 01:46:20.668094 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fd57q" podUID="bde1b4b3-33c0-4855-9196-5fe2289fff45" Aug 13 01:46:20.923786 containerd[1938]: time="2025-08-13T01:46:20.923732965Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:20.924006 containerd[1938]: time="2025-08-13T01:46:20.923927270Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Aug 13 01:46:20.924311 containerd[1938]: time="2025-08-13T01:46:20.924299606Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:20.925168 containerd[1938]: time="2025-08-13T01:46:20.925124433Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:20.925518 containerd[1938]: time="2025-08-13T01:46:20.925470067Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 2.24897388s" Aug 13 01:46:20.925518 containerd[1938]: time="2025-08-13T01:46:20.925486389Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Aug 13 01:46:20.925948 containerd[1938]: time="2025-08-13T01:46:20.925938312Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Aug 13 01:46:20.928923 containerd[1938]: time="2025-08-13T01:46:20.928907855Z" level=info msg="CreateContainer within sandbox \"8bc358a2dbe157e3ee8ed4e176274353d021f349957d0816d998280039a83973\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Aug 13 01:46:20.931617 containerd[1938]: time="2025-08-13T01:46:20.931579481Z" level=info msg="Container 43d7b08538c2fb9babdca744c8acf8d9f63530f1dc197c674c1418b7f27a0e74: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:46:20.934287 containerd[1938]: time="2025-08-13T01:46:20.934246842Z" level=info msg="CreateContainer within sandbox \"8bc358a2dbe157e3ee8ed4e176274353d021f349957d0816d998280039a83973\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"43d7b08538c2fb9babdca744c8acf8d9f63530f1dc197c674c1418b7f27a0e74\"" Aug 13 01:46:20.934486 containerd[1938]: time="2025-08-13T01:46:20.934442509Z" level=info msg="StartContainer for \"43d7b08538c2fb9babdca744c8acf8d9f63530f1dc197c674c1418b7f27a0e74\"" Aug 13 01:46:20.934973 containerd[1938]: time="2025-08-13T01:46:20.934938166Z" level=info msg="connecting to shim 43d7b08538c2fb9babdca744c8acf8d9f63530f1dc197c674c1418b7f27a0e74" address="unix:///run/containerd/s/82abcece01e9804a75b9894756810957ca9bc4ac2f3a9e56898d4319d315282d" protocol=ttrpc version=3 Aug 13 01:46:20.948950 systemd[1]: Started cri-containerd-43d7b08538c2fb9babdca744c8acf8d9f63530f1dc197c674c1418b7f27a0e74.scope - libcontainer container 43d7b08538c2fb9babdca744c8acf8d9f63530f1dc197c674c1418b7f27a0e74. Aug 13 01:46:20.977074 containerd[1938]: time="2025-08-13T01:46:20.977050553Z" level=info msg="StartContainer for \"43d7b08538c2fb9babdca744c8acf8d9f63530f1dc197c674c1418b7f27a0e74\" returns successfully" Aug 13 01:46:21.750093 kubelet[3290]: I0813 01:46:21.749964 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-567cb885c6-fw265" podStartSLOduration=1.5004304990000001 podStartE2EDuration="3.749937362s" podCreationTimestamp="2025-08-13 01:46:18 +0000 UTC" firstStartedPulling="2025-08-13 01:46:18.676362014 +0000 UTC m=+17.053686277" lastFinishedPulling="2025-08-13 01:46:20.925868878 +0000 UTC m=+19.303193140" observedRunningTime="2025-08-13 01:46:21.749716766 +0000 UTC m=+20.127041071" watchObservedRunningTime="2025-08-13 01:46:21.749937362 +0000 UTC m=+20.127261653" Aug 13 01:46:21.786122 kubelet[3290]: E0813 01:46:21.786003 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.786122 kubelet[3290]: W0813 01:46:21.786067 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.786122 kubelet[3290]: E0813 01:46:21.786105 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.786701 kubelet[3290]: E0813 01:46:21.786623 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.786701 kubelet[3290]: W0813 01:46:21.786655 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.786701 kubelet[3290]: E0813 01:46:21.786686 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.787294 kubelet[3290]: E0813 01:46:21.787215 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.787294 kubelet[3290]: W0813 01:46:21.787251 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.787294 kubelet[3290]: E0813 01:46:21.787280 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.787902 kubelet[3290]: E0813 01:46:21.787835 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.787902 kubelet[3290]: W0813 01:46:21.787861 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.787902 kubelet[3290]: E0813 01:46:21.787902 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.788483 kubelet[3290]: E0813 01:46:21.788406 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.788483 kubelet[3290]: W0813 01:46:21.788437 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.788483 kubelet[3290]: E0813 01:46:21.788465 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.789023 kubelet[3290]: E0813 01:46:21.788946 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.789023 kubelet[3290]: W0813 01:46:21.788970 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.789023 kubelet[3290]: E0813 01:46:21.788992 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.789539 kubelet[3290]: E0813 01:46:21.789465 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.789539 kubelet[3290]: W0813 01:46:21.789490 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.789539 kubelet[3290]: E0813 01:46:21.789515 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.789982 kubelet[3290]: E0813 01:46:21.789952 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.789982 kubelet[3290]: W0813 01:46:21.789976 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.790166 kubelet[3290]: E0813 01:46:21.789996 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.790438 kubelet[3290]: E0813 01:46:21.790412 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.790438 kubelet[3290]: W0813 01:46:21.790435 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.790606 kubelet[3290]: E0813 01:46:21.790456 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.790879 kubelet[3290]: E0813 01:46:21.790846 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.790980 kubelet[3290]: W0813 01:46:21.790886 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.790980 kubelet[3290]: E0813 01:46:21.790910 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.791461 kubelet[3290]: E0813 01:46:21.791411 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.791461 kubelet[3290]: W0813 01:46:21.791438 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.791684 kubelet[3290]: E0813 01:46:21.791464 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.792057 kubelet[3290]: E0813 01:46:21.792000 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.792057 kubelet[3290]: W0813 01:46:21.792028 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.792057 kubelet[3290]: E0813 01:46:21.792057 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.792625 kubelet[3290]: E0813 01:46:21.792573 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.792625 kubelet[3290]: W0813 01:46:21.792601 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.792625 kubelet[3290]: E0813 01:46:21.792627 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.793239 kubelet[3290]: E0813 01:46:21.793175 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.793239 kubelet[3290]: W0813 01:46:21.793208 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.793239 kubelet[3290]: E0813 01:46:21.793235 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.793788 kubelet[3290]: E0813 01:46:21.793732 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.793788 kubelet[3290]: W0813 01:46:21.793759 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.793788 kubelet[3290]: E0813 01:46:21.793785 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.806821 kubelet[3290]: E0813 01:46:21.806766 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.806821 kubelet[3290]: W0813 01:46:21.806809 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.807278 kubelet[3290]: E0813 01:46:21.806846 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.807627 kubelet[3290]: E0813 01:46:21.807545 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.807627 kubelet[3290]: W0813 01:46:21.807584 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.807966 kubelet[3290]: E0813 01:46:21.807632 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.808394 kubelet[3290]: E0813 01:46:21.808333 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.808394 kubelet[3290]: W0813 01:46:21.808386 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.808705 kubelet[3290]: E0813 01:46:21.808444 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.809076 kubelet[3290]: E0813 01:46:21.808997 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.809076 kubelet[3290]: W0813 01:46:21.809040 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.809376 kubelet[3290]: E0813 01:46:21.809092 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.809698 kubelet[3290]: E0813 01:46:21.809656 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.809698 kubelet[3290]: W0813 01:46:21.809689 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.810040 kubelet[3290]: E0813 01:46:21.809730 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.810299 kubelet[3290]: E0813 01:46:21.810262 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.810299 kubelet[3290]: W0813 01:46:21.810292 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.810585 kubelet[3290]: E0813 01:46:21.810382 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.810775 kubelet[3290]: E0813 01:46:21.810740 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.810775 kubelet[3290]: W0813 01:46:21.810768 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.811106 kubelet[3290]: E0813 01:46:21.810824 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.811272 kubelet[3290]: E0813 01:46:21.811238 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.811272 kubelet[3290]: W0813 01:46:21.811267 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.811521 kubelet[3290]: E0813 01:46:21.811354 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.811730 kubelet[3290]: E0813 01:46:21.811694 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.811730 kubelet[3290]: W0813 01:46:21.811719 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.812041 kubelet[3290]: E0813 01:46:21.811752 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.812439 kubelet[3290]: E0813 01:46:21.812408 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.812439 kubelet[3290]: W0813 01:46:21.812435 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.812669 kubelet[3290]: E0813 01:46:21.812551 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.812926 kubelet[3290]: E0813 01:46:21.812859 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.812926 kubelet[3290]: W0813 01:46:21.812920 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.813228 kubelet[3290]: E0813 01:46:21.813033 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.813359 kubelet[3290]: E0813 01:46:21.813333 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.813494 kubelet[3290]: W0813 01:46:21.813357 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.813494 kubelet[3290]: E0813 01:46:21.813393 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.813948 kubelet[3290]: E0813 01:46:21.813907 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.814075 kubelet[3290]: W0813 01:46:21.813956 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.814075 kubelet[3290]: E0813 01:46:21.814009 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.814617 kubelet[3290]: E0813 01:46:21.814580 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.814617 kubelet[3290]: W0813 01:46:21.814608 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.814905 kubelet[3290]: E0813 01:46:21.814643 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.815433 kubelet[3290]: E0813 01:46:21.815391 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.815433 kubelet[3290]: W0813 01:46:21.815429 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.815728 kubelet[3290]: E0813 01:46:21.815475 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.816024 kubelet[3290]: E0813 01:46:21.815989 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.816024 kubelet[3290]: W0813 01:46:21.816017 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.816309 kubelet[3290]: E0813 01:46:21.816051 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.816575 kubelet[3290]: E0813 01:46:21.816540 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.816575 kubelet[3290]: W0813 01:46:21.816568 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.816969 kubelet[3290]: E0813 01:46:21.816600 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:21.817204 kubelet[3290]: E0813 01:46:21.817167 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 01:46:21.817204 kubelet[3290]: W0813 01:46:21.817195 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 01:46:21.817511 kubelet[3290]: E0813 01:46:21.817224 3290 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 01:46:22.460635 containerd[1938]: time="2025-08-13T01:46:22.460610386Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:22.460851 containerd[1938]: time="2025-08-13T01:46:22.460755156Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Aug 13 01:46:22.461216 containerd[1938]: time="2025-08-13T01:46:22.461192013Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:22.461947 containerd[1938]: time="2025-08-13T01:46:22.461910876Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:22.462252 containerd[1938]: time="2025-08-13T01:46:22.462236761Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 1.536284385s" Aug 13 01:46:22.462296 containerd[1938]: time="2025-08-13T01:46:22.462253853Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Aug 13 01:46:22.463187 containerd[1938]: time="2025-08-13T01:46:22.463173889Z" level=info msg="CreateContainer within sandbox \"9360514811ec3057aaabdb8bc4c74261d7b00940d25584a389ec70dd388ede4e\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Aug 13 01:46:22.466419 containerd[1938]: time="2025-08-13T01:46:22.466407745Z" level=info msg="Container ff8bdda17377b74d3e12f4e497b9fca2abd23e4bc39d2e5a37a1b8cf0a1be28d: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:46:22.469530 containerd[1938]: time="2025-08-13T01:46:22.469492227Z" level=info msg="CreateContainer within sandbox \"9360514811ec3057aaabdb8bc4c74261d7b00940d25584a389ec70dd388ede4e\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ff8bdda17377b74d3e12f4e497b9fca2abd23e4bc39d2e5a37a1b8cf0a1be28d\"" Aug 13 01:46:22.469711 containerd[1938]: time="2025-08-13T01:46:22.469700186Z" level=info msg="StartContainer for \"ff8bdda17377b74d3e12f4e497b9fca2abd23e4bc39d2e5a37a1b8cf0a1be28d\"" Aug 13 01:46:22.470524 containerd[1938]: time="2025-08-13T01:46:22.470477306Z" level=info msg="connecting to shim ff8bdda17377b74d3e12f4e497b9fca2abd23e4bc39d2e5a37a1b8cf0a1be28d" address="unix:///run/containerd/s/1a3b6de1faddb0eb4abe97e2c67d188e65513d0e50b9df11d00977705b971dab" protocol=ttrpc version=3 Aug 13 01:46:22.499003 systemd[1]: Started cri-containerd-ff8bdda17377b74d3e12f4e497b9fca2abd23e4bc39d2e5a37a1b8cf0a1be28d.scope - libcontainer container ff8bdda17377b74d3e12f4e497b9fca2abd23e4bc39d2e5a37a1b8cf0a1be28d. Aug 13 01:46:22.524710 containerd[1938]: time="2025-08-13T01:46:22.524647079Z" level=info msg="StartContainer for \"ff8bdda17377b74d3e12f4e497b9fca2abd23e4bc39d2e5a37a1b8cf0a1be28d\" returns successfully" Aug 13 01:46:22.530009 systemd[1]: cri-containerd-ff8bdda17377b74d3e12f4e497b9fca2abd23e4bc39d2e5a37a1b8cf0a1be28d.scope: Deactivated successfully. Aug 13 01:46:22.531309 containerd[1938]: time="2025-08-13T01:46:22.531283086Z" level=info msg="received exit event container_id:\"ff8bdda17377b74d3e12f4e497b9fca2abd23e4bc39d2e5a37a1b8cf0a1be28d\" id:\"ff8bdda17377b74d3e12f4e497b9fca2abd23e4bc39d2e5a37a1b8cf0a1be28d\" pid:4117 exited_at:{seconds:1755049582 nanos:531021601}" Aug 13 01:46:22.531309 containerd[1938]: time="2025-08-13T01:46:22.531301890Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff8bdda17377b74d3e12f4e497b9fca2abd23e4bc39d2e5a37a1b8cf0a1be28d\" id:\"ff8bdda17377b74d3e12f4e497b9fca2abd23e4bc39d2e5a37a1b8cf0a1be28d\" pid:4117 exited_at:{seconds:1755049582 nanos:531021601}" Aug 13 01:46:22.547709 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ff8bdda17377b74d3e12f4e497b9fca2abd23e4bc39d2e5a37a1b8cf0a1be28d-rootfs.mount: Deactivated successfully. Aug 13 01:46:22.667559 kubelet[3290]: E0813 01:46:22.667408 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fd57q" podUID="bde1b4b3-33c0-4855-9196-5fe2289fff45" Aug 13 01:46:23.742296 containerd[1938]: time="2025-08-13T01:46:23.742200319Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Aug 13 01:46:24.667842 kubelet[3290]: E0813 01:46:24.667710 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fd57q" podUID="bde1b4b3-33c0-4855-9196-5fe2289fff45" Aug 13 01:46:26.667362 kubelet[3290]: E0813 01:46:26.667342 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fd57q" podUID="bde1b4b3-33c0-4855-9196-5fe2289fff45" Aug 13 01:46:26.893354 containerd[1938]: time="2025-08-13T01:46:26.893328722Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:26.893596 containerd[1938]: time="2025-08-13T01:46:26.893555104Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Aug 13 01:46:26.893989 containerd[1938]: time="2025-08-13T01:46:26.893945805Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:26.894836 containerd[1938]: time="2025-08-13T01:46:26.894823049Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:26.895255 containerd[1938]: time="2025-08-13T01:46:26.895242518Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 3.152966845s" Aug 13 01:46:26.895255 containerd[1938]: time="2025-08-13T01:46:26.895255301Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Aug 13 01:46:26.896247 containerd[1938]: time="2025-08-13T01:46:26.896235112Z" level=info msg="CreateContainer within sandbox \"9360514811ec3057aaabdb8bc4c74261d7b00940d25584a389ec70dd388ede4e\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Aug 13 01:46:26.899631 containerd[1938]: time="2025-08-13T01:46:26.899594956Z" level=info msg="Container 51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:46:26.903112 containerd[1938]: time="2025-08-13T01:46:26.903069420Z" level=info msg="CreateContainer within sandbox \"9360514811ec3057aaabdb8bc4c74261d7b00940d25584a389ec70dd388ede4e\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf\"" Aug 13 01:46:26.903353 containerd[1938]: time="2025-08-13T01:46:26.903340421Z" level=info msg="StartContainer for \"51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf\"" Aug 13 01:46:26.904113 containerd[1938]: time="2025-08-13T01:46:26.904077116Z" level=info msg="connecting to shim 51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf" address="unix:///run/containerd/s/1a3b6de1faddb0eb4abe97e2c67d188e65513d0e50b9df11d00977705b971dab" protocol=ttrpc version=3 Aug 13 01:46:26.918147 systemd[1]: Started cri-containerd-51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf.scope - libcontainer container 51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf. Aug 13 01:46:26.938743 containerd[1938]: time="2025-08-13T01:46:26.938717739Z" level=info msg="StartContainer for \"51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf\" returns successfully" Aug 13 01:46:27.505278 systemd[1]: cri-containerd-51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf.scope: Deactivated successfully. Aug 13 01:46:27.505436 systemd[1]: cri-containerd-51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf.scope: Consumed 371ms CPU time, 193.8M memory peak, 171.2M written to disk. Aug 13 01:46:27.505829 containerd[1938]: time="2025-08-13T01:46:27.505810646Z" level=info msg="received exit event container_id:\"51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf\" id:\"51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf\" pid:4178 exited_at:{seconds:1755049587 nanos:505681575}" Aug 13 01:46:27.505861 containerd[1938]: time="2025-08-13T01:46:27.505831420Z" level=info msg="TaskExit event in podsandbox handler container_id:\"51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf\" id:\"51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf\" pid:4178 exited_at:{seconds:1755049587 nanos:505681575}" Aug 13 01:46:27.516696 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf-rootfs.mount: Deactivated successfully. Aug 13 01:46:27.587652 kubelet[3290]: I0813 01:46:27.587546 3290 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Aug 13 01:46:27.636090 systemd[1]: Created slice kubepods-besteffort-podde755d1b_da9c_40d7_8dfc_e44c18abf305.slice - libcontainer container kubepods-besteffort-podde755d1b_da9c_40d7_8dfc_e44c18abf305.slice. Aug 13 01:46:27.643657 systemd[1]: Created slice kubepods-besteffort-podb3feb002_5e3f_48a9_a537_288b8166141e.slice - libcontainer container kubepods-besteffort-podb3feb002_5e3f_48a9_a537_288b8166141e.slice. Aug 13 01:46:27.650538 systemd[1]: Created slice kubepods-burstable-pode53e116f_77da_4082_88d4_1cdc062c5b36.slice - libcontainer container kubepods-burstable-pode53e116f_77da_4082_88d4_1cdc062c5b36.slice. Aug 13 01:46:27.650706 kubelet[3290]: I0813 01:46:27.650676 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b3feb002-5e3f-48a9-a537-288b8166141e-calico-apiserver-certs\") pod \"calico-apiserver-795979ddf-wcmd7\" (UID: \"b3feb002-5e3f-48a9-a537-288b8166141e\") " pod="calico-apiserver/calico-apiserver-795979ddf-wcmd7" Aug 13 01:46:27.650772 kubelet[3290]: I0813 01:46:27.650722 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bafd23a2-669f-44d7-adaf-3f7ce6d9df14-whisker-ca-bundle\") pod \"whisker-5584fbdb67-lqhr4\" (UID: \"bafd23a2-669f-44d7-adaf-3f7ce6d9df14\") " pod="calico-system/whisker-5584fbdb67-lqhr4" Aug 13 01:46:27.650772 kubelet[3290]: I0813 01:46:27.650750 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29f8l\" (UniqueName: \"kubernetes.io/projected/de755d1b-da9c-40d7-8dfc-e44c18abf305-kube-api-access-29f8l\") pod \"goldmane-768f4c5c69-g4k5b\" (UID: \"de755d1b-da9c-40d7-8dfc-e44c18abf305\") " pod="calico-system/goldmane-768f4c5c69-g4k5b" Aug 13 01:46:27.650848 kubelet[3290]: I0813 01:46:27.650771 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e53e116f-77da-4082-88d4-1cdc062c5b36-config-volume\") pod \"coredns-668d6bf9bc-7qzrh\" (UID: \"e53e116f-77da-4082-88d4-1cdc062c5b36\") " pod="kube-system/coredns-668d6bf9bc-7qzrh" Aug 13 01:46:27.650848 kubelet[3290]: I0813 01:46:27.650807 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/de755d1b-da9c-40d7-8dfc-e44c18abf305-goldmane-key-pair\") pod \"goldmane-768f4c5c69-g4k5b\" (UID: \"de755d1b-da9c-40d7-8dfc-e44c18abf305\") " pod="calico-system/goldmane-768f4c5c69-g4k5b" Aug 13 01:46:27.650848 kubelet[3290]: I0813 01:46:27.650827 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m59qb\" (UniqueName: \"kubernetes.io/projected/e53e116f-77da-4082-88d4-1cdc062c5b36-kube-api-access-m59qb\") pod \"coredns-668d6bf9bc-7qzrh\" (UID: \"e53e116f-77da-4082-88d4-1cdc062c5b36\") " pod="kube-system/coredns-668d6bf9bc-7qzrh" Aug 13 01:46:27.650988 kubelet[3290]: I0813 01:46:27.650847 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrhpt\" (UniqueName: \"kubernetes.io/projected/bafd23a2-669f-44d7-adaf-3f7ce6d9df14-kube-api-access-zrhpt\") pod \"whisker-5584fbdb67-lqhr4\" (UID: \"bafd23a2-669f-44d7-adaf-3f7ce6d9df14\") " pod="calico-system/whisker-5584fbdb67-lqhr4" Aug 13 01:46:27.650988 kubelet[3290]: I0813 01:46:27.650935 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de755d1b-da9c-40d7-8dfc-e44c18abf305-config\") pod \"goldmane-768f4c5c69-g4k5b\" (UID: \"de755d1b-da9c-40d7-8dfc-e44c18abf305\") " pod="calico-system/goldmane-768f4c5c69-g4k5b" Aug 13 01:46:27.650988 kubelet[3290]: I0813 01:46:27.650977 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de755d1b-da9c-40d7-8dfc-e44c18abf305-goldmane-ca-bundle\") pod \"goldmane-768f4c5c69-g4k5b\" (UID: \"de755d1b-da9c-40d7-8dfc-e44c18abf305\") " pod="calico-system/goldmane-768f4c5c69-g4k5b" Aug 13 01:46:27.651098 kubelet[3290]: I0813 01:46:27.651022 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bafd23a2-669f-44d7-adaf-3f7ce6d9df14-whisker-backend-key-pair\") pod \"whisker-5584fbdb67-lqhr4\" (UID: \"bafd23a2-669f-44d7-adaf-3f7ce6d9df14\") " pod="calico-system/whisker-5584fbdb67-lqhr4" Aug 13 01:46:27.651098 kubelet[3290]: I0813 01:46:27.651045 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55rfl\" (UniqueName: \"kubernetes.io/projected/b3feb002-5e3f-48a9-a537-288b8166141e-kube-api-access-55rfl\") pod \"calico-apiserver-795979ddf-wcmd7\" (UID: \"b3feb002-5e3f-48a9-a537-288b8166141e\") " pod="calico-apiserver/calico-apiserver-795979ddf-wcmd7" Aug 13 01:46:27.657220 systemd[1]: Created slice kubepods-besteffort-podbafd23a2_669f_44d7_adaf_3f7ce6d9df14.slice - libcontainer container kubepods-besteffort-podbafd23a2_669f_44d7_adaf_3f7ce6d9df14.slice. Aug 13 01:46:27.662470 systemd[1]: Created slice kubepods-besteffort-pod105c1b16_b046_4e8f_a08b_e099d1063650.slice - libcontainer container kubepods-besteffort-pod105c1b16_b046_4e8f_a08b_e099d1063650.slice. Aug 13 01:46:27.667217 systemd[1]: Created slice kubepods-burstable-pod0aac1d2c_a1ff_4714_8f26_f62a9c1ad472.slice - libcontainer container kubepods-burstable-pod0aac1d2c_a1ff_4714_8f26_f62a9c1ad472.slice. Aug 13 01:46:27.671048 systemd[1]: Created slice kubepods-besteffort-pod5a16da63_a958_45dd_907e_53b0c801fd41.slice - libcontainer container kubepods-besteffort-pod5a16da63_a958_45dd_907e_53b0c801fd41.slice. Aug 13 01:46:27.752537 kubelet[3290]: I0813 01:46:27.752420 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z5lk\" (UniqueName: \"kubernetes.io/projected/5a16da63-a958-45dd-907e-53b0c801fd41-kube-api-access-4z5lk\") pod \"calico-kube-controllers-7846546c6c-fbv5w\" (UID: \"5a16da63-a958-45dd-907e-53b0c801fd41\") " pod="calico-system/calico-kube-controllers-7846546c6c-fbv5w" Aug 13 01:46:27.754019 kubelet[3290]: I0813 01:46:27.752752 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dljtd\" (UniqueName: \"kubernetes.io/projected/105c1b16-b046-4e8f-a08b-e099d1063650-kube-api-access-dljtd\") pod \"calico-apiserver-795979ddf-pkw5n\" (UID: \"105c1b16-b046-4e8f-a08b-e099d1063650\") " pod="calico-apiserver/calico-apiserver-795979ddf-pkw5n" Aug 13 01:46:27.754019 kubelet[3290]: I0813 01:46:27.752895 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a16da63-a958-45dd-907e-53b0c801fd41-tigera-ca-bundle\") pod \"calico-kube-controllers-7846546c6c-fbv5w\" (UID: \"5a16da63-a958-45dd-907e-53b0c801fd41\") " pod="calico-system/calico-kube-controllers-7846546c6c-fbv5w" Aug 13 01:46:27.754019 kubelet[3290]: I0813 01:46:27.753113 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz4g8\" (UniqueName: \"kubernetes.io/projected/0aac1d2c-a1ff-4714-8f26-f62a9c1ad472-kube-api-access-zz4g8\") pod \"coredns-668d6bf9bc-47grb\" (UID: \"0aac1d2c-a1ff-4714-8f26-f62a9c1ad472\") " pod="kube-system/coredns-668d6bf9bc-47grb" Aug 13 01:46:27.754019 kubelet[3290]: I0813 01:46:27.753565 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0aac1d2c-a1ff-4714-8f26-f62a9c1ad472-config-volume\") pod \"coredns-668d6bf9bc-47grb\" (UID: \"0aac1d2c-a1ff-4714-8f26-f62a9c1ad472\") " pod="kube-system/coredns-668d6bf9bc-47grb" Aug 13 01:46:27.754019 kubelet[3290]: I0813 01:46:27.753701 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/105c1b16-b046-4e8f-a08b-e099d1063650-calico-apiserver-certs\") pod \"calico-apiserver-795979ddf-pkw5n\" (UID: \"105c1b16-b046-4e8f-a08b-e099d1063650\") " pod="calico-apiserver/calico-apiserver-795979ddf-pkw5n" Aug 13 01:46:27.940745 containerd[1938]: time="2025-08-13T01:46:27.940643936Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-g4k5b,Uid:de755d1b-da9c-40d7-8dfc-e44c18abf305,Namespace:calico-system,Attempt:0,}" Aug 13 01:46:27.947273 containerd[1938]: time="2025-08-13T01:46:27.947223713Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-795979ddf-wcmd7,Uid:b3feb002-5e3f-48a9-a537-288b8166141e,Namespace:calico-apiserver,Attempt:0,}" Aug 13 01:46:27.954840 containerd[1938]: time="2025-08-13T01:46:27.954818905Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-7qzrh,Uid:e53e116f-77da-4082-88d4-1cdc062c5b36,Namespace:kube-system,Attempt:0,}" Aug 13 01:46:27.960380 containerd[1938]: time="2025-08-13T01:46:27.960349361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5584fbdb67-lqhr4,Uid:bafd23a2-669f-44d7-adaf-3f7ce6d9df14,Namespace:calico-system,Attempt:0,}" Aug 13 01:46:27.964818 containerd[1938]: time="2025-08-13T01:46:27.964795799Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-795979ddf-pkw5n,Uid:105c1b16-b046-4e8f-a08b-e099d1063650,Namespace:calico-apiserver,Attempt:0,}" Aug 13 01:46:27.970271 containerd[1938]: time="2025-08-13T01:46:27.970243750Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-47grb,Uid:0aac1d2c-a1ff-4714-8f26-f62a9c1ad472,Namespace:kube-system,Attempt:0,}" Aug 13 01:46:27.970371 containerd[1938]: time="2025-08-13T01:46:27.970281784Z" level=error msg="Failed to destroy network for sandbox \"756b19702937786f2cce7e05946804200e79084af4d983670fed96216e0401b3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:27.970976 containerd[1938]: time="2025-08-13T01:46:27.970943185Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-g4k5b,Uid:de755d1b-da9c-40d7-8dfc-e44c18abf305,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"756b19702937786f2cce7e05946804200e79084af4d983670fed96216e0401b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:27.971123 kubelet[3290]: E0813 01:46:27.971091 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"756b19702937786f2cce7e05946804200e79084af4d983670fed96216e0401b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:27.971168 kubelet[3290]: E0813 01:46:27.971152 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"756b19702937786f2cce7e05946804200e79084af4d983670fed96216e0401b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-g4k5b" Aug 13 01:46:27.971204 kubelet[3290]: E0813 01:46:27.971175 3290 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"756b19702937786f2cce7e05946804200e79084af4d983670fed96216e0401b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-g4k5b" Aug 13 01:46:27.971252 kubelet[3290]: E0813 01:46:27.971232 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-g4k5b_calico-system(de755d1b-da9c-40d7-8dfc-e44c18abf305)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-g4k5b_calico-system(de755d1b-da9c-40d7-8dfc-e44c18abf305)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"756b19702937786f2cce7e05946804200e79084af4d983670fed96216e0401b3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-g4k5b" podUID="de755d1b-da9c-40d7-8dfc-e44c18abf305" Aug 13 01:46:27.973632 containerd[1938]: time="2025-08-13T01:46:27.973602566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7846546c6c-fbv5w,Uid:5a16da63-a958-45dd-907e-53b0c801fd41,Namespace:calico-system,Attempt:0,}" Aug 13 01:46:27.973846 containerd[1938]: time="2025-08-13T01:46:27.973827423Z" level=error msg="Failed to destroy network for sandbox \"0196a211b6df712d3b470d4387bfaa1bfb4f85ad72c35db7c6b5bcf474fa4b6e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:27.974330 containerd[1938]: time="2025-08-13T01:46:27.974312589Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-795979ddf-wcmd7,Uid:b3feb002-5e3f-48a9-a537-288b8166141e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0196a211b6df712d3b470d4387bfaa1bfb4f85ad72c35db7c6b5bcf474fa4b6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:27.974438 kubelet[3290]: E0813 01:46:27.974418 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0196a211b6df712d3b470d4387bfaa1bfb4f85ad72c35db7c6b5bcf474fa4b6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:27.974492 kubelet[3290]: E0813 01:46:27.974455 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0196a211b6df712d3b470d4387bfaa1bfb4f85ad72c35db7c6b5bcf474fa4b6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-795979ddf-wcmd7" Aug 13 01:46:27.974492 kubelet[3290]: E0813 01:46:27.974474 3290 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0196a211b6df712d3b470d4387bfaa1bfb4f85ad72c35db7c6b5bcf474fa4b6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-795979ddf-wcmd7" Aug 13 01:46:27.974551 kubelet[3290]: E0813 01:46:27.974508 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-795979ddf-wcmd7_calico-apiserver(b3feb002-5e3f-48a9-a537-288b8166141e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-795979ddf-wcmd7_calico-apiserver(b3feb002-5e3f-48a9-a537-288b8166141e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0196a211b6df712d3b470d4387bfaa1bfb4f85ad72c35db7c6b5bcf474fa4b6e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-795979ddf-wcmd7" podUID="b3feb002-5e3f-48a9-a537-288b8166141e" Aug 13 01:46:27.981456 containerd[1938]: time="2025-08-13T01:46:27.981428363Z" level=error msg="Failed to destroy network for sandbox \"95f7ec947a98ae1282a157d3f30f13c77dadd7ad556b83ae7466e98ad76106ec\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:27.981843 containerd[1938]: time="2025-08-13T01:46:27.981825132Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-7qzrh,Uid:e53e116f-77da-4082-88d4-1cdc062c5b36,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"95f7ec947a98ae1282a157d3f30f13c77dadd7ad556b83ae7466e98ad76106ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:27.981993 kubelet[3290]: E0813 01:46:27.981971 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95f7ec947a98ae1282a157d3f30f13c77dadd7ad556b83ae7466e98ad76106ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:27.982025 kubelet[3290]: E0813 01:46:27.982010 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95f7ec947a98ae1282a157d3f30f13c77dadd7ad556b83ae7466e98ad76106ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-7qzrh" Aug 13 01:46:27.982046 kubelet[3290]: E0813 01:46:27.982029 3290 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95f7ec947a98ae1282a157d3f30f13c77dadd7ad556b83ae7466e98ad76106ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-7qzrh" Aug 13 01:46:27.982069 kubelet[3290]: E0813 01:46:27.982055 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-7qzrh_kube-system(e53e116f-77da-4082-88d4-1cdc062c5b36)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-7qzrh_kube-system(e53e116f-77da-4082-88d4-1cdc062c5b36)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"95f7ec947a98ae1282a157d3f30f13c77dadd7ad556b83ae7466e98ad76106ec\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-7qzrh" podUID="e53e116f-77da-4082-88d4-1cdc062c5b36" Aug 13 01:46:27.990569 containerd[1938]: time="2025-08-13T01:46:27.990531127Z" level=error msg="Failed to destroy network for sandbox \"d354eb565d3d1f94ebe4f680e72fe47978a8f3edd9931c022016e5316a518209\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:27.991000 containerd[1938]: time="2025-08-13T01:46:27.990977350Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5584fbdb67-lqhr4,Uid:bafd23a2-669f-44d7-adaf-3f7ce6d9df14,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d354eb565d3d1f94ebe4f680e72fe47978a8f3edd9931c022016e5316a518209\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:27.991176 kubelet[3290]: E0813 01:46:27.991143 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d354eb565d3d1f94ebe4f680e72fe47978a8f3edd9931c022016e5316a518209\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:27.991238 kubelet[3290]: E0813 01:46:27.991197 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d354eb565d3d1f94ebe4f680e72fe47978a8f3edd9931c022016e5316a518209\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5584fbdb67-lqhr4" Aug 13 01:46:27.991238 kubelet[3290]: E0813 01:46:27.991217 3290 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d354eb565d3d1f94ebe4f680e72fe47978a8f3edd9931c022016e5316a518209\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5584fbdb67-lqhr4" Aug 13 01:46:27.991307 kubelet[3290]: E0813 01:46:27.991260 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5584fbdb67-lqhr4_calico-system(bafd23a2-669f-44d7-adaf-3f7ce6d9df14)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5584fbdb67-lqhr4_calico-system(bafd23a2-669f-44d7-adaf-3f7ce6d9df14)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d354eb565d3d1f94ebe4f680e72fe47978a8f3edd9931c022016e5316a518209\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5584fbdb67-lqhr4" podUID="bafd23a2-669f-44d7-adaf-3f7ce6d9df14" Aug 13 01:46:27.994282 containerd[1938]: time="2025-08-13T01:46:27.994255388Z" level=error msg="Failed to destroy network for sandbox \"b10191f7c097495c1b6f034fad81e085725f148cd257d13587d0cf787ec14c07\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:28.001307 containerd[1938]: time="2025-08-13T01:46:28.001281089Z" level=error msg="Failed to destroy network for sandbox \"b07f8e7232cdffc92a685675319fb795999c7f0087a2500e8eab76a076d200db\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:28.001376 containerd[1938]: time="2025-08-13T01:46:28.001353238Z" level=error msg="Failed to destroy network for sandbox \"5367df713a347056dea2b7da9293bf10a9fb24503802882c388125ed888d65b2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:28.010218 containerd[1938]: time="2025-08-13T01:46:28.010198745Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-795979ddf-pkw5n,Uid:105c1b16-b046-4e8f-a08b-e099d1063650,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b10191f7c097495c1b6f034fad81e085725f148cd257d13587d0cf787ec14c07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:28.010341 kubelet[3290]: E0813 01:46:28.010321 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b10191f7c097495c1b6f034fad81e085725f148cd257d13587d0cf787ec14c07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:28.010375 kubelet[3290]: E0813 01:46:28.010355 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b10191f7c097495c1b6f034fad81e085725f148cd257d13587d0cf787ec14c07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-795979ddf-pkw5n" Aug 13 01:46:28.010375 kubelet[3290]: E0813 01:46:28.010367 3290 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b10191f7c097495c1b6f034fad81e085725f148cd257d13587d0cf787ec14c07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-795979ddf-pkw5n" Aug 13 01:46:28.010416 kubelet[3290]: E0813 01:46:28.010393 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-795979ddf-pkw5n_calico-apiserver(105c1b16-b046-4e8f-a08b-e099d1063650)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-795979ddf-pkw5n_calico-apiserver(105c1b16-b046-4e8f-a08b-e099d1063650)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b10191f7c097495c1b6f034fad81e085725f148cd257d13587d0cf787ec14c07\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-795979ddf-pkw5n" podUID="105c1b16-b046-4e8f-a08b-e099d1063650" Aug 13 01:46:28.010650 containerd[1938]: time="2025-08-13T01:46:28.010632794Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7846546c6c-fbv5w,Uid:5a16da63-a958-45dd-907e-53b0c801fd41,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b07f8e7232cdffc92a685675319fb795999c7f0087a2500e8eab76a076d200db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:28.010751 kubelet[3290]: E0813 01:46:28.010737 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b07f8e7232cdffc92a685675319fb795999c7f0087a2500e8eab76a076d200db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:28.010776 kubelet[3290]: E0813 01:46:28.010759 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b07f8e7232cdffc92a685675319fb795999c7f0087a2500e8eab76a076d200db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7846546c6c-fbv5w" Aug 13 01:46:28.010796 kubelet[3290]: E0813 01:46:28.010770 3290 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b07f8e7232cdffc92a685675319fb795999c7f0087a2500e8eab76a076d200db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7846546c6c-fbv5w" Aug 13 01:46:28.010818 kubelet[3290]: E0813 01:46:28.010792 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7846546c6c-fbv5w_calico-system(5a16da63-a958-45dd-907e-53b0c801fd41)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7846546c6c-fbv5w_calico-system(5a16da63-a958-45dd-907e-53b0c801fd41)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b07f8e7232cdffc92a685675319fb795999c7f0087a2500e8eab76a076d200db\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7846546c6c-fbv5w" podUID="5a16da63-a958-45dd-907e-53b0c801fd41" Aug 13 01:46:28.010947 containerd[1938]: time="2025-08-13T01:46:28.010933484Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-47grb,Uid:0aac1d2c-a1ff-4714-8f26-f62a9c1ad472,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5367df713a347056dea2b7da9293bf10a9fb24503802882c388125ed888d65b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:28.011007 kubelet[3290]: E0813 01:46:28.010995 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5367df713a347056dea2b7da9293bf10a9fb24503802882c388125ed888d65b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:28.011028 kubelet[3290]: E0813 01:46:28.011014 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5367df713a347056dea2b7da9293bf10a9fb24503802882c388125ed888d65b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-47grb" Aug 13 01:46:28.011051 kubelet[3290]: E0813 01:46:28.011026 3290 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5367df713a347056dea2b7da9293bf10a9fb24503802882c388125ed888d65b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-47grb" Aug 13 01:46:28.011051 kubelet[3290]: E0813 01:46:28.011043 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-47grb_kube-system(0aac1d2c-a1ff-4714-8f26-f62a9c1ad472)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-47grb_kube-system(0aac1d2c-a1ff-4714-8f26-f62a9c1ad472)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5367df713a347056dea2b7da9293bf10a9fb24503802882c388125ed888d65b2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-47grb" podUID="0aac1d2c-a1ff-4714-8f26-f62a9c1ad472" Aug 13 01:46:28.685393 systemd[1]: Created slice kubepods-besteffort-podbde1b4b3_33c0_4855_9196_5fe2289fff45.slice - libcontainer container kubepods-besteffort-podbde1b4b3_33c0_4855_9196_5fe2289fff45.slice. Aug 13 01:46:28.691641 containerd[1938]: time="2025-08-13T01:46:28.691551763Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fd57q,Uid:bde1b4b3-33c0-4855-9196-5fe2289fff45,Namespace:calico-system,Attempt:0,}" Aug 13 01:46:28.718134 containerd[1938]: time="2025-08-13T01:46:28.718074683Z" level=error msg="Failed to destroy network for sandbox \"f12ffb129d7321b5dd1f657d0cba7c963d5c9d749e105a1d548b279c72feae32\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:28.718594 containerd[1938]: time="2025-08-13T01:46:28.718548110Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fd57q,Uid:bde1b4b3-33c0-4855-9196-5fe2289fff45,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f12ffb129d7321b5dd1f657d0cba7c963d5c9d749e105a1d548b279c72feae32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:28.718742 kubelet[3290]: E0813 01:46:28.718714 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f12ffb129d7321b5dd1f657d0cba7c963d5c9d749e105a1d548b279c72feae32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 01:46:28.718775 kubelet[3290]: E0813 01:46:28.718758 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f12ffb129d7321b5dd1f657d0cba7c963d5c9d749e105a1d548b279c72feae32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fd57q" Aug 13 01:46:28.718795 kubelet[3290]: E0813 01:46:28.718772 3290 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f12ffb129d7321b5dd1f657d0cba7c963d5c9d749e105a1d548b279c72feae32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fd57q" Aug 13 01:46:28.718817 kubelet[3290]: E0813 01:46:28.718802 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fd57q_calico-system(bde1b4b3-33c0-4855-9196-5fe2289fff45)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fd57q_calico-system(bde1b4b3-33c0-4855-9196-5fe2289fff45)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f12ffb129d7321b5dd1f657d0cba7c963d5c9d749e105a1d548b279c72feae32\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fd57q" podUID="bde1b4b3-33c0-4855-9196-5fe2289fff45" Aug 13 01:46:28.769222 containerd[1938]: time="2025-08-13T01:46:28.769096114Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Aug 13 01:46:28.905949 systemd[1]: run-netns-cni\x2d79cd2765\x2deeb6\x2d4518\x2d5a6a\x2dd45cdaf4d02e.mount: Deactivated successfully. Aug 13 01:46:28.906007 systemd[1]: run-netns-cni\x2d66fcd9ee\x2d93ad\x2de08f\x2d3ac1\x2d57aefbf46236.mount: Deactivated successfully. Aug 13 01:46:28.906043 systemd[1]: run-netns-cni\x2dc08af868\x2df1df\x2d4cfd\x2d224b\x2d86a8e0f9efc7.mount: Deactivated successfully. Aug 13 01:46:28.906076 systemd[1]: run-netns-cni\x2d5ce76be8\x2d11f7\x2dd114\x2d2479\x2d11f780f2b24f.mount: Deactivated successfully. Aug 13 01:46:33.955920 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount594038422.mount: Deactivated successfully. Aug 13 01:46:33.978177 containerd[1938]: time="2025-08-13T01:46:33.978122921Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:33.978402 containerd[1938]: time="2025-08-13T01:46:33.978286711Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Aug 13 01:46:33.978700 containerd[1938]: time="2025-08-13T01:46:33.978652299Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:33.979813 containerd[1938]: time="2025-08-13T01:46:33.979803977Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:33.980040 containerd[1938]: time="2025-08-13T01:46:33.980028542Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 5.210863316s" Aug 13 01:46:33.980065 containerd[1938]: time="2025-08-13T01:46:33.980044225Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Aug 13 01:46:33.983415 containerd[1938]: time="2025-08-13T01:46:33.983396259Z" level=info msg="CreateContainer within sandbox \"9360514811ec3057aaabdb8bc4c74261d7b00940d25584a389ec70dd388ede4e\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Aug 13 01:46:33.986855 containerd[1938]: time="2025-08-13T01:46:33.986843695Z" level=info msg="Container 1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:46:33.991544 containerd[1938]: time="2025-08-13T01:46:33.991502897Z" level=info msg="CreateContainer within sandbox \"9360514811ec3057aaabdb8bc4c74261d7b00940d25584a389ec70dd388ede4e\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\"" Aug 13 01:46:33.991730 containerd[1938]: time="2025-08-13T01:46:33.991715466Z" level=info msg="StartContainer for \"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\"" Aug 13 01:46:33.992471 containerd[1938]: time="2025-08-13T01:46:33.992458520Z" level=info msg="connecting to shim 1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e" address="unix:///run/containerd/s/1a3b6de1faddb0eb4abe97e2c67d188e65513d0e50b9df11d00977705b971dab" protocol=ttrpc version=3 Aug 13 01:46:34.018143 systemd[1]: Started cri-containerd-1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e.scope - libcontainer container 1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e. Aug 13 01:46:34.045484 containerd[1938]: time="2025-08-13T01:46:34.045431132Z" level=info msg="StartContainer for \"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\" returns successfully" Aug 13 01:46:34.112166 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Aug 13 01:46:34.112220 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Aug 13 01:46:34.298644 kubelet[3290]: I0813 01:46:34.298423 3290 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bafd23a2-669f-44d7-adaf-3f7ce6d9df14-whisker-ca-bundle\") pod \"bafd23a2-669f-44d7-adaf-3f7ce6d9df14\" (UID: \"bafd23a2-669f-44d7-adaf-3f7ce6d9df14\") " Aug 13 01:46:34.298644 kubelet[3290]: I0813 01:46:34.298550 3290 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrhpt\" (UniqueName: \"kubernetes.io/projected/bafd23a2-669f-44d7-adaf-3f7ce6d9df14-kube-api-access-zrhpt\") pod \"bafd23a2-669f-44d7-adaf-3f7ce6d9df14\" (UID: \"bafd23a2-669f-44d7-adaf-3f7ce6d9df14\") " Aug 13 01:46:34.298644 kubelet[3290]: I0813 01:46:34.298621 3290 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bafd23a2-669f-44d7-adaf-3f7ce6d9df14-whisker-backend-key-pair\") pod \"bafd23a2-669f-44d7-adaf-3f7ce6d9df14\" (UID: \"bafd23a2-669f-44d7-adaf-3f7ce6d9df14\") " Aug 13 01:46:34.299737 kubelet[3290]: I0813 01:46:34.299296 3290 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bafd23a2-669f-44d7-adaf-3f7ce6d9df14-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "bafd23a2-669f-44d7-adaf-3f7ce6d9df14" (UID: "bafd23a2-669f-44d7-adaf-3f7ce6d9df14"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Aug 13 01:46:34.304938 kubelet[3290]: I0813 01:46:34.304827 3290 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bafd23a2-669f-44d7-adaf-3f7ce6d9df14-kube-api-access-zrhpt" (OuterVolumeSpecName: "kube-api-access-zrhpt") pod "bafd23a2-669f-44d7-adaf-3f7ce6d9df14" (UID: "bafd23a2-669f-44d7-adaf-3f7ce6d9df14"). InnerVolumeSpecName "kube-api-access-zrhpt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Aug 13 01:46:34.305196 kubelet[3290]: I0813 01:46:34.304997 3290 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bafd23a2-669f-44d7-adaf-3f7ce6d9df14-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "bafd23a2-669f-44d7-adaf-3f7ce6d9df14" (UID: "bafd23a2-669f-44d7-adaf-3f7ce6d9df14"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Aug 13 01:46:34.400143 kubelet[3290]: I0813 01:46:34.400012 3290 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bafd23a2-669f-44d7-adaf-3f7ce6d9df14-whisker-ca-bundle\") on node \"ci-4372.1.0-a-4296cabafa\" DevicePath \"\"" Aug 13 01:46:34.400143 kubelet[3290]: I0813 01:46:34.400087 3290 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zrhpt\" (UniqueName: \"kubernetes.io/projected/bafd23a2-669f-44d7-adaf-3f7ce6d9df14-kube-api-access-zrhpt\") on node \"ci-4372.1.0-a-4296cabafa\" DevicePath \"\"" Aug 13 01:46:34.400143 kubelet[3290]: I0813 01:46:34.400142 3290 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bafd23a2-669f-44d7-adaf-3f7ce6d9df14-whisker-backend-key-pair\") on node \"ci-4372.1.0-a-4296cabafa\" DevicePath \"\"" Aug 13 01:46:34.793627 systemd[1]: Removed slice kubepods-besteffort-podbafd23a2_669f_44d7_adaf_3f7ce6d9df14.slice - libcontainer container kubepods-besteffort-podbafd23a2_669f_44d7_adaf_3f7ce6d9df14.slice. Aug 13 01:46:34.800469 kubelet[3290]: I0813 01:46:34.800417 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-89f4b" podStartSLOduration=1.8864751690000001 podStartE2EDuration="16.800404577s" podCreationTimestamp="2025-08-13 01:46:18 +0000 UTC" firstStartedPulling="2025-08-13 01:46:19.066411964 +0000 UTC m=+17.443736226" lastFinishedPulling="2025-08-13 01:46:33.980341372 +0000 UTC m=+32.357665634" observedRunningTime="2025-08-13 01:46:34.800087426 +0000 UTC m=+33.177411699" watchObservedRunningTime="2025-08-13 01:46:34.800404577 +0000 UTC m=+33.177728838" Aug 13 01:46:34.821158 systemd[1]: Created slice kubepods-besteffort-podf45fa281_09c0_4e4a_9494_62fbd49ab63e.slice - libcontainer container kubepods-besteffort-podf45fa281_09c0_4e4a_9494_62fbd49ab63e.slice. Aug 13 01:46:34.903697 kubelet[3290]: I0813 01:46:34.903573 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f45fa281-09c0-4e4a-9494-62fbd49ab63e-whisker-ca-bundle\") pod \"whisker-6d4846c74f-7hvcw\" (UID: \"f45fa281-09c0-4e4a-9494-62fbd49ab63e\") " pod="calico-system/whisker-6d4846c74f-7hvcw" Aug 13 01:46:34.904036 kubelet[3290]: I0813 01:46:34.903715 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5lnj\" (UniqueName: \"kubernetes.io/projected/f45fa281-09c0-4e4a-9494-62fbd49ab63e-kube-api-access-q5lnj\") pod \"whisker-6d4846c74f-7hvcw\" (UID: \"f45fa281-09c0-4e4a-9494-62fbd49ab63e\") " pod="calico-system/whisker-6d4846c74f-7hvcw" Aug 13 01:46:34.904036 kubelet[3290]: I0813 01:46:34.903942 3290 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f45fa281-09c0-4e4a-9494-62fbd49ab63e-whisker-backend-key-pair\") pod \"whisker-6d4846c74f-7hvcw\" (UID: \"f45fa281-09c0-4e4a-9494-62fbd49ab63e\") " pod="calico-system/whisker-6d4846c74f-7hvcw" Aug 13 01:46:34.961109 systemd[1]: var-lib-kubelet-pods-bafd23a2\x2d669f\x2d44d7\x2dadaf\x2d3f7ce6d9df14-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dzrhpt.mount: Deactivated successfully. Aug 13 01:46:34.961172 systemd[1]: var-lib-kubelet-pods-bafd23a2\x2d669f\x2d44d7\x2dadaf\x2d3f7ce6d9df14-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Aug 13 01:46:35.124650 containerd[1938]: time="2025-08-13T01:46:35.124535468Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6d4846c74f-7hvcw,Uid:f45fa281-09c0-4e4a-9494-62fbd49ab63e,Namespace:calico-system,Attempt:0,}" Aug 13 01:46:35.219589 systemd-networkd[1852]: calia5c8681ee2a: Link UP Aug 13 01:46:35.220408 systemd-networkd[1852]: calia5c8681ee2a: Gained carrier Aug 13 01:46:35.247281 containerd[1938]: 2025-08-13 01:46:35.136 [INFO][4689] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 01:46:35.247281 containerd[1938]: 2025-08-13 01:46:35.143 [INFO][4689] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--a--4296cabafa-k8s-whisker--6d4846c74f--7hvcw-eth0 whisker-6d4846c74f- calico-system f45fa281-09c0-4e4a-9494-62fbd49ab63e 857 0 2025-08-13 01:46:34 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6d4846c74f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4372.1.0-a-4296cabafa whisker-6d4846c74f-7hvcw eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calia5c8681ee2a [] [] }} ContainerID="3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" Namespace="calico-system" Pod="whisker-6d4846c74f-7hvcw" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-whisker--6d4846c74f--7hvcw-" Aug 13 01:46:35.247281 containerd[1938]: 2025-08-13 01:46:35.143 [INFO][4689] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" Namespace="calico-system" Pod="whisker-6d4846c74f-7hvcw" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-whisker--6d4846c74f--7hvcw-eth0" Aug 13 01:46:35.247281 containerd[1938]: 2025-08-13 01:46:35.161 [INFO][4708] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" HandleID="k8s-pod-network.3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" Workload="ci--4372.1.0--a--4296cabafa-k8s-whisker--6d4846c74f--7hvcw-eth0" Aug 13 01:46:35.247847 containerd[1938]: 2025-08-13 01:46:35.161 [INFO][4708] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" HandleID="k8s-pod-network.3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" Workload="ci--4372.1.0--a--4296cabafa-k8s-whisker--6d4846c74f--7hvcw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000596540), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-a-4296cabafa", "pod":"whisker-6d4846c74f-7hvcw", "timestamp":"2025-08-13 01:46:35.161312058 +0000 UTC"}, Hostname:"ci-4372.1.0-a-4296cabafa", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 01:46:35.247847 containerd[1938]: 2025-08-13 01:46:35.161 [INFO][4708] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 01:46:35.247847 containerd[1938]: 2025-08-13 01:46:35.161 [INFO][4708] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 01:46:35.247847 containerd[1938]: 2025-08-13 01:46:35.161 [INFO][4708] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-a-4296cabafa' Aug 13 01:46:35.247847 containerd[1938]: 2025-08-13 01:46:35.167 [INFO][4708] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:35.247847 containerd[1938]: 2025-08-13 01:46:35.171 [INFO][4708] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:35.247847 containerd[1938]: 2025-08-13 01:46:35.176 [INFO][4708] ipam/ipam.go 511: Trying affinity for 192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:35.247847 containerd[1938]: 2025-08-13 01:46:35.178 [INFO][4708] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:35.247847 containerd[1938]: 2025-08-13 01:46:35.181 [INFO][4708] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:35.248783 containerd[1938]: 2025-08-13 01:46:35.181 [INFO][4708] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.89.0/26 handle="k8s-pod-network.3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:35.248783 containerd[1938]: 2025-08-13 01:46:35.183 [INFO][4708] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330 Aug 13 01:46:35.248783 containerd[1938]: 2025-08-13 01:46:35.188 [INFO][4708] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.89.0/26 handle="k8s-pod-network.3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:35.248783 containerd[1938]: 2025-08-13 01:46:35.196 [INFO][4708] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.89.1/26] block=192.168.89.0/26 handle="k8s-pod-network.3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:35.248783 containerd[1938]: 2025-08-13 01:46:35.196 [INFO][4708] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.1/26] handle="k8s-pod-network.3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:35.248783 containerd[1938]: 2025-08-13 01:46:35.196 [INFO][4708] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 01:46:35.248783 containerd[1938]: 2025-08-13 01:46:35.196 [INFO][4708] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.89.1/26] IPv6=[] ContainerID="3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" HandleID="k8s-pod-network.3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" Workload="ci--4372.1.0--a--4296cabafa-k8s-whisker--6d4846c74f--7hvcw-eth0" Aug 13 01:46:35.249478 containerd[1938]: 2025-08-13 01:46:35.203 [INFO][4689] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" Namespace="calico-system" Pod="whisker-6d4846c74f-7hvcw" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-whisker--6d4846c74f--7hvcw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--4296cabafa-k8s-whisker--6d4846c74f--7hvcw-eth0", GenerateName:"whisker-6d4846c74f-", Namespace:"calico-system", SelfLink:"", UID:"f45fa281-09c0-4e4a-9494-62fbd49ab63e", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 1, 46, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6d4846c74f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-4296cabafa", ContainerID:"", Pod:"whisker-6d4846c74f-7hvcw", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.89.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calia5c8681ee2a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 01:46:35.249478 containerd[1938]: 2025-08-13 01:46:35.204 [INFO][4689] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.1/32] ContainerID="3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" Namespace="calico-system" Pod="whisker-6d4846c74f-7hvcw" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-whisker--6d4846c74f--7hvcw-eth0" Aug 13 01:46:35.249818 containerd[1938]: 2025-08-13 01:46:35.204 [INFO][4689] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia5c8681ee2a ContainerID="3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" Namespace="calico-system" Pod="whisker-6d4846c74f-7hvcw" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-whisker--6d4846c74f--7hvcw-eth0" Aug 13 01:46:35.249818 containerd[1938]: 2025-08-13 01:46:35.220 [INFO][4689] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" Namespace="calico-system" Pod="whisker-6d4846c74f-7hvcw" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-whisker--6d4846c74f--7hvcw-eth0" Aug 13 01:46:35.250033 containerd[1938]: 2025-08-13 01:46:35.221 [INFO][4689] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" Namespace="calico-system" Pod="whisker-6d4846c74f-7hvcw" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-whisker--6d4846c74f--7hvcw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--4296cabafa-k8s-whisker--6d4846c74f--7hvcw-eth0", GenerateName:"whisker-6d4846c74f-", Namespace:"calico-system", SelfLink:"", UID:"f45fa281-09c0-4e4a-9494-62fbd49ab63e", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 1, 46, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6d4846c74f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-4296cabafa", ContainerID:"3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330", Pod:"whisker-6d4846c74f-7hvcw", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.89.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calia5c8681ee2a", MAC:"42:ff:73:89:a0:89", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 01:46:35.250227 containerd[1938]: 2025-08-13 01:46:35.240 [INFO][4689] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" Namespace="calico-system" Pod="whisker-6d4846c74f-7hvcw" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-whisker--6d4846c74f--7hvcw-eth0" Aug 13 01:46:35.259915 containerd[1938]: time="2025-08-13T01:46:35.259894565Z" level=info msg="connecting to shim 3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330" address="unix:///run/containerd/s/62f3f76bd777a5717bcfec618a600a56ec88e8e1fe5dc2dea9a49ae42e50ce11" namespace=k8s.io protocol=ttrpc version=3 Aug 13 01:46:35.281185 systemd[1]: Started cri-containerd-3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330.scope - libcontainer container 3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330. Aug 13 01:46:35.312136 containerd[1938]: time="2025-08-13T01:46:35.312117288Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6d4846c74f-7hvcw,Uid:f45fa281-09c0-4e4a-9494-62fbd49ab63e,Namespace:calico-system,Attempt:0,} returns sandbox id \"3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330\"" Aug 13 01:46:35.313359 containerd[1938]: time="2025-08-13T01:46:35.313344913Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Aug 13 01:46:35.641990 systemd-networkd[1852]: vxlan.calico: Link UP Aug 13 01:46:35.641994 systemd-networkd[1852]: vxlan.calico: Gained carrier Aug 13 01:46:35.669242 kubelet[3290]: I0813 01:46:35.669195 3290 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bafd23a2-669f-44d7-adaf-3f7ce6d9df14" path="/var/lib/kubelet/pods/bafd23a2-669f-44d7-adaf-3f7ce6d9df14/volumes" Aug 13 01:46:35.841729 containerd[1938]: time="2025-08-13T01:46:35.841703444Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\" id:\"dc2b813c3c8c0f7c82e48d950759268db94c5016b9d060dbacb077c732df86cd\" pid:5025 exit_status:1 exited_at:{seconds:1755049595 nanos:841534029}" Aug 13 01:46:36.848861 containerd[1938]: time="2025-08-13T01:46:36.848831103Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\" id:\"f88cf97b8bcbc66d85295a05f7f80b15db68f9181008b6ae5dc0702ea935f974\" pid:5099 exit_status:1 exited_at:{seconds:1755049596 nanos:848553386}" Aug 13 01:46:36.972103 containerd[1938]: time="2025-08-13T01:46:36.972051083Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:36.972276 containerd[1938]: time="2025-08-13T01:46:36.972230842Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Aug 13 01:46:36.972606 containerd[1938]: time="2025-08-13T01:46:36.972566401Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:36.973491 containerd[1938]: time="2025-08-13T01:46:36.973448778Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:36.973847 containerd[1938]: time="2025-08-13T01:46:36.973808530Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 1.660446599s" Aug 13 01:46:36.973847 containerd[1938]: time="2025-08-13T01:46:36.973822465Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Aug 13 01:46:36.974784 containerd[1938]: time="2025-08-13T01:46:36.974739843Z" level=info msg="CreateContainer within sandbox \"3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Aug 13 01:46:36.977473 containerd[1938]: time="2025-08-13T01:46:36.977429215Z" level=info msg="Container d0f392e5243f2245e0c034e3885584dd74266edc5ac71cdac1cb0c8f561acf73: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:46:36.980322 containerd[1938]: time="2025-08-13T01:46:36.980310537Z" level=info msg="CreateContainer within sandbox \"3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"d0f392e5243f2245e0c034e3885584dd74266edc5ac71cdac1cb0c8f561acf73\"" Aug 13 01:46:36.980506 containerd[1938]: time="2025-08-13T01:46:36.980494680Z" level=info msg="StartContainer for \"d0f392e5243f2245e0c034e3885584dd74266edc5ac71cdac1cb0c8f561acf73\"" Aug 13 01:46:36.981093 containerd[1938]: time="2025-08-13T01:46:36.981048567Z" level=info msg="connecting to shim d0f392e5243f2245e0c034e3885584dd74266edc5ac71cdac1cb0c8f561acf73" address="unix:///run/containerd/s/62f3f76bd777a5717bcfec618a600a56ec88e8e1fe5dc2dea9a49ae42e50ce11" protocol=ttrpc version=3 Aug 13 01:46:36.997343 systemd[1]: Started cri-containerd-d0f392e5243f2245e0c034e3885584dd74266edc5ac71cdac1cb0c8f561acf73.scope - libcontainer container d0f392e5243f2245e0c034e3885584dd74266edc5ac71cdac1cb0c8f561acf73. Aug 13 01:46:37.063960 systemd-networkd[1852]: calia5c8681ee2a: Gained IPv6LL Aug 13 01:46:37.064137 systemd-networkd[1852]: vxlan.calico: Gained IPv6LL Aug 13 01:46:37.069577 containerd[1938]: time="2025-08-13T01:46:37.069555421Z" level=info msg="StartContainer for \"d0f392e5243f2245e0c034e3885584dd74266edc5ac71cdac1cb0c8f561acf73\" returns successfully" Aug 13 01:46:37.070137 containerd[1938]: time="2025-08-13T01:46:37.070098895Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Aug 13 01:46:38.668486 containerd[1938]: time="2025-08-13T01:46:38.668428322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-795979ddf-wcmd7,Uid:b3feb002-5e3f-48a9-a537-288b8166141e,Namespace:calico-apiserver,Attempt:0,}" Aug 13 01:46:38.730622 systemd-networkd[1852]: calie7fa91c1722: Link UP Aug 13 01:46:38.730952 systemd-networkd[1852]: calie7fa91c1722: Gained carrier Aug 13 01:46:38.739925 containerd[1938]: 2025-08-13 01:46:38.688 [INFO][5176] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--wcmd7-eth0 calico-apiserver-795979ddf- calico-apiserver b3feb002-5e3f-48a9-a537-288b8166141e 791 0 2025-08-13 01:46:14 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:795979ddf projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372.1.0-a-4296cabafa calico-apiserver-795979ddf-wcmd7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie7fa91c1722 [] [] }} ContainerID="3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" Namespace="calico-apiserver" Pod="calico-apiserver-795979ddf-wcmd7" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--wcmd7-" Aug 13 01:46:38.739925 containerd[1938]: 2025-08-13 01:46:38.689 [INFO][5176] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" Namespace="calico-apiserver" Pod="calico-apiserver-795979ddf-wcmd7" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--wcmd7-eth0" Aug 13 01:46:38.739925 containerd[1938]: 2025-08-13 01:46:38.703 [INFO][5199] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" HandleID="k8s-pod-network.3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" Workload="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--wcmd7-eth0" Aug 13 01:46:38.740143 containerd[1938]: 2025-08-13 01:46:38.703 [INFO][5199] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" HandleID="k8s-pod-network.3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" Workload="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--wcmd7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e2780), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372.1.0-a-4296cabafa", "pod":"calico-apiserver-795979ddf-wcmd7", "timestamp":"2025-08-13 01:46:38.703338364 +0000 UTC"}, Hostname:"ci-4372.1.0-a-4296cabafa", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 01:46:38.740143 containerd[1938]: 2025-08-13 01:46:38.703 [INFO][5199] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 01:46:38.740143 containerd[1938]: 2025-08-13 01:46:38.703 [INFO][5199] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 01:46:38.740143 containerd[1938]: 2025-08-13 01:46:38.703 [INFO][5199] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-a-4296cabafa' Aug 13 01:46:38.740143 containerd[1938]: 2025-08-13 01:46:38.708 [INFO][5199] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:38.740143 containerd[1938]: 2025-08-13 01:46:38.712 [INFO][5199] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:38.740143 containerd[1938]: 2025-08-13 01:46:38.715 [INFO][5199] ipam/ipam.go 511: Trying affinity for 192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:38.740143 containerd[1938]: 2025-08-13 01:46:38.717 [INFO][5199] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:38.740143 containerd[1938]: 2025-08-13 01:46:38.718 [INFO][5199] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:38.740504 containerd[1938]: 2025-08-13 01:46:38.719 [INFO][5199] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.89.0/26 handle="k8s-pod-network.3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:38.740504 containerd[1938]: 2025-08-13 01:46:38.720 [INFO][5199] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005 Aug 13 01:46:38.740504 containerd[1938]: 2025-08-13 01:46:38.723 [INFO][5199] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.89.0/26 handle="k8s-pod-network.3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:38.740504 containerd[1938]: 2025-08-13 01:46:38.727 [INFO][5199] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.89.2/26] block=192.168.89.0/26 handle="k8s-pod-network.3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:38.740504 containerd[1938]: 2025-08-13 01:46:38.727 [INFO][5199] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.2/26] handle="k8s-pod-network.3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:38.740504 containerd[1938]: 2025-08-13 01:46:38.727 [INFO][5199] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 01:46:38.740504 containerd[1938]: 2025-08-13 01:46:38.727 [INFO][5199] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.89.2/26] IPv6=[] ContainerID="3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" HandleID="k8s-pod-network.3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" Workload="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--wcmd7-eth0" Aug 13 01:46:38.740729 containerd[1938]: 2025-08-13 01:46:38.729 [INFO][5176] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" Namespace="calico-apiserver" Pod="calico-apiserver-795979ddf-wcmd7" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--wcmd7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--wcmd7-eth0", GenerateName:"calico-apiserver-795979ddf-", Namespace:"calico-apiserver", SelfLink:"", UID:"b3feb002-5e3f-48a9-a537-288b8166141e", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 1, 46, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"795979ddf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-4296cabafa", ContainerID:"", Pod:"calico-apiserver-795979ddf-wcmd7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.89.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie7fa91c1722", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 01:46:38.740804 containerd[1938]: 2025-08-13 01:46:38.729 [INFO][5176] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.2/32] ContainerID="3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" Namespace="calico-apiserver" Pod="calico-apiserver-795979ddf-wcmd7" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--wcmd7-eth0" Aug 13 01:46:38.740804 containerd[1938]: 2025-08-13 01:46:38.729 [INFO][5176] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie7fa91c1722 ContainerID="3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" Namespace="calico-apiserver" Pod="calico-apiserver-795979ddf-wcmd7" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--wcmd7-eth0" Aug 13 01:46:38.740804 containerd[1938]: 2025-08-13 01:46:38.731 [INFO][5176] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" Namespace="calico-apiserver" Pod="calico-apiserver-795979ddf-wcmd7" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--wcmd7-eth0" Aug 13 01:46:38.740932 containerd[1938]: 2025-08-13 01:46:38.731 [INFO][5176] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" Namespace="calico-apiserver" Pod="calico-apiserver-795979ddf-wcmd7" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--wcmd7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--wcmd7-eth0", GenerateName:"calico-apiserver-795979ddf-", Namespace:"calico-apiserver", SelfLink:"", UID:"b3feb002-5e3f-48a9-a537-288b8166141e", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 1, 46, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"795979ddf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-4296cabafa", ContainerID:"3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005", Pod:"calico-apiserver-795979ddf-wcmd7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.89.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie7fa91c1722", MAC:"fe:b6:4d:a0:d2:69", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 01:46:38.741008 containerd[1938]: 2025-08-13 01:46:38.737 [INFO][5176] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" Namespace="calico-apiserver" Pod="calico-apiserver-795979ddf-wcmd7" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--wcmd7-eth0" Aug 13 01:46:38.749875 containerd[1938]: time="2025-08-13T01:46:38.749847567Z" level=info msg="connecting to shim 3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005" address="unix:///run/containerd/s/c19a51256830670824098be82cc8bb94503de8d8846677bb78a73fcffbdc1729" namespace=k8s.io protocol=ttrpc version=3 Aug 13 01:46:38.767170 systemd[1]: Started cri-containerd-3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005.scope - libcontainer container 3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005. Aug 13 01:46:38.793943 containerd[1938]: time="2025-08-13T01:46:38.793917998Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-795979ddf-wcmd7,Uid:b3feb002-5e3f-48a9-a537-288b8166141e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005\"" Aug 13 01:46:39.667957 containerd[1938]: time="2025-08-13T01:46:39.667930010Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-47grb,Uid:0aac1d2c-a1ff-4714-8f26-f62a9c1ad472,Namespace:kube-system,Attempt:0,}" Aug 13 01:46:39.733280 systemd-networkd[1852]: cali39df9890bbd: Link UP Aug 13 01:46:39.733435 systemd-networkd[1852]: cali39df9890bbd: Gained carrier Aug 13 01:46:39.740786 containerd[1938]: 2025-08-13 01:46:39.687 [INFO][5272] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--47grb-eth0 coredns-668d6bf9bc- kube-system 0aac1d2c-a1ff-4714-8f26-f62a9c1ad472 792 0 2025-08-13 01:46:07 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372.1.0-a-4296cabafa coredns-668d6bf9bc-47grb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali39df9890bbd [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" Namespace="kube-system" Pod="coredns-668d6bf9bc-47grb" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--47grb-" Aug 13 01:46:39.740786 containerd[1938]: 2025-08-13 01:46:39.687 [INFO][5272] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" Namespace="kube-system" Pod="coredns-668d6bf9bc-47grb" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--47grb-eth0" Aug 13 01:46:39.740786 containerd[1938]: 2025-08-13 01:46:39.702 [INFO][5295] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" HandleID="k8s-pod-network.3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" Workload="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--47grb-eth0" Aug 13 01:46:39.741104 containerd[1938]: 2025-08-13 01:46:39.702 [INFO][5295] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" HandleID="k8s-pod-network.3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" Workload="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--47grb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f630), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372.1.0-a-4296cabafa", "pod":"coredns-668d6bf9bc-47grb", "timestamp":"2025-08-13 01:46:39.702840104 +0000 UTC"}, Hostname:"ci-4372.1.0-a-4296cabafa", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 01:46:39.741104 containerd[1938]: 2025-08-13 01:46:39.702 [INFO][5295] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 01:46:39.741104 containerd[1938]: 2025-08-13 01:46:39.703 [INFO][5295] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 01:46:39.741104 containerd[1938]: 2025-08-13 01:46:39.703 [INFO][5295] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-a-4296cabafa' Aug 13 01:46:39.741104 containerd[1938]: 2025-08-13 01:46:39.708 [INFO][5295] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:39.741104 containerd[1938]: 2025-08-13 01:46:39.711 [INFO][5295] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:39.741104 containerd[1938]: 2025-08-13 01:46:39.714 [INFO][5295] ipam/ipam.go 511: Trying affinity for 192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:39.741104 containerd[1938]: 2025-08-13 01:46:39.716 [INFO][5295] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:39.741104 containerd[1938]: 2025-08-13 01:46:39.717 [INFO][5295] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:39.741362 containerd[1938]: 2025-08-13 01:46:39.717 [INFO][5295] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.89.0/26 handle="k8s-pod-network.3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:39.741362 containerd[1938]: 2025-08-13 01:46:39.718 [INFO][5295] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0 Aug 13 01:46:39.741362 containerd[1938]: 2025-08-13 01:46:39.721 [INFO][5295] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.89.0/26 handle="k8s-pod-network.3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:39.741362 containerd[1938]: 2025-08-13 01:46:39.728 [INFO][5295] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.89.3/26] block=192.168.89.0/26 handle="k8s-pod-network.3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:39.741362 containerd[1938]: 2025-08-13 01:46:39.728 [INFO][5295] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.3/26] handle="k8s-pod-network.3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:39.741362 containerd[1938]: 2025-08-13 01:46:39.728 [INFO][5295] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 01:46:39.741362 containerd[1938]: 2025-08-13 01:46:39.728 [INFO][5295] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.89.3/26] IPv6=[] ContainerID="3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" HandleID="k8s-pod-network.3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" Workload="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--47grb-eth0" Aug 13 01:46:39.741532 containerd[1938]: 2025-08-13 01:46:39.732 [INFO][5272] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" Namespace="kube-system" Pod="coredns-668d6bf9bc-47grb" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--47grb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--47grb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"0aac1d2c-a1ff-4714-8f26-f62a9c1ad472", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 1, 46, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-4296cabafa", ContainerID:"", Pod:"coredns-668d6bf9bc-47grb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.89.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali39df9890bbd", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 01:46:39.741532 containerd[1938]: 2025-08-13 01:46:39.732 [INFO][5272] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.3/32] ContainerID="3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" Namespace="kube-system" Pod="coredns-668d6bf9bc-47grb" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--47grb-eth0" Aug 13 01:46:39.741532 containerd[1938]: 2025-08-13 01:46:39.732 [INFO][5272] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali39df9890bbd ContainerID="3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" Namespace="kube-system" Pod="coredns-668d6bf9bc-47grb" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--47grb-eth0" Aug 13 01:46:39.741532 containerd[1938]: 2025-08-13 01:46:39.733 [INFO][5272] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" Namespace="kube-system" Pod="coredns-668d6bf9bc-47grb" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--47grb-eth0" Aug 13 01:46:39.741532 containerd[1938]: 2025-08-13 01:46:39.734 [INFO][5272] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" Namespace="kube-system" Pod="coredns-668d6bf9bc-47grb" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--47grb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--47grb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"0aac1d2c-a1ff-4714-8f26-f62a9c1ad472", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 1, 46, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-4296cabafa", ContainerID:"3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0", Pod:"coredns-668d6bf9bc-47grb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.89.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali39df9890bbd", MAC:"76:c5:30:d2:9d:33", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 01:46:39.741532 containerd[1938]: 2025-08-13 01:46:39.739 [INFO][5272] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" Namespace="kube-system" Pod="coredns-668d6bf9bc-47grb" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--47grb-eth0" Aug 13 01:46:39.749102 containerd[1938]: time="2025-08-13T01:46:39.749078642Z" level=info msg="connecting to shim 3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0" address="unix:///run/containerd/s/5b1ac493eba805c13a298c6fc7fbf3078d5e12c6fd5ff2fbe3b85c9824e4ee87" namespace=k8s.io protocol=ttrpc version=3 Aug 13 01:46:39.749175 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3685954899.mount: Deactivated successfully. Aug 13 01:46:39.752140 containerd[1938]: time="2025-08-13T01:46:39.752123969Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:39.752333 containerd[1938]: time="2025-08-13T01:46:39.752319334Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Aug 13 01:46:39.753089 containerd[1938]: time="2025-08-13T01:46:39.752847116Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:39.754752 containerd[1938]: time="2025-08-13T01:46:39.754734030Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:39.755118 containerd[1938]: time="2025-08-13T01:46:39.755104532Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 2.684988882s" Aug 13 01:46:39.755162 containerd[1938]: time="2025-08-13T01:46:39.755121492Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Aug 13 01:46:39.755683 containerd[1938]: time="2025-08-13T01:46:39.755669750Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 13 01:46:39.756274 containerd[1938]: time="2025-08-13T01:46:39.756262174Z" level=info msg="CreateContainer within sandbox \"3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Aug 13 01:46:39.759059 containerd[1938]: time="2025-08-13T01:46:39.759042827Z" level=info msg="Container f42758ee2e8b95398e595efd2fc981f88ebb01fe0ee9cd8618588777698b265c: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:46:39.761739 containerd[1938]: time="2025-08-13T01:46:39.761697599Z" level=info msg="CreateContainer within sandbox \"3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"f42758ee2e8b95398e595efd2fc981f88ebb01fe0ee9cd8618588777698b265c\"" Aug 13 01:46:39.761976 containerd[1938]: time="2025-08-13T01:46:39.761928233Z" level=info msg="StartContainer for \"f42758ee2e8b95398e595efd2fc981f88ebb01fe0ee9cd8618588777698b265c\"" Aug 13 01:46:39.762493 containerd[1938]: time="2025-08-13T01:46:39.762456531Z" level=info msg="connecting to shim f42758ee2e8b95398e595efd2fc981f88ebb01fe0ee9cd8618588777698b265c" address="unix:///run/containerd/s/62f3f76bd777a5717bcfec618a600a56ec88e8e1fe5dc2dea9a49ae42e50ce11" protocol=ttrpc version=3 Aug 13 01:46:39.768059 systemd[1]: Started cri-containerd-3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0.scope - libcontainer container 3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0. Aug 13 01:46:39.772370 systemd[1]: Started cri-containerd-f42758ee2e8b95398e595efd2fc981f88ebb01fe0ee9cd8618588777698b265c.scope - libcontainer container f42758ee2e8b95398e595efd2fc981f88ebb01fe0ee9cd8618588777698b265c. Aug 13 01:46:39.794858 containerd[1938]: time="2025-08-13T01:46:39.794834818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-47grb,Uid:0aac1d2c-a1ff-4714-8f26-f62a9c1ad472,Namespace:kube-system,Attempt:0,} returns sandbox id \"3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0\"" Aug 13 01:46:39.795941 containerd[1938]: time="2025-08-13T01:46:39.795926541Z" level=info msg="CreateContainer within sandbox \"3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 13 01:46:39.799260 containerd[1938]: time="2025-08-13T01:46:39.799240116Z" level=info msg="Container d2e0932ba7283bab5e37017f4a036c925049eefb1d65283bfd80f2e52a3e5812: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:46:39.800068 containerd[1938]: time="2025-08-13T01:46:39.800053917Z" level=info msg="StartContainer for \"f42758ee2e8b95398e595efd2fc981f88ebb01fe0ee9cd8618588777698b265c\" returns successfully" Aug 13 01:46:39.801743 containerd[1938]: time="2025-08-13T01:46:39.801725225Z" level=info msg="CreateContainer within sandbox \"3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d2e0932ba7283bab5e37017f4a036c925049eefb1d65283bfd80f2e52a3e5812\"" Aug 13 01:46:39.801900 containerd[1938]: time="2025-08-13T01:46:39.801888023Z" level=info msg="StartContainer for \"d2e0932ba7283bab5e37017f4a036c925049eefb1d65283bfd80f2e52a3e5812\"" Aug 13 01:46:39.802326 containerd[1938]: time="2025-08-13T01:46:39.802312908Z" level=info msg="connecting to shim d2e0932ba7283bab5e37017f4a036c925049eefb1d65283bfd80f2e52a3e5812" address="unix:///run/containerd/s/5b1ac493eba805c13a298c6fc7fbf3078d5e12c6fd5ff2fbe3b85c9824e4ee87" protocol=ttrpc version=3 Aug 13 01:46:39.807329 kubelet[3290]: I0813 01:46:39.807287 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6d4846c74f-7hvcw" podStartSLOduration=1.36481011 podStartE2EDuration="5.807270822s" podCreationTimestamp="2025-08-13 01:46:34 +0000 UTC" firstStartedPulling="2025-08-13 01:46:35.313122364 +0000 UTC m=+33.690446625" lastFinishedPulling="2025-08-13 01:46:39.755583073 +0000 UTC m=+38.132907337" observedRunningTime="2025-08-13 01:46:39.806980183 +0000 UTC m=+38.184304446" watchObservedRunningTime="2025-08-13 01:46:39.807270822 +0000 UTC m=+38.184595082" Aug 13 01:46:39.815988 systemd[1]: Started cri-containerd-d2e0932ba7283bab5e37017f4a036c925049eefb1d65283bfd80f2e52a3e5812.scope - libcontainer container d2e0932ba7283bab5e37017f4a036c925049eefb1d65283bfd80f2e52a3e5812. Aug 13 01:46:39.829079 containerd[1938]: time="2025-08-13T01:46:39.829057694Z" level=info msg="StartContainer for \"d2e0932ba7283bab5e37017f4a036c925049eefb1d65283bfd80f2e52a3e5812\" returns successfully" Aug 13 01:46:39.880196 systemd-networkd[1852]: calie7fa91c1722: Gained IPv6LL Aug 13 01:46:40.669280 containerd[1938]: time="2025-08-13T01:46:40.669188781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-795979ddf-pkw5n,Uid:105c1b16-b046-4e8f-a08b-e099d1063650,Namespace:calico-apiserver,Attempt:0,}" Aug 13 01:46:40.669638 containerd[1938]: time="2025-08-13T01:46:40.669405873Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7846546c6c-fbv5w,Uid:5a16da63-a958-45dd-907e-53b0c801fd41,Namespace:calico-system,Attempt:0,}" Aug 13 01:46:40.726248 systemd-networkd[1852]: cali24e1e5ec6e0: Link UP Aug 13 01:46:40.726417 systemd-networkd[1852]: cali24e1e5ec6e0: Gained carrier Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.688 [INFO][5462] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--pkw5n-eth0 calico-apiserver-795979ddf- calico-apiserver 105c1b16-b046-4e8f-a08b-e099d1063650 789 0 2025-08-13 01:46:14 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:795979ddf projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372.1.0-a-4296cabafa calico-apiserver-795979ddf-pkw5n eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali24e1e5ec6e0 [] [] }} ContainerID="15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" Namespace="calico-apiserver" Pod="calico-apiserver-795979ddf-pkw5n" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--pkw5n-" Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.688 [INFO][5462] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" Namespace="calico-apiserver" Pod="calico-apiserver-795979ddf-pkw5n" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--pkw5n-eth0" Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.702 [INFO][5510] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" HandleID="k8s-pod-network.15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" Workload="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--pkw5n-eth0" Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.702 [INFO][5510] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" HandleID="k8s-pod-network.15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" Workload="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--pkw5n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000351860), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372.1.0-a-4296cabafa", "pod":"calico-apiserver-795979ddf-pkw5n", "timestamp":"2025-08-13 01:46:40.702084449 +0000 UTC"}, Hostname:"ci-4372.1.0-a-4296cabafa", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.702 [INFO][5510] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.702 [INFO][5510] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.702 [INFO][5510] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-a-4296cabafa' Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.707 [INFO][5510] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.710 [INFO][5510] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.714 [INFO][5510] ipam/ipam.go 511: Trying affinity for 192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.715 [INFO][5510] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.717 [INFO][5510] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.717 [INFO][5510] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.89.0/26 handle="k8s-pod-network.15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.718 [INFO][5510] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71 Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.721 [INFO][5510] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.89.0/26 handle="k8s-pod-network.15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.724 [INFO][5510] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.89.4/26] block=192.168.89.0/26 handle="k8s-pod-network.15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.724 [INFO][5510] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.4/26] handle="k8s-pod-network.15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.724 [INFO][5510] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 01:46:40.733676 containerd[1938]: 2025-08-13 01:46:40.724 [INFO][5510] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.89.4/26] IPv6=[] ContainerID="15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" HandleID="k8s-pod-network.15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" Workload="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--pkw5n-eth0" Aug 13 01:46:40.734411 containerd[1938]: 2025-08-13 01:46:40.725 [INFO][5462] cni-plugin/k8s.go 418: Populated endpoint ContainerID="15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" Namespace="calico-apiserver" Pod="calico-apiserver-795979ddf-pkw5n" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--pkw5n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--pkw5n-eth0", GenerateName:"calico-apiserver-795979ddf-", Namespace:"calico-apiserver", SelfLink:"", UID:"105c1b16-b046-4e8f-a08b-e099d1063650", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 1, 46, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"795979ddf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-4296cabafa", ContainerID:"", Pod:"calico-apiserver-795979ddf-pkw5n", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.89.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali24e1e5ec6e0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 01:46:40.734411 containerd[1938]: 2025-08-13 01:46:40.725 [INFO][5462] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.4/32] ContainerID="15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" Namespace="calico-apiserver" Pod="calico-apiserver-795979ddf-pkw5n" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--pkw5n-eth0" Aug 13 01:46:40.734411 containerd[1938]: 2025-08-13 01:46:40.725 [INFO][5462] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali24e1e5ec6e0 ContainerID="15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" Namespace="calico-apiserver" Pod="calico-apiserver-795979ddf-pkw5n" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--pkw5n-eth0" Aug 13 01:46:40.734411 containerd[1938]: 2025-08-13 01:46:40.726 [INFO][5462] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" Namespace="calico-apiserver" Pod="calico-apiserver-795979ddf-pkw5n" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--pkw5n-eth0" Aug 13 01:46:40.734411 containerd[1938]: 2025-08-13 01:46:40.726 [INFO][5462] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" Namespace="calico-apiserver" Pod="calico-apiserver-795979ddf-pkw5n" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--pkw5n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--pkw5n-eth0", GenerateName:"calico-apiserver-795979ddf-", Namespace:"calico-apiserver", SelfLink:"", UID:"105c1b16-b046-4e8f-a08b-e099d1063650", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 1, 46, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"795979ddf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-4296cabafa", ContainerID:"15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71", Pod:"calico-apiserver-795979ddf-pkw5n", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.89.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali24e1e5ec6e0", MAC:"fa:19:70:4c:24:fa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 01:46:40.734411 containerd[1938]: 2025-08-13 01:46:40.732 [INFO][5462] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" Namespace="calico-apiserver" Pod="calico-apiserver-795979ddf-pkw5n" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--apiserver--795979ddf--pkw5n-eth0" Aug 13 01:46:40.763443 containerd[1938]: time="2025-08-13T01:46:40.763416457Z" level=info msg="connecting to shim 15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71" address="unix:///run/containerd/s/73cecb4e6e92a0402388254277ac479d0623e02e22488bd0843b59190adfb733" namespace=k8s.io protocol=ttrpc version=3 Aug 13 01:46:40.779038 systemd[1]: Started cri-containerd-15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71.scope - libcontainer container 15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71. Aug 13 01:46:40.805290 containerd[1938]: time="2025-08-13T01:46:40.805267505Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-795979ddf-pkw5n,Uid:105c1b16-b046-4e8f-a08b-e099d1063650,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71\"" Aug 13 01:46:40.810263 kubelet[3290]: I0813 01:46:40.810232 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-47grb" podStartSLOduration=33.810221136 podStartE2EDuration="33.810221136s" podCreationTimestamp="2025-08-13 01:46:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 01:46:40.809953517 +0000 UTC m=+39.187277782" watchObservedRunningTime="2025-08-13 01:46:40.810221136 +0000 UTC m=+39.187545395" Aug 13 01:46:40.822492 systemd-networkd[1852]: calibde8d6cfca0: Link UP Aug 13 01:46:40.822675 systemd-networkd[1852]: calibde8d6cfca0: Gained carrier Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.689 [INFO][5465] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--a--4296cabafa-k8s-calico--kube--controllers--7846546c6c--fbv5w-eth0 calico-kube-controllers-7846546c6c- calico-system 5a16da63-a958-45dd-907e-53b0c801fd41 790 0 2025-08-13 01:46:19 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7846546c6c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4372.1.0-a-4296cabafa calico-kube-controllers-7846546c6c-fbv5w eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calibde8d6cfca0 [] [] }} ContainerID="dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" Namespace="calico-system" Pod="calico-kube-controllers-7846546c6c-fbv5w" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--kube--controllers--7846546c6c--fbv5w-" Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.689 [INFO][5465] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" Namespace="calico-system" Pod="calico-kube-controllers-7846546c6c-fbv5w" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--kube--controllers--7846546c6c--fbv5w-eth0" Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.702 [INFO][5508] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" HandleID="k8s-pod-network.dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" Workload="ci--4372.1.0--a--4296cabafa-k8s-calico--kube--controllers--7846546c6c--fbv5w-eth0" Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.702 [INFO][5508] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" HandleID="k8s-pod-network.dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" Workload="ci--4372.1.0--a--4296cabafa-k8s-calico--kube--controllers--7846546c6c--fbv5w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000123750), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-a-4296cabafa", "pod":"calico-kube-controllers-7846546c6c-fbv5w", "timestamp":"2025-08-13 01:46:40.702484748 +0000 UTC"}, Hostname:"ci-4372.1.0-a-4296cabafa", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.702 [INFO][5508] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.724 [INFO][5508] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.724 [INFO][5508] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-a-4296cabafa' Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.807 [INFO][5508] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.810 [INFO][5508] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.814 [INFO][5508] ipam/ipam.go 511: Trying affinity for 192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.815 [INFO][5508] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.816 [INFO][5508] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.816 [INFO][5508] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.89.0/26 handle="k8s-pod-network.dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.816 [INFO][5508] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895 Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.818 [INFO][5508] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.89.0/26 handle="k8s-pod-network.dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.820 [INFO][5508] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.89.5/26] block=192.168.89.0/26 handle="k8s-pod-network.dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.820 [INFO][5508] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.5/26] handle="k8s-pod-network.dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.820 [INFO][5508] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 01:46:40.828924 containerd[1938]: 2025-08-13 01:46:40.820 [INFO][5508] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.89.5/26] IPv6=[] ContainerID="dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" HandleID="k8s-pod-network.dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" Workload="ci--4372.1.0--a--4296cabafa-k8s-calico--kube--controllers--7846546c6c--fbv5w-eth0" Aug 13 01:46:40.829312 containerd[1938]: 2025-08-13 01:46:40.821 [INFO][5465] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" Namespace="calico-system" Pod="calico-kube-controllers-7846546c6c-fbv5w" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--kube--controllers--7846546c6c--fbv5w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--4296cabafa-k8s-calico--kube--controllers--7846546c6c--fbv5w-eth0", GenerateName:"calico-kube-controllers-7846546c6c-", Namespace:"calico-system", SelfLink:"", UID:"5a16da63-a958-45dd-907e-53b0c801fd41", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 1, 46, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7846546c6c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-4296cabafa", ContainerID:"", Pod:"calico-kube-controllers-7846546c6c-fbv5w", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.89.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibde8d6cfca0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 01:46:40.829312 containerd[1938]: 2025-08-13 01:46:40.821 [INFO][5465] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.5/32] ContainerID="dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" Namespace="calico-system" Pod="calico-kube-controllers-7846546c6c-fbv5w" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--kube--controllers--7846546c6c--fbv5w-eth0" Aug 13 01:46:40.829312 containerd[1938]: 2025-08-13 01:46:40.821 [INFO][5465] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibde8d6cfca0 ContainerID="dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" Namespace="calico-system" Pod="calico-kube-controllers-7846546c6c-fbv5w" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--kube--controllers--7846546c6c--fbv5w-eth0" Aug 13 01:46:40.829312 containerd[1938]: 2025-08-13 01:46:40.822 [INFO][5465] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" Namespace="calico-system" Pod="calico-kube-controllers-7846546c6c-fbv5w" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--kube--controllers--7846546c6c--fbv5w-eth0" Aug 13 01:46:40.829312 containerd[1938]: 2025-08-13 01:46:40.823 [INFO][5465] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" Namespace="calico-system" Pod="calico-kube-controllers-7846546c6c-fbv5w" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--kube--controllers--7846546c6c--fbv5w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--4296cabafa-k8s-calico--kube--controllers--7846546c6c--fbv5w-eth0", GenerateName:"calico-kube-controllers-7846546c6c-", Namespace:"calico-system", SelfLink:"", UID:"5a16da63-a958-45dd-907e-53b0c801fd41", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 1, 46, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7846546c6c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-4296cabafa", ContainerID:"dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895", Pod:"calico-kube-controllers-7846546c6c-fbv5w", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.89.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibde8d6cfca0", MAC:"7e:d5:ce:b9:d7:93", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 01:46:40.829312 containerd[1938]: 2025-08-13 01:46:40.827 [INFO][5465] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" Namespace="calico-system" Pod="calico-kube-controllers-7846546c6c-fbv5w" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-calico--kube--controllers--7846546c6c--fbv5w-eth0" Aug 13 01:46:40.836420 containerd[1938]: time="2025-08-13T01:46:40.836362544Z" level=info msg="connecting to shim dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895" address="unix:///run/containerd/s/b22ef5d0f963c78ff232312e950c71b99e9ca9fdd363eba1a7b408f3ae30d7fc" namespace=k8s.io protocol=ttrpc version=3 Aug 13 01:46:40.858283 systemd[1]: Started cri-containerd-dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895.scope - libcontainer container dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895. Aug 13 01:46:40.886141 containerd[1938]: time="2025-08-13T01:46:40.886092262Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7846546c6c-fbv5w,Uid:5a16da63-a958-45dd-907e-53b0c801fd41,Namespace:calico-system,Attempt:0,} returns sandbox id \"dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895\"" Aug 13 01:46:41.544009 systemd-networkd[1852]: cali39df9890bbd: Gained IPv6LL Aug 13 01:46:42.120943 systemd-networkd[1852]: cali24e1e5ec6e0: Gained IPv6LL Aug 13 01:46:42.375781 containerd[1938]: time="2025-08-13T01:46:42.375697215Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:42.375990 containerd[1938]: time="2025-08-13T01:46:42.375930829Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Aug 13 01:46:42.376344 containerd[1938]: time="2025-08-13T01:46:42.376300256Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:42.377201 containerd[1938]: time="2025-08-13T01:46:42.377156020Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:42.377597 containerd[1938]: time="2025-08-13T01:46:42.377554184Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 2.62186431s" Aug 13 01:46:42.377597 containerd[1938]: time="2025-08-13T01:46:42.377571157Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 13 01:46:42.378013 containerd[1938]: time="2025-08-13T01:46:42.378003594Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 13 01:46:42.378472 containerd[1938]: time="2025-08-13T01:46:42.378460673Z" level=info msg="CreateContainer within sandbox \"3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 13 01:46:42.381027 containerd[1938]: time="2025-08-13T01:46:42.381016455Z" level=info msg="Container 2bcc2e126dc8f1297bcec02870fa8ea3ec4973d26587abfcc57461b7d01e65a3: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:46:42.383817 containerd[1938]: time="2025-08-13T01:46:42.383801740Z" level=info msg="CreateContainer within sandbox \"3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"2bcc2e126dc8f1297bcec02870fa8ea3ec4973d26587abfcc57461b7d01e65a3\"" Aug 13 01:46:42.384048 containerd[1938]: time="2025-08-13T01:46:42.384035365Z" level=info msg="StartContainer for \"2bcc2e126dc8f1297bcec02870fa8ea3ec4973d26587abfcc57461b7d01e65a3\"" Aug 13 01:46:42.384792 containerd[1938]: time="2025-08-13T01:46:42.384780362Z" level=info msg="connecting to shim 2bcc2e126dc8f1297bcec02870fa8ea3ec4973d26587abfcc57461b7d01e65a3" address="unix:///run/containerd/s/c19a51256830670824098be82cc8bb94503de8d8846677bb78a73fcffbdc1729" protocol=ttrpc version=3 Aug 13 01:46:42.404209 systemd[1]: Started cri-containerd-2bcc2e126dc8f1297bcec02870fa8ea3ec4973d26587abfcc57461b7d01e65a3.scope - libcontainer container 2bcc2e126dc8f1297bcec02870fa8ea3ec4973d26587abfcc57461b7d01e65a3. Aug 13 01:46:42.435003 containerd[1938]: time="2025-08-13T01:46:42.434982900Z" level=info msg="StartContainer for \"2bcc2e126dc8f1297bcec02870fa8ea3ec4973d26587abfcc57461b7d01e65a3\" returns successfully" Aug 13 01:46:42.669216 containerd[1938]: time="2025-08-13T01:46:42.669002254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-7qzrh,Uid:e53e116f-77da-4082-88d4-1cdc062c5b36,Namespace:kube-system,Attempt:0,}" Aug 13 01:46:42.722974 systemd-networkd[1852]: cali1df3e822126: Link UP Aug 13 01:46:42.723133 systemd-networkd[1852]: cali1df3e822126: Gained carrier Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.694 [INFO][5725] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--7qzrh-eth0 coredns-668d6bf9bc- kube-system e53e116f-77da-4082-88d4-1cdc062c5b36 787 0 2025-08-13 01:46:07 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372.1.0-a-4296cabafa coredns-668d6bf9bc-7qzrh eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali1df3e822126 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" Namespace="kube-system" Pod="coredns-668d6bf9bc-7qzrh" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--7qzrh-" Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.694 [INFO][5725] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" Namespace="kube-system" Pod="coredns-668d6bf9bc-7qzrh" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--7qzrh-eth0" Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.707 [INFO][5747] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" HandleID="k8s-pod-network.0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" Workload="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--7qzrh-eth0" Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.707 [INFO][5747] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" HandleID="k8s-pod-network.0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" Workload="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--7qzrh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f850), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372.1.0-a-4296cabafa", "pod":"coredns-668d6bf9bc-7qzrh", "timestamp":"2025-08-13 01:46:42.707766444 +0000 UTC"}, Hostname:"ci-4372.1.0-a-4296cabafa", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.707 [INFO][5747] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.707 [INFO][5747] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.707 [INFO][5747] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-a-4296cabafa' Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.710 [INFO][5747] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.712 [INFO][5747] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.714 [INFO][5747] ipam/ipam.go 511: Trying affinity for 192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.715 [INFO][5747] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.716 [INFO][5747] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.716 [INFO][5747] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.89.0/26 handle="k8s-pod-network.0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.716 [INFO][5747] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15 Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.718 [INFO][5747] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.89.0/26 handle="k8s-pod-network.0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.721 [INFO][5747] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.89.6/26] block=192.168.89.0/26 handle="k8s-pod-network.0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.721 [INFO][5747] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.6/26] handle="k8s-pod-network.0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.721 [INFO][5747] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 01:46:42.727971 containerd[1938]: 2025-08-13 01:46:42.721 [INFO][5747] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.89.6/26] IPv6=[] ContainerID="0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" HandleID="k8s-pod-network.0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" Workload="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--7qzrh-eth0" Aug 13 01:46:42.728622 containerd[1938]: 2025-08-13 01:46:42.722 [INFO][5725] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" Namespace="kube-system" Pod="coredns-668d6bf9bc-7qzrh" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--7qzrh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--7qzrh-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e53e116f-77da-4082-88d4-1cdc062c5b36", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 1, 46, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-4296cabafa", ContainerID:"", Pod:"coredns-668d6bf9bc-7qzrh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.89.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1df3e822126", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 01:46:42.728622 containerd[1938]: 2025-08-13 01:46:42.722 [INFO][5725] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.6/32] ContainerID="0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" Namespace="kube-system" Pod="coredns-668d6bf9bc-7qzrh" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--7qzrh-eth0" Aug 13 01:46:42.728622 containerd[1938]: 2025-08-13 01:46:42.722 [INFO][5725] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1df3e822126 ContainerID="0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" Namespace="kube-system" Pod="coredns-668d6bf9bc-7qzrh" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--7qzrh-eth0" Aug 13 01:46:42.728622 containerd[1938]: 2025-08-13 01:46:42.723 [INFO][5725] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" Namespace="kube-system" Pod="coredns-668d6bf9bc-7qzrh" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--7qzrh-eth0" Aug 13 01:46:42.728622 containerd[1938]: 2025-08-13 01:46:42.723 [INFO][5725] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" Namespace="kube-system" Pod="coredns-668d6bf9bc-7qzrh" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--7qzrh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--7qzrh-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e53e116f-77da-4082-88d4-1cdc062c5b36", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 1, 46, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-4296cabafa", ContainerID:"0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15", Pod:"coredns-668d6bf9bc-7qzrh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.89.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1df3e822126", MAC:"ce:db:0a:4b:5d:15", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 01:46:42.728622 containerd[1938]: 2025-08-13 01:46:42.726 [INFO][5725] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" Namespace="kube-system" Pod="coredns-668d6bf9bc-7qzrh" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-coredns--668d6bf9bc--7qzrh-eth0" Aug 13 01:46:42.736757 containerd[1938]: time="2025-08-13T01:46:42.736731268Z" level=info msg="connecting to shim 0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15" address="unix:///run/containerd/s/9674c0784eeb1e324b0eeefc7cd7df6cdd70b398c407bef4dc2b08cc67b6519b" namespace=k8s.io protocol=ttrpc version=3 Aug 13 01:46:42.755992 systemd[1]: Started cri-containerd-0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15.scope - libcontainer container 0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15. Aug 13 01:46:42.777459 containerd[1938]: time="2025-08-13T01:46:42.777438738Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:42.777644 containerd[1938]: time="2025-08-13T01:46:42.777624143Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Aug 13 01:46:42.778786 containerd[1938]: time="2025-08-13T01:46:42.778772734Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 400.75574ms" Aug 13 01:46:42.778826 containerd[1938]: time="2025-08-13T01:46:42.778790787Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 13 01:46:42.779241 containerd[1938]: time="2025-08-13T01:46:42.779230006Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Aug 13 01:46:42.779817 containerd[1938]: time="2025-08-13T01:46:42.779801339Z" level=info msg="CreateContainer within sandbox \"15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 13 01:46:42.782698 containerd[1938]: time="2025-08-13T01:46:42.782681356Z" level=info msg="Container 393f7acbdd831c7ec35b8ecdd8bc4687aa654dca440f325a124238205eb8aef3: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:46:42.783517 containerd[1938]: time="2025-08-13T01:46:42.783505834Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-7qzrh,Uid:e53e116f-77da-4082-88d4-1cdc062c5b36,Namespace:kube-system,Attempt:0,} returns sandbox id \"0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15\"" Aug 13 01:46:42.784653 containerd[1938]: time="2025-08-13T01:46:42.784639108Z" level=info msg="CreateContainer within sandbox \"0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 13 01:46:42.785334 containerd[1938]: time="2025-08-13T01:46:42.785320413Z" level=info msg="CreateContainer within sandbox \"15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"393f7acbdd831c7ec35b8ecdd8bc4687aa654dca440f325a124238205eb8aef3\"" Aug 13 01:46:42.785470 containerd[1938]: time="2025-08-13T01:46:42.785460529Z" level=info msg="StartContainer for \"393f7acbdd831c7ec35b8ecdd8bc4687aa654dca440f325a124238205eb8aef3\"" Aug 13 01:46:42.785986 containerd[1938]: time="2025-08-13T01:46:42.785974987Z" level=info msg="connecting to shim 393f7acbdd831c7ec35b8ecdd8bc4687aa654dca440f325a124238205eb8aef3" address="unix:///run/containerd/s/73cecb4e6e92a0402388254277ac479d0623e02e22488bd0843b59190adfb733" protocol=ttrpc version=3 Aug 13 01:46:42.787812 containerd[1938]: time="2025-08-13T01:46:42.787768329Z" level=info msg="Container 516d22b09f3851a1de701f26926b2579393a85e7c216d7cf429a28b0f542cdc2: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:46:42.790164 containerd[1938]: time="2025-08-13T01:46:42.790153176Z" level=info msg="CreateContainer within sandbox \"0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"516d22b09f3851a1de701f26926b2579393a85e7c216d7cf429a28b0f542cdc2\"" Aug 13 01:46:42.790358 containerd[1938]: time="2025-08-13T01:46:42.790348022Z" level=info msg="StartContainer for \"516d22b09f3851a1de701f26926b2579393a85e7c216d7cf429a28b0f542cdc2\"" Aug 13 01:46:42.790744 containerd[1938]: time="2025-08-13T01:46:42.790733405Z" level=info msg="connecting to shim 516d22b09f3851a1de701f26926b2579393a85e7c216d7cf429a28b0f542cdc2" address="unix:///run/containerd/s/9674c0784eeb1e324b0eeefc7cd7df6cdd70b398c407bef4dc2b08cc67b6519b" protocol=ttrpc version=3 Aug 13 01:46:42.807081 systemd[1]: Started cri-containerd-393f7acbdd831c7ec35b8ecdd8bc4687aa654dca440f325a124238205eb8aef3.scope - libcontainer container 393f7acbdd831c7ec35b8ecdd8bc4687aa654dca440f325a124238205eb8aef3. Aug 13 01:46:42.808897 systemd[1]: Started cri-containerd-516d22b09f3851a1de701f26926b2579393a85e7c216d7cf429a28b0f542cdc2.scope - libcontainer container 516d22b09f3851a1de701f26926b2579393a85e7c216d7cf429a28b0f542cdc2. Aug 13 01:46:42.815652 kubelet[3290]: I0813 01:46:42.815603 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-795979ddf-wcmd7" podStartSLOduration=25.232163434 podStartE2EDuration="28.815586361s" podCreationTimestamp="2025-08-13 01:46:14 +0000 UTC" firstStartedPulling="2025-08-13 01:46:38.794519106 +0000 UTC m=+37.171843369" lastFinishedPulling="2025-08-13 01:46:42.377942034 +0000 UTC m=+40.755266296" observedRunningTime="2025-08-13 01:46:42.815419955 +0000 UTC m=+41.192744227" watchObservedRunningTime="2025-08-13 01:46:42.815586361 +0000 UTC m=+41.192910624" Aug 13 01:46:42.823236 containerd[1938]: time="2025-08-13T01:46:42.823215427Z" level=info msg="StartContainer for \"516d22b09f3851a1de701f26926b2579393a85e7c216d7cf429a28b0f542cdc2\" returns successfully" Aug 13 01:46:42.823936 systemd-networkd[1852]: calibde8d6cfca0: Gained IPv6LL Aug 13 01:46:42.836637 containerd[1938]: time="2025-08-13T01:46:42.836611705Z" level=info msg="StartContainer for \"393f7acbdd831c7ec35b8ecdd8bc4687aa654dca440f325a124238205eb8aef3\" returns successfully" Aug 13 01:46:43.669619 containerd[1938]: time="2025-08-13T01:46:43.669498580Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-g4k5b,Uid:de755d1b-da9c-40d7-8dfc-e44c18abf305,Namespace:calico-system,Attempt:0,}" Aug 13 01:46:43.669619 containerd[1938]: time="2025-08-13T01:46:43.669543803Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fd57q,Uid:bde1b4b3-33c0-4855-9196-5fe2289fff45,Namespace:calico-system,Attempt:0,}" Aug 13 01:46:43.726416 systemd-networkd[1852]: cali16d3921c7ce: Link UP Aug 13 01:46:43.726590 systemd-networkd[1852]: cali16d3921c7ce: Gained carrier Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.691 [INFO][5912] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--a--4296cabafa-k8s-csi--node--driver--fd57q-eth0 csi-node-driver- calico-system bde1b4b3-33c0-4855-9196-5fe2289fff45 678 0 2025-08-13 01:46:18 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:8967bcb6f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4372.1.0-a-4296cabafa csi-node-driver-fd57q eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali16d3921c7ce [] [] }} ContainerID="a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" Namespace="calico-system" Pod="csi-node-driver-fd57q" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-csi--node--driver--fd57q-" Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.691 [INFO][5912] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" Namespace="calico-system" Pod="csi-node-driver-fd57q" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-csi--node--driver--fd57q-eth0" Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.705 [INFO][5957] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" HandleID="k8s-pod-network.a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" Workload="ci--4372.1.0--a--4296cabafa-k8s-csi--node--driver--fd57q-eth0" Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.705 [INFO][5957] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" HandleID="k8s-pod-network.a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" Workload="ci--4372.1.0--a--4296cabafa-k8s-csi--node--driver--fd57q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fe70), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-a-4296cabafa", "pod":"csi-node-driver-fd57q", "timestamp":"2025-08-13 01:46:43.705166048 +0000 UTC"}, Hostname:"ci-4372.1.0-a-4296cabafa", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.705 [INFO][5957] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.705 [INFO][5957] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.705 [INFO][5957] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-a-4296cabafa' Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.709 [INFO][5957] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.712 [INFO][5957] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.715 [INFO][5957] ipam/ipam.go 511: Trying affinity for 192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.717 [INFO][5957] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.718 [INFO][5957] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.718 [INFO][5957] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.89.0/26 handle="k8s-pod-network.a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.719 [INFO][5957] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.721 [INFO][5957] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.89.0/26 handle="k8s-pod-network.a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.724 [INFO][5957] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.89.7/26] block=192.168.89.0/26 handle="k8s-pod-network.a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.724 [INFO][5957] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.7/26] handle="k8s-pod-network.a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.724 [INFO][5957] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 01:46:43.731566 containerd[1938]: 2025-08-13 01:46:43.724 [INFO][5957] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.89.7/26] IPv6=[] ContainerID="a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" HandleID="k8s-pod-network.a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" Workload="ci--4372.1.0--a--4296cabafa-k8s-csi--node--driver--fd57q-eth0" Aug 13 01:46:43.732136 containerd[1938]: 2025-08-13 01:46:43.725 [INFO][5912] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" Namespace="calico-system" Pod="csi-node-driver-fd57q" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-csi--node--driver--fd57q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--4296cabafa-k8s-csi--node--driver--fd57q-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"bde1b4b3-33c0-4855-9196-5fe2289fff45", ResourceVersion:"678", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 1, 46, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-4296cabafa", ContainerID:"", Pod:"csi-node-driver-fd57q", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.89.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali16d3921c7ce", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 01:46:43.732136 containerd[1938]: 2025-08-13 01:46:43.725 [INFO][5912] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.7/32] ContainerID="a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" Namespace="calico-system" Pod="csi-node-driver-fd57q" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-csi--node--driver--fd57q-eth0" Aug 13 01:46:43.732136 containerd[1938]: 2025-08-13 01:46:43.725 [INFO][5912] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali16d3921c7ce ContainerID="a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" Namespace="calico-system" Pod="csi-node-driver-fd57q" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-csi--node--driver--fd57q-eth0" Aug 13 01:46:43.732136 containerd[1938]: 2025-08-13 01:46:43.726 [INFO][5912] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" Namespace="calico-system" Pod="csi-node-driver-fd57q" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-csi--node--driver--fd57q-eth0" Aug 13 01:46:43.732136 containerd[1938]: 2025-08-13 01:46:43.726 [INFO][5912] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" Namespace="calico-system" Pod="csi-node-driver-fd57q" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-csi--node--driver--fd57q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--4296cabafa-k8s-csi--node--driver--fd57q-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"bde1b4b3-33c0-4855-9196-5fe2289fff45", ResourceVersion:"678", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 1, 46, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-4296cabafa", ContainerID:"a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c", Pod:"csi-node-driver-fd57q", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.89.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali16d3921c7ce", MAC:"9e:59:5f:e6:d4:fa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 01:46:43.732136 containerd[1938]: 2025-08-13 01:46:43.730 [INFO][5912] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" Namespace="calico-system" Pod="csi-node-driver-fd57q" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-csi--node--driver--fd57q-eth0" Aug 13 01:46:43.739211 containerd[1938]: time="2025-08-13T01:46:43.739186010Z" level=info msg="connecting to shim a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c" address="unix:///run/containerd/s/1a8aed39702b9f47a1e09a380ebba3b0f59dc8f3b75261b170da4c34d2dc32ab" namespace=k8s.io protocol=ttrpc version=3 Aug 13 01:46:43.763182 systemd[1]: Started cri-containerd-a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c.scope - libcontainer container a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c. Aug 13 01:46:43.776303 containerd[1938]: time="2025-08-13T01:46:43.776257571Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fd57q,Uid:bde1b4b3-33c0-4855-9196-5fe2289fff45,Namespace:calico-system,Attempt:0,} returns sandbox id \"a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c\"" Aug 13 01:46:43.816665 kubelet[3290]: I0813 01:46:43.816634 3290 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 01:46:43.843730 systemd-networkd[1852]: cali387620a51b5: Link UP Aug 13 01:46:43.843942 systemd-networkd[1852]: cali387620a51b5: Gained carrier Aug 13 01:46:43.844820 kubelet[3290]: I0813 01:46:43.844786 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-7qzrh" podStartSLOduration=36.844773502 podStartE2EDuration="36.844773502s" podCreationTimestamp="2025-08-13 01:46:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 01:46:43.844446068 +0000 UTC m=+42.221770340" watchObservedRunningTime="2025-08-13 01:46:43.844773502 +0000 UTC m=+42.222097761" Aug 13 01:46:43.844944 kubelet[3290]: I0813 01:46:43.844849 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-795979ddf-pkw5n" podStartSLOduration=27.871327282 podStartE2EDuration="29.84484574s" podCreationTimestamp="2025-08-13 01:46:14 +0000 UTC" firstStartedPulling="2025-08-13 01:46:40.805642062 +0000 UTC m=+39.182966324" lastFinishedPulling="2025-08-13 01:46:42.779160519 +0000 UTC m=+41.156484782" observedRunningTime="2025-08-13 01:46:43.838829171 +0000 UTC m=+42.216153478" watchObservedRunningTime="2025-08-13 01:46:43.84484574 +0000 UTC m=+42.222169999" Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.691 [INFO][5910] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--a--4296cabafa-k8s-goldmane--768f4c5c69--g4k5b-eth0 goldmane-768f4c5c69- calico-system de755d1b-da9c-40d7-8dfc-e44c18abf305 783 0 2025-08-13 01:46:18 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:768f4c5c69 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4372.1.0-a-4296cabafa goldmane-768f4c5c69-g4k5b eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali387620a51b5 [] [] }} ContainerID="bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" Namespace="calico-system" Pod="goldmane-768f4c5c69-g4k5b" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-goldmane--768f4c5c69--g4k5b-" Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.691 [INFO][5910] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" Namespace="calico-system" Pod="goldmane-768f4c5c69-g4k5b" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-goldmane--768f4c5c69--g4k5b-eth0" Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.705 [INFO][5955] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" HandleID="k8s-pod-network.bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" Workload="ci--4372.1.0--a--4296cabafa-k8s-goldmane--768f4c5c69--g4k5b-eth0" Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.705 [INFO][5955] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" HandleID="k8s-pod-network.bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" Workload="ci--4372.1.0--a--4296cabafa-k8s-goldmane--768f4c5c69--g4k5b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003cf340), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-a-4296cabafa", "pod":"goldmane-768f4c5c69-g4k5b", "timestamp":"2025-08-13 01:46:43.705169189 +0000 UTC"}, Hostname:"ci-4372.1.0-a-4296cabafa", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.705 [INFO][5955] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.724 [INFO][5955] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.724 [INFO][5955] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-a-4296cabafa' Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.810 [INFO][5955] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.814 [INFO][5955] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.817 [INFO][5955] ipam/ipam.go 511: Trying affinity for 192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.819 [INFO][5955] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.821 [INFO][5955] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.0/26 host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.821 [INFO][5955] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.89.0/26 handle="k8s-pod-network.bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.823 [INFO][5955] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.839 [INFO][5955] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.89.0/26 handle="k8s-pod-network.bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.841 [INFO][5955] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.89.8/26] block=192.168.89.0/26 handle="k8s-pod-network.bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.841 [INFO][5955] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.8/26] handle="k8s-pod-network.bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" host="ci-4372.1.0-a-4296cabafa" Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.841 [INFO][5955] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 01:46:43.850293 containerd[1938]: 2025-08-13 01:46:43.841 [INFO][5955] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.89.8/26] IPv6=[] ContainerID="bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" HandleID="k8s-pod-network.bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" Workload="ci--4372.1.0--a--4296cabafa-k8s-goldmane--768f4c5c69--g4k5b-eth0" Aug 13 01:46:43.850716 containerd[1938]: 2025-08-13 01:46:43.842 [INFO][5910] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" Namespace="calico-system" Pod="goldmane-768f4c5c69-g4k5b" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-goldmane--768f4c5c69--g4k5b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--4296cabafa-k8s-goldmane--768f4c5c69--g4k5b-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"de755d1b-da9c-40d7-8dfc-e44c18abf305", ResourceVersion:"783", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 1, 46, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-4296cabafa", ContainerID:"", Pod:"goldmane-768f4c5c69-g4k5b", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.89.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali387620a51b5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 01:46:43.850716 containerd[1938]: 2025-08-13 01:46:43.842 [INFO][5910] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.8/32] ContainerID="bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" Namespace="calico-system" Pod="goldmane-768f4c5c69-g4k5b" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-goldmane--768f4c5c69--g4k5b-eth0" Aug 13 01:46:43.850716 containerd[1938]: 2025-08-13 01:46:43.843 [INFO][5910] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali387620a51b5 ContainerID="bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" Namespace="calico-system" Pod="goldmane-768f4c5c69-g4k5b" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-goldmane--768f4c5c69--g4k5b-eth0" Aug 13 01:46:43.850716 containerd[1938]: 2025-08-13 01:46:43.844 [INFO][5910] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" Namespace="calico-system" Pod="goldmane-768f4c5c69-g4k5b" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-goldmane--768f4c5c69--g4k5b-eth0" Aug 13 01:46:43.850716 containerd[1938]: 2025-08-13 01:46:43.844 [INFO][5910] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" Namespace="calico-system" Pod="goldmane-768f4c5c69-g4k5b" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-goldmane--768f4c5c69--g4k5b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--4296cabafa-k8s-goldmane--768f4c5c69--g4k5b-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"de755d1b-da9c-40d7-8dfc-e44c18abf305", ResourceVersion:"783", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 1, 46, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-4296cabafa", ContainerID:"bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa", Pod:"goldmane-768f4c5c69-g4k5b", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.89.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali387620a51b5", MAC:"1a:18:9d:18:df:fa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 01:46:43.850716 containerd[1938]: 2025-08-13 01:46:43.849 [INFO][5910] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" Namespace="calico-system" Pod="goldmane-768f4c5c69-g4k5b" WorkloadEndpoint="ci--4372.1.0--a--4296cabafa-k8s-goldmane--768f4c5c69--g4k5b-eth0" Aug 13 01:46:43.857729 containerd[1938]: time="2025-08-13T01:46:43.857704623Z" level=info msg="connecting to shim bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa" address="unix:///run/containerd/s/f4c1fe135894fc69086f97ee95a1859e863e49fd12a8b423321cdbb471f606ad" namespace=k8s.io protocol=ttrpc version=3 Aug 13 01:46:43.884184 systemd[1]: Started cri-containerd-bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa.scope - libcontainer container bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa. Aug 13 01:46:43.915719 containerd[1938]: time="2025-08-13T01:46:43.915696999Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-g4k5b,Uid:de755d1b-da9c-40d7-8dfc-e44c18abf305,Namespace:calico-system,Attempt:0,} returns sandbox id \"bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa\"" Aug 13 01:46:43.977205 systemd-networkd[1852]: cali1df3e822126: Gained IPv6LL Aug 13 01:46:44.819325 kubelet[3290]: I0813 01:46:44.819307 3290 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 01:46:45.000169 systemd-networkd[1852]: cali387620a51b5: Gained IPv6LL Aug 13 01:46:45.255944 systemd-networkd[1852]: cali16d3921c7ce: Gained IPv6LL Aug 13 01:46:45.441146 containerd[1938]: time="2025-08-13T01:46:45.441121630Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:45.441387 containerd[1938]: time="2025-08-13T01:46:45.441344139Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Aug 13 01:46:45.441715 containerd[1938]: time="2025-08-13T01:46:45.441702258Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:45.442502 containerd[1938]: time="2025-08-13T01:46:45.442489485Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:45.442852 containerd[1938]: time="2025-08-13T01:46:45.442838886Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 2.663593071s" Aug 13 01:46:45.442886 containerd[1938]: time="2025-08-13T01:46:45.442856157Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Aug 13 01:46:45.443339 containerd[1938]: time="2025-08-13T01:46:45.443327429Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Aug 13 01:46:45.446492 containerd[1938]: time="2025-08-13T01:46:45.446478518Z" level=info msg="CreateContainer within sandbox \"dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Aug 13 01:46:45.449544 containerd[1938]: time="2025-08-13T01:46:45.449530434Z" level=info msg="Container ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:46:45.452164 containerd[1938]: time="2025-08-13T01:46:45.452151979Z" level=info msg="CreateContainer within sandbox \"dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\"" Aug 13 01:46:45.452358 containerd[1938]: time="2025-08-13T01:46:45.452342448Z" level=info msg="StartContainer for \"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\"" Aug 13 01:46:45.452925 containerd[1938]: time="2025-08-13T01:46:45.452884079Z" level=info msg="connecting to shim ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4" address="unix:///run/containerd/s/b22ef5d0f963c78ff232312e950c71b99e9ca9fdd363eba1a7b408f3ae30d7fc" protocol=ttrpc version=3 Aug 13 01:46:45.467205 systemd[1]: Started cri-containerd-ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4.scope - libcontainer container ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4. Aug 13 01:46:45.494622 containerd[1938]: time="2025-08-13T01:46:45.494603083Z" level=info msg="StartContainer for \"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" returns successfully" Aug 13 01:46:45.848716 kubelet[3290]: I0813 01:46:45.848631 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7846546c6c-fbv5w" podStartSLOduration=22.291965083 podStartE2EDuration="26.848601537s" podCreationTimestamp="2025-08-13 01:46:19 +0000 UTC" firstStartedPulling="2025-08-13 01:46:40.886632016 +0000 UTC m=+39.263956278" lastFinishedPulling="2025-08-13 01:46:45.443268468 +0000 UTC m=+43.820592732" observedRunningTime="2025-08-13 01:46:45.847792541 +0000 UTC m=+44.225116849" watchObservedRunningTime="2025-08-13 01:46:45.848601537 +0000 UTC m=+44.225925829" Aug 13 01:46:46.876253 containerd[1938]: time="2025-08-13T01:46:46.876229030Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"be23a25489f5635339aedf8119203c0bf7ee58128aadc5b0af2a0ee51f3a267b\" pid:6177 exited_at:{seconds:1755049606 nanos:876053802}" Aug 13 01:46:47.092734 containerd[1938]: time="2025-08-13T01:46:47.092708624Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:47.092923 containerd[1938]: time="2025-08-13T01:46:47.092908661Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Aug 13 01:46:47.093470 containerd[1938]: time="2025-08-13T01:46:47.093457436Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:47.094798 containerd[1938]: time="2025-08-13T01:46:47.094666076Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:47.095712 containerd[1938]: time="2025-08-13T01:46:47.095697444Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 1.652352111s" Aug 13 01:46:47.095755 containerd[1938]: time="2025-08-13T01:46:47.095714647Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Aug 13 01:46:47.096128 containerd[1938]: time="2025-08-13T01:46:47.096115211Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Aug 13 01:46:47.096681 containerd[1938]: time="2025-08-13T01:46:47.096664626Z" level=info msg="CreateContainer within sandbox \"a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Aug 13 01:46:47.100560 containerd[1938]: time="2025-08-13T01:46:47.100546343Z" level=info msg="Container 738c311c0963c37eba56b873c652beb7d95194a8f1ae9b9ae4278e079dac6c37: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:46:47.103590 containerd[1938]: time="2025-08-13T01:46:47.103578194Z" level=info msg="CreateContainer within sandbox \"a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"738c311c0963c37eba56b873c652beb7d95194a8f1ae9b9ae4278e079dac6c37\"" Aug 13 01:46:47.103770 containerd[1938]: time="2025-08-13T01:46:47.103759593Z" level=info msg="StartContainer for \"738c311c0963c37eba56b873c652beb7d95194a8f1ae9b9ae4278e079dac6c37\"" Aug 13 01:46:47.104513 containerd[1938]: time="2025-08-13T01:46:47.104474082Z" level=info msg="connecting to shim 738c311c0963c37eba56b873c652beb7d95194a8f1ae9b9ae4278e079dac6c37" address="unix:///run/containerd/s/1a8aed39702b9f47a1e09a380ebba3b0f59dc8f3b75261b170da4c34d2dc32ab" protocol=ttrpc version=3 Aug 13 01:46:47.122201 systemd[1]: Started cri-containerd-738c311c0963c37eba56b873c652beb7d95194a8f1ae9b9ae4278e079dac6c37.scope - libcontainer container 738c311c0963c37eba56b873c652beb7d95194a8f1ae9b9ae4278e079dac6c37. Aug 13 01:46:47.143357 containerd[1938]: time="2025-08-13T01:46:47.143300906Z" level=info msg="StartContainer for \"738c311c0963c37eba56b873c652beb7d95194a8f1ae9b9ae4278e079dac6c37\" returns successfully" Aug 13 01:46:48.382185 kubelet[3290]: I0813 01:46:48.382126 3290 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 01:46:49.537961 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount862835854.mount: Deactivated successfully. Aug 13 01:46:49.771658 containerd[1938]: time="2025-08-13T01:46:49.771633315Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:49.771886 containerd[1938]: time="2025-08-13T01:46:49.771809943Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Aug 13 01:46:49.772118 containerd[1938]: time="2025-08-13T01:46:49.772105318Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:49.773139 containerd[1938]: time="2025-08-13T01:46:49.773126937Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:49.773522 containerd[1938]: time="2025-08-13T01:46:49.773507798Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 2.677375646s" Aug 13 01:46:49.773555 containerd[1938]: time="2025-08-13T01:46:49.773526250Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Aug 13 01:46:49.774016 containerd[1938]: time="2025-08-13T01:46:49.774003440Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Aug 13 01:46:49.774533 containerd[1938]: time="2025-08-13T01:46:49.774519710Z" level=info msg="CreateContainer within sandbox \"bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Aug 13 01:46:49.777637 containerd[1938]: time="2025-08-13T01:46:49.777618582Z" level=info msg="Container faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:46:49.781608 containerd[1938]: time="2025-08-13T01:46:49.781563381Z" level=info msg="CreateContainer within sandbox \"bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\"" Aug 13 01:46:49.781837 containerd[1938]: time="2025-08-13T01:46:49.781824570Z" level=info msg="StartContainer for \"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\"" Aug 13 01:46:49.782388 containerd[1938]: time="2025-08-13T01:46:49.782376137Z" level=info msg="connecting to shim faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37" address="unix:///run/containerd/s/f4c1fe135894fc69086f97ee95a1859e863e49fd12a8b423321cdbb471f606ad" protocol=ttrpc version=3 Aug 13 01:46:49.797960 systemd[1]: Started cri-containerd-faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37.scope - libcontainer container faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37. Aug 13 01:46:49.826846 containerd[1938]: time="2025-08-13T01:46:49.826823357Z" level=info msg="StartContainer for \"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" returns successfully" Aug 13 01:46:49.876378 kubelet[3290]: I0813 01:46:49.876338 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-768f4c5c69-g4k5b" podStartSLOduration=26.018604335 podStartE2EDuration="31.876322275s" podCreationTimestamp="2025-08-13 01:46:18 +0000 UTC" firstStartedPulling="2025-08-13 01:46:43.916240214 +0000 UTC m=+42.293564476" lastFinishedPulling="2025-08-13 01:46:49.773958154 +0000 UTC m=+48.151282416" observedRunningTime="2025-08-13 01:46:49.876231586 +0000 UTC m=+48.253555866" watchObservedRunningTime="2025-08-13 01:46:49.876322275 +0000 UTC m=+48.253646538" Aug 13 01:46:49.929211 containerd[1938]: time="2025-08-13T01:46:49.929184073Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"514692083ee15755c96003ef2c4768ab633b77bc760ee1bd7bef3ef3704845a9\" pid:6290 exit_status:1 exited_at:{seconds:1755049609 nanos:928962454}" Aug 13 01:46:50.930478 containerd[1938]: time="2025-08-13T01:46:50.930447354Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"fbc47f6219cd5db651a0c67495b833735ce491e13658c5f2743ddb1e8a88698f\" pid:6325 exit_status:1 exited_at:{seconds:1755049610 nanos:930220619}" Aug 13 01:46:51.701761 containerd[1938]: time="2025-08-13T01:46:51.701737988Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:51.702011 containerd[1938]: time="2025-08-13T01:46:51.701997933Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Aug 13 01:46:51.702323 containerd[1938]: time="2025-08-13T01:46:51.702310499Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:51.703063 containerd[1938]: time="2025-08-13T01:46:51.703052116Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 01:46:51.703719 containerd[1938]: time="2025-08-13T01:46:51.703708043Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 1.929687961s" Aug 13 01:46:51.703748 containerd[1938]: time="2025-08-13T01:46:51.703721961Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Aug 13 01:46:51.704517 containerd[1938]: time="2025-08-13T01:46:51.704506306Z" level=info msg="CreateContainer within sandbox \"a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Aug 13 01:46:51.707273 containerd[1938]: time="2025-08-13T01:46:51.707258730Z" level=info msg="Container 70dd06d2d17b84bde4077fb04c83d71334fc61c038b4164dfbcedd5e254879f6: CDI devices from CRI Config.CDIDevices: []" Aug 13 01:46:51.710945 containerd[1938]: time="2025-08-13T01:46:51.710931304Z" level=info msg="CreateContainer within sandbox \"a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"70dd06d2d17b84bde4077fb04c83d71334fc61c038b4164dfbcedd5e254879f6\"" Aug 13 01:46:51.711191 containerd[1938]: time="2025-08-13T01:46:51.711179251Z" level=info msg="StartContainer for \"70dd06d2d17b84bde4077fb04c83d71334fc61c038b4164dfbcedd5e254879f6\"" Aug 13 01:46:51.711897 containerd[1938]: time="2025-08-13T01:46:51.711884395Z" level=info msg="connecting to shim 70dd06d2d17b84bde4077fb04c83d71334fc61c038b4164dfbcedd5e254879f6" address="unix:///run/containerd/s/1a8aed39702b9f47a1e09a380ebba3b0f59dc8f3b75261b170da4c34d2dc32ab" protocol=ttrpc version=3 Aug 13 01:46:51.740329 systemd[1]: Started cri-containerd-70dd06d2d17b84bde4077fb04c83d71334fc61c038b4164dfbcedd5e254879f6.scope - libcontainer container 70dd06d2d17b84bde4077fb04c83d71334fc61c038b4164dfbcedd5e254879f6. Aug 13 01:46:51.803818 containerd[1938]: time="2025-08-13T01:46:51.803788290Z" level=info msg="StartContainer for \"70dd06d2d17b84bde4077fb04c83d71334fc61c038b4164dfbcedd5e254879f6\" returns successfully" Aug 13 01:46:51.895418 kubelet[3290]: I0813 01:46:51.895288 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-fd57q" podStartSLOduration=25.968101578 podStartE2EDuration="33.89524967s" podCreationTimestamp="2025-08-13 01:46:18 +0000 UTC" firstStartedPulling="2025-08-13 01:46:43.776802531 +0000 UTC m=+42.154126792" lastFinishedPulling="2025-08-13 01:46:51.703950624 +0000 UTC m=+50.081274884" observedRunningTime="2025-08-13 01:46:51.894966165 +0000 UTC m=+50.272290499" watchObservedRunningTime="2025-08-13 01:46:51.89524967 +0000 UTC m=+50.272573984" Aug 13 01:46:52.710934 kubelet[3290]: I0813 01:46:52.710831 3290 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Aug 13 01:46:52.711263 kubelet[3290]: I0813 01:46:52.710960 3290 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Aug 13 01:47:01.979720 kubelet[3290]: I0813 01:47:01.979513 3290 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 01:47:06.921492 containerd[1938]: time="2025-08-13T01:47:06.921417117Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\" id:\"b0617b40cff83f50895da6874f50cf1282d0958edb1985ace47ab5532da87f3f\" pid:6421 exited_at:{seconds:1755049626 nanos:921147294}" Aug 13 01:47:16.887634 containerd[1938]: time="2025-08-13T01:47:16.887609503Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"deabad680522332f3b4c2b3a4fc372767715ad22ad6de369fba6191a4b094a3a\" pid:6466 exited_at:{seconds:1755049636 nanos:887475653}" Aug 13 01:47:20.989707 containerd[1938]: time="2025-08-13T01:47:20.989681116Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"1dd5d827eb35d155549bbe8ca2cfc3de19c43be18b3a1f4218681d9016f37c76\" pid:6487 exited_at:{seconds:1755049640 nanos:989489107}" Aug 13 01:47:33.726820 containerd[1938]: time="2025-08-13T01:47:33.726773454Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"699a1d50b339462a6dfd1773b8cf66ced992436d53ff3733b5ad247a65dfaf83\" pid:6526 exited_at:{seconds:1755049653 nanos:726598670}" Aug 13 01:47:36.892473 containerd[1938]: time="2025-08-13T01:47:36.892417799Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\" id:\"77e24408f84b9ae5e0b8a2e4af45ae5ab7f416ace426c4839d50562e946303cd\" pid:6548 exited_at:{seconds:1755049656 nanos:892240306}" Aug 13 01:47:39.745364 containerd[1938]: time="2025-08-13T01:47:39.745339776Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"b2093ba76eae8a6c78751f4e8e49c0bb8e3cef20bda91eb98f5b6677d9bde882\" pid:6587 exited_at:{seconds:1755049659 nanos:745070391}" Aug 13 01:47:46.873203 containerd[1938]: time="2025-08-13T01:47:46.873176986Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"a932362a8cc843099f6185729a1b8e3d67a0ee3c00a933affe67602897d97eb1\" pid:6621 exited_at:{seconds:1755049666 nanos:873077161}" Aug 13 01:47:50.917326 containerd[1938]: time="2025-08-13T01:47:50.917284973Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"24fc3335306fd3ef0224faef8e4cc1c50dabbb887686e2357e9b989f0d160e09\" pid:6643 exited_at:{seconds:1755049670 nanos:916999178}" Aug 13 01:48:06.902977 containerd[1938]: time="2025-08-13T01:48:06.902931682Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\" id:\"9ff1618433ddc11f67ac8ed8e1fb5ba324cc25b387bb93c737356bcd2d0692b5\" pid:6686 exited_at:{seconds:1755049686 nanos:902743187}" Aug 13 01:48:16.883001 containerd[1938]: time="2025-08-13T01:48:16.882974362Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"f3c5ccc3b0f1d283519907d6069a7f104d934022ab3ddfe49fbe8b5fc8f7d14b\" pid:6748 exited_at:{seconds:1755049696 nanos:882813841}" Aug 13 01:48:20.974526 containerd[1938]: time="2025-08-13T01:48:20.974500062Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"e60abb46b960bbc0c1b746b0481f0d71730fcb65d060c47b3033c4ca88b819f4\" pid:6770 exited_at:{seconds:1755049700 nanos:974307842}" Aug 13 01:48:33.719778 containerd[1938]: time="2025-08-13T01:48:33.719715446Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"9dc867a0b9d62064d78b6f98a2e05d705c42a5ed609ec6fba704f5653cbcaf99\" pid:6803 exited_at:{seconds:1755049713 nanos:719559179}" Aug 13 01:48:36.908106 containerd[1938]: time="2025-08-13T01:48:36.908051523Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\" id:\"693eff885f341513b32c759d7a2947cc9f4d451acd76a0d73f97e53b5f691248\" pid:6824 exited_at:{seconds:1755049716 nanos:907881119}" Aug 13 01:48:39.753001 containerd[1938]: time="2025-08-13T01:48:39.752978327Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"3e6b2ee3c5e69b2df453b7a7ebb9bbfcea4e340a1cbe7a65d1b49da736293e32\" pid:6858 exited_at:{seconds:1755049719 nanos:752772670}" Aug 13 01:48:46.873983 containerd[1938]: time="2025-08-13T01:48:46.873953761Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"de98833a3e6e85aa4f818ab75d078a9284308fa9cd8ecaf69c746b1729f4eabe\" pid:6890 exited_at:{seconds:1755049726 nanos:873779854}" Aug 13 01:48:50.985592 containerd[1938]: time="2025-08-13T01:48:50.985564298Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"4a6c2c552106c1b35c6ce8deef4ae8eb70f1ba23a4b88004bb31fadfe9a799df\" pid:6912 exited_at:{seconds:1755049730 nanos:985353656}" Aug 13 01:49:06.859767 containerd[1938]: time="2025-08-13T01:49:06.859738922Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\" id:\"464d3b66ac7936b93e31552d540d1748b30d60e83927d87452e3c9cee9932722\" pid:6948 exited_at:{seconds:1755049746 nanos:859504142}" Aug 13 01:49:16.882571 containerd[1938]: time="2025-08-13T01:49:16.882547316Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"9f04e4b8f530ad1238e09327a9f5e0f0b318fcdee74ba29739c506e68bf1293d\" pid:6992 exited_at:{seconds:1755049756 nanos:882419644}" Aug 13 01:49:20.930713 containerd[1938]: time="2025-08-13T01:49:20.930689809Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"7b478ed428838602b8ccb9d0d589918148ce96be0bb27e55963bd168f9f9e57b\" pid:7014 exited_at:{seconds:1755049760 nanos:930473715}" Aug 13 01:49:33.710230 containerd[1938]: time="2025-08-13T01:49:33.710188790Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"8baf6888c2407b3cce5756faff889ee3306e7d6d50b14d7855bba22715e18011\" pid:7049 exited_at:{seconds:1755049773 nanos:709993900}" Aug 13 01:49:36.861416 containerd[1938]: time="2025-08-13T01:49:36.861340831Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\" id:\"c1377afb394f154a5865d1d57c39da728fdfb8400708d7b2b16acf881426a9dd\" pid:7072 exited_at:{seconds:1755049776 nanos:861114979}" Aug 13 01:49:39.756456 containerd[1938]: time="2025-08-13T01:49:39.756431454Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"1d9f407d5e18be28c04fb145c29984e34f086abe7f0f73c25bf9acffef4a1d94\" pid:7108 exited_at:{seconds:1755049779 nanos:756110845}" Aug 13 01:49:46.874416 containerd[1938]: time="2025-08-13T01:49:46.874393927Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"72605efff31dbea66fab759bf1c9cf01b5324504ddcd6f004ceee8335387fd22\" pid:7155 exited_at:{seconds:1755049786 nanos:874308105}" Aug 13 01:49:50.933706 containerd[1938]: time="2025-08-13T01:49:50.933677937Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"b36d73c8a1eef7e6690e848f84597d654014bd14ac603596247b310a38b243e0\" pid:7183 exited_at:{seconds:1755049790 nanos:933482382}" Aug 13 01:50:06.847545 containerd[1938]: time="2025-08-13T01:50:06.847494431Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\" id:\"0e2d23da23ab277762f6e5779c2e9b0058321e751e28cf2b1aa313f2b8b02857\" pid:7217 exited_at:{seconds:1755049806 nanos:847212230}" Aug 13 01:50:16.874235 containerd[1938]: time="2025-08-13T01:50:16.874203723Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"6aabca0d8aebc569cd4209a63368d659bf14e65e56f8fb9c94c55266ed352ef6\" pid:7253 exited_at:{seconds:1755049816 nanos:874066532}" Aug 13 01:50:20.998398 containerd[1938]: time="2025-08-13T01:50:20.998336218Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"ecf2799cdd35546a01d10576899eb0b0235ff3d9c381a7c6c0a971bd08b7e9a5\" pid:7275 exited_at:{seconds:1755049820 nanos:998157326}" Aug 13 01:50:33.706411 containerd[1938]: time="2025-08-13T01:50:33.706387899Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"eee890be6e92a511ddae122c1a7d925468424cd776523be75c92a4d391ba20c8\" pid:7310 exited_at:{seconds:1755049833 nanos:706258295}" Aug 13 01:50:36.902041 containerd[1938]: time="2025-08-13T01:50:36.901982242Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\" id:\"13e2a0869bc0fb905330fc3fa9cd908d2c19ea2f832a2783ebe057c88ddf033a\" pid:7331 exited_at:{seconds:1755049836 nanos:901744505}" Aug 13 01:50:39.769193 containerd[1938]: time="2025-08-13T01:50:39.769164994Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"ac1e624450daee7c843e3df9d9ab1d9a26d27027c4d0691507ff2f53dc652645\" pid:7369 exited_at:{seconds:1755049839 nanos:768978436}" Aug 13 01:50:46.874222 containerd[1938]: time="2025-08-13T01:50:46.874200612Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"29019aec08e35579a4b1dc9d8dd9ab7cd53185b087bf6fec6cc1eb04cfabc486\" pid:7402 exited_at:{seconds:1755049846 nanos:874095491}" Aug 13 01:50:50.948124 containerd[1938]: time="2025-08-13T01:50:50.948093285Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"f6bdf5493337b71c37ee55b4b6df4d74bc8066fc74415d1b435e1821ef9a08f1\" pid:7427 exited_at:{seconds:1755049850 nanos:947888056}" Aug 13 01:50:58.279362 containerd[1938]: time="2025-08-13T01:50:58.279236514Z" level=warning msg="container event discarded" container=64ff6af9143ee2e106db5438c5cd33855ca0e01b02150572c59b7f5f35bb1cb8 type=CONTAINER_CREATED_EVENT Aug 13 01:50:58.279362 containerd[1938]: time="2025-08-13T01:50:58.279314069Z" level=warning msg="container event discarded" container=64ff6af9143ee2e106db5438c5cd33855ca0e01b02150572c59b7f5f35bb1cb8 type=CONTAINER_STARTED_EVENT Aug 13 01:50:58.310921 containerd[1938]: time="2025-08-13T01:50:58.310755757Z" level=warning msg="container event discarded" container=1f0f3e5aa67a7ccf8fe204023802ee2664dc783ef145035f70cd84a082e3a1ce type=CONTAINER_CREATED_EVENT Aug 13 01:50:58.310921 containerd[1938]: time="2025-08-13T01:50:58.310839529Z" level=warning msg="container event discarded" container=1f0f3e5aa67a7ccf8fe204023802ee2664dc783ef145035f70cd84a082e3a1ce type=CONTAINER_STARTED_EVENT Aug 13 01:50:58.310921 containerd[1938]: time="2025-08-13T01:50:58.310908613Z" level=warning msg="container event discarded" container=c6cad9ef8f4e993785c2c5cfde03dd888fa5d3d25c41d00896c72b236a33feb9 type=CONTAINER_CREATED_EVENT Aug 13 01:50:58.311389 containerd[1938]: time="2025-08-13T01:50:58.310949445Z" level=warning msg="container event discarded" container=c6cad9ef8f4e993785c2c5cfde03dd888fa5d3d25c41d00896c72b236a33feb9 type=CONTAINER_STARTED_EVENT Aug 13 01:50:58.311389 containerd[1938]: time="2025-08-13T01:50:58.310972029Z" level=warning msg="container event discarded" container=aab4af456155b502395cc8f20fb7f2c46bebf5e15062561ec4a941d3b153e25e type=CONTAINER_CREATED_EVENT Aug 13 01:50:58.311389 containerd[1938]: time="2025-08-13T01:50:58.310992612Z" level=warning msg="container event discarded" container=e5a999bfdeb08c7e882a677fbd4a4534e9078dbf8c776ed96737accd15b59002 type=CONTAINER_CREATED_EVENT Aug 13 01:50:58.311389 containerd[1938]: time="2025-08-13T01:50:58.311019632Z" level=warning msg="container event discarded" container=ddf6e6fab286b061b4ef76c293aa6383466b2117da2d8352970ad3acac319915 type=CONTAINER_CREATED_EVENT Aug 13 01:50:58.370636 containerd[1938]: time="2025-08-13T01:50:58.370472330Z" level=warning msg="container event discarded" container=aab4af456155b502395cc8f20fb7f2c46bebf5e15062561ec4a941d3b153e25e type=CONTAINER_STARTED_EVENT Aug 13 01:50:58.370636 containerd[1938]: time="2025-08-13T01:50:58.370616486Z" level=warning msg="container event discarded" container=ddf6e6fab286b061b4ef76c293aa6383466b2117da2d8352970ad3acac319915 type=CONTAINER_STARTED_EVENT Aug 13 01:50:58.370636 containerd[1938]: time="2025-08-13T01:50:58.370654005Z" level=warning msg="container event discarded" container=e5a999bfdeb08c7e882a677fbd4a4534e9078dbf8c776ed96737accd15b59002 type=CONTAINER_STARTED_EVENT Aug 13 01:51:06.855945 containerd[1938]: time="2025-08-13T01:51:06.855917558Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\" id:\"4af3b4d20ad99fab1dee8392c1e9b77d1c81f2a30f15b70939ade378967330d2\" pid:7465 exited_at:{seconds:1755049866 nanos:855441882}" Aug 13 01:51:07.658187 containerd[1938]: time="2025-08-13T01:51:07.658021612Z" level=warning msg="container event discarded" container=7859634f416d68d09c80c2970b82474d3f59f643048c47afe27b6c314a9ce8bb type=CONTAINER_CREATED_EVENT Aug 13 01:51:07.658187 containerd[1938]: time="2025-08-13T01:51:07.658161235Z" level=warning msg="container event discarded" container=7859634f416d68d09c80c2970b82474d3f59f643048c47afe27b6c314a9ce8bb type=CONTAINER_STARTED_EVENT Aug 13 01:51:07.658187 containerd[1938]: time="2025-08-13T01:51:07.658194316Z" level=warning msg="container event discarded" container=b002d54badd9b7ce06e0eefd79f028d19247dde5990c253dad250289b101c8f1 type=CONTAINER_CREATED_EVENT Aug 13 01:51:07.705580 containerd[1938]: time="2025-08-13T01:51:07.705465049Z" level=warning msg="container event discarded" container=b002d54badd9b7ce06e0eefd79f028d19247dde5990c253dad250289b101c8f1 type=CONTAINER_STARTED_EVENT Aug 13 01:51:08.048928 containerd[1938]: time="2025-08-13T01:51:08.048626029Z" level=warning msg="container event discarded" container=89e62afa32a88a86b8db0e74301c090e82bfd266b47b02a4c26e803e803a2286 type=CONTAINER_CREATED_EVENT Aug 13 01:51:08.048928 containerd[1938]: time="2025-08-13T01:51:08.048739663Z" level=warning msg="container event discarded" container=89e62afa32a88a86b8db0e74301c090e82bfd266b47b02a4c26e803e803a2286 type=CONTAINER_STARTED_EVENT Aug 13 01:51:10.077909 containerd[1938]: time="2025-08-13T01:51:10.077722722Z" level=warning msg="container event discarded" container=4a45d2c80b92f285dd5a843fd9d6cdd12fb3078c664fef6fa321892471dfbaad type=CONTAINER_CREATED_EVENT Aug 13 01:51:10.106332 containerd[1938]: time="2025-08-13T01:51:10.106162112Z" level=warning msg="container event discarded" container=4a45d2c80b92f285dd5a843fd9d6cdd12fb3078c664fef6fa321892471dfbaad type=CONTAINER_STARTED_EVENT Aug 13 01:51:16.879806 containerd[1938]: time="2025-08-13T01:51:16.879751767Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"c8156b2f4fff71baf67077c3da663c5dd94c9a262caabdc664f9248746f50bc0\" pid:7504 exited_at:{seconds:1755049876 nanos:879585015}" Aug 13 01:51:18.686995 containerd[1938]: time="2025-08-13T01:51:18.686816944Z" level=warning msg="container event discarded" container=8bc358a2dbe157e3ee8ed4e176274353d021f349957d0816d998280039a83973 type=CONTAINER_CREATED_EVENT Aug 13 01:51:18.686995 containerd[1938]: time="2025-08-13T01:51:18.686960590Z" level=warning msg="container event discarded" container=8bc358a2dbe157e3ee8ed4e176274353d021f349957d0816d998280039a83973 type=CONTAINER_STARTED_EVENT Aug 13 01:51:19.077051 containerd[1938]: time="2025-08-13T01:51:19.076759987Z" level=warning msg="container event discarded" container=9360514811ec3057aaabdb8bc4c74261d7b00940d25584a389ec70dd388ede4e type=CONTAINER_CREATED_EVENT Aug 13 01:51:19.077051 containerd[1938]: time="2025-08-13T01:51:19.076850994Z" level=warning msg="container event discarded" container=9360514811ec3057aaabdb8bc4c74261d7b00940d25584a389ec70dd388ede4e type=CONTAINER_STARTED_EVENT Aug 13 01:51:20.933693 containerd[1938]: time="2025-08-13T01:51:20.933668770Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"acdb950fbea6aeff10b131fef839cdc2ed8371443c0a6acdc5737b2976fd3b4e\" pid:7532 exited_at:{seconds:1755049880 nanos:933488497}" Aug 13 01:51:20.944592 containerd[1938]: time="2025-08-13T01:51:20.944538442Z" level=warning msg="container event discarded" container=43d7b08538c2fb9babdca744c8acf8d9f63530f1dc197c674c1418b7f27a0e74 type=CONTAINER_CREATED_EVENT Aug 13 01:51:20.986922 containerd[1938]: time="2025-08-13T01:51:20.986705573Z" level=warning msg="container event discarded" container=43d7b08538c2fb9babdca744c8acf8d9f63530f1dc197c674c1418b7f27a0e74 type=CONTAINER_STARTED_EVENT Aug 13 01:51:22.479296 containerd[1938]: time="2025-08-13T01:51:22.479122344Z" level=warning msg="container event discarded" container=ff8bdda17377b74d3e12f4e497b9fca2abd23e4bc39d2e5a37a1b8cf0a1be28d type=CONTAINER_CREATED_EVENT Aug 13 01:51:22.535007 containerd[1938]: time="2025-08-13T01:51:22.534898887Z" level=warning msg="container event discarded" container=ff8bdda17377b74d3e12f4e497b9fca2abd23e4bc39d2e5a37a1b8cf0a1be28d type=CONTAINER_STARTED_EVENT Aug 13 01:51:23.459560 containerd[1938]: time="2025-08-13T01:51:23.459409480Z" level=warning msg="container event discarded" container=ff8bdda17377b74d3e12f4e497b9fca2abd23e4bc39d2e5a37a1b8cf0a1be28d type=CONTAINER_STOPPED_EVENT Aug 13 01:51:26.913771 containerd[1938]: time="2025-08-13T01:51:26.913649674Z" level=warning msg="container event discarded" container=51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf type=CONTAINER_CREATED_EVENT Aug 13 01:51:26.948289 containerd[1938]: time="2025-08-13T01:51:26.948116523Z" level=warning msg="container event discarded" container=51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf type=CONTAINER_STARTED_EVENT Aug 13 01:51:27.906127 containerd[1938]: time="2025-08-13T01:51:27.905973219Z" level=warning msg="container event discarded" container=51f79ee67dad551521478cecf82f463cd47474488209fe040730bad1f0706aaf type=CONTAINER_STOPPED_EVENT Aug 13 01:51:33.715285 containerd[1938]: time="2025-08-13T01:51:33.715257817Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"96d2d84aae4e02110b6393ea423c2d9246878b816c8f79c6a29deb97999c72ef\" pid:7582 exited_at:{seconds:1755049893 nanos:715092954}" Aug 13 01:51:34.001955 containerd[1938]: time="2025-08-13T01:51:34.001593041Z" level=warning msg="container event discarded" container=1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e type=CONTAINER_CREATED_EVENT Aug 13 01:51:34.055315 containerd[1938]: time="2025-08-13T01:51:34.055199761Z" level=warning msg="container event discarded" container=1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e type=CONTAINER_STARTED_EVENT Aug 13 01:51:35.323397 containerd[1938]: time="2025-08-13T01:51:35.323259634Z" level=warning msg="container event discarded" container=3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330 type=CONTAINER_CREATED_EVENT Aug 13 01:51:35.323397 containerd[1938]: time="2025-08-13T01:51:35.323373602Z" level=warning msg="container event discarded" container=3f71eaa1e411ae12ab8962522d0b872fa135ac1b143bec001d6a493ca624a330 type=CONTAINER_STARTED_EVENT Aug 13 01:51:36.858582 containerd[1938]: time="2025-08-13T01:51:36.858553385Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\" id:\"9e699c0100ec4c94dee4e112e3c8e1408700e726fc3444660ee559fd394cbffa\" pid:7604 exited_at:{seconds:1755049896 nanos:858334857}" Aug 13 01:51:36.990075 containerd[1938]: time="2025-08-13T01:51:36.990046288Z" level=warning msg="container event discarded" container=d0f392e5243f2245e0c034e3885584dd74266edc5ac71cdac1cb0c8f561acf73 type=CONTAINER_CREATED_EVENT Aug 13 01:51:37.079664 containerd[1938]: time="2025-08-13T01:51:37.079522711Z" level=warning msg="container event discarded" container=d0f392e5243f2245e0c034e3885584dd74266edc5ac71cdac1cb0c8f561acf73 type=CONTAINER_STARTED_EVENT Aug 13 01:51:38.804906 containerd[1938]: time="2025-08-13T01:51:38.804699662Z" level=warning msg="container event discarded" container=3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005 type=CONTAINER_CREATED_EVENT Aug 13 01:51:38.804906 containerd[1938]: time="2025-08-13T01:51:38.804813743Z" level=warning msg="container event discarded" container=3221fea2844d8966f34ff103fdbfaa40ccd57e014bf96ae67323be59687d6005 type=CONTAINER_STARTED_EVENT Aug 13 01:51:39.771758 containerd[1938]: time="2025-08-13T01:51:39.771723634Z" level=warning msg="container event discarded" container=f42758ee2e8b95398e595efd2fc981f88ebb01fe0ee9cd8618588777698b265c type=CONTAINER_CREATED_EVENT Aug 13 01:51:39.783632 containerd[1938]: time="2025-08-13T01:51:39.783600777Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"cd1991528caf251a89118cd17ac1642b5fad69b19813c3a260b249c8b6c6cee0\" pid:7640 exited_at:{seconds:1755049899 nanos:783398158}" Aug 13 01:51:39.805357 containerd[1938]: time="2025-08-13T01:51:39.805301759Z" level=warning msg="container event discarded" container=3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0 type=CONTAINER_CREATED_EVENT Aug 13 01:51:39.805357 containerd[1938]: time="2025-08-13T01:51:39.805315873Z" level=warning msg="container event discarded" container=3a32616a3abb50de405fa5c6411414c147254cda1eecd01bbe42684d63a29ac0 type=CONTAINER_STARTED_EVENT Aug 13 01:51:39.805357 containerd[1938]: time="2025-08-13T01:51:39.805320639Z" level=warning msg="container event discarded" container=f42758ee2e8b95398e595efd2fc981f88ebb01fe0ee9cd8618588777698b265c type=CONTAINER_STARTED_EVENT Aug 13 01:51:39.805357 containerd[1938]: time="2025-08-13T01:51:39.805324431Z" level=warning msg="container event discarded" container=d2e0932ba7283bab5e37017f4a036c925049eefb1d65283bfd80f2e52a3e5812 type=CONTAINER_CREATED_EVENT Aug 13 01:51:39.839437 containerd[1938]: time="2025-08-13T01:51:39.839259596Z" level=warning msg="container event discarded" container=d2e0932ba7283bab5e37017f4a036c925049eefb1d65283bfd80f2e52a3e5812 type=CONTAINER_STARTED_EVENT Aug 13 01:51:40.815537 containerd[1938]: time="2025-08-13T01:51:40.815400031Z" level=warning msg="container event discarded" container=15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71 type=CONTAINER_CREATED_EVENT Aug 13 01:51:40.815537 containerd[1938]: time="2025-08-13T01:51:40.815478437Z" level=warning msg="container event discarded" container=15b8e3cbd29f1d23a0d23f1d50b1ba4de17639aba39d5315361b76ac4abe1a71 type=CONTAINER_STARTED_EVENT Aug 13 01:51:40.897143 containerd[1938]: time="2025-08-13T01:51:40.896983510Z" level=warning msg="container event discarded" container=dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895 type=CONTAINER_CREATED_EVENT Aug 13 01:51:40.897143 containerd[1938]: time="2025-08-13T01:51:40.897079897Z" level=warning msg="container event discarded" container=dfb01fb6680cd972330b914c46931519170d6f285c6b5dc5bd7621d764046895 type=CONTAINER_STARTED_EVENT Aug 13 01:51:42.393637 containerd[1938]: time="2025-08-13T01:51:42.393418304Z" level=warning msg="container event discarded" container=2bcc2e126dc8f1297bcec02870fa8ea3ec4973d26587abfcc57461b7d01e65a3 type=CONTAINER_CREATED_EVENT Aug 13 01:51:42.445162 containerd[1938]: time="2025-08-13T01:51:42.444978522Z" level=warning msg="container event discarded" container=2bcc2e126dc8f1297bcec02870fa8ea3ec4973d26587abfcc57461b7d01e65a3 type=CONTAINER_STARTED_EVENT Aug 13 01:51:42.793947 containerd[1938]: time="2025-08-13T01:51:42.793636078Z" level=warning msg="container event discarded" container=0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15 type=CONTAINER_CREATED_EVENT Aug 13 01:51:42.793947 containerd[1938]: time="2025-08-13T01:51:42.793727699Z" level=warning msg="container event discarded" container=0cec6820f42632b453a3b1fc2620c9975e5e194d99dba23b295def59ae6fee15 type=CONTAINER_STARTED_EVENT Aug 13 01:51:42.793947 containerd[1938]: time="2025-08-13T01:51:42.793772478Z" level=warning msg="container event discarded" container=393f7acbdd831c7ec35b8ecdd8bc4687aa654dca440f325a124238205eb8aef3 type=CONTAINER_CREATED_EVENT Aug 13 01:51:42.793947 containerd[1938]: time="2025-08-13T01:51:42.793813489Z" level=warning msg="container event discarded" container=516d22b09f3851a1de701f26926b2579393a85e7c216d7cf429a28b0f542cdc2 type=CONTAINER_CREATED_EVENT Aug 13 01:51:42.833132 containerd[1938]: time="2025-08-13T01:51:42.832962402Z" level=warning msg="container event discarded" container=516d22b09f3851a1de701f26926b2579393a85e7c216d7cf429a28b0f542cdc2 type=CONTAINER_STARTED_EVENT Aug 13 01:51:42.846524 containerd[1938]: time="2025-08-13T01:51:42.846385584Z" level=warning msg="container event discarded" container=393f7acbdd831c7ec35b8ecdd8bc4687aa654dca440f325a124238205eb8aef3 type=CONTAINER_STARTED_EVENT Aug 13 01:51:43.786483 containerd[1938]: time="2025-08-13T01:51:43.786321856Z" level=warning msg="container event discarded" container=a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c type=CONTAINER_CREATED_EVENT Aug 13 01:51:43.786483 containerd[1938]: time="2025-08-13T01:51:43.786418353Z" level=warning msg="container event discarded" container=a094e427f4e825a42889be2198c95c382022908447a3d3fe4ad8c516cc39ff7c type=CONTAINER_STARTED_EVENT Aug 13 01:51:43.926798 containerd[1938]: time="2025-08-13T01:51:43.926641569Z" level=warning msg="container event discarded" container=bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa type=CONTAINER_CREATED_EVENT Aug 13 01:51:43.926798 containerd[1938]: time="2025-08-13T01:51:43.926733619Z" level=warning msg="container event discarded" container=bb8ae3001c80c9e3fde5cc8a91dfb0e602eaef05f938bb9b23da5dfb06bed2fa type=CONTAINER_STARTED_EVENT Aug 13 01:51:45.462723 containerd[1938]: time="2025-08-13T01:51:45.462541943Z" level=warning msg="container event discarded" container=ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4 type=CONTAINER_CREATED_EVENT Aug 13 01:51:45.505161 containerd[1938]: time="2025-08-13T01:51:45.505004669Z" level=warning msg="container event discarded" container=ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4 type=CONTAINER_STARTED_EVENT Aug 13 01:51:46.880595 containerd[1938]: time="2025-08-13T01:51:46.880574031Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"30298fe201ce8d9354a12203970b40733897c67fd681a06f2b0527cfbc773a49\" pid:7674 exited_at:{seconds:1755049906 nanos:880456275}" Aug 13 01:51:47.114030 containerd[1938]: time="2025-08-13T01:51:47.113908634Z" level=warning msg="container event discarded" container=738c311c0963c37eba56b873c652beb7d95194a8f1ae9b9ae4278e079dac6c37 type=CONTAINER_CREATED_EVENT Aug 13 01:51:47.153693 containerd[1938]: time="2025-08-13T01:51:47.153417990Z" level=warning msg="container event discarded" container=738c311c0963c37eba56b873c652beb7d95194a8f1ae9b9ae4278e079dac6c37 type=CONTAINER_STARTED_EVENT Aug 13 01:51:49.791268 containerd[1938]: time="2025-08-13T01:51:49.791108242Z" level=warning msg="container event discarded" container=faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37 type=CONTAINER_CREATED_EVENT Aug 13 01:51:49.836641 containerd[1938]: time="2025-08-13T01:51:49.836484279Z" level=warning msg="container event discarded" container=faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37 type=CONTAINER_STARTED_EVENT Aug 13 01:51:50.928481 containerd[1938]: time="2025-08-13T01:51:50.928457569Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"b66ff83b46d097014631a403adaf6618e92747255751566dfbe447882198b691\" pid:7697 exited_at:{seconds:1755049910 nanos:928284382}" Aug 13 01:51:51.721049 containerd[1938]: time="2025-08-13T01:51:51.720929280Z" level=warning msg="container event discarded" container=70dd06d2d17b84bde4077fb04c83d71334fc61c038b4164dfbcedd5e254879f6 type=CONTAINER_CREATED_EVENT Aug 13 01:51:51.813646 containerd[1938]: time="2025-08-13T01:51:51.813505723Z" level=warning msg="container event discarded" container=70dd06d2d17b84bde4077fb04c83d71334fc61c038b4164dfbcedd5e254879f6 type=CONTAINER_STARTED_EVENT Aug 13 01:52:06.855369 containerd[1938]: time="2025-08-13T01:52:06.855340523Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\" id:\"b1799bbfc57e1aaa6ad8f4b9076a1bb577df33b4a8509b58aa63f83734459c54\" pid:7740 exited_at:{seconds:1755049926 nanos:855076620}" Aug 13 01:52:16.926835 containerd[1938]: time="2025-08-13T01:52:16.926806053Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"b9b022398f3c33c269ab73c28befbbe57c2c4b0031694b19603f6b2b3566564b\" pid:7779 exited_at:{seconds:1755049936 nanos:926650180}" Aug 13 01:52:19.897441 systemd[1]: Started sshd@9-147.75.71.211:22-139.178.89.65:36500.service - OpenSSH per-connection server daemon (139.178.89.65:36500). Aug 13 01:52:19.961687 sshd[7793]: Accepted publickey for core from 139.178.89.65 port 36500 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:52:19.963440 sshd-session[7793]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:52:19.970369 systemd-logind[1926]: New session 12 of user core. Aug 13 01:52:19.988115 systemd[1]: Started session-12.scope - Session 12 of User core. Aug 13 01:52:20.085709 sshd[7795]: Connection closed by 139.178.89.65 port 36500 Aug 13 01:52:20.085898 sshd-session[7793]: pam_unix(sshd:session): session closed for user core Aug 13 01:52:20.087850 systemd[1]: sshd@9-147.75.71.211:22-139.178.89.65:36500.service: Deactivated successfully. Aug 13 01:52:20.088959 systemd[1]: session-12.scope: Deactivated successfully. Aug 13 01:52:20.089780 systemd-logind[1926]: Session 12 logged out. Waiting for processes to exit. Aug 13 01:52:20.090667 systemd-logind[1926]: Removed session 12. Aug 13 01:52:20.937820 containerd[1938]: time="2025-08-13T01:52:20.937794247Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"82b3fb13e7c60dba31405682ab808aeb2f748b4c44759f3d4b5260372c626925\" pid:7837 exited_at:{seconds:1755049940 nanos:937567086}" Aug 13 01:52:25.110055 systemd[1]: Started sshd@10-147.75.71.211:22-139.178.89.65:36516.service - OpenSSH per-connection server daemon (139.178.89.65:36516). Aug 13 01:52:25.152755 sshd[7858]: Accepted publickey for core from 139.178.89.65 port 36516 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:52:25.154091 sshd-session[7858]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:52:25.159037 systemd-logind[1926]: New session 13 of user core. Aug 13 01:52:25.170254 systemd[1]: Started session-13.scope - Session 13 of User core. Aug 13 01:52:25.293038 sshd[7860]: Connection closed by 139.178.89.65 port 36516 Aug 13 01:52:25.293205 sshd-session[7858]: pam_unix(sshd:session): session closed for user core Aug 13 01:52:25.295029 systemd[1]: sshd@10-147.75.71.211:22-139.178.89.65:36516.service: Deactivated successfully. Aug 13 01:52:25.296084 systemd[1]: session-13.scope: Deactivated successfully. Aug 13 01:52:25.297113 systemd-logind[1926]: Session 13 logged out. Waiting for processes to exit. Aug 13 01:52:25.297810 systemd-logind[1926]: Removed session 13. Aug 13 01:52:30.316912 systemd[1]: Started sshd@11-147.75.71.211:22-139.178.89.65:33492.service - OpenSSH per-connection server daemon (139.178.89.65:33492). Aug 13 01:52:30.405224 sshd[7886]: Accepted publickey for core from 139.178.89.65 port 33492 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:52:30.408990 sshd-session[7886]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:52:30.412607 systemd-logind[1926]: New session 14 of user core. Aug 13 01:52:30.426116 systemd[1]: Started session-14.scope - Session 14 of User core. Aug 13 01:52:30.518638 sshd[7888]: Connection closed by 139.178.89.65 port 33492 Aug 13 01:52:30.518831 sshd-session[7886]: pam_unix(sshd:session): session closed for user core Aug 13 01:52:30.533195 systemd[1]: sshd@11-147.75.71.211:22-139.178.89.65:33492.service: Deactivated successfully. Aug 13 01:52:30.534160 systemd[1]: session-14.scope: Deactivated successfully. Aug 13 01:52:30.534705 systemd-logind[1926]: Session 14 logged out. Waiting for processes to exit. Aug 13 01:52:30.536264 systemd[1]: Started sshd@12-147.75.71.211:22-139.178.89.65:33508.service - OpenSSH per-connection server daemon (139.178.89.65:33508). Aug 13 01:52:30.536665 systemd-logind[1926]: Removed session 14. Aug 13 01:52:30.568988 sshd[7913]: Accepted publickey for core from 139.178.89.65 port 33508 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:52:30.572315 sshd-session[7913]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:52:30.585945 systemd-logind[1926]: New session 15 of user core. Aug 13 01:52:30.602295 systemd[1]: Started session-15.scope - Session 15 of User core. Aug 13 01:52:30.768301 sshd[7915]: Connection closed by 139.178.89.65 port 33508 Aug 13 01:52:30.768517 sshd-session[7913]: pam_unix(sshd:session): session closed for user core Aug 13 01:52:30.781374 systemd[1]: sshd@12-147.75.71.211:22-139.178.89.65:33508.service: Deactivated successfully. Aug 13 01:52:30.782378 systemd[1]: session-15.scope: Deactivated successfully. Aug 13 01:52:30.782813 systemd-logind[1926]: Session 15 logged out. Waiting for processes to exit. Aug 13 01:52:30.784316 systemd[1]: Started sshd@13-147.75.71.211:22-139.178.89.65:33510.service - OpenSSH per-connection server daemon (139.178.89.65:33510). Aug 13 01:52:30.784698 systemd-logind[1926]: Removed session 15. Aug 13 01:52:30.814032 sshd[7938]: Accepted publickey for core from 139.178.89.65 port 33510 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:52:30.814713 sshd-session[7938]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:52:30.817782 systemd-logind[1926]: New session 16 of user core. Aug 13 01:52:30.832124 systemd[1]: Started session-16.scope - Session 16 of User core. Aug 13 01:52:30.978188 sshd[7940]: Connection closed by 139.178.89.65 port 33510 Aug 13 01:52:30.978401 sshd-session[7938]: pam_unix(sshd:session): session closed for user core Aug 13 01:52:30.980348 systemd[1]: sshd@13-147.75.71.211:22-139.178.89.65:33510.service: Deactivated successfully. Aug 13 01:52:30.981337 systemd[1]: session-16.scope: Deactivated successfully. Aug 13 01:52:30.981751 systemd-logind[1926]: Session 16 logged out. Waiting for processes to exit. Aug 13 01:52:30.982399 systemd-logind[1926]: Removed session 16. Aug 13 01:52:33.711004 containerd[1938]: time="2025-08-13T01:52:33.710980028Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"0943ec5e02c974ba5902492965437521cfc45eb6867ad16eb1c028347998b632\" pid:7976 exited_at:{seconds:1755049953 nanos:710852788}" Aug 13 01:52:35.993835 systemd[1]: Started sshd@14-147.75.71.211:22-139.178.89.65:33522.service - OpenSSH per-connection server daemon (139.178.89.65:33522). Aug 13 01:52:36.026978 sshd[7992]: Accepted publickey for core from 139.178.89.65 port 33522 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:52:36.027935 sshd-session[7992]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:52:36.031699 systemd-logind[1926]: New session 17 of user core. Aug 13 01:52:36.040112 systemd[1]: Started session-17.scope - Session 17 of User core. Aug 13 01:52:36.130840 sshd[7994]: Connection closed by 139.178.89.65 port 33522 Aug 13 01:52:36.131036 sshd-session[7992]: pam_unix(sshd:session): session closed for user core Aug 13 01:52:36.132810 systemd[1]: sshd@14-147.75.71.211:22-139.178.89.65:33522.service: Deactivated successfully. Aug 13 01:52:36.133845 systemd[1]: session-17.scope: Deactivated successfully. Aug 13 01:52:36.134595 systemd-logind[1926]: Session 17 logged out. Waiting for processes to exit. Aug 13 01:52:36.135257 systemd-logind[1926]: Removed session 17. Aug 13 01:52:36.907222 containerd[1938]: time="2025-08-13T01:52:36.907190659Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\" id:\"6f07d2c38d730d221fb30a826885f72a098f6ee351d67a8ddd2e51b67379723c\" pid:8026 exit_status:1 exited_at:{seconds:1755049956 nanos:907002208}" Aug 13 01:52:39.759895 containerd[1938]: time="2025-08-13T01:52:39.759869993Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"d5011aac504a71a34abd261ada23cdb16cafee5496b020ac4bd638bb23ada3a0\" pid:8064 exited_at:{seconds:1755049959 nanos:759710078}" Aug 13 01:52:41.153818 systemd[1]: Started sshd@15-147.75.71.211:22-139.178.89.65:37230.service - OpenSSH per-connection server daemon (139.178.89.65:37230). Aug 13 01:52:41.197830 sshd[8086]: Accepted publickey for core from 139.178.89.65 port 37230 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:52:41.198562 sshd-session[8086]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:52:41.201900 systemd-logind[1926]: New session 18 of user core. Aug 13 01:52:41.224059 systemd[1]: Started session-18.scope - Session 18 of User core. Aug 13 01:52:41.312613 sshd[8088]: Connection closed by 139.178.89.65 port 37230 Aug 13 01:52:41.312804 sshd-session[8086]: pam_unix(sshd:session): session closed for user core Aug 13 01:52:41.314716 systemd[1]: sshd@15-147.75.71.211:22-139.178.89.65:37230.service: Deactivated successfully. Aug 13 01:52:41.315794 systemd[1]: session-18.scope: Deactivated successfully. Aug 13 01:52:41.316574 systemd-logind[1926]: Session 18 logged out. Waiting for processes to exit. Aug 13 01:52:41.317229 systemd-logind[1926]: Removed session 18. Aug 13 01:52:46.340148 systemd[1]: Started sshd@16-147.75.71.211:22-139.178.89.65:37236.service - OpenSSH per-connection server daemon (139.178.89.65:37236). Aug 13 01:52:46.382855 sshd[8114]: Accepted publickey for core from 139.178.89.65 port 37236 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:52:46.383594 sshd-session[8114]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:52:46.386996 systemd-logind[1926]: New session 19 of user core. Aug 13 01:52:46.400044 systemd[1]: Started session-19.scope - Session 19 of User core. Aug 13 01:52:46.488653 sshd[8116]: Connection closed by 139.178.89.65 port 37236 Aug 13 01:52:46.488848 sshd-session[8114]: pam_unix(sshd:session): session closed for user core Aug 13 01:52:46.490799 systemd[1]: sshd@16-147.75.71.211:22-139.178.89.65:37236.service: Deactivated successfully. Aug 13 01:52:46.491900 systemd[1]: session-19.scope: Deactivated successfully. Aug 13 01:52:46.492635 systemd-logind[1926]: Session 19 logged out. Waiting for processes to exit. Aug 13 01:52:46.493337 systemd-logind[1926]: Removed session 19. Aug 13 01:52:46.892610 containerd[1938]: time="2025-08-13T01:52:46.892551900Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ccab189ddf2fd07bfcc0fc9f832a09af4495687414feec07172aee54331d4eb4\" id:\"c34de7bb5970be45f0894ca675a6f5e78afc6829b622e85441651d1f78901223\" pid:8151 exited_at:{seconds:1755049966 nanos:892381504}" Aug 13 01:52:50.970858 containerd[1938]: time="2025-08-13T01:52:50.970833421Z" level=info msg="TaskExit event in podsandbox handler container_id:\"faceaf44ee1b24f07a8acfa6266e31a0c31201e79538236d219eaadd67bafb37\" id:\"f6495793dfc6a5b35bd1ad4c673d172080c07d7789407ab18d21d0680bb30eb2\" pid:8173 exited_at:{seconds:1755049970 nanos:970665899}" Aug 13 01:52:51.515274 systemd[1]: Started sshd@17-147.75.71.211:22-139.178.89.65:54180.service - OpenSSH per-connection server daemon (139.178.89.65:54180). Aug 13 01:52:51.545046 sshd[8196]: Accepted publickey for core from 139.178.89.65 port 54180 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:52:51.545902 sshd-session[8196]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:52:51.549335 systemd-logind[1926]: New session 20 of user core. Aug 13 01:52:51.560144 systemd[1]: Started session-20.scope - Session 20 of User core. Aug 13 01:52:51.658516 sshd[8198]: Connection closed by 139.178.89.65 port 54180 Aug 13 01:52:51.659309 sshd-session[8196]: pam_unix(sshd:session): session closed for user core Aug 13 01:52:51.684661 systemd[1]: sshd@17-147.75.71.211:22-139.178.89.65:54180.service: Deactivated successfully. Aug 13 01:52:51.689249 systemd[1]: session-20.scope: Deactivated successfully. Aug 13 01:52:51.691647 systemd-logind[1926]: Session 20 logged out. Waiting for processes to exit. Aug 13 01:52:51.698446 systemd[1]: Started sshd@18-147.75.71.211:22-139.178.89.65:54184.service - OpenSSH per-connection server daemon (139.178.89.65:54184). Aug 13 01:52:51.700494 systemd-logind[1926]: Removed session 20. Aug 13 01:52:51.785707 sshd[8223]: Accepted publickey for core from 139.178.89.65 port 54184 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:52:51.788180 sshd-session[8223]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:52:51.798087 systemd-logind[1926]: New session 21 of user core. Aug 13 01:52:51.820340 systemd[1]: Started session-21.scope - Session 21 of User core. Aug 13 01:52:51.966437 sshd[8225]: Connection closed by 139.178.89.65 port 54184 Aug 13 01:52:51.966631 sshd-session[8223]: pam_unix(sshd:session): session closed for user core Aug 13 01:52:51.990425 systemd[1]: sshd@18-147.75.71.211:22-139.178.89.65:54184.service: Deactivated successfully. Aug 13 01:52:51.994707 systemd[1]: session-21.scope: Deactivated successfully. Aug 13 01:52:51.997075 systemd-logind[1926]: Session 21 logged out. Waiting for processes to exit. Aug 13 01:52:52.001422 systemd-logind[1926]: Removed session 21. Aug 13 01:52:52.004905 systemd[1]: Started sshd@19-147.75.71.211:22-139.178.89.65:54196.service - OpenSSH per-connection server daemon (139.178.89.65:54196). Aug 13 01:52:52.091036 sshd[8246]: Accepted publickey for core from 139.178.89.65 port 54196 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:52:52.091935 sshd-session[8246]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:52:52.095873 systemd-logind[1926]: New session 22 of user core. Aug 13 01:52:52.111339 systemd[1]: Started session-22.scope - Session 22 of User core. Aug 13 01:52:52.910525 sshd[8248]: Connection closed by 139.178.89.65 port 54196 Aug 13 01:52:52.910957 sshd-session[8246]: pam_unix(sshd:session): session closed for user core Aug 13 01:52:52.928882 systemd[1]: sshd@19-147.75.71.211:22-139.178.89.65:54196.service: Deactivated successfully. Aug 13 01:52:52.930403 systemd[1]: session-22.scope: Deactivated successfully. Aug 13 01:52:52.930992 systemd-logind[1926]: Session 22 logged out. Waiting for processes to exit. Aug 13 01:52:52.932899 systemd[1]: Started sshd@20-147.75.71.211:22-139.178.89.65:54204.service - OpenSSH per-connection server daemon (139.178.89.65:54204). Aug 13 01:52:52.933408 systemd-logind[1926]: Removed session 22. Aug 13 01:52:52.984593 sshd[8278]: Accepted publickey for core from 139.178.89.65 port 54204 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:52:52.985472 sshd-session[8278]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:52:52.988798 systemd-logind[1926]: New session 23 of user core. Aug 13 01:52:53.009013 systemd[1]: Started session-23.scope - Session 23 of User core. Aug 13 01:52:53.221021 sshd[8283]: Connection closed by 139.178.89.65 port 54204 Aug 13 01:52:53.221238 sshd-session[8278]: pam_unix(sshd:session): session closed for user core Aug 13 01:52:53.234728 systemd[1]: sshd@20-147.75.71.211:22-139.178.89.65:54204.service: Deactivated successfully. Aug 13 01:52:53.235990 systemd[1]: session-23.scope: Deactivated successfully. Aug 13 01:52:53.236596 systemd-logind[1926]: Session 23 logged out. Waiting for processes to exit. Aug 13 01:52:53.238431 systemd[1]: Started sshd@21-147.75.71.211:22-139.178.89.65:54212.service - OpenSSH per-connection server daemon (139.178.89.65:54212). Aug 13 01:52:53.238825 systemd-logind[1926]: Removed session 23. Aug 13 01:52:53.282215 sshd[8306]: Accepted publickey for core from 139.178.89.65 port 54212 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:52:53.283040 sshd-session[8306]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:52:53.286106 systemd-logind[1926]: New session 24 of user core. Aug 13 01:52:53.299061 systemd[1]: Started session-24.scope - Session 24 of User core. Aug 13 01:52:53.380427 sshd[8308]: Connection closed by 139.178.89.65 port 54212 Aug 13 01:52:53.380643 sshd-session[8306]: pam_unix(sshd:session): session closed for user core Aug 13 01:52:53.382562 systemd[1]: sshd@21-147.75.71.211:22-139.178.89.65:54212.service: Deactivated successfully. Aug 13 01:52:53.383622 systemd[1]: session-24.scope: Deactivated successfully. Aug 13 01:52:53.384702 systemd-logind[1926]: Session 24 logged out. Waiting for processes to exit. Aug 13 01:52:53.385471 systemd-logind[1926]: Removed session 24. Aug 13 01:52:58.408098 systemd[1]: Started sshd@22-147.75.71.211:22-139.178.89.65:54224.service - OpenSSH per-connection server daemon (139.178.89.65:54224). Aug 13 01:52:58.454998 sshd[8344]: Accepted publickey for core from 139.178.89.65 port 54224 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:52:58.455774 sshd-session[8344]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:52:58.458954 systemd-logind[1926]: New session 25 of user core. Aug 13 01:52:58.475099 systemd[1]: Started session-25.scope - Session 25 of User core. Aug 13 01:52:58.567620 sshd[8346]: Connection closed by 139.178.89.65 port 54224 Aug 13 01:52:58.567821 sshd-session[8344]: pam_unix(sshd:session): session closed for user core Aug 13 01:52:58.570200 systemd[1]: sshd@22-147.75.71.211:22-139.178.89.65:54224.service: Deactivated successfully. Aug 13 01:52:58.571195 systemd[1]: session-25.scope: Deactivated successfully. Aug 13 01:52:58.571658 systemd-logind[1926]: Session 25 logged out. Waiting for processes to exit. Aug 13 01:52:58.572341 systemd-logind[1926]: Removed session 25. Aug 13 01:53:03.584975 systemd[1]: Started sshd@23-147.75.71.211:22-139.178.89.65:54674.service - OpenSSH per-connection server daemon (139.178.89.65:54674). Aug 13 01:53:03.638077 sshd[8390]: Accepted publickey for core from 139.178.89.65 port 54674 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:53:03.639155 sshd-session[8390]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:53:03.643683 systemd-logind[1926]: New session 26 of user core. Aug 13 01:53:03.656307 systemd[1]: Started session-26.scope - Session 26 of User core. Aug 13 01:53:03.753991 sshd[8392]: Connection closed by 139.178.89.65 port 54674 Aug 13 01:53:03.754183 sshd-session[8390]: pam_unix(sshd:session): session closed for user core Aug 13 01:53:03.755848 systemd[1]: sshd@23-147.75.71.211:22-139.178.89.65:54674.service: Deactivated successfully. Aug 13 01:53:03.756880 systemd[1]: session-26.scope: Deactivated successfully. Aug 13 01:53:03.757624 systemd-logind[1926]: Session 26 logged out. Waiting for processes to exit. Aug 13 01:53:03.758402 systemd-logind[1926]: Removed session 26. Aug 13 01:53:06.900048 containerd[1938]: time="2025-08-13T01:53:06.899996727Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1777baaf771cf1ecf00b0f19ed001fb62e80fdc050273cf47f861b18c7d7436e\" id:\"ef5386a700db2d7e236b44380476f8f05805e8c07e24670c0582534b804fd9ff\" pid:8427 exited_at:{seconds:1755049986 nanos:899819031}" Aug 13 01:53:08.781416 systemd[1]: Started sshd@24-147.75.71.211:22-139.178.89.65:54688.service - OpenSSH per-connection server daemon (139.178.89.65:54688). Aug 13 01:53:08.865810 sshd[8453]: Accepted publickey for core from 139.178.89.65 port 54688 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 01:53:08.866791 sshd-session[8453]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 01:53:08.870717 systemd-logind[1926]: New session 27 of user core. Aug 13 01:53:08.891230 systemd[1]: Started session-27.scope - Session 27 of User core. Aug 13 01:53:08.984678 sshd[8455]: Connection closed by 139.178.89.65 port 54688 Aug 13 01:53:08.984893 sshd-session[8453]: pam_unix(sshd:session): session closed for user core Aug 13 01:53:08.986793 systemd[1]: sshd@24-147.75.71.211:22-139.178.89.65:54688.service: Deactivated successfully. Aug 13 01:53:08.987856 systemd[1]: session-27.scope: Deactivated successfully. Aug 13 01:53:08.988623 systemd-logind[1926]: Session 27 logged out. Waiting for processes to exit. Aug 13 01:53:08.989358 systemd-logind[1926]: Removed session 27.