Aug 13 00:33:17.899751 kernel: Linux version 6.12.40-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Aug 12 21:42:48 -00 2025 Aug 13 00:33:17.899765 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=215bdedb8de38f6b96ec4f9db80853e25015f60454b867e319fdcb9244320a21 Aug 13 00:33:17.899772 kernel: BIOS-provided physical RAM map: Aug 13 00:33:17.899776 kernel: BIOS-e820: [mem 0x0000000000000000-0x00000000000997ff] usable Aug 13 00:33:17.899779 kernel: BIOS-e820: [mem 0x0000000000099800-0x000000000009ffff] reserved Aug 13 00:33:17.899786 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Aug 13 00:33:17.899807 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Aug 13 00:33:17.899811 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Aug 13 00:33:17.899815 kernel: BIOS-e820: [mem 0x0000000040400000-0x0000000081b24fff] usable Aug 13 00:33:17.899820 kernel: BIOS-e820: [mem 0x0000000081b25000-0x0000000081b25fff] ACPI NVS Aug 13 00:33:17.899824 kernel: BIOS-e820: [mem 0x0000000081b26000-0x0000000081b26fff] reserved Aug 13 00:33:17.899841 kernel: BIOS-e820: [mem 0x0000000081b27000-0x000000008afccfff] usable Aug 13 00:33:17.899845 kernel: BIOS-e820: [mem 0x000000008afcd000-0x000000008c0b1fff] reserved Aug 13 00:33:17.899850 kernel: BIOS-e820: [mem 0x000000008c0b2000-0x000000008c23afff] usable Aug 13 00:33:17.899855 kernel: BIOS-e820: [mem 0x000000008c23b000-0x000000008c66cfff] ACPI NVS Aug 13 00:33:17.899860 kernel: BIOS-e820: [mem 0x000000008c66d000-0x000000008eefefff] reserved Aug 13 00:33:17.899864 kernel: BIOS-e820: [mem 0x000000008eeff000-0x000000008eefffff] usable Aug 13 00:33:17.899869 kernel: BIOS-e820: [mem 0x000000008ef00000-0x000000008fffffff] reserved Aug 13 00:33:17.899873 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Aug 13 00:33:17.899878 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Aug 13 00:33:17.899882 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Aug 13 00:33:17.899887 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Aug 13 00:33:17.899891 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Aug 13 00:33:17.899895 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000086effffff] usable Aug 13 00:33:17.899900 kernel: NX (Execute Disable) protection: active Aug 13 00:33:17.899904 kernel: APIC: Static calls initialized Aug 13 00:33:17.899909 kernel: SMBIOS 3.2.1 present. Aug 13 00:33:17.899914 kernel: DMI: Supermicro SYS-5019C-MR-PH004/X11SCM-F, BIOS 1.9 09/16/2022 Aug 13 00:33:17.899919 kernel: DMI: Memory slots populated: 2/4 Aug 13 00:33:17.899923 kernel: tsc: Detected 3400.000 MHz processor Aug 13 00:33:17.899927 kernel: tsc: Detected 3399.906 MHz TSC Aug 13 00:33:17.899932 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Aug 13 00:33:17.899937 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Aug 13 00:33:17.899941 kernel: last_pfn = 0x86f000 max_arch_pfn = 0x400000000 Aug 13 00:33:17.899946 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 23), built from 10 variable MTRRs Aug 13 00:33:17.899950 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Aug 13 00:33:17.899956 kernel: last_pfn = 0x8ef00 max_arch_pfn = 0x400000000 Aug 13 00:33:17.899960 kernel: Using GB pages for direct mapping Aug 13 00:33:17.899965 kernel: ACPI: Early table checksum verification disabled Aug 13 00:33:17.899970 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Aug 13 00:33:17.899976 kernel: ACPI: XSDT 0x000000008C54E0C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Aug 13 00:33:17.899981 kernel: ACPI: FACP 0x000000008C58A670 000114 (v06 01072009 AMI 00010013) Aug 13 00:33:17.899986 kernel: ACPI: DSDT 0x000000008C54E268 03C404 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Aug 13 00:33:17.899991 kernel: ACPI: FACS 0x000000008C66CF80 000040 Aug 13 00:33:17.899996 kernel: ACPI: APIC 0x000000008C58A788 00012C (v04 01072009 AMI 00010013) Aug 13 00:33:17.900001 kernel: ACPI: FPDT 0x000000008C58A8B8 000044 (v01 01072009 AMI 00010013) Aug 13 00:33:17.900006 kernel: ACPI: FIDT 0x000000008C58A900 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Aug 13 00:33:17.900011 kernel: ACPI: MCFG 0x000000008C58A9A0 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Aug 13 00:33:17.900015 kernel: ACPI: SPMI 0x000000008C58A9E0 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Aug 13 00:33:17.900020 kernel: ACPI: SSDT 0x000000008C58AA28 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Aug 13 00:33:17.900026 kernel: ACPI: SSDT 0x000000008C58C548 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Aug 13 00:33:17.900031 kernel: ACPI: SSDT 0x000000008C58F710 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Aug 13 00:33:17.900035 kernel: ACPI: HPET 0x000000008C591A40 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Aug 13 00:33:17.900040 kernel: ACPI: SSDT 0x000000008C591A78 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Aug 13 00:33:17.900045 kernel: ACPI: SSDT 0x000000008C592A28 0008F4 (v02 INTEL xh_mossb 00000000 INTL 20160527) Aug 13 00:33:17.900050 kernel: ACPI: UEFI 0x000000008C593320 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Aug 13 00:33:17.900055 kernel: ACPI: LPIT 0x000000008C593368 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Aug 13 00:33:17.900060 kernel: ACPI: SSDT 0x000000008C593400 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Aug 13 00:33:17.900065 kernel: ACPI: SSDT 0x000000008C595BE0 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Aug 13 00:33:17.900070 kernel: ACPI: DBGP 0x000000008C5970C8 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Aug 13 00:33:17.900075 kernel: ACPI: DBG2 0x000000008C597100 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Aug 13 00:33:17.900080 kernel: ACPI: SSDT 0x000000008C597158 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Aug 13 00:33:17.900085 kernel: ACPI: DMAR 0x000000008C598CC0 000070 (v01 INTEL EDK2 00000002 01000013) Aug 13 00:33:17.900090 kernel: ACPI: SSDT 0x000000008C598D30 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Aug 13 00:33:17.900095 kernel: ACPI: TPM2 0x000000008C598E78 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Aug 13 00:33:17.900099 kernel: ACPI: SSDT 0x000000008C598EB0 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Aug 13 00:33:17.900104 kernel: ACPI: WSMT 0x000000008C599C40 000028 (v01 SUPERM 01072009 AMI 00010013) Aug 13 00:33:17.900110 kernel: ACPI: EINJ 0x000000008C599C68 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Aug 13 00:33:17.900115 kernel: ACPI: ERST 0x000000008C599D98 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Aug 13 00:33:17.900120 kernel: ACPI: BERT 0x000000008C599FC8 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Aug 13 00:33:17.900124 kernel: ACPI: HEST 0x000000008C599FF8 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Aug 13 00:33:17.900129 kernel: ACPI: SSDT 0x000000008C59A278 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Aug 13 00:33:17.900134 kernel: ACPI: Reserving FACP table memory at [mem 0x8c58a670-0x8c58a783] Aug 13 00:33:17.900139 kernel: ACPI: Reserving DSDT table memory at [mem 0x8c54e268-0x8c58a66b] Aug 13 00:33:17.900144 kernel: ACPI: Reserving FACS table memory at [mem 0x8c66cf80-0x8c66cfbf] Aug 13 00:33:17.900148 kernel: ACPI: Reserving APIC table memory at [mem 0x8c58a788-0x8c58a8b3] Aug 13 00:33:17.900154 kernel: ACPI: Reserving FPDT table memory at [mem 0x8c58a8b8-0x8c58a8fb] Aug 13 00:33:17.900159 kernel: ACPI: Reserving FIDT table memory at [mem 0x8c58a900-0x8c58a99b] Aug 13 00:33:17.900164 kernel: ACPI: Reserving MCFG table memory at [mem 0x8c58a9a0-0x8c58a9db] Aug 13 00:33:17.900168 kernel: ACPI: Reserving SPMI table memory at [mem 0x8c58a9e0-0x8c58aa20] Aug 13 00:33:17.900173 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58aa28-0x8c58c543] Aug 13 00:33:17.900178 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58c548-0x8c58f70d] Aug 13 00:33:17.900183 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58f710-0x8c591a3a] Aug 13 00:33:17.900187 kernel: ACPI: Reserving HPET table memory at [mem 0x8c591a40-0x8c591a77] Aug 13 00:33:17.900192 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c591a78-0x8c592a25] Aug 13 00:33:17.900198 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c592a28-0x8c59331b] Aug 13 00:33:17.900203 kernel: ACPI: Reserving UEFI table memory at [mem 0x8c593320-0x8c593361] Aug 13 00:33:17.900208 kernel: ACPI: Reserving LPIT table memory at [mem 0x8c593368-0x8c5933fb] Aug 13 00:33:17.900212 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c593400-0x8c595bdd] Aug 13 00:33:17.900217 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c595be0-0x8c5970c1] Aug 13 00:33:17.900222 kernel: ACPI: Reserving DBGP table memory at [mem 0x8c5970c8-0x8c5970fb] Aug 13 00:33:17.900227 kernel: ACPI: Reserving DBG2 table memory at [mem 0x8c597100-0x8c597153] Aug 13 00:33:17.900231 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c597158-0x8c598cbe] Aug 13 00:33:17.900236 kernel: ACPI: Reserving DMAR table memory at [mem 0x8c598cc0-0x8c598d2f] Aug 13 00:33:17.900242 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598d30-0x8c598e73] Aug 13 00:33:17.900246 kernel: ACPI: Reserving TPM2 table memory at [mem 0x8c598e78-0x8c598eab] Aug 13 00:33:17.900251 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598eb0-0x8c599c3e] Aug 13 00:33:17.900256 kernel: ACPI: Reserving WSMT table memory at [mem 0x8c599c40-0x8c599c67] Aug 13 00:33:17.900261 kernel: ACPI: Reserving EINJ table memory at [mem 0x8c599c68-0x8c599d97] Aug 13 00:33:17.900265 kernel: ACPI: Reserving ERST table memory at [mem 0x8c599d98-0x8c599fc7] Aug 13 00:33:17.900270 kernel: ACPI: Reserving BERT table memory at [mem 0x8c599fc8-0x8c599ff7] Aug 13 00:33:17.900275 kernel: ACPI: Reserving HEST table memory at [mem 0x8c599ff8-0x8c59a273] Aug 13 00:33:17.900280 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c59a278-0x8c59a3d9] Aug 13 00:33:17.900284 kernel: No NUMA configuration found Aug 13 00:33:17.900290 kernel: Faking a node at [mem 0x0000000000000000-0x000000086effffff] Aug 13 00:33:17.900295 kernel: NODE_DATA(0) allocated [mem 0x86eff8dc0-0x86effffff] Aug 13 00:33:17.900300 kernel: Zone ranges: Aug 13 00:33:17.900305 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Aug 13 00:33:17.900309 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Aug 13 00:33:17.900314 kernel: Normal [mem 0x0000000100000000-0x000000086effffff] Aug 13 00:33:17.900319 kernel: Device empty Aug 13 00:33:17.900324 kernel: Movable zone start for each node Aug 13 00:33:17.900329 kernel: Early memory node ranges Aug 13 00:33:17.900334 kernel: node 0: [mem 0x0000000000001000-0x0000000000098fff] Aug 13 00:33:17.900339 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Aug 13 00:33:17.900344 kernel: node 0: [mem 0x0000000040400000-0x0000000081b24fff] Aug 13 00:33:17.900349 kernel: node 0: [mem 0x0000000081b27000-0x000000008afccfff] Aug 13 00:33:17.900354 kernel: node 0: [mem 0x000000008c0b2000-0x000000008c23afff] Aug 13 00:33:17.900362 kernel: node 0: [mem 0x000000008eeff000-0x000000008eefffff] Aug 13 00:33:17.900367 kernel: node 0: [mem 0x0000000100000000-0x000000086effffff] Aug 13 00:33:17.900372 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000086effffff] Aug 13 00:33:17.900378 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Aug 13 00:33:17.900384 kernel: On node 0, zone DMA: 103 pages in unavailable ranges Aug 13 00:33:17.900389 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Aug 13 00:33:17.900394 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Aug 13 00:33:17.900399 kernel: On node 0, zone DMA32: 4325 pages in unavailable ranges Aug 13 00:33:17.900404 kernel: On node 0, zone DMA32: 11460 pages in unavailable ranges Aug 13 00:33:17.900409 kernel: On node 0, zone Normal: 4352 pages in unavailable ranges Aug 13 00:33:17.900414 kernel: On node 0, zone Normal: 4096 pages in unavailable ranges Aug 13 00:33:17.900419 kernel: ACPI: PM-Timer IO Port: 0x1808 Aug 13 00:33:17.900425 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Aug 13 00:33:17.900430 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Aug 13 00:33:17.900435 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Aug 13 00:33:17.900440 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Aug 13 00:33:17.900445 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Aug 13 00:33:17.900451 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Aug 13 00:33:17.900456 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Aug 13 00:33:17.900461 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Aug 13 00:33:17.900466 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Aug 13 00:33:17.900472 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Aug 13 00:33:17.900477 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Aug 13 00:33:17.900482 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Aug 13 00:33:17.900487 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Aug 13 00:33:17.900492 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Aug 13 00:33:17.900497 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Aug 13 00:33:17.900502 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Aug 13 00:33:17.900507 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Aug 13 00:33:17.900512 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Aug 13 00:33:17.900518 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Aug 13 00:33:17.900523 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Aug 13 00:33:17.900528 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Aug 13 00:33:17.900533 kernel: TSC deadline timer available Aug 13 00:33:17.900538 kernel: CPU topo: Max. logical packages: 1 Aug 13 00:33:17.900543 kernel: CPU topo: Max. logical dies: 1 Aug 13 00:33:17.900548 kernel: CPU topo: Max. dies per package: 1 Aug 13 00:33:17.900553 kernel: CPU topo: Max. threads per core: 2 Aug 13 00:33:17.900558 kernel: CPU topo: Num. cores per package: 8 Aug 13 00:33:17.900563 kernel: CPU topo: Num. threads per package: 16 Aug 13 00:33:17.900569 kernel: CPU topo: Allowing 16 present CPUs plus 0 hotplug CPUs Aug 13 00:33:17.900575 kernel: [mem 0x90000000-0xdfffffff] available for PCI devices Aug 13 00:33:17.900580 kernel: Booting paravirtualized kernel on bare hardware Aug 13 00:33:17.900585 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Aug 13 00:33:17.900590 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Aug 13 00:33:17.900595 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Aug 13 00:33:17.900600 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Aug 13 00:33:17.900605 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Aug 13 00:33:17.900611 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=215bdedb8de38f6b96ec4f9db80853e25015f60454b867e319fdcb9244320a21 Aug 13 00:33:17.900618 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Aug 13 00:33:17.900623 kernel: random: crng init done Aug 13 00:33:17.900628 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Aug 13 00:33:17.900633 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Aug 13 00:33:17.900638 kernel: Fallback order for Node 0: 0 Aug 13 00:33:17.900643 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8363245 Aug 13 00:33:17.900648 kernel: Policy zone: Normal Aug 13 00:33:17.900653 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Aug 13 00:33:17.900659 kernel: software IO TLB: area num 16. Aug 13 00:33:17.900664 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Aug 13 00:33:17.900669 kernel: ftrace: allocating 40098 entries in 157 pages Aug 13 00:33:17.900675 kernel: ftrace: allocated 157 pages with 5 groups Aug 13 00:33:17.900680 kernel: Dynamic Preempt: voluntary Aug 13 00:33:17.900685 kernel: rcu: Preemptible hierarchical RCU implementation. Aug 13 00:33:17.900690 kernel: rcu: RCU event tracing is enabled. Aug 13 00:33:17.900695 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Aug 13 00:33:17.900701 kernel: Trampoline variant of Tasks RCU enabled. Aug 13 00:33:17.900707 kernel: Rude variant of Tasks RCU enabled. Aug 13 00:33:17.900712 kernel: Tracing variant of Tasks RCU enabled. Aug 13 00:33:17.900717 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Aug 13 00:33:17.900722 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Aug 13 00:33:17.900727 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Aug 13 00:33:17.900732 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Aug 13 00:33:17.900737 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Aug 13 00:33:17.900743 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Aug 13 00:33:17.900748 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Aug 13 00:33:17.900754 kernel: Console: colour VGA+ 80x25 Aug 13 00:33:17.900759 kernel: printk: legacy console [tty0] enabled Aug 13 00:33:17.900764 kernel: printk: legacy console [ttyS1] enabled Aug 13 00:33:17.900769 kernel: ACPI: Core revision 20240827 Aug 13 00:33:17.900774 kernel: hpet: HPET dysfunctional in PC10. Force disabled. Aug 13 00:33:17.900779 kernel: APIC: Switch to symmetric I/O mode setup Aug 13 00:33:17.900786 kernel: DMAR: Host address width 39 Aug 13 00:33:17.900792 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Aug 13 00:33:17.900797 kernel: DMAR: dmar0: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Aug 13 00:33:17.900803 kernel: DMAR: RMRR base: 0x0000008cf18000 end: 0x0000008d161fff Aug 13 00:33:17.900808 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 0 Aug 13 00:33:17.900813 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Aug 13 00:33:17.900818 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Aug 13 00:33:17.900823 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Aug 13 00:33:17.900829 kernel: x2apic enabled Aug 13 00:33:17.900834 kernel: APIC: Switched APIC routing to: cluster x2apic Aug 13 00:33:17.900839 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Aug 13 00:33:17.900844 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Aug 13 00:33:17.900850 kernel: CPU0: Thermal monitoring enabled (TM1) Aug 13 00:33:17.900855 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Aug 13 00:33:17.900860 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Aug 13 00:33:17.900865 kernel: process: using mwait in idle threads Aug 13 00:33:17.900870 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Aug 13 00:33:17.900876 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Aug 13 00:33:17.900881 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Aug 13 00:33:17.900886 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Aug 13 00:33:17.900891 kernel: RETBleed: Mitigation: Enhanced IBRS Aug 13 00:33:17.900896 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Aug 13 00:33:17.900901 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Aug 13 00:33:17.900906 kernel: TAA: Mitigation: TSX disabled Aug 13 00:33:17.900911 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Aug 13 00:33:17.900916 kernel: SRBDS: Mitigation: Microcode Aug 13 00:33:17.900921 kernel: GDS: Vulnerable: No microcode Aug 13 00:33:17.900926 kernel: ITS: Mitigation: Aligned branch/return thunks Aug 13 00:33:17.900931 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Aug 13 00:33:17.900936 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Aug 13 00:33:17.900941 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Aug 13 00:33:17.900946 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Aug 13 00:33:17.900951 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Aug 13 00:33:17.900956 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Aug 13 00:33:17.900962 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Aug 13 00:33:17.900967 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Aug 13 00:33:17.900972 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Aug 13 00:33:17.900977 kernel: Freeing SMP alternatives memory: 32K Aug 13 00:33:17.900982 kernel: pid_max: default: 32768 minimum: 301 Aug 13 00:33:17.900987 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Aug 13 00:33:17.900992 kernel: landlock: Up and running. Aug 13 00:33:17.900997 kernel: SELinux: Initializing. Aug 13 00:33:17.901002 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Aug 13 00:33:17.901007 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Aug 13 00:33:17.901012 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Aug 13 00:33:17.901018 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Aug 13 00:33:17.901023 kernel: ... version: 4 Aug 13 00:33:17.901028 kernel: ... bit width: 48 Aug 13 00:33:17.901033 kernel: ... generic registers: 4 Aug 13 00:33:17.901038 kernel: ... value mask: 0000ffffffffffff Aug 13 00:33:17.901044 kernel: ... max period: 00007fffffffffff Aug 13 00:33:17.901049 kernel: ... fixed-purpose events: 3 Aug 13 00:33:17.901054 kernel: ... event mask: 000000070000000f Aug 13 00:33:17.901059 kernel: signal: max sigframe size: 2032 Aug 13 00:33:17.901064 kernel: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Aug 13 00:33:17.901070 kernel: rcu: Hierarchical SRCU implementation. Aug 13 00:33:17.901075 kernel: rcu: Max phase no-delay instances is 400. Aug 13 00:33:17.901080 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Aug 13 00:33:17.901085 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Aug 13 00:33:17.901091 kernel: smp: Bringing up secondary CPUs ... Aug 13 00:33:17.901096 kernel: smpboot: x86: Booting SMP configuration: Aug 13 00:33:17.901101 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 #11 #12 #13 #14 #15 Aug 13 00:33:17.901106 kernel: Transient Scheduler Attacks: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Aug 13 00:33:17.901112 kernel: smp: Brought up 1 node, 16 CPUs Aug 13 00:33:17.901117 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Aug 13 00:33:17.901123 kernel: Memory: 32695188K/33452980K available (14336K kernel code, 2430K rwdata, 9960K rodata, 54444K init, 2524K bss, 732516K reserved, 0K cma-reserved) Aug 13 00:33:17.901128 kernel: devtmpfs: initialized Aug 13 00:33:17.901133 kernel: x86/mm: Memory block size: 128MB Aug 13 00:33:17.901138 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x81b25000-0x81b25fff] (4096 bytes) Aug 13 00:33:17.901143 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x8c23b000-0x8c66cfff] (4399104 bytes) Aug 13 00:33:17.901148 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Aug 13 00:33:17.901154 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Aug 13 00:33:17.901160 kernel: pinctrl core: initialized pinctrl subsystem Aug 13 00:33:17.901165 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Aug 13 00:33:17.901170 kernel: audit: initializing netlink subsys (disabled) Aug 13 00:33:17.901175 kernel: audit: type=2000 audit(1755045189.041:1): state=initialized audit_enabled=0 res=1 Aug 13 00:33:17.901180 kernel: thermal_sys: Registered thermal governor 'step_wise' Aug 13 00:33:17.901185 kernel: thermal_sys: Registered thermal governor 'user_space' Aug 13 00:33:17.901190 kernel: cpuidle: using governor menu Aug 13 00:33:17.901195 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Aug 13 00:33:17.901200 kernel: dca service started, version 1.12.1 Aug 13 00:33:17.901206 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Aug 13 00:33:17.901211 kernel: PCI: Using configuration type 1 for base access Aug 13 00:33:17.901216 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Aug 13 00:33:17.901222 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Aug 13 00:33:17.901227 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Aug 13 00:33:17.901232 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Aug 13 00:33:17.901237 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Aug 13 00:33:17.901242 kernel: ACPI: Added _OSI(Module Device) Aug 13 00:33:17.901247 kernel: ACPI: Added _OSI(Processor Device) Aug 13 00:33:17.901253 kernel: ACPI: Added _OSI(Processor Aggregator Device) Aug 13 00:33:17.901258 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Aug 13 00:33:17.901263 kernel: ACPI: Dynamic OEM Table Load: Aug 13 00:33:17.901268 kernel: ACPI: SSDT 0xFFFF8FC0420D2800 000400 (v02 PmRef Cpu0Cst 00003001 INTL 20160527) Aug 13 00:33:17.901274 kernel: ACPI: Dynamic OEM Table Load: Aug 13 00:33:17.901279 kernel: ACPI: SSDT 0xFFFF8FC0421A1800 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Aug 13 00:33:17.901284 kernel: ACPI: Dynamic OEM Table Load: Aug 13 00:33:17.901289 kernel: ACPI: SSDT 0xFFFF8FC040249700 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Aug 13 00:33:17.901294 kernel: ACPI: Dynamic OEM Table Load: Aug 13 00:33:17.901300 kernel: ACPI: SSDT 0xFFFF8FC0421A4800 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Aug 13 00:33:17.901305 kernel: ACPI: Dynamic OEM Table Load: Aug 13 00:33:17.901310 kernel: ACPI: SSDT 0xFFFF8FC0401A3000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Aug 13 00:33:17.901315 kernel: ACPI: Dynamic OEM Table Load: Aug 13 00:33:17.901320 kernel: ACPI: SSDT 0xFFFF8FC0420D7C00 00030A (v02 PmRef ApCst 00003000 INTL 20160527) Aug 13 00:33:17.901325 kernel: ACPI: Interpreter enabled Aug 13 00:33:17.901330 kernel: ACPI: PM: (supports S0 S5) Aug 13 00:33:17.901335 kernel: ACPI: Using IOAPIC for interrupt routing Aug 13 00:33:17.901340 kernel: HEST: Enabling Firmware First mode for corrected errors. Aug 13 00:33:17.901346 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Aug 13 00:33:17.901351 kernel: HEST: Table parsing has been initialized. Aug 13 00:33:17.901356 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Aug 13 00:33:17.901362 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Aug 13 00:33:17.901367 kernel: PCI: Using E820 reservations for host bridge windows Aug 13 00:33:17.901372 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Aug 13 00:33:17.901377 kernel: ACPI: \_SB_.PCI0.XDCI.USBC: New power resource Aug 13 00:33:17.901382 kernel: ACPI: \_SB_.PCI0.SAT0.VOL0.V0PR: New power resource Aug 13 00:33:17.901388 kernel: ACPI: \_SB_.PCI0.SAT0.VOL1.V1PR: New power resource Aug 13 00:33:17.901393 kernel: ACPI: \_SB_.PCI0.SAT0.VOL2.V2PR: New power resource Aug 13 00:33:17.901399 kernel: ACPI: \_SB_.PCI0.CNVW.WRST: New power resource Aug 13 00:33:17.901404 kernel: ACPI: \_TZ_.FN00: New power resource Aug 13 00:33:17.901409 kernel: ACPI: \_TZ_.FN01: New power resource Aug 13 00:33:17.901414 kernel: ACPI: \_TZ_.FN02: New power resource Aug 13 00:33:17.901419 kernel: ACPI: \_TZ_.FN03: New power resource Aug 13 00:33:17.901424 kernel: ACPI: \_TZ_.FN04: New power resource Aug 13 00:33:17.901429 kernel: ACPI: \PIN_: New power resource Aug 13 00:33:17.901435 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Aug 13 00:33:17.901509 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Aug 13 00:33:17.901558 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Aug 13 00:33:17.901603 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Aug 13 00:33:17.901611 kernel: PCI host bridge to bus 0000:00 Aug 13 00:33:17.901658 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Aug 13 00:33:17.901700 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Aug 13 00:33:17.901739 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Aug 13 00:33:17.901781 kernel: pci_bus 0000:00: root bus resource [mem 0x90000000-0xdfffffff window] Aug 13 00:33:17.901864 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Aug 13 00:33:17.901904 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Aug 13 00:33:17.901963 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 conventional PCI endpoint Aug 13 00:33:17.902016 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 PCIe Root Port Aug 13 00:33:17.902065 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Aug 13 00:33:17.902114 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Aug 13 00:33:17.902160 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Aug 13 00:33:17.902205 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Aug 13 00:33:17.902255 kernel: pci 0000:00:08.0: [8086:1911] type 00 class 0x088000 conventional PCI endpoint Aug 13 00:33:17.902301 kernel: pci 0000:00:08.0: BAR 0 [mem 0x9551f000-0x9551ffff 64bit] Aug 13 00:33:17.902350 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 conventional PCI endpoint Aug 13 00:33:17.902396 kernel: pci 0000:00:12.0: BAR 0 [mem 0x9551e000-0x9551efff 64bit] Aug 13 00:33:17.902448 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 conventional PCI endpoint Aug 13 00:33:17.902494 kernel: pci 0000:00:14.0: BAR 0 [mem 0x95500000-0x9550ffff 64bit] Aug 13 00:33:17.902539 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Aug 13 00:33:17.902587 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 conventional PCI endpoint Aug 13 00:33:17.902633 kernel: pci 0000:00:14.2: BAR 0 [mem 0x95512000-0x95513fff 64bit] Aug 13 00:33:17.902678 kernel: pci 0000:00:14.2: BAR 2 [mem 0x9551d000-0x9551dfff 64bit] Aug 13 00:33:17.902731 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 conventional PCI endpoint Aug 13 00:33:17.902791 kernel: pci 0000:00:15.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Aug 13 00:33:17.902886 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 conventional PCI endpoint Aug 13 00:33:17.902931 kernel: pci 0000:00:15.1: BAR 0 [mem 0x00000000-0x00000fff 64bit] Aug 13 00:33:17.902980 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 conventional PCI endpoint Aug 13 00:33:17.903025 kernel: pci 0000:00:16.0: BAR 0 [mem 0x9551a000-0x9551afff 64bit] Aug 13 00:33:17.903072 kernel: pci 0000:00:16.0: PME# supported from D3hot Aug 13 00:33:17.903121 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 conventional PCI endpoint Aug 13 00:33:17.903167 kernel: pci 0000:00:16.1: BAR 0 [mem 0x95519000-0x95519fff 64bit] Aug 13 00:33:17.903212 kernel: pci 0000:00:16.1: PME# supported from D3hot Aug 13 00:33:17.903260 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 conventional PCI endpoint Aug 13 00:33:17.903305 kernel: pci 0000:00:16.4: BAR 0 [mem 0x95518000-0x95518fff 64bit] Aug 13 00:33:17.903352 kernel: pci 0000:00:16.4: PME# supported from D3hot Aug 13 00:33:17.903402 kernel: pci 0000:00:17.0: [8086:a352] type 00 class 0x010601 conventional PCI endpoint Aug 13 00:33:17.903447 kernel: pci 0000:00:17.0: BAR 0 [mem 0x95510000-0x95511fff] Aug 13 00:33:17.903491 kernel: pci 0000:00:17.0: BAR 1 [mem 0x95517000-0x955170ff] Aug 13 00:33:17.903536 kernel: pci 0000:00:17.0: BAR 2 [io 0x6050-0x6057] Aug 13 00:33:17.903584 kernel: pci 0000:00:17.0: BAR 3 [io 0x6040-0x6043] Aug 13 00:33:17.903628 kernel: pci 0000:00:17.0: BAR 4 [io 0x6020-0x603f] Aug 13 00:33:17.903673 kernel: pci 0000:00:17.0: BAR 5 [mem 0x95516000-0x955167ff] Aug 13 00:33:17.903717 kernel: pci 0000:00:17.0: PME# supported from D3hot Aug 13 00:33:17.903771 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 PCIe Root Port Aug 13 00:33:17.903841 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Aug 13 00:33:17.903902 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Aug 13 00:33:17.903954 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 PCIe Root Port Aug 13 00:33:17.904001 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Aug 13 00:33:17.904045 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Aug 13 00:33:17.904091 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Aug 13 00:33:17.904137 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Aug 13 00:33:17.904187 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 PCIe Root Port Aug 13 00:33:17.904235 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Aug 13 00:33:17.904280 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Aug 13 00:33:17.904326 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Aug 13 00:33:17.904371 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Aug 13 00:33:17.904421 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 PCIe Root Port Aug 13 00:33:17.904469 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Aug 13 00:33:17.904514 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Aug 13 00:33:17.904566 kernel: pci 0000:00:1c.3: [8086:a33b] type 01 class 0x060400 PCIe Root Port Aug 13 00:33:17.904612 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Aug 13 00:33:17.904657 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Aug 13 00:33:17.904702 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Aug 13 00:33:17.904747 kernel: pci 0000:00:1c.3: PME# supported from D0 D3hot D3cold Aug 13 00:33:17.904801 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 conventional PCI endpoint Aug 13 00:33:17.904848 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Aug 13 00:33:17.904899 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 conventional PCI endpoint Aug 13 00:33:17.904952 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 conventional PCI endpoint Aug 13 00:33:17.904997 kernel: pci 0000:00:1f.4: BAR 0 [mem 0x95514000-0x955140ff 64bit] Aug 13 00:33:17.905042 kernel: pci 0000:00:1f.4: BAR 4 [io 0xefa0-0xefbf] Aug 13 00:33:17.905091 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 conventional PCI endpoint Aug 13 00:33:17.905137 kernel: pci 0000:00:1f.5: BAR 0 [mem 0xfe010000-0xfe010fff] Aug 13 00:33:17.905191 kernel: pci 0000:01:00.0: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Aug 13 00:33:17.905238 kernel: pci 0000:01:00.0: BAR 0 [mem 0x92000000-0x93ffffff 64bit pref] Aug 13 00:33:17.905285 kernel: pci 0000:01:00.0: ROM [mem 0x95200000-0x952fffff pref] Aug 13 00:33:17.905330 kernel: pci 0000:01:00.0: PME# supported from D3cold Aug 13 00:33:17.905376 kernel: pci 0000:01:00.0: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Aug 13 00:33:17.905422 kernel: pci 0000:01:00.0: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Aug 13 00:33:17.905474 kernel: pci 0000:01:00.1: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Aug 13 00:33:17.905524 kernel: pci 0000:01:00.1: BAR 0 [mem 0x90000000-0x91ffffff 64bit pref] Aug 13 00:33:17.905570 kernel: pci 0000:01:00.1: ROM [mem 0x95100000-0x951fffff pref] Aug 13 00:33:17.905616 kernel: pci 0000:01:00.1: PME# supported from D3cold Aug 13 00:33:17.905662 kernel: pci 0000:01:00.1: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Aug 13 00:33:17.905710 kernel: pci 0000:01:00.1: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Aug 13 00:33:17.905756 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Aug 13 00:33:17.905806 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Aug 13 00:33:17.905897 kernel: pci 0000:03:00.0: working around ROM BAR overlap defect Aug 13 00:33:17.905944 kernel: pci 0000:03:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Aug 13 00:33:17.905990 kernel: pci 0000:03:00.0: BAR 0 [mem 0x95400000-0x9547ffff] Aug 13 00:33:17.906036 kernel: pci 0000:03:00.0: BAR 2 [io 0x5000-0x501f] Aug 13 00:33:17.906082 kernel: pci 0000:03:00.0: BAR 3 [mem 0x95480000-0x95483fff] Aug 13 00:33:17.906129 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Aug 13 00:33:17.906176 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Aug 13 00:33:17.906229 kernel: pci 0000:04:00.0: working around ROM BAR overlap defect Aug 13 00:33:17.906276 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Aug 13 00:33:17.906322 kernel: pci 0000:04:00.0: BAR 0 [mem 0x95300000-0x9537ffff] Aug 13 00:33:17.906368 kernel: pci 0000:04:00.0: BAR 2 [io 0x4000-0x401f] Aug 13 00:33:17.906414 kernel: pci 0000:04:00.0: BAR 3 [mem 0x95380000-0x95383fff] Aug 13 00:33:17.906459 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Aug 13 00:33:17.906505 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Aug 13 00:33:17.906553 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Aug 13 00:33:17.906607 kernel: pci 0000:06:00.0: [1a03:1150] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Aug 13 00:33:17.906654 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Aug 13 00:33:17.906701 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Aug 13 00:33:17.906747 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Aug 13 00:33:17.906797 kernel: pci 0000:06:00.0: enabling Extended Tags Aug 13 00:33:17.906890 kernel: pci 0000:06:00.0: supports D1 D2 Aug 13 00:33:17.906936 kernel: pci 0000:06:00.0: PME# supported from D0 D1 D2 D3hot D3cold Aug 13 00:33:17.906985 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Aug 13 00:33:17.907036 kernel: pci_bus 0000:07: extended config space not accessible Aug 13 00:33:17.907091 kernel: pci 0000:07:00.0: [1a03:2000] type 00 class 0x030000 conventional PCI endpoint Aug 13 00:33:17.907140 kernel: pci 0000:07:00.0: BAR 0 [mem 0x94000000-0x94ffffff] Aug 13 00:33:17.907189 kernel: pci 0000:07:00.0: BAR 1 [mem 0x95000000-0x9501ffff] Aug 13 00:33:17.907237 kernel: pci 0000:07:00.0: BAR 2 [io 0x3000-0x307f] Aug 13 00:33:17.907284 kernel: pci 0000:07:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Aug 13 00:33:17.907335 kernel: pci 0000:07:00.0: supports D1 D2 Aug 13 00:33:17.907383 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Aug 13 00:33:17.907429 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Aug 13 00:33:17.907437 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Aug 13 00:33:17.907443 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Aug 13 00:33:17.907448 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Aug 13 00:33:17.907454 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Aug 13 00:33:17.907459 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Aug 13 00:33:17.907466 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Aug 13 00:33:17.907472 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Aug 13 00:33:17.907477 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Aug 13 00:33:17.907483 kernel: iommu: Default domain type: Translated Aug 13 00:33:17.907488 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Aug 13 00:33:17.907494 kernel: PCI: Using ACPI for IRQ routing Aug 13 00:33:17.907499 kernel: PCI: pci_cache_line_size set to 64 bytes Aug 13 00:33:17.907505 kernel: e820: reserve RAM buffer [mem 0x00099800-0x0009ffff] Aug 13 00:33:17.907510 kernel: e820: reserve RAM buffer [mem 0x81b25000-0x83ffffff] Aug 13 00:33:17.907516 kernel: e820: reserve RAM buffer [mem 0x8afcd000-0x8bffffff] Aug 13 00:33:17.907521 kernel: e820: reserve RAM buffer [mem 0x8c23b000-0x8fffffff] Aug 13 00:33:17.907527 kernel: e820: reserve RAM buffer [mem 0x8ef00000-0x8fffffff] Aug 13 00:33:17.907532 kernel: e820: reserve RAM buffer [mem 0x86f000000-0x86fffffff] Aug 13 00:33:17.907579 kernel: pci 0000:07:00.0: vgaarb: setting as boot VGA device Aug 13 00:33:17.907627 kernel: pci 0000:07:00.0: vgaarb: bridge control possible Aug 13 00:33:17.907675 kernel: pci 0000:07:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Aug 13 00:33:17.907683 kernel: vgaarb: loaded Aug 13 00:33:17.907688 kernel: clocksource: Switched to clocksource tsc-early Aug 13 00:33:17.907695 kernel: VFS: Disk quotas dquot_6.6.0 Aug 13 00:33:17.907701 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Aug 13 00:33:17.907707 kernel: pnp: PnP ACPI init Aug 13 00:33:17.907753 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Aug 13 00:33:17.907802 kernel: pnp 00:02: [dma 0 disabled] Aug 13 00:33:17.907881 kernel: pnp 00:03: [dma 0 disabled] Aug 13 00:33:17.907929 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Aug 13 00:33:17.907973 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Aug 13 00:33:17.908018 kernel: system 00:05: [mem 0xfed10000-0xfed17fff] has been reserved Aug 13 00:33:17.908059 kernel: system 00:05: [mem 0xfed18000-0xfed18fff] has been reserved Aug 13 00:33:17.908100 kernel: system 00:05: [mem 0xfed19000-0xfed19fff] has been reserved Aug 13 00:33:17.908140 kernel: system 00:05: [mem 0xe0000000-0xefffffff] has been reserved Aug 13 00:33:17.908181 kernel: system 00:05: [mem 0xfed20000-0xfed3ffff] has been reserved Aug 13 00:33:17.908221 kernel: system 00:05: [mem 0xfed90000-0xfed93fff] could not be reserved Aug 13 00:33:17.908264 kernel: system 00:05: [mem 0xfed45000-0xfed8ffff] has been reserved Aug 13 00:33:17.908304 kernel: system 00:05: [mem 0xfee00000-0xfeefffff] could not be reserved Aug 13 00:33:17.908350 kernel: system 00:06: [io 0x1800-0x18fe] could not be reserved Aug 13 00:33:17.908392 kernel: system 00:06: [mem 0xfd000000-0xfd69ffff] has been reserved Aug 13 00:33:17.908433 kernel: system 00:06: [mem 0xfd6c0000-0xfd6cffff] has been reserved Aug 13 00:33:17.908473 kernel: system 00:06: [mem 0xfd6f0000-0xfdffffff] has been reserved Aug 13 00:33:17.908516 kernel: system 00:06: [mem 0xfe000000-0xfe01ffff] could not be reserved Aug 13 00:33:17.908557 kernel: system 00:06: [mem 0xfe200000-0xfe7fffff] has been reserved Aug 13 00:33:17.908598 kernel: system 00:06: [mem 0xff000000-0xffffffff] has been reserved Aug 13 00:33:17.908641 kernel: system 00:07: [io 0x2000-0x20fe] has been reserved Aug 13 00:33:17.908649 kernel: pnp: PnP ACPI: found 9 devices Aug 13 00:33:17.908655 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Aug 13 00:33:17.908661 kernel: NET: Registered PF_INET protocol family Aug 13 00:33:17.908666 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Aug 13 00:33:17.908673 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Aug 13 00:33:17.908679 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Aug 13 00:33:17.908685 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Aug 13 00:33:17.908690 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Aug 13 00:33:17.908695 kernel: TCP: Hash tables configured (established 262144 bind 65536) Aug 13 00:33:17.908701 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Aug 13 00:33:17.908706 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Aug 13 00:33:17.908712 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Aug 13 00:33:17.908717 kernel: NET: Registered PF_XDP protocol family Aug 13 00:33:17.908764 kernel: pci 0000:00:15.0: BAR 0 [mem 0x95515000-0x95515fff 64bit]: assigned Aug 13 00:33:17.908873 kernel: pci 0000:00:15.1: BAR 0 [mem 0x9551b000-0x9551bfff 64bit]: assigned Aug 13 00:33:17.908920 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x9551c000-0x9551cfff 64bit]: assigned Aug 13 00:33:17.908968 kernel: pci 0000:01:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Aug 13 00:33:17.909015 kernel: pci 0000:01:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Aug 13 00:33:17.909065 kernel: pci 0000:01:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Aug 13 00:33:17.909112 kernel: pci 0000:01:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Aug 13 00:33:17.909158 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Aug 13 00:33:17.909204 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Aug 13 00:33:17.909250 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Aug 13 00:33:17.909295 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Aug 13 00:33:17.909341 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Aug 13 00:33:17.909387 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Aug 13 00:33:17.909435 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Aug 13 00:33:17.909480 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Aug 13 00:33:17.909526 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Aug 13 00:33:17.909571 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Aug 13 00:33:17.909616 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Aug 13 00:33:17.909663 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Aug 13 00:33:17.909709 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Aug 13 00:33:17.909755 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Aug 13 00:33:17.909825 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Aug 13 00:33:17.909884 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Aug 13 00:33:17.909932 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Aug 13 00:33:17.909973 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Aug 13 00:33:17.910013 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Aug 13 00:33:17.910053 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Aug 13 00:33:17.910092 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Aug 13 00:33:17.910131 kernel: pci_bus 0000:00: resource 7 [mem 0x90000000-0xdfffffff window] Aug 13 00:33:17.910170 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Aug 13 00:33:17.910216 kernel: pci_bus 0000:01: resource 1 [mem 0x95100000-0x952fffff] Aug 13 00:33:17.910261 kernel: pci_bus 0000:01: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Aug 13 00:33:17.910306 kernel: pci_bus 0000:03: resource 0 [io 0x5000-0x5fff] Aug 13 00:33:17.910347 kernel: pci_bus 0000:03: resource 1 [mem 0x95400000-0x954fffff] Aug 13 00:33:17.910395 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Aug 13 00:33:17.910437 kernel: pci_bus 0000:04: resource 1 [mem 0x95300000-0x953fffff] Aug 13 00:33:17.910483 kernel: pci_bus 0000:06: resource 0 [io 0x3000-0x3fff] Aug 13 00:33:17.910527 kernel: pci_bus 0000:06: resource 1 [mem 0x94000000-0x950fffff] Aug 13 00:33:17.910571 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Aug 13 00:33:17.910614 kernel: pci_bus 0000:07: resource 1 [mem 0x94000000-0x950fffff] Aug 13 00:33:17.910622 kernel: PCI: CLS 64 bytes, default 64 Aug 13 00:33:17.910627 kernel: DMAR: No ATSR found Aug 13 00:33:17.910633 kernel: DMAR: No SATC found Aug 13 00:33:17.910639 kernel: DMAR: dmar0: Using Queued invalidation Aug 13 00:33:17.910684 kernel: pci 0000:00:00.0: Adding to iommu group 0 Aug 13 00:33:17.910732 kernel: pci 0000:00:01.0: Adding to iommu group 1 Aug 13 00:33:17.910778 kernel: pci 0000:00:08.0: Adding to iommu group 2 Aug 13 00:33:17.910828 kernel: pci 0000:00:12.0: Adding to iommu group 3 Aug 13 00:33:17.910874 kernel: pci 0000:00:14.0: Adding to iommu group 4 Aug 13 00:33:17.910919 kernel: pci 0000:00:14.2: Adding to iommu group 4 Aug 13 00:33:17.910964 kernel: pci 0000:00:15.0: Adding to iommu group 5 Aug 13 00:33:17.911008 kernel: pci 0000:00:15.1: Adding to iommu group 5 Aug 13 00:33:17.911054 kernel: pci 0000:00:16.0: Adding to iommu group 6 Aug 13 00:33:17.911101 kernel: pci 0000:00:16.1: Adding to iommu group 6 Aug 13 00:33:17.911146 kernel: pci 0000:00:16.4: Adding to iommu group 6 Aug 13 00:33:17.911191 kernel: pci 0000:00:17.0: Adding to iommu group 7 Aug 13 00:33:17.911237 kernel: pci 0000:00:1b.0: Adding to iommu group 8 Aug 13 00:33:17.911282 kernel: pci 0000:00:1b.4: Adding to iommu group 9 Aug 13 00:33:17.911328 kernel: pci 0000:00:1b.5: Adding to iommu group 10 Aug 13 00:33:17.911373 kernel: pci 0000:00:1c.0: Adding to iommu group 11 Aug 13 00:33:17.911418 kernel: pci 0000:00:1c.3: Adding to iommu group 12 Aug 13 00:33:17.911465 kernel: pci 0000:00:1e.0: Adding to iommu group 13 Aug 13 00:33:17.911510 kernel: pci 0000:00:1f.0: Adding to iommu group 14 Aug 13 00:33:17.911555 kernel: pci 0000:00:1f.4: Adding to iommu group 14 Aug 13 00:33:17.911600 kernel: pci 0000:00:1f.5: Adding to iommu group 14 Aug 13 00:33:17.911646 kernel: pci 0000:01:00.0: Adding to iommu group 1 Aug 13 00:33:17.911692 kernel: pci 0000:01:00.1: Adding to iommu group 1 Aug 13 00:33:17.911739 kernel: pci 0000:03:00.0: Adding to iommu group 15 Aug 13 00:33:17.911790 kernel: pci 0000:04:00.0: Adding to iommu group 16 Aug 13 00:33:17.911877 kernel: pci 0000:06:00.0: Adding to iommu group 17 Aug 13 00:33:17.911926 kernel: pci 0000:07:00.0: Adding to iommu group 17 Aug 13 00:33:17.911934 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Aug 13 00:33:17.911939 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Aug 13 00:33:17.911945 kernel: software IO TLB: mapped [mem 0x0000000086fcd000-0x000000008afcd000] (64MB) Aug 13 00:33:17.911951 kernel: RAPL PMU: API unit is 2^-32 Joules, 3 fixed counters, 655360 ms ovfl timer Aug 13 00:33:17.911956 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Aug 13 00:33:17.911962 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Aug 13 00:33:17.911969 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Aug 13 00:33:17.912017 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Aug 13 00:33:17.912026 kernel: Initialise system trusted keyrings Aug 13 00:33:17.912031 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Aug 13 00:33:17.912037 kernel: Key type asymmetric registered Aug 13 00:33:17.912042 kernel: Asymmetric key parser 'x509' registered Aug 13 00:33:17.912047 kernel: tsc: Refined TSC clocksource calibration: 3408.000 MHz Aug 13 00:33:17.912053 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Aug 13 00:33:17.912058 kernel: clocksource: Switched to clocksource tsc Aug 13 00:33:17.912066 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Aug 13 00:33:17.912071 kernel: io scheduler mq-deadline registered Aug 13 00:33:17.912076 kernel: io scheduler kyber registered Aug 13 00:33:17.912082 kernel: io scheduler bfq registered Aug 13 00:33:17.912127 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 121 Aug 13 00:33:17.912172 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 122 Aug 13 00:33:17.912218 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 123 Aug 13 00:33:17.912264 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 124 Aug 13 00:33:17.912311 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 125 Aug 13 00:33:17.912357 kernel: pcieport 0000:00:1c.3: PME: Signaling with IRQ 126 Aug 13 00:33:17.912407 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Aug 13 00:33:17.912416 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Aug 13 00:33:17.912421 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Aug 13 00:33:17.912427 kernel: pstore: Using crash dump compression: deflate Aug 13 00:33:17.912432 kernel: pstore: Registered erst as persistent store backend Aug 13 00:33:17.912438 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Aug 13 00:33:17.912443 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Aug 13 00:33:17.912450 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Aug 13 00:33:17.912455 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Aug 13 00:33:17.912461 kernel: hpet_acpi_add: no address or irqs in _CRS Aug 13 00:33:17.912506 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Aug 13 00:33:17.912514 kernel: i8042: PNP: No PS/2 controller found. Aug 13 00:33:17.912556 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Aug 13 00:33:17.912599 kernel: rtc_cmos rtc_cmos: registered as rtc0 Aug 13 00:33:17.912642 kernel: rtc_cmos rtc_cmos: setting system clock to 2025-08-13T00:33:16 UTC (1755045196) Aug 13 00:33:17.912684 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Aug 13 00:33:17.912692 kernel: intel_pstate: Intel P-state driver initializing Aug 13 00:33:17.912697 kernel: intel_pstate: Disabling energy efficiency optimization Aug 13 00:33:17.912703 kernel: intel_pstate: HWP enabled Aug 13 00:33:17.912708 kernel: NET: Registered PF_INET6 protocol family Aug 13 00:33:17.912714 kernel: Segment Routing with IPv6 Aug 13 00:33:17.912719 kernel: In-situ OAM (IOAM) with IPv6 Aug 13 00:33:17.912724 kernel: NET: Registered PF_PACKET protocol family Aug 13 00:33:17.912731 kernel: Key type dns_resolver registered Aug 13 00:33:17.912737 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Aug 13 00:33:17.912742 kernel: microcode: Current revision: 0x000000f4 Aug 13 00:33:17.912747 kernel: IPI shorthand broadcast: enabled Aug 13 00:33:17.912753 kernel: sched_clock: Marking stable (3722000742, 1496100708)->(6810837257, -1592735807) Aug 13 00:33:17.912758 kernel: registered taskstats version 1 Aug 13 00:33:17.912763 kernel: Loading compiled-in X.509 certificates Aug 13 00:33:17.912769 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.40-flatcar: dee0b464d3f7f8d09744a2392f69dde258bc95c0' Aug 13 00:33:17.912774 kernel: Demotion targets for Node 0: null Aug 13 00:33:17.912781 kernel: Key type .fscrypt registered Aug 13 00:33:17.912789 kernel: Key type fscrypt-provisioning registered Aug 13 00:33:17.912794 kernel: ima: Allocated hash algorithm: sha1 Aug 13 00:33:17.912800 kernel: ima: No architecture policies found Aug 13 00:33:17.912823 kernel: clk: Disabling unused clocks Aug 13 00:33:17.912829 kernel: Warning: unable to open an initial console. Aug 13 00:33:17.912834 kernel: Freeing unused kernel image (initmem) memory: 54444K Aug 13 00:33:17.912853 kernel: Write protecting the kernel read-only data: 24576k Aug 13 00:33:17.912859 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Aug 13 00:33:17.912865 kernel: Run /init as init process Aug 13 00:33:17.912871 kernel: with arguments: Aug 13 00:33:17.912876 kernel: /init Aug 13 00:33:17.912881 kernel: with environment: Aug 13 00:33:17.912887 kernel: HOME=/ Aug 13 00:33:17.912892 kernel: TERM=linux Aug 13 00:33:17.912897 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Aug 13 00:33:17.912903 systemd[1]: Successfully made /usr/ read-only. Aug 13 00:33:17.912912 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 13 00:33:17.912918 systemd[1]: Detected architecture x86-64. Aug 13 00:33:17.912923 systemd[1]: Running in initrd. Aug 13 00:33:17.912929 systemd[1]: No hostname configured, using default hostname. Aug 13 00:33:17.912934 systemd[1]: Hostname set to . Aug 13 00:33:17.912940 systemd[1]: Initializing machine ID from random generator. Aug 13 00:33:17.912946 systemd[1]: Queued start job for default target initrd.target. Aug 13 00:33:17.912951 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 00:33:17.912958 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 00:33:17.912964 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Aug 13 00:33:17.912970 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 13 00:33:17.912976 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Aug 13 00:33:17.912981 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Aug 13 00:33:17.912988 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Aug 13 00:33:17.912995 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Aug 13 00:33:17.913001 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 00:33:17.913006 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 13 00:33:17.913012 systemd[1]: Reached target paths.target - Path Units. Aug 13 00:33:17.913018 systemd[1]: Reached target slices.target - Slice Units. Aug 13 00:33:17.913023 systemd[1]: Reached target swap.target - Swaps. Aug 13 00:33:17.913029 systemd[1]: Reached target timers.target - Timer Units. Aug 13 00:33:17.913034 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Aug 13 00:33:17.913040 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 13 00:33:17.913047 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Aug 13 00:33:17.913052 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Aug 13 00:33:17.913058 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 13 00:33:17.913064 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 13 00:33:17.913069 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 00:33:17.913075 systemd[1]: Reached target sockets.target - Socket Units. Aug 13 00:33:17.913081 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Aug 13 00:33:17.913086 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 13 00:33:17.913093 systemd[1]: Finished network-cleanup.service - Network Cleanup. Aug 13 00:33:17.913099 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Aug 13 00:33:17.913105 systemd[1]: Starting systemd-fsck-usr.service... Aug 13 00:33:17.913110 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 13 00:33:17.913126 systemd-journald[296]: Collecting audit messages is disabled. Aug 13 00:33:17.913141 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 13 00:33:17.913147 systemd-journald[296]: Journal started Aug 13 00:33:17.913160 systemd-journald[296]: Runtime Journal (/run/log/journal/81850aafb2644717b51d2df502e72815) is 8M, max 640.1M, 632.1M free. Aug 13 00:33:17.931806 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:33:17.933420 systemd-modules-load[300]: Inserted module 'overlay' Aug 13 00:33:17.954846 systemd[1]: Started systemd-journald.service - Journal Service. Aug 13 00:33:17.955222 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Aug 13 00:33:17.955338 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 00:33:17.955423 systemd[1]: Finished systemd-fsck-usr.service. Aug 13 00:33:17.969788 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Aug 13 00:33:17.970658 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 13 00:33:17.971235 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 13 00:33:17.976599 systemd-modules-load[300]: Inserted module 'br_netfilter' Aug 13 00:33:17.976791 kernel: Bridge firewalling registered Aug 13 00:33:17.991036 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 13 00:33:17.993634 systemd-tmpfiles[312]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Aug 13 00:33:17.997380 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 13 00:33:18.112689 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:33:18.134468 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 00:33:18.157826 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 13 00:33:18.174809 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 13 00:33:18.214504 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 13 00:33:18.219740 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 13 00:33:18.221090 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 13 00:33:18.223491 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 00:33:18.228070 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 13 00:33:18.239978 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Aug 13 00:33:18.261393 systemd-resolved[337]: Positive Trust Anchors: Aug 13 00:33:18.261401 systemd-resolved[337]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 13 00:33:18.261433 systemd-resolved[337]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 13 00:33:18.263661 systemd-resolved[337]: Defaulting to hostname 'linux'. Aug 13 00:33:18.264387 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 13 00:33:18.290986 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 13 00:33:18.394373 dracut-cmdline[343]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=215bdedb8de38f6b96ec4f9db80853e25015f60454b867e319fdcb9244320a21 Aug 13 00:33:18.602816 kernel: SCSI subsystem initialized Aug 13 00:33:18.615822 kernel: Loading iSCSI transport class v2.0-870. Aug 13 00:33:18.628816 kernel: iscsi: registered transport (tcp) Aug 13 00:33:18.651917 kernel: iscsi: registered transport (qla4xxx) Aug 13 00:33:18.651934 kernel: QLogic iSCSI HBA Driver Aug 13 00:33:18.662095 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 13 00:33:18.695598 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 13 00:33:18.709603 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 13 00:33:18.747207 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Aug 13 00:33:18.758688 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Aug 13 00:33:18.876817 kernel: raid6: avx2x4 gen() 48591 MB/s Aug 13 00:33:18.897821 kernel: raid6: avx2x2 gen() 54820 MB/s Aug 13 00:33:18.923853 kernel: raid6: avx2x1 gen() 46054 MB/s Aug 13 00:33:18.923869 kernel: raid6: using algorithm avx2x2 gen() 54820 MB/s Aug 13 00:33:18.950956 kernel: raid6: .... xor() 32962 MB/s, rmw enabled Aug 13 00:33:18.950975 kernel: raid6: using avx2x2 recovery algorithm Aug 13 00:33:18.971817 kernel: xor: automatically using best checksumming function avx Aug 13 00:33:19.078837 kernel: Btrfs loaded, zoned=no, fsverity=no Aug 13 00:33:19.081912 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Aug 13 00:33:19.091850 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 00:33:19.137182 systemd-udevd[556]: Using default interface naming scheme 'v255'. Aug 13 00:33:19.141583 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 00:33:19.146622 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Aug 13 00:33:19.203224 dracut-pre-trigger[568]: rd.md=0: removing MD RAID activation Aug 13 00:33:19.217327 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Aug 13 00:33:19.230077 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 13 00:33:19.323462 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 00:33:19.342810 kernel: cryptd: max_cpu_qlen set to 1000 Aug 13 00:33:19.359037 kernel: pps_core: LinuxPPS API ver. 1 registered Aug 13 00:33:19.359059 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Aug 13 00:33:19.363398 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Aug 13 00:33:19.399314 kernel: ACPI: bus type USB registered Aug 13 00:33:19.399332 kernel: usbcore: registered new interface driver usbfs Aug 13 00:33:19.399345 kernel: usbcore: registered new interface driver hub Aug 13 00:33:19.399358 kernel: usbcore: registered new device driver usb Aug 13 00:33:19.399365 kernel: libata version 3.00 loaded. Aug 13 00:33:19.399372 kernel: PTP clock support registered Aug 13 00:33:19.399382 kernel: AES CTR mode by8 optimization enabled Aug 13 00:33:19.398823 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 13 00:33:19.604053 kernel: ahci 0000:00:17.0: version 3.0 Aug 13 00:33:19.604154 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Aug 13 00:33:19.604220 kernel: ahci 0000:00:17.0: AHCI vers 0001.0301, 32 command slots, 6 Gbps, SATA mode Aug 13 00:33:19.604282 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Aug 13 00:33:19.604342 kernel: ahci 0000:00:17.0: 7/7 ports implemented (port mask 0x7f) Aug 13 00:33:19.604403 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Aug 13 00:33:19.604464 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Aug 13 00:33:19.604524 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Aug 13 00:33:19.604582 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Aug 13 00:33:19.604641 kernel: scsi host0: ahci Aug 13 00:33:19.604704 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Aug 13 00:33:19.604764 kernel: scsi host1: ahci Aug 13 00:33:19.604829 kernel: hub 1-0:1.0: USB hub found Aug 13 00:33:19.604903 kernel: scsi host2: ahci Aug 13 00:33:19.604965 kernel: hub 1-0:1.0: 16 ports detected Aug 13 00:33:19.605029 kernel: scsi host3: ahci Aug 13 00:33:19.605114 kernel: hub 2-0:1.0: USB hub found Aug 13 00:33:19.605188 kernel: scsi host4: ahci Aug 13 00:33:19.605247 kernel: hub 2-0:1.0: 10 ports detected Aug 13 00:33:19.605312 kernel: scsi host5: ahci Aug 13 00:33:19.605371 kernel: scsi host6: ahci Aug 13 00:33:19.605429 kernel: ata1: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516100 irq 127 lpm-pol 0 Aug 13 00:33:19.605438 kernel: ata2: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516180 irq 127 lpm-pol 0 Aug 13 00:33:19.605445 kernel: ata3: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516200 irq 127 lpm-pol 0 Aug 13 00:33:19.605452 kernel: ata4: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516280 irq 127 lpm-pol 0 Aug 13 00:33:19.605458 kernel: ata5: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516300 irq 127 lpm-pol 0 Aug 13 00:33:19.605465 kernel: ata6: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516380 irq 127 lpm-pol 0 Aug 13 00:33:19.605474 kernel: ata7: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516400 irq 127 lpm-pol 0 Aug 13 00:33:19.605481 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Aug 13 00:33:19.605487 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Aug 13 00:33:19.398994 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:33:19.399396 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:33:19.584400 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:33:19.604343 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 13 00:33:19.646284 kernel: igb 0000:03:00.0: added PHC on eth0 Aug 13 00:33:19.646642 kernel: igb 0000:03:00.0: Intel(R) Gigabit Ethernet Network Connection Aug 13 00:33:19.646880 kernel: igb 0000:03:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:70:d2:1e Aug 13 00:33:19.647074 kernel: igb 0000:03:00.0: eth0: PBA No: 010000-000 Aug 13 00:33:19.660836 kernel: igb 0000:03:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Aug 13 00:33:19.688724 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:33:19.744676 kernel: igb 0000:04:00.0: added PHC on eth1 Aug 13 00:33:19.744761 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Aug 13 00:33:19.744879 kernel: igb 0000:04:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:70:d2:1f Aug 13 00:33:19.744942 kernel: igb 0000:04:00.0: eth1: PBA No: 010000-000 Aug 13 00:33:19.745004 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Aug 13 00:33:19.745064 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Aug 13 00:33:19.859848 kernel: ata4: SATA link down (SStatus 0 SControl 300) Aug 13 00:33:19.859996 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Aug 13 00:33:19.866819 kernel: ata5: SATA link down (SStatus 0 SControl 300) Aug 13 00:33:19.872840 kernel: ata3: SATA link down (SStatus 0 SControl 300) Aug 13 00:33:19.877806 kernel: ata6: SATA link down (SStatus 0 SControl 300) Aug 13 00:33:19.883808 kernel: hub 1-14:1.0: USB hub found Aug 13 00:33:19.884355 kernel: ata7: SATA link down (SStatus 0 SControl 300) Aug 13 00:33:19.892361 kernel: hub 1-14:1.0: 4 ports detected Aug 13 00:33:19.892614 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Aug 13 00:33:19.904797 kernel: ata1.00: Model 'Micron_5300_MTFDDAK480TDT', rev ' D3MU001', applying quirks: zeroaftertrim Aug 13 00:33:19.920898 kernel: ata1.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Aug 13 00:33:19.921836 kernel: ata2.00: Model 'Micron_5300_MTFDDAK480TDT', rev ' D3MU001', applying quirks: zeroaftertrim Aug 13 00:33:19.938076 kernel: ata2.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Aug 13 00:33:19.949824 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Aug 13 00:33:19.949842 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Aug 13 00:33:19.967992 kernel: ata1.00: Features: NCQ-prio Aug 13 00:33:19.972816 kernel: ata2.00: Features: NCQ-prio Aug 13 00:33:19.989861 kernel: ata1.00: configured for UDMA/133 Aug 13 00:33:19.989908 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Aug 13 00:33:19.997848 kernel: ata2.00: configured for UDMA/133 Aug 13 00:33:20.002842 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Aug 13 00:33:20.017790 kernel: igb 0000:04:00.0 eno2: renamed from eth1 Aug 13 00:33:20.017892 kernel: igb 0000:03:00.0 eno1: renamed from eth0 Aug 13 00:33:20.023833 kernel: ata1.00: Enabling discard_zeroes_data Aug 13 00:33:20.028213 kernel: ata2.00: Enabling discard_zeroes_data Aug 13 00:33:20.028233 kernel: sd 0:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Aug 13 00:33:20.032950 kernel: sd 1:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Aug 13 00:33:20.047907 kernel: sd 0:0:0:0: [sdb] 4096-byte physical blocks Aug 13 00:33:20.048027 kernel: sd 1:0:0:0: [sda] 4096-byte physical blocks Aug 13 00:33:20.053134 kernel: sd 0:0:0:0: [sdb] Write Protect is off Aug 13 00:33:20.058365 kernel: sd 1:0:0:0: [sda] Write Protect is off Aug 13 00:33:20.063159 kernel: sd 0:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Aug 13 00:33:20.067953 kernel: sd 1:0:0:0: [sda] Mode Sense: 00 3a 00 00 Aug 13 00:33:20.068092 kernel: sd 1:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Aug 13 00:33:20.077706 kernel: sd 0:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Aug 13 00:33:20.077860 kernel: sd 1:0:0:0: [sda] Preferred minimum I/O size 4096 bytes Aug 13 00:33:20.077939 kernel: sd 0:0:0:0: [sdb] Preferred minimum I/O size 4096 bytes Aug 13 00:33:20.107050 kernel: ata2.00: Enabling discard_zeroes_data Aug 13 00:33:20.112372 kernel: ata1.00: Enabling discard_zeroes_data Aug 13 00:33:20.184838 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Aug 13 00:33:20.242805 kernel: sd 0:0:0:0: [sdb] Attached SCSI disk Aug 13 00:33:20.242900 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Aug 13 00:33:20.254396 kernel: GPT:9289727 != 937703087 Aug 13 00:33:20.260730 kernel: GPT:Alternate GPT header not at the end of the disk. Aug 13 00:33:20.264662 kernel: GPT:9289727 != 937703087 Aug 13 00:33:20.270069 kernel: GPT: Use GNU Parted to correct GPT errors. Aug 13 00:33:20.275437 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:33:20.281791 kernel: sd 1:0:0:0: [sda] Attached SCSI disk Aug 13 00:33:20.288838 kernel: hid: raw HID events driver (C) Jiri Kosina Aug 13 00:33:20.288871 kernel: mlx5_core 0000:01:00.0: PTM is not supported by PCIe Aug 13 00:33:20.300428 kernel: mlx5_core 0000:01:00.0: firmware version: 14.28.2006 Aug 13 00:33:20.309507 kernel: mlx5_core 0000:01:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Aug 13 00:33:20.313771 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Micron_5300_MTFDDAK480TDT EFI-SYSTEM. Aug 13 00:33:20.353907 kernel: usbcore: registered new interface driver usbhid Aug 13 00:33:20.353920 kernel: usbhid: USB HID core driver Aug 13 00:33:20.353927 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Aug 13 00:33:20.333094 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Micron_5300_MTFDDAK480TDT ROOT. Aug 13 00:33:20.375056 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5300_MTFDDAK480TDT OEM. Aug 13 00:33:20.429903 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Aug 13 00:33:20.430069 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Aug 13 00:33:20.430078 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Aug 13 00:33:20.390342 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Micron_5300_MTFDDAK480TDT USR-A. Aug 13 00:33:20.439921 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Micron_5300_MTFDDAK480TDT USR-A. Aug 13 00:33:20.440439 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Aug 13 00:33:20.496430 disk-uuid[754]: Primary Header is updated. Aug 13 00:33:20.496430 disk-uuid[754]: Secondary Entries is updated. Aug 13 00:33:20.496430 disk-uuid[754]: Secondary Header is updated. Aug 13 00:33:20.523813 kernel: ata2.00: Enabling discard_zeroes_data Aug 13 00:33:20.523825 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:33:20.575818 kernel: mlx5_core 0000:01:00.0: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Aug 13 00:33:20.585600 kernel: mlx5_core 0000:01:00.0: Port module event: module 0, Cable plugged Aug 13 00:33:20.810924 kernel: mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Aug 13 00:33:20.825256 kernel: mlx5_core 0000:01:00.1: PTM is not supported by PCIe Aug 13 00:33:20.825688 kernel: mlx5_core 0000:01:00.1: firmware version: 14.28.2006 Aug 13 00:33:20.825911 kernel: mlx5_core 0000:01:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Aug 13 00:33:21.119789 kernel: mlx5_core 0000:01:00.1: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Aug 13 00:33:21.132187 kernel: mlx5_core 0000:01:00.1: Port module event: module 1, Cable plugged Aug 13 00:33:21.396853 kernel: mlx5_core 0000:01:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Aug 13 00:33:21.409797 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: renamed from eth1 Aug 13 00:33:21.410082 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: renamed from eth0 Aug 13 00:33:21.423175 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Aug 13 00:33:21.432358 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Aug 13 00:33:21.460975 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 00:33:21.470942 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 13 00:33:21.489153 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Aug 13 00:33:21.527032 kernel: ata2.00: Enabling discard_zeroes_data Aug 13 00:33:21.538221 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Aug 13 00:33:21.558877 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:33:21.558891 disk-uuid[755]: The operation has completed successfully. Aug 13 00:33:21.579830 systemd[1]: disk-uuid.service: Deactivated successfully. Aug 13 00:33:21.579926 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Aug 13 00:33:21.621790 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Aug 13 00:33:21.656699 sh[807]: Success Aug 13 00:33:21.681838 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Aug 13 00:33:21.681861 kernel: device-mapper: uevent: version 1.0.3 Aug 13 00:33:21.695685 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Aug 13 00:33:21.707823 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Aug 13 00:33:21.745307 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Aug 13 00:33:21.754937 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Aug 13 00:33:21.784994 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Aug 13 00:33:21.846865 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Aug 13 00:33:21.846878 kernel: BTRFS: device fsid 0c0338fb-9434-41c1-99a2-737cbe2351c4 devid 1 transid 44 /dev/mapper/usr (254:0) scanned by mount (819) Aug 13 00:33:21.846885 kernel: BTRFS info (device dm-0): first mount of filesystem 0c0338fb-9434-41c1-99a2-737cbe2351c4 Aug 13 00:33:21.846892 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Aug 13 00:33:21.846899 kernel: BTRFS info (device dm-0): using free-space-tree Aug 13 00:33:21.852992 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Aug 13 00:33:21.860146 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Aug 13 00:33:21.884022 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Aug 13 00:33:21.884525 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Aug 13 00:33:21.921138 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Aug 13 00:33:21.961684 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (842) Aug 13 00:33:21.961704 kernel: BTRFS info (device sda6): first mount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:33:21.969825 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Aug 13 00:33:21.975763 kernel: BTRFS info (device sda6): using free-space-tree Aug 13 00:33:21.991792 kernel: BTRFS info (device sda6): last unmount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:33:21.992207 systemd[1]: Finished ignition-setup.service - Ignition (setup). Aug 13 00:33:21.993010 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Aug 13 00:33:22.008037 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 13 00:33:22.038550 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 13 00:33:22.085853 systemd-networkd[989]: lo: Link UP Aug 13 00:33:22.085856 systemd-networkd[989]: lo: Gained carrier Aug 13 00:33:22.089196 systemd-networkd[989]: Enumeration completed Aug 13 00:33:22.089276 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 13 00:33:22.090734 systemd-networkd[989]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 00:33:22.126621 ignition[956]: Ignition 2.21.0 Aug 13 00:33:22.092963 systemd[1]: Reached target network.target - Network. Aug 13 00:33:22.126625 ignition[956]: Stage: fetch-offline Aug 13 00:33:22.117470 systemd-networkd[989]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 00:33:22.126643 ignition[956]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:33:22.128637 unknown[956]: fetched base config from "system" Aug 13 00:33:22.126648 ignition[956]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Aug 13 00:33:22.128641 unknown[956]: fetched user config from "system" Aug 13 00:33:22.126695 ignition[956]: parsed url from cmdline: "" Aug 13 00:33:22.129694 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Aug 13 00:33:22.126697 ignition[956]: no config URL provided Aug 13 00:33:22.145910 systemd-networkd[989]: enp1s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 00:33:22.126700 ignition[956]: reading system config file "/usr/lib/ignition/user.ign" Aug 13 00:33:22.149114 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Aug 13 00:33:22.126724 ignition[956]: parsing config with SHA512: 00f56e737eb183a13fb8200cf0365167887a36846649cead36c9526af1daf9f67e3e16dab35a9699a25a0c0ca68300a6c1f3a09e820390373c94d4a82bac66f6 Aug 13 00:33:22.149596 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Aug 13 00:33:22.128849 ignition[956]: fetch-offline: fetch-offline passed Aug 13 00:33:22.128851 ignition[956]: POST message to Packet Timeline Aug 13 00:33:22.128854 ignition[956]: POST Status error: resource requires networking Aug 13 00:33:22.128884 ignition[956]: Ignition finished successfully Aug 13 00:33:22.307032 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Aug 13 00:33:22.306603 systemd-networkd[989]: enp1s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 00:33:22.179367 ignition[1008]: Ignition 2.21.0 Aug 13 00:33:22.179372 ignition[1008]: Stage: kargs Aug 13 00:33:22.179456 ignition[1008]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:33:22.179461 ignition[1008]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Aug 13 00:33:22.180445 ignition[1008]: kargs: kargs passed Aug 13 00:33:22.180449 ignition[1008]: POST message to Packet Timeline Aug 13 00:33:22.180464 ignition[1008]: GET https://metadata.packet.net/metadata: attempt #1 Aug 13 00:33:22.180825 ignition[1008]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:37062->[::1]:53: read: connection refused Aug 13 00:33:22.381101 ignition[1008]: GET https://metadata.packet.net/metadata: attempt #2 Aug 13 00:33:22.381371 ignition[1008]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:47398->[::1]:53: read: connection refused Aug 13 00:33:22.477879 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Aug 13 00:33:22.479769 systemd-networkd[989]: eno1: Link UP Aug 13 00:33:22.480229 systemd-networkd[989]: eno2: Link UP Aug 13 00:33:22.480627 systemd-networkd[989]: enp1s0f0np0: Link UP Aug 13 00:33:22.481106 systemd-networkd[989]: enp1s0f0np0: Gained carrier Aug 13 00:33:22.497343 systemd-networkd[989]: enp1s0f1np1: Link UP Aug 13 00:33:22.498771 systemd-networkd[989]: enp1s0f1np1: Gained carrier Aug 13 00:33:22.538154 systemd-networkd[989]: enp1s0f0np0: DHCPv4 address 147.75.71.77/31, gateway 147.75.71.76 acquired from 145.40.83.140 Aug 13 00:33:22.781723 ignition[1008]: GET https://metadata.packet.net/metadata: attempt #3 Aug 13 00:33:22.783075 ignition[1008]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:46475->[::1]:53: read: connection refused Aug 13 00:33:23.495407 systemd-networkd[989]: enp1s0f0np0: Gained IPv6LL Aug 13 00:33:23.583398 ignition[1008]: GET https://metadata.packet.net/metadata: attempt #4 Aug 13 00:33:23.584682 ignition[1008]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:46344->[::1]:53: read: connection refused Aug 13 00:33:23.943436 systemd-networkd[989]: enp1s0f1np1: Gained IPv6LL Aug 13 00:33:25.186171 ignition[1008]: GET https://metadata.packet.net/metadata: attempt #5 Aug 13 00:33:25.187442 ignition[1008]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:48712->[::1]:53: read: connection refused Aug 13 00:33:28.390912 ignition[1008]: GET https://metadata.packet.net/metadata: attempt #6 Aug 13 00:33:29.543915 ignition[1008]: GET result: OK Aug 13 00:33:30.094728 ignition[1008]: Ignition finished successfully Aug 13 00:33:30.100980 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Aug 13 00:33:30.111771 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Aug 13 00:33:30.158937 ignition[1024]: Ignition 2.21.0 Aug 13 00:33:30.158943 ignition[1024]: Stage: disks Aug 13 00:33:30.159028 ignition[1024]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:33:30.159034 ignition[1024]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Aug 13 00:33:30.160079 ignition[1024]: disks: disks passed Aug 13 00:33:30.160082 ignition[1024]: POST message to Packet Timeline Aug 13 00:33:30.160094 ignition[1024]: GET https://metadata.packet.net/metadata: attempt #1 Aug 13 00:33:31.299253 ignition[1024]: GET result: OK Aug 13 00:33:31.960411 ignition[1024]: Ignition finished successfully Aug 13 00:33:31.965352 systemd[1]: Finished ignition-disks.service - Ignition (disks). Aug 13 00:33:31.977045 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Aug 13 00:33:31.995175 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Aug 13 00:33:32.014236 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 13 00:33:32.033237 systemd[1]: Reached target sysinit.target - System Initialization. Aug 13 00:33:32.051215 systemd[1]: Reached target basic.target - Basic System. Aug 13 00:33:32.071145 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Aug 13 00:33:32.118443 systemd-fsck[1042]: ROOT: clean, 15/553520 files, 52789/553472 blocks Aug 13 00:33:32.127294 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Aug 13 00:33:32.128132 systemd[1]: Mounting sysroot.mount - /sysroot... Aug 13 00:33:32.246582 systemd[1]: Mounted sysroot.mount - /sysroot. Aug 13 00:33:32.260074 kernel: EXT4-fs (sda9): mounted filesystem 069caac6-7833-4acd-8940-01a7ff7d1281 r/w with ordered data mode. Quota mode: none. Aug 13 00:33:32.246889 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Aug 13 00:33:32.271997 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 13 00:33:32.280509 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Aug 13 00:33:32.301404 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Aug 13 00:33:32.319790 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sda6 (8:6) scanned by mount (1051) Aug 13 00:33:32.339239 kernel: BTRFS info (device sda6): first mount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:33:32.339399 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Aug 13 00:33:32.346253 kernel: BTRFS info (device sda6): using free-space-tree Aug 13 00:33:32.358334 systemd[1]: Starting flatcar-static-network.service - Flatcar Static Network Agent... Aug 13 00:33:32.368951 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Aug 13 00:33:32.368971 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Aug 13 00:33:32.382340 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 13 00:33:32.437045 coreos-metadata[1069]: Aug 13 00:33:32.427 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Aug 13 00:33:32.412982 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Aug 13 00:33:32.466837 coreos-metadata[1053]: Aug 13 00:33:32.427 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Aug 13 00:33:32.430747 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Aug 13 00:33:32.487933 initrd-setup-root[1083]: cut: /sysroot/etc/passwd: No such file or directory Aug 13 00:33:32.497910 initrd-setup-root[1090]: cut: /sysroot/etc/group: No such file or directory Aug 13 00:33:32.507915 initrd-setup-root[1097]: cut: /sysroot/etc/shadow: No such file or directory Aug 13 00:33:32.516918 initrd-setup-root[1104]: cut: /sysroot/etc/gshadow: No such file or directory Aug 13 00:33:32.542492 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Aug 13 00:33:32.552709 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Aug 13 00:33:32.561592 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Aug 13 00:33:32.595657 systemd[1]: sysroot-oem.mount: Deactivated successfully. Aug 13 00:33:32.613063 kernel: BTRFS info (device sda6): last unmount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:33:32.610984 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Aug 13 00:33:32.628954 ignition[1173]: INFO : Ignition 2.21.0 Aug 13 00:33:32.628954 ignition[1173]: INFO : Stage: mount Aug 13 00:33:32.628954 ignition[1173]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 00:33:32.628954 ignition[1173]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Aug 13 00:33:32.628954 ignition[1173]: INFO : mount: mount passed Aug 13 00:33:32.628954 ignition[1173]: INFO : POST message to Packet Timeline Aug 13 00:33:32.628954 ignition[1173]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Aug 13 00:33:33.416224 coreos-metadata[1069]: Aug 13 00:33:33.416 INFO Fetch successful Aug 13 00:33:33.496101 systemd[1]: flatcar-static-network.service: Deactivated successfully. Aug 13 00:33:33.496168 systemd[1]: Finished flatcar-static-network.service - Flatcar Static Network Agent. Aug 13 00:33:33.569040 ignition[1173]: INFO : GET result: OK Aug 13 00:33:33.712198 coreos-metadata[1053]: Aug 13 00:33:33.712 INFO Fetch successful Aug 13 00:33:33.788206 coreos-metadata[1053]: Aug 13 00:33:33.788 INFO wrote hostname ci-4372.1.0-a-083aa5303b to /sysroot/etc/hostname Aug 13 00:33:33.789643 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 13 00:33:34.005972 ignition[1173]: INFO : Ignition finished successfully Aug 13 00:33:34.006917 systemd[1]: Finished ignition-mount.service - Ignition (mount). Aug 13 00:33:34.023976 systemd[1]: Starting ignition-files.service - Ignition (files)... Aug 13 00:33:34.069194 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 13 00:33:34.116704 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sda6 (8:6) scanned by mount (1198) Aug 13 00:33:34.116722 kernel: BTRFS info (device sda6): first mount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:33:34.124796 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Aug 13 00:33:34.130732 kernel: BTRFS info (device sda6): using free-space-tree Aug 13 00:33:34.135376 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 13 00:33:34.168892 ignition[1215]: INFO : Ignition 2.21.0 Aug 13 00:33:34.168892 ignition[1215]: INFO : Stage: files Aug 13 00:33:34.181044 ignition[1215]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 00:33:34.181044 ignition[1215]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Aug 13 00:33:34.181044 ignition[1215]: DEBUG : files: compiled without relabeling support, skipping Aug 13 00:33:34.181044 ignition[1215]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Aug 13 00:33:34.181044 ignition[1215]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Aug 13 00:33:34.181044 ignition[1215]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Aug 13 00:33:34.181044 ignition[1215]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Aug 13 00:33:34.181044 ignition[1215]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Aug 13 00:33:34.181044 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Aug 13 00:33:34.181044 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Aug 13 00:33:34.172457 unknown[1215]: wrote ssh authorized keys file for user: core Aug 13 00:33:34.305044 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Aug 13 00:33:34.400119 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Aug 13 00:33:34.400119 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Aug 13 00:33:34.431132 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Aug 13 00:33:34.431132 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Aug 13 00:33:34.431132 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Aug 13 00:33:34.431132 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 13 00:33:34.431132 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 13 00:33:34.431132 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 13 00:33:34.431132 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 13 00:33:34.431132 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Aug 13 00:33:34.431132 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Aug 13 00:33:34.431132 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Aug 13 00:33:34.431132 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Aug 13 00:33:34.431132 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Aug 13 00:33:34.431132 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Aug 13 00:33:35.012836 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Aug 13 00:33:35.643423 ignition[1215]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Aug 13 00:33:35.643423 ignition[1215]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Aug 13 00:33:35.672044 ignition[1215]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 13 00:33:35.672044 ignition[1215]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 13 00:33:35.672044 ignition[1215]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Aug 13 00:33:35.672044 ignition[1215]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Aug 13 00:33:35.672044 ignition[1215]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Aug 13 00:33:35.672044 ignition[1215]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Aug 13 00:33:35.672044 ignition[1215]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Aug 13 00:33:35.672044 ignition[1215]: INFO : files: files passed Aug 13 00:33:35.672044 ignition[1215]: INFO : POST message to Packet Timeline Aug 13 00:33:35.672044 ignition[1215]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Aug 13 00:33:36.735201 ignition[1215]: INFO : GET result: OK Aug 13 00:33:37.233256 ignition[1215]: INFO : Ignition finished successfully Aug 13 00:33:37.237553 systemd[1]: Finished ignition-files.service - Ignition (files). Aug 13 00:33:37.253276 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Aug 13 00:33:37.259342 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Aug 13 00:33:37.288217 systemd[1]: ignition-quench.service: Deactivated successfully. Aug 13 00:33:37.288295 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Aug 13 00:33:37.321003 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 13 00:33:37.336382 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Aug 13 00:33:37.357068 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Aug 13 00:33:37.386088 initrd-setup-root-after-ignition[1257]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 13 00:33:37.386088 initrd-setup-root-after-ignition[1257]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Aug 13 00:33:37.412057 initrd-setup-root-after-ignition[1261]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 13 00:33:37.472886 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Aug 13 00:33:37.472941 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Aug 13 00:33:37.490145 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Aug 13 00:33:37.508990 systemd[1]: Reached target initrd.target - Initrd Default Target. Aug 13 00:33:37.527139 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Aug 13 00:33:37.528940 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Aug 13 00:33:37.609164 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 13 00:33:37.622858 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Aug 13 00:33:37.668650 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Aug 13 00:33:37.679072 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 00:33:37.699488 systemd[1]: Stopped target timers.target - Timer Units. Aug 13 00:33:37.716560 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Aug 13 00:33:37.716994 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 13 00:33:37.753243 systemd[1]: Stopped target initrd.target - Initrd Default Target. Aug 13 00:33:37.763673 systemd[1]: Stopped target basic.target - Basic System. Aug 13 00:33:37.781432 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Aug 13 00:33:37.799364 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Aug 13 00:33:37.819386 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Aug 13 00:33:37.839363 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Aug 13 00:33:37.859386 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Aug 13 00:33:37.878379 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Aug 13 00:33:37.898430 systemd[1]: Stopped target sysinit.target - System Initialization. Aug 13 00:33:37.918406 systemd[1]: Stopped target local-fs.target - Local File Systems. Aug 13 00:33:37.936391 systemd[1]: Stopped target swap.target - Swaps. Aug 13 00:33:37.953312 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Aug 13 00:33:37.953706 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Aug 13 00:33:37.977416 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Aug 13 00:33:37.995527 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 00:33:38.015382 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Aug 13 00:33:38.015827 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 00:33:38.037278 systemd[1]: dracut-initqueue.service: Deactivated successfully. Aug 13 00:33:38.037672 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Aug 13 00:33:38.066396 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Aug 13 00:33:38.066858 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Aug 13 00:33:38.085574 systemd[1]: Stopped target paths.target - Path Units. Aug 13 00:33:38.101240 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Aug 13 00:33:38.101695 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 00:33:38.121382 systemd[1]: Stopped target slices.target - Slice Units. Aug 13 00:33:38.139509 systemd[1]: Stopped target sockets.target - Socket Units. Aug 13 00:33:38.156360 systemd[1]: iscsid.socket: Deactivated successfully. Aug 13 00:33:38.156655 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Aug 13 00:33:38.174412 systemd[1]: iscsiuio.socket: Deactivated successfully. Aug 13 00:33:38.174695 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 13 00:33:38.196520 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Aug 13 00:33:38.196947 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 13 00:33:38.214474 systemd[1]: ignition-files.service: Deactivated successfully. Aug 13 00:33:38.214894 systemd[1]: Stopped ignition-files.service - Ignition (files). Aug 13 00:33:38.338035 ignition[1281]: INFO : Ignition 2.21.0 Aug 13 00:33:38.338035 ignition[1281]: INFO : Stage: umount Aug 13 00:33:38.338035 ignition[1281]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 00:33:38.338035 ignition[1281]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Aug 13 00:33:38.338035 ignition[1281]: INFO : umount: umount passed Aug 13 00:33:38.338035 ignition[1281]: INFO : POST message to Packet Timeline Aug 13 00:33:38.338035 ignition[1281]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Aug 13 00:33:38.230358 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Aug 13 00:33:38.230712 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 13 00:33:38.249969 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Aug 13 00:33:38.262980 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Aug 13 00:33:38.263051 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 00:33:38.270944 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Aug 13 00:33:38.302943 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Aug 13 00:33:38.303132 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 00:33:38.331178 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Aug 13 00:33:38.331248 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Aug 13 00:33:38.372710 systemd[1]: sysroot-boot.mount: Deactivated successfully. Aug 13 00:33:38.373644 systemd[1]: sysroot-boot.service: Deactivated successfully. Aug 13 00:33:38.373733 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Aug 13 00:33:38.400131 systemd[1]: initrd-cleanup.service: Deactivated successfully. Aug 13 00:33:38.400379 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Aug 13 00:33:39.569175 ignition[1281]: INFO : GET result: OK Aug 13 00:33:40.508628 ignition[1281]: INFO : Ignition finished successfully Aug 13 00:33:40.512266 systemd[1]: ignition-mount.service: Deactivated successfully. Aug 13 00:33:40.512555 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Aug 13 00:33:40.527069 systemd[1]: Stopped target network.target - Network. Aug 13 00:33:40.541055 systemd[1]: ignition-disks.service: Deactivated successfully. Aug 13 00:33:40.541325 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Aug 13 00:33:40.559210 systemd[1]: ignition-kargs.service: Deactivated successfully. Aug 13 00:33:40.559360 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Aug 13 00:33:40.575275 systemd[1]: ignition-setup.service: Deactivated successfully. Aug 13 00:33:40.575451 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Aug 13 00:33:40.591271 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Aug 13 00:33:40.591443 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Aug 13 00:33:40.609267 systemd[1]: initrd-setup-root.service: Deactivated successfully. Aug 13 00:33:40.609451 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Aug 13 00:33:40.625600 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Aug 13 00:33:40.643387 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Aug 13 00:33:40.660019 systemd[1]: systemd-resolved.service: Deactivated successfully. Aug 13 00:33:40.660301 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Aug 13 00:33:40.682871 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Aug 13 00:33:40.683473 systemd[1]: systemd-networkd.service: Deactivated successfully. Aug 13 00:33:40.683759 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Aug 13 00:33:40.699778 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Aug 13 00:33:40.701371 systemd[1]: Stopped target network-pre.target - Preparation for Network. Aug 13 00:33:40.714020 systemd[1]: systemd-networkd.socket: Deactivated successfully. Aug 13 00:33:40.714046 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Aug 13 00:33:40.740756 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Aug 13 00:33:40.755955 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Aug 13 00:33:40.756043 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 13 00:33:40.781218 systemd[1]: systemd-sysctl.service: Deactivated successfully. Aug 13 00:33:40.781382 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Aug 13 00:33:40.799579 systemd[1]: systemd-modules-load.service: Deactivated successfully. Aug 13 00:33:40.799740 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Aug 13 00:33:40.819292 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Aug 13 00:33:40.819472 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 00:33:40.837813 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 00:33:40.859529 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Aug 13 00:33:40.859733 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Aug 13 00:33:40.860811 systemd[1]: systemd-udevd.service: Deactivated successfully. Aug 13 00:33:40.861180 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 00:33:40.879696 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Aug 13 00:33:40.879850 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Aug 13 00:33:40.894989 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Aug 13 00:33:40.895008 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 00:33:40.910960 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Aug 13 00:33:40.910999 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Aug 13 00:33:40.936306 systemd[1]: dracut-cmdline.service: Deactivated successfully. Aug 13 00:33:40.936403 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Aug 13 00:33:40.973993 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 13 00:33:40.974177 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 13 00:33:41.013204 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Aug 13 00:33:41.037862 systemd[1]: systemd-network-generator.service: Deactivated successfully. Aug 13 00:33:41.272936 systemd-journald[296]: Received SIGTERM from PID 1 (systemd). Aug 13 00:33:41.037895 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Aug 13 00:33:41.038123 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Aug 13 00:33:41.038153 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 00:33:41.069090 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 13 00:33:41.069141 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:33:41.094240 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Aug 13 00:33:41.094383 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Aug 13 00:33:41.094493 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 13 00:33:41.095543 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Aug 13 00:33:41.095753 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Aug 13 00:33:41.128557 systemd[1]: network-cleanup.service: Deactivated successfully. Aug 13 00:33:41.128870 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Aug 13 00:33:41.140737 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Aug 13 00:33:41.160267 systemd[1]: Starting initrd-switch-root.service - Switch Root... Aug 13 00:33:41.213505 systemd[1]: Switching root. Aug 13 00:33:41.414904 systemd-journald[296]: Journal stopped Aug 13 00:33:43.109459 kernel: SELinux: policy capability network_peer_controls=1 Aug 13 00:33:43.109475 kernel: SELinux: policy capability open_perms=1 Aug 13 00:33:43.109483 kernel: SELinux: policy capability extended_socket_class=1 Aug 13 00:33:43.109488 kernel: SELinux: policy capability always_check_network=0 Aug 13 00:33:43.109493 kernel: SELinux: policy capability cgroup_seclabel=1 Aug 13 00:33:43.109498 kernel: SELinux: policy capability nnp_nosuid_transition=1 Aug 13 00:33:43.109504 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Aug 13 00:33:43.109509 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Aug 13 00:33:43.109514 kernel: SELinux: policy capability userspace_initial_context=0 Aug 13 00:33:43.109521 kernel: audit: type=1403 audit(1755045221.524:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Aug 13 00:33:43.109530 systemd[1]: Successfully loaded SELinux policy in 81.787ms. Aug 13 00:33:43.109537 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.194ms. Aug 13 00:33:43.109543 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 13 00:33:43.109550 systemd[1]: Detected architecture x86-64. Aug 13 00:33:43.109557 systemd[1]: Detected first boot. Aug 13 00:33:43.109564 systemd[1]: Hostname set to . Aug 13 00:33:43.109570 systemd[1]: Initializing machine ID from random generator. Aug 13 00:33:43.109577 zram_generator::config[1335]: No configuration found. Aug 13 00:33:43.109583 systemd[1]: Populated /etc with preset unit settings. Aug 13 00:33:43.109590 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Aug 13 00:33:43.109597 systemd[1]: initrd-switch-root.service: Deactivated successfully. Aug 13 00:33:43.109604 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Aug 13 00:33:43.109610 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Aug 13 00:33:43.109616 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Aug 13 00:33:43.109623 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Aug 13 00:33:43.109629 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Aug 13 00:33:43.109635 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Aug 13 00:33:43.109643 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Aug 13 00:33:43.109650 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Aug 13 00:33:43.109656 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Aug 13 00:33:43.109662 systemd[1]: Created slice user.slice - User and Session Slice. Aug 13 00:33:43.109669 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 00:33:43.109676 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 00:33:43.109682 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Aug 13 00:33:43.109688 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Aug 13 00:33:43.109695 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Aug 13 00:33:43.109703 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 13 00:33:43.109709 systemd[1]: Expecting device dev-ttyS1.device - /dev/ttyS1... Aug 13 00:33:43.109716 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 00:33:43.109722 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 13 00:33:43.109730 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Aug 13 00:33:43.109737 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Aug 13 00:33:43.109744 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Aug 13 00:33:43.109751 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Aug 13 00:33:43.109758 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 00:33:43.109764 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 13 00:33:43.109771 systemd[1]: Reached target slices.target - Slice Units. Aug 13 00:33:43.109777 systemd[1]: Reached target swap.target - Swaps. Aug 13 00:33:43.109787 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Aug 13 00:33:43.109794 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Aug 13 00:33:43.109800 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Aug 13 00:33:43.109808 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 13 00:33:43.109816 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 13 00:33:43.109823 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 00:33:43.109830 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Aug 13 00:33:43.109837 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Aug 13 00:33:43.109844 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Aug 13 00:33:43.109851 systemd[1]: Mounting media.mount - External Media Directory... Aug 13 00:33:43.109858 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:33:43.109865 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Aug 13 00:33:43.109871 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Aug 13 00:33:43.109878 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Aug 13 00:33:43.109885 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Aug 13 00:33:43.109891 systemd[1]: Reached target machines.target - Containers. Aug 13 00:33:43.109899 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Aug 13 00:33:43.109906 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 13 00:33:43.109913 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 13 00:33:43.109920 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Aug 13 00:33:43.109927 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 13 00:33:43.109933 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 13 00:33:43.109940 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 13 00:33:43.109946 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Aug 13 00:33:43.109953 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 13 00:33:43.109961 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Aug 13 00:33:43.109968 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Aug 13 00:33:43.109974 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Aug 13 00:33:43.109981 kernel: ACPI: bus type drm_connector registered Aug 13 00:33:43.109987 kernel: fuse: init (API version 7.41) Aug 13 00:33:43.109993 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Aug 13 00:33:43.109999 systemd[1]: Stopped systemd-fsck-usr.service. Aug 13 00:33:43.110006 kernel: loop: module loaded Aug 13 00:33:43.110013 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 13 00:33:43.110020 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 13 00:33:43.110027 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 13 00:33:43.110034 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 13 00:33:43.110050 systemd-journald[1439]: Collecting audit messages is disabled. Aug 13 00:33:43.110066 systemd-journald[1439]: Journal started Aug 13 00:33:43.110080 systemd-journald[1439]: Runtime Journal (/run/log/journal/9c0e93736c754b6294dea83ad261339d) is 8M, max 640.1M, 632.1M free. Aug 13 00:33:41.956094 systemd[1]: Queued start job for default target multi-user.target. Aug 13 00:33:41.965625 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Aug 13 00:33:41.965870 systemd[1]: systemd-journald.service: Deactivated successfully. Aug 13 00:33:43.132827 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Aug 13 00:33:43.155859 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Aug 13 00:33:43.174875 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 13 00:33:43.196022 systemd[1]: verity-setup.service: Deactivated successfully. Aug 13 00:33:43.196045 systemd[1]: Stopped verity-setup.service. Aug 13 00:33:43.220798 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:33:43.228827 systemd[1]: Started systemd-journald.service - Journal Service. Aug 13 00:33:43.237250 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Aug 13 00:33:43.246080 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Aug 13 00:33:43.256080 systemd[1]: Mounted media.mount - External Media Directory. Aug 13 00:33:43.265049 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Aug 13 00:33:43.274062 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Aug 13 00:33:43.283035 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Aug 13 00:33:43.292138 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Aug 13 00:33:43.302167 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 00:33:43.312187 systemd[1]: modprobe@configfs.service: Deactivated successfully. Aug 13 00:33:43.312359 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Aug 13 00:33:43.322282 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 13 00:33:43.322513 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 13 00:33:43.331647 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 13 00:33:43.332024 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 13 00:33:43.341748 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 13 00:33:43.342254 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 13 00:33:43.352736 systemd[1]: modprobe@fuse.service: Deactivated successfully. Aug 13 00:33:43.353261 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Aug 13 00:33:43.363717 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 13 00:33:43.364216 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 13 00:33:43.374846 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 13 00:33:43.385898 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 13 00:33:43.398151 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Aug 13 00:33:43.408852 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Aug 13 00:33:43.420800 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 00:33:43.454675 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 13 00:33:43.467477 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Aug 13 00:33:43.491557 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Aug 13 00:33:43.501064 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Aug 13 00:33:43.501165 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 13 00:33:43.513270 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Aug 13 00:33:43.527916 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Aug 13 00:33:43.537270 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:33:43.556918 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Aug 13 00:33:43.580020 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Aug 13 00:33:43.589916 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 13 00:33:43.590643 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Aug 13 00:33:43.593136 systemd-journald[1439]: Time spent on flushing to /var/log/journal/9c0e93736c754b6294dea83ad261339d is 12.735ms for 1382 entries. Aug 13 00:33:43.593136 systemd-journald[1439]: System Journal (/var/log/journal/9c0e93736c754b6294dea83ad261339d) is 8M, max 195.6M, 187.6M free. Aug 13 00:33:43.618000 systemd-journald[1439]: Received client request to flush runtime journal. Aug 13 00:33:43.606902 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 13 00:33:43.607581 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 13 00:33:43.616630 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Aug 13 00:33:43.634999 systemd[1]: Starting systemd-sysusers.service - Create System Users... Aug 13 00:33:43.646832 kernel: loop0: detected capacity change from 0 to 146240 Aug 13 00:33:43.650014 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Aug 13 00:33:43.659897 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Aug 13 00:33:43.675234 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Aug 13 00:33:43.675846 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Aug 13 00:33:43.685532 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Aug 13 00:33:43.696035 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 13 00:33:43.706244 systemd[1]: Finished systemd-sysusers.service - Create System Users. Aug 13 00:33:43.717839 kernel: loop1: detected capacity change from 0 to 8 Aug 13 00:33:43.721206 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Aug 13 00:33:43.731574 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Aug 13 00:33:43.753835 kernel: loop2: detected capacity change from 0 to 229808 Aug 13 00:33:43.756045 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 13 00:33:43.773325 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Aug 13 00:33:43.773716 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Aug 13 00:33:43.779680 systemd-tmpfiles[1490]: ACLs are not supported, ignoring. Aug 13 00:33:43.779690 systemd-tmpfiles[1490]: ACLs are not supported, ignoring. Aug 13 00:33:43.785063 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 00:33:43.804829 kernel: loop3: detected capacity change from 0 to 113872 Aug 13 00:33:43.858834 kernel: loop4: detected capacity change from 0 to 146240 Aug 13 00:33:43.883826 kernel: loop5: detected capacity change from 0 to 8 Aug 13 00:33:43.890834 kernel: loop6: detected capacity change from 0 to 229808 Aug 13 00:33:43.913826 kernel: loop7: detected capacity change from 0 to 113872 Aug 13 00:33:43.926932 (sd-merge)[1496]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-packet'. Aug 13 00:33:43.927169 (sd-merge)[1496]: Merged extensions into '/usr'. Aug 13 00:33:43.929700 systemd[1]: Reload requested from client PID 1474 ('systemd-sysext') (unit systemd-sysext.service)... Aug 13 00:33:43.929708 systemd[1]: Reloading... Aug 13 00:33:43.931845 ldconfig[1469]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Aug 13 00:33:43.952849 zram_generator::config[1522]: No configuration found. Aug 13 00:33:44.011346 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:33:44.072233 systemd[1]: Reloading finished in 142 ms. Aug 13 00:33:44.099793 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Aug 13 00:33:44.109313 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Aug 13 00:33:44.119685 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Aug 13 00:33:44.156327 systemd[1]: Starting ensure-sysext.service... Aug 13 00:33:44.165148 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 13 00:33:44.191655 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 00:33:44.200832 systemd-tmpfiles[1580]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Aug 13 00:33:44.200856 systemd-tmpfiles[1580]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Aug 13 00:33:44.201051 systemd-tmpfiles[1580]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Aug 13 00:33:44.201250 systemd-tmpfiles[1580]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Aug 13 00:33:44.201878 systemd-tmpfiles[1580]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Aug 13 00:33:44.202094 systemd-tmpfiles[1580]: ACLs are not supported, ignoring. Aug 13 00:33:44.202138 systemd-tmpfiles[1580]: ACLs are not supported, ignoring. Aug 13 00:33:44.204439 systemd-tmpfiles[1580]: Detected autofs mount point /boot during canonicalization of boot. Aug 13 00:33:44.204444 systemd-tmpfiles[1580]: Skipping /boot Aug 13 00:33:44.206995 systemd[1]: Reload requested from client PID 1579 ('systemctl') (unit ensure-sysext.service)... Aug 13 00:33:44.207006 systemd[1]: Reloading... Aug 13 00:33:44.211547 systemd-tmpfiles[1580]: Detected autofs mount point /boot during canonicalization of boot. Aug 13 00:33:44.211552 systemd-tmpfiles[1580]: Skipping /boot Aug 13 00:33:44.222502 systemd-udevd[1581]: Using default interface naming scheme 'v255'. Aug 13 00:33:44.235794 zram_generator::config[1608]: No configuration found. Aug 13 00:33:44.283666 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Aug 13 00:33:44.283718 kernel: ACPI: button: Sleep Button [SLPB] Aug 13 00:33:44.291466 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Aug 13 00:33:44.292795 kernel: mousedev: PS/2 mouse device common for all mice Aug 13 00:33:44.304691 kernel: IPMI message handler: version 39.2 Aug 13 00:33:44.304741 kernel: ACPI: button: Power Button [PWRF] Aug 13 00:33:44.318801 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Aug 13 00:33:44.336664 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Aug 13 00:33:44.322028 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:33:44.351554 kernel: ipmi device interface Aug 13 00:33:44.369138 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Aug 13 00:33:44.369280 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Aug 13 00:33:44.399802 kernel: iTCO_vendor_support: vendor-support=0 Aug 13 00:33:44.399867 kernel: MACsec IEEE 802.1AE Aug 13 00:33:44.399893 kernel: ipmi_si: IPMI System Interface driver Aug 13 00:33:44.399919 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Aug 13 00:33:44.400156 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Aug 13 00:33:44.400174 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Aug 13 00:33:44.400198 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Aug 13 00:33:44.400417 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Aug 13 00:33:44.400617 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Aug 13 00:33:44.400824 kernel: ipmi_si: Adding ACPI-specified kcs state machine Aug 13 00:33:44.400857 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Aug 13 00:33:44.428711 systemd[1]: Condition check resulted in dev-ttyS1.device - /dev/ttyS1 being skipped. Aug 13 00:33:44.428926 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5300_MTFDDAK480TDT OEM. Aug 13 00:33:44.447872 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Aug 13 00:33:44.497417 systemd[1]: Reloading finished in 290 ms. Aug 13 00:33:44.521821 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b0f, dev_id: 0x20) Aug 13 00:33:44.543010 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 00:33:44.543389 kernel: iTCO_wdt iTCO_wdt: Found a Intel PCH TCO device (Version=6, TCOBASE=0x0400) Aug 13 00:33:44.544257 kernel: iTCO_wdt iTCO_wdt: initialized. heartbeat=30 sec (nowayout=0) Aug 13 00:33:44.559659 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 00:33:44.568250 kernel: intel_rapl_common: Found RAPL domain package Aug 13 00:33:44.568286 kernel: intel_rapl_common: Found RAPL domain core Aug 13 00:33:44.568850 kernel: intel_rapl_common: Found RAPL domain dram Aug 13 00:33:44.583791 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Aug 13 00:33:44.592789 kernel: ipmi_ssif: IPMI SSIF Interface driver Aug 13 00:33:44.605164 systemd[1]: Finished ensure-sysext.service. Aug 13 00:33:44.630578 systemd[1]: Reached target tpm2.target - Trusted Platform Module. Aug 13 00:33:44.639857 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:33:44.640525 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 13 00:33:44.663312 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Aug 13 00:33:44.672969 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 13 00:33:44.673542 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 13 00:33:44.683676 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 13 00:33:44.684465 augenrules[1806]: No rules Aug 13 00:33:44.693377 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 13 00:33:44.704380 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 13 00:33:44.713899 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:33:44.714425 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Aug 13 00:33:44.724824 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 13 00:33:44.740993 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Aug 13 00:33:44.759584 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 13 00:33:44.768816 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 13 00:33:44.777709 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Aug 13 00:33:44.786451 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Aug 13 00:33:44.802995 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:33:44.812831 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:33:44.813526 systemd[1]: audit-rules.service: Deactivated successfully. Aug 13 00:33:44.820916 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 13 00:33:44.832662 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Aug 13 00:33:44.832865 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 13 00:33:44.832987 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 13 00:33:44.833169 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 13 00:33:44.833255 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 13 00:33:44.833396 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 13 00:33:44.833479 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 13 00:33:44.833610 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 13 00:33:44.833695 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 13 00:33:44.833848 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Aug 13 00:33:44.834007 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Aug 13 00:33:44.837736 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 13 00:33:44.837809 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 13 00:33:44.838501 systemd[1]: Starting systemd-update-done.service - Update is Completed... Aug 13 00:33:44.839351 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Aug 13 00:33:44.839375 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 13 00:33:44.839568 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Aug 13 00:33:44.859875 systemd[1]: Finished systemd-update-done.service - Update is Completed. Aug 13 00:33:44.875085 systemd[1]: Started systemd-userdbd.service - User Database Manager. Aug 13 00:33:44.920310 systemd-resolved[1817]: Positive Trust Anchors: Aug 13 00:33:44.920318 systemd-resolved[1817]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 13 00:33:44.920345 systemd-resolved[1817]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 13 00:33:44.922826 systemd-resolved[1817]: Using system hostname 'ci-4372.1.0-a-083aa5303b'. Aug 13 00:33:44.926217 systemd-networkd[1816]: lo: Link UP Aug 13 00:33:44.926220 systemd-networkd[1816]: lo: Gained carrier Aug 13 00:33:44.928612 systemd-networkd[1816]: bond0: netdev ready Aug 13 00:33:44.929596 systemd-networkd[1816]: Enumeration completed Aug 13 00:33:44.932037 systemd-networkd[1816]: enp1s0f0np0: Configuring with /etc/systemd/network/10-0c:42:a1:97:fa:a8.network. Aug 13 00:33:44.937187 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Aug 13 00:33:44.947073 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 13 00:33:44.956850 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 13 00:33:44.966012 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:33:44.978180 systemd[1]: Reached target network.target - Network. Aug 13 00:33:44.985843 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 13 00:33:44.996888 systemd[1]: Reached target sysinit.target - System Initialization. Aug 13 00:33:45.007016 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Aug 13 00:33:45.018029 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Aug 13 00:33:45.028974 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Aug 13 00:33:45.040046 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Aug 13 00:33:45.049983 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Aug 13 00:33:45.050088 systemd[1]: Reached target paths.target - Path Units. Aug 13 00:33:45.057980 systemd[1]: Reached target time-set.target - System Time Set. Aug 13 00:33:45.068398 systemd[1]: Started logrotate.timer - Daily rotation of log files. Aug 13 00:33:45.077254 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Aug 13 00:33:45.086963 systemd[1]: Reached target timers.target - Timer Units. Aug 13 00:33:45.097730 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Aug 13 00:33:45.109269 systemd[1]: Starting docker.socket - Docker Socket for the API... Aug 13 00:33:45.120391 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Aug 13 00:33:45.146479 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Aug 13 00:33:45.155062 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Aug 13 00:33:45.165575 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Aug 13 00:33:45.176444 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Aug 13 00:33:45.188198 systemd[1]: Listening on docker.socket - Docker Socket for the API. Aug 13 00:33:45.197475 systemd[1]: Reached target sockets.target - Socket Units. Aug 13 00:33:45.206976 systemd[1]: Reached target basic.target - Basic System. Aug 13 00:33:45.214999 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Aug 13 00:33:45.215022 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Aug 13 00:33:45.215764 systemd[1]: Starting containerd.service - containerd container runtime... Aug 13 00:33:45.242012 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Aug 13 00:33:45.269620 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Aug 13 00:33:45.274873 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Aug 13 00:33:45.288845 kernel: bond0: (slave enp1s0f0np0): Enslaving as a backup interface with an up link Aug 13 00:33:45.289754 systemd-networkd[1816]: enp1s0f1np1: Configuring with /etc/systemd/network/10-0c:42:a1:97:fa:a9.network. Aug 13 00:33:45.290625 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Aug 13 00:33:45.292104 coreos-metadata[1857]: Aug 13 00:33:45.292 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Aug 13 00:33:45.292977 coreos-metadata[1857]: Aug 13 00:33:45.292 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata) Aug 13 00:33:45.314983 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Aug 13 00:33:45.335053 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Aug 13 00:33:45.336719 jq[1863]: false Aug 13 00:33:45.344894 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Aug 13 00:33:45.351059 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Aug 13 00:33:45.355058 extend-filesystems[1864]: Found /dev/sda6 Aug 13 00:33:45.372447 extend-filesystems[1864]: Found /dev/sda9 Aug 13 00:33:45.372447 extend-filesystems[1864]: Checking size of /dev/sda9 Aug 13 00:33:45.372447 extend-filesystems[1864]: Resized partition /dev/sda9 Aug 13 00:33:45.400906 kernel: EXT4-fs (sda9): resizing filesystem from 553472 to 116605649 blocks Aug 13 00:33:45.382240 oslogin_cache_refresh[1865]: Refreshing passwd entry cache Aug 13 00:33:45.377303 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Aug 13 00:33:45.401066 extend-filesystems[1875]: resize2fs 1.47.2 (1-Jan-2025) Aug 13 00:33:45.383345 oslogin_cache_refresh[1865]: Failure getting users, quitting Aug 13 00:33:45.380517 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Aug 13 00:33:45.416999 google_oslogin_nss_cache[1865]: oslogin_cache_refresh[1865]: Refreshing passwd entry cache Aug 13 00:33:45.416999 google_oslogin_nss_cache[1865]: oslogin_cache_refresh[1865]: Failure getting users, quitting Aug 13 00:33:45.416999 google_oslogin_nss_cache[1865]: oslogin_cache_refresh[1865]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 13 00:33:45.416999 google_oslogin_nss_cache[1865]: oslogin_cache_refresh[1865]: Refreshing group entry cache Aug 13 00:33:45.416999 google_oslogin_nss_cache[1865]: oslogin_cache_refresh[1865]: Failure getting groups, quitting Aug 13 00:33:45.416999 google_oslogin_nss_cache[1865]: oslogin_cache_refresh[1865]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 13 00:33:45.383352 oslogin_cache_refresh[1865]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 13 00:33:45.410682 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Aug 13 00:33:45.383372 oslogin_cache_refresh[1865]: Refreshing group entry cache Aug 13 00:33:45.383692 oslogin_cache_refresh[1865]: Failure getting groups, quitting Aug 13 00:33:45.383699 oslogin_cache_refresh[1865]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 13 00:33:45.428791 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Aug 13 00:33:45.432512 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Aug 13 00:33:45.440457 systemd-networkd[1816]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Aug 13 00:33:45.440790 kernel: bond0: (slave enp1s0f1np1): Enslaving as a backup interface with an up link Aug 13 00:33:45.441723 systemd-networkd[1816]: enp1s0f0np0: Link UP Aug 13 00:33:45.441846 systemd-networkd[1816]: enp1s0f0np0: Gained carrier Aug 13 00:33:45.451826 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Aug 13 00:33:45.458116 systemd[1]: Starting systemd-logind.service - User Login Management... Aug 13 00:33:45.461937 systemd-networkd[1816]: enp1s0f1np1: Reconfiguring with /etc/systemd/network/10-0c:42:a1:97:fa:a8.network. Aug 13 00:33:45.462070 systemd-networkd[1816]: enp1s0f1np1: Link UP Aug 13 00:33:45.462190 systemd-networkd[1816]: enp1s0f1np1: Gained carrier Aug 13 00:33:45.468988 systemd[1]: Starting tcsd.service - TCG Core Services Daemon... Aug 13 00:33:45.476168 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Aug 13 00:33:45.476559 systemd[1]: Starting update-engine.service - Update Engine... Aug 13 00:33:45.479982 systemd-networkd[1816]: bond0: Link UP Aug 13 00:33:45.480141 systemd-networkd[1816]: bond0: Gained carrier Aug 13 00:33:45.480275 systemd-timesyncd[1818]: Network configuration changed, trying to establish connection. Aug 13 00:33:45.480560 systemd-timesyncd[1818]: Network configuration changed, trying to establish connection. Aug 13 00:33:45.480726 systemd-timesyncd[1818]: Network configuration changed, trying to establish connection. Aug 13 00:33:45.480766 systemd-timesyncd[1818]: Network configuration changed, trying to establish connection. Aug 13 00:33:45.484402 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Aug 13 00:33:45.492261 update_engine[1895]: I20250813 00:33:45.492226 1895 main.cc:92] Flatcar Update Engine starting Aug 13 00:33:45.494659 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Aug 13 00:33:45.495711 jq[1896]: true Aug 13 00:33:45.499620 systemd-logind[1890]: Watching system buttons on /dev/input/event3 (Power Button) Aug 13 00:33:45.499632 systemd-logind[1890]: Watching system buttons on /dev/input/event2 (Sleep Button) Aug 13 00:33:45.499641 systemd-logind[1890]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Aug 13 00:33:45.499775 systemd-logind[1890]: New seat seat0. Aug 13 00:33:45.504981 systemd[1]: Started systemd-logind.service - User Login Management. Aug 13 00:33:45.514391 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Aug 13 00:33:45.521804 sshd_keygen[1893]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Aug 13 00:33:45.523954 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Aug 13 00:33:45.524062 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Aug 13 00:33:45.524207 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Aug 13 00:33:45.536897 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Aug 13 00:33:45.553533 systemd[1]: motdgen.service: Deactivated successfully. Aug 13 00:33:45.553644 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Aug 13 00:33:45.557451 kernel: bond0: (slave enp1s0f0np0): link status definitely up, 10000 Mbps full duplex Aug 13 00:33:45.557473 kernel: bond0: active interface up! Aug 13 00:33:45.566409 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Aug 13 00:33:45.566522 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Aug 13 00:33:45.577016 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Aug 13 00:33:45.593398 (ntainerd)[1908]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Aug 13 00:33:45.594703 jq[1906]: true Aug 13 00:33:45.603540 tar[1905]: linux-amd64/LICENSE Aug 13 00:33:45.603670 tar[1905]: linux-amd64/helm Aug 13 00:33:45.610314 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Aug 13 00:33:45.610437 systemd[1]: Condition check resulted in tcsd.service - TCG Core Services Daemon being skipped. Aug 13 00:33:45.614739 systemd[1]: Starting issuegen.service - Generate /run/issue... Aug 13 00:33:45.632039 dbus-daemon[1858]: [system] SELinux support is enabled Aug 13 00:33:45.632127 systemd[1]: Started dbus.service - D-Bus System Message Bus. Aug 13 00:33:45.633782 update_engine[1895]: I20250813 00:33:45.633752 1895 update_check_scheduler.cc:74] Next update check in 4m38s Aug 13 00:33:45.642457 systemd[1]: issuegen.service: Deactivated successfully. Aug 13 00:33:45.642571 systemd[1]: Finished issuegen.service - Generate /run/issue. Aug 13 00:33:45.652278 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Aug 13 00:33:45.652306 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Aug 13 00:33:45.652799 dbus-daemon[1858]: [system] Successfully activated service 'org.freedesktop.systemd1' Aug 13 00:33:45.662521 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Aug 13 00:33:45.672790 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 10000 Mbps full duplex Aug 13 00:33:45.679854 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Aug 13 00:33:45.679874 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Aug 13 00:33:45.680032 systemd[1]: Started update-engine.service - Update Engine. Aug 13 00:33:45.685859 bash[1936]: Updated "/home/core/.ssh/authorized_keys" Aug 13 00:33:45.705902 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Aug 13 00:33:45.716098 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Aug 13 00:33:45.727640 systemd[1]: Started getty@tty1.service - Getty on tty1. Aug 13 00:33:45.736553 systemd[1]: Started serial-getty@ttyS1.service - Serial Getty on ttyS1. Aug 13 00:33:45.747932 systemd[1]: Reached target getty.target - Login Prompts. Aug 13 00:33:45.754693 containerd[1908]: time="2025-08-13T00:33:45Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Aug 13 00:33:45.756100 containerd[1908]: time="2025-08-13T00:33:45.756083525Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Aug 13 00:33:45.757075 systemd[1]: Starting sshkeys.service... Aug 13 00:33:45.761413 containerd[1908]: time="2025-08-13T00:33:45.761397187Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="4.867µs" Aug 13 00:33:45.761413 containerd[1908]: time="2025-08-13T00:33:45.761411548Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Aug 13 00:33:45.761459 containerd[1908]: time="2025-08-13T00:33:45.761421677Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Aug 13 00:33:45.761505 containerd[1908]: time="2025-08-13T00:33:45.761496889Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Aug 13 00:33:45.761521 containerd[1908]: time="2025-08-13T00:33:45.761506437Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Aug 13 00:33:45.761534 containerd[1908]: time="2025-08-13T00:33:45.761519969Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 13 00:33:45.761562 containerd[1908]: time="2025-08-13T00:33:45.761554423Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 13 00:33:45.761577 containerd[1908]: time="2025-08-13T00:33:45.761565467Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 13 00:33:45.761691 containerd[1908]: time="2025-08-13T00:33:45.761682421Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 13 00:33:45.761707 containerd[1908]: time="2025-08-13T00:33:45.761690648Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 13 00:33:45.761707 containerd[1908]: time="2025-08-13T00:33:45.761696716Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 13 00:33:45.761707 containerd[1908]: time="2025-08-13T00:33:45.761701274Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Aug 13 00:33:45.761752 containerd[1908]: time="2025-08-13T00:33:45.761744776Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Aug 13 00:33:45.761875 containerd[1908]: time="2025-08-13T00:33:45.761867110Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 13 00:33:45.761896 containerd[1908]: time="2025-08-13T00:33:45.761882846Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 13 00:33:45.761896 containerd[1908]: time="2025-08-13T00:33:45.761888502Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Aug 13 00:33:45.761922 containerd[1908]: time="2025-08-13T00:33:45.761904771Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Aug 13 00:33:45.762059 containerd[1908]: time="2025-08-13T00:33:45.762051891Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Aug 13 00:33:45.762088 containerd[1908]: time="2025-08-13T00:33:45.762081664Z" level=info msg="metadata content store policy set" policy=shared Aug 13 00:33:45.775045 systemd[1]: Started locksmithd.service - Cluster reboot manager. Aug 13 00:33:45.777462 containerd[1908]: time="2025-08-13T00:33:45.777447712Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Aug 13 00:33:45.777490 containerd[1908]: time="2025-08-13T00:33:45.777475035Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Aug 13 00:33:45.777490 containerd[1908]: time="2025-08-13T00:33:45.777483941Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Aug 13 00:33:45.777517 containerd[1908]: time="2025-08-13T00:33:45.777491124Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Aug 13 00:33:45.777517 containerd[1908]: time="2025-08-13T00:33:45.777498231Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Aug 13 00:33:45.777517 containerd[1908]: time="2025-08-13T00:33:45.777507940Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Aug 13 00:33:45.777554 containerd[1908]: time="2025-08-13T00:33:45.777517790Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Aug 13 00:33:45.777554 containerd[1908]: time="2025-08-13T00:33:45.777525207Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Aug 13 00:33:45.777554 containerd[1908]: time="2025-08-13T00:33:45.777530950Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Aug 13 00:33:45.777554 containerd[1908]: time="2025-08-13T00:33:45.777536387Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Aug 13 00:33:45.777554 containerd[1908]: time="2025-08-13T00:33:45.777541372Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Aug 13 00:33:45.777554 containerd[1908]: time="2025-08-13T00:33:45.777549601Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Aug 13 00:33:45.777636 containerd[1908]: time="2025-08-13T00:33:45.777618856Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Aug 13 00:33:45.777636 containerd[1908]: time="2025-08-13T00:33:45.777630972Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Aug 13 00:33:45.777665 containerd[1908]: time="2025-08-13T00:33:45.777642828Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Aug 13 00:33:45.777665 containerd[1908]: time="2025-08-13T00:33:45.777649786Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Aug 13 00:33:45.777665 containerd[1908]: time="2025-08-13T00:33:45.777655499Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Aug 13 00:33:45.777665 containerd[1908]: time="2025-08-13T00:33:45.777660758Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Aug 13 00:33:45.777716 containerd[1908]: time="2025-08-13T00:33:45.777669238Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Aug 13 00:33:45.777716 containerd[1908]: time="2025-08-13T00:33:45.777675096Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Aug 13 00:33:45.777716 containerd[1908]: time="2025-08-13T00:33:45.777681292Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Aug 13 00:33:45.777716 containerd[1908]: time="2025-08-13T00:33:45.777687092Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Aug 13 00:33:45.777716 containerd[1908]: time="2025-08-13T00:33:45.777695131Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Aug 13 00:33:45.777782 containerd[1908]: time="2025-08-13T00:33:45.777736158Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Aug 13 00:33:45.777782 containerd[1908]: time="2025-08-13T00:33:45.777744576Z" level=info msg="Start snapshots syncer" Aug 13 00:33:45.777782 containerd[1908]: time="2025-08-13T00:33:45.777754813Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Aug 13 00:33:45.778065 containerd[1908]: time="2025-08-13T00:33:45.777904780Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Aug 13 00:33:45.778148 containerd[1908]: time="2025-08-13T00:33:45.778083238Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Aug 13 00:33:45.778699 containerd[1908]: time="2025-08-13T00:33:45.778681425Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Aug 13 00:33:45.778795 containerd[1908]: time="2025-08-13T00:33:45.778776917Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Aug 13 00:33:45.778816 containerd[1908]: time="2025-08-13T00:33:45.778805798Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Aug 13 00:33:45.778830 containerd[1908]: time="2025-08-13T00:33:45.778815479Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Aug 13 00:33:45.778830 containerd[1908]: time="2025-08-13T00:33:45.778824313Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Aug 13 00:33:45.778855 containerd[1908]: time="2025-08-13T00:33:45.778833694Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Aug 13 00:33:45.778855 containerd[1908]: time="2025-08-13T00:33:45.778842219Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Aug 13 00:33:45.778855 containerd[1908]: time="2025-08-13T00:33:45.778849444Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Aug 13 00:33:45.778898 containerd[1908]: time="2025-08-13T00:33:45.778876780Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Aug 13 00:33:45.778898 containerd[1908]: time="2025-08-13T00:33:45.778886758Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Aug 13 00:33:45.778898 containerd[1908]: time="2025-08-13T00:33:45.778896052Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Aug 13 00:33:45.779764 containerd[1908]: time="2025-08-13T00:33:45.779749003Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 13 00:33:45.779780 containerd[1908]: time="2025-08-13T00:33:45.779770919Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 13 00:33:45.779799 containerd[1908]: time="2025-08-13T00:33:45.779780692Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 13 00:33:45.779818 containerd[1908]: time="2025-08-13T00:33:45.779795484Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 13 00:33:45.779818 containerd[1908]: time="2025-08-13T00:33:45.779803520Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Aug 13 00:33:45.779818 containerd[1908]: time="2025-08-13T00:33:45.779812437Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Aug 13 00:33:45.779856 containerd[1908]: time="2025-08-13T00:33:45.779821856Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Aug 13 00:33:45.779856 containerd[1908]: time="2025-08-13T00:33:45.779842961Z" level=info msg="runtime interface created" Aug 13 00:33:45.779856 containerd[1908]: time="2025-08-13T00:33:45.779848141Z" level=info msg="created NRI interface" Aug 13 00:33:45.779895 containerd[1908]: time="2025-08-13T00:33:45.779854697Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Aug 13 00:33:45.779895 containerd[1908]: time="2025-08-13T00:33:45.779865069Z" level=info msg="Connect containerd service" Aug 13 00:33:45.779895 containerd[1908]: time="2025-08-13T00:33:45.779885439Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Aug 13 00:33:45.780335 containerd[1908]: time="2025-08-13T00:33:45.780274988Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 13 00:33:45.800138 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Aug 13 00:33:45.810687 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Aug 13 00:33:45.818131 locksmithd[1965]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Aug 13 00:33:45.844611 coreos-metadata[1974]: Aug 13 00:33:45.844 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Aug 13 00:33:45.891547 containerd[1908]: time="2025-08-13T00:33:45.891524553Z" level=info msg="Start subscribing containerd event" Aug 13 00:33:45.891623 containerd[1908]: time="2025-08-13T00:33:45.891555097Z" level=info msg="Start recovering state" Aug 13 00:33:45.891623 containerd[1908]: time="2025-08-13T00:33:45.891595678Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Aug 13 00:33:45.891665 containerd[1908]: time="2025-08-13T00:33:45.891623632Z" level=info msg=serving... address=/run/containerd/containerd.sock Aug 13 00:33:45.891665 containerd[1908]: time="2025-08-13T00:33:45.891629587Z" level=info msg="Start event monitor" Aug 13 00:33:45.891665 containerd[1908]: time="2025-08-13T00:33:45.891638712Z" level=info msg="Start cni network conf syncer for default" Aug 13 00:33:45.891665 containerd[1908]: time="2025-08-13T00:33:45.891644606Z" level=info msg="Start streaming server" Aug 13 00:33:45.891665 containerd[1908]: time="2025-08-13T00:33:45.891649083Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Aug 13 00:33:45.891665 containerd[1908]: time="2025-08-13T00:33:45.891652797Z" level=info msg="runtime interface starting up..." Aug 13 00:33:45.891665 containerd[1908]: time="2025-08-13T00:33:45.891657600Z" level=info msg="starting plugins..." Aug 13 00:33:45.891665 containerd[1908]: time="2025-08-13T00:33:45.891664602Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Aug 13 00:33:45.891807 containerd[1908]: time="2025-08-13T00:33:45.891747460Z" level=info msg="containerd successfully booted in 0.137253s" Aug 13 00:33:45.891811 systemd[1]: Started containerd.service - containerd container runtime. Aug 13 00:33:45.924810 tar[1905]: linux-amd64/README.md Aug 13 00:33:45.942752 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Aug 13 00:33:46.293052 coreos-metadata[1857]: Aug 13 00:33:46.293 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Aug 13 00:33:46.485814 kernel: EXT4-fs (sda9): resized filesystem to 116605649 Aug 13 00:33:46.516393 extend-filesystems[1875]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Aug 13 00:33:46.516393 extend-filesystems[1875]: old_desc_blocks = 1, new_desc_blocks = 56 Aug 13 00:33:46.516393 extend-filesystems[1875]: The filesystem on /dev/sda9 is now 116605649 (4k) blocks long. Aug 13 00:33:46.543867 extend-filesystems[1864]: Resized filesystem in /dev/sda9 Aug 13 00:33:46.516906 systemd[1]: extend-filesystems.service: Deactivated successfully. Aug 13 00:33:46.517028 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Aug 13 00:33:46.983184 systemd-timesyncd[1818]: Network configuration changed, trying to establish connection. Aug 13 00:33:47.302860 systemd-networkd[1816]: bond0: Gained IPv6LL Aug 13 00:33:47.303151 systemd-timesyncd[1818]: Network configuration changed, trying to establish connection. Aug 13 00:33:47.304437 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Aug 13 00:33:47.315264 systemd[1]: Reached target network-online.target - Network is Online. Aug 13 00:33:47.324990 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:33:47.346111 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Aug 13 00:33:47.365008 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Aug 13 00:33:48.109060 kernel: mlx5_core 0000:01:00.0: lag map: port 1:1 port 2:2 Aug 13 00:33:48.109197 kernel: mlx5_core 0000:01:00.0: shared_fdb:0 mode:queue_affinity Aug 13 00:33:48.147495 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:33:48.157430 (kubelet)[2016]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:33:48.184805 kernel: mlx5_core 0000:01:00.0: lag map: port 1:2 port 2:2 Aug 13 00:33:48.269789 kernel: mlx5_core 0000:01:00.0: lag map: port 1:1 port 2:2 Aug 13 00:33:48.648758 kubelet[2016]: E0813 00:33:48.648736 2016 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:33:48.649917 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:33:48.649991 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:33:48.650158 systemd[1]: kubelet.service: Consumed 641ms CPU time, 269.5M memory peak. Aug 13 00:33:49.196102 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Aug 13 00:33:49.206743 systemd[1]: Started sshd@0-147.75.71.77:22-139.178.89.65:51208.service - OpenSSH per-connection server daemon (139.178.89.65:51208). Aug 13 00:33:49.322589 sshd[2035]: Accepted publickey for core from 139.178.89.65 port 51208 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:33:49.323372 sshd-session[2035]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:33:49.330690 systemd-logind[1890]: New session 1 of user core. Aug 13 00:33:49.331519 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Aug 13 00:33:49.340773 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Aug 13 00:33:49.371622 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Aug 13 00:33:49.383374 systemd[1]: Starting user@500.service - User Manager for UID 500... Aug 13 00:33:49.413470 (systemd)[2039]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Aug 13 00:33:49.416497 systemd-logind[1890]: New session c1 of user core. Aug 13 00:33:49.536541 systemd[2039]: Queued start job for default target default.target. Aug 13 00:33:49.544453 systemd[2039]: Created slice app.slice - User Application Slice. Aug 13 00:33:49.544485 systemd[2039]: Reached target paths.target - Paths. Aug 13 00:33:49.544505 systemd[2039]: Reached target timers.target - Timers. Aug 13 00:33:49.545122 systemd[2039]: Starting dbus.socket - D-Bus User Message Bus Socket... Aug 13 00:33:49.550489 systemd[2039]: Listening on dbus.socket - D-Bus User Message Bus Socket. Aug 13 00:33:49.550516 systemd[2039]: Reached target sockets.target - Sockets. Aug 13 00:33:49.550554 systemd[2039]: Reached target basic.target - Basic System. Aug 13 00:33:49.550573 systemd[2039]: Reached target default.target - Main User Target. Aug 13 00:33:49.550587 systemd[2039]: Startup finished in 125ms. Aug 13 00:33:49.550643 systemd[1]: Started user@500.service - User Manager for UID 500. Aug 13 00:33:49.560984 systemd[1]: Started session-1.scope - Session 1 of User core. Aug 13 00:33:49.632576 systemd[1]: Started sshd@1-147.75.71.77:22-139.178.89.65:51222.service - OpenSSH per-connection server daemon (139.178.89.65:51222). Aug 13 00:33:49.683368 sshd[2050]: Accepted publickey for core from 139.178.89.65 port 51222 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:33:49.684638 sshd-session[2050]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:33:49.689888 systemd-logind[1890]: New session 2 of user core. Aug 13 00:33:49.703137 systemd[1]: Started session-2.scope - Session 2 of User core. Aug 13 00:33:49.764623 sshd[2052]: Connection closed by 139.178.89.65 port 51222 Aug 13 00:33:49.764910 sshd-session[2050]: pam_unix(sshd:session): session closed for user core Aug 13 00:33:49.788216 systemd[1]: sshd@1-147.75.71.77:22-139.178.89.65:51222.service: Deactivated successfully. Aug 13 00:33:49.789947 systemd[1]: session-2.scope: Deactivated successfully. Aug 13 00:33:49.790435 systemd-logind[1890]: Session 2 logged out. Waiting for processes to exit. Aug 13 00:33:49.791441 systemd[1]: Started sshd@2-147.75.71.77:22-139.178.89.65:51236.service - OpenSSH per-connection server daemon (139.178.89.65:51236). Aug 13 00:33:49.802959 systemd-logind[1890]: Removed session 2. Aug 13 00:33:49.846479 sshd[2058]: Accepted publickey for core from 139.178.89.65 port 51236 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:33:49.847744 sshd-session[2058]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:33:49.852878 systemd-logind[1890]: New session 3 of user core. Aug 13 00:33:49.865232 systemd[1]: Started session-3.scope - Session 3 of User core. Aug 13 00:33:49.891642 coreos-metadata[1974]: Aug 13 00:33:49.891 INFO Fetch successful Aug 13 00:33:49.933307 sshd[2061]: Connection closed by 139.178.89.65 port 51236 Aug 13 00:33:49.933427 sshd-session[2058]: pam_unix(sshd:session): session closed for user core Aug 13 00:33:49.935183 systemd[1]: sshd@2-147.75.71.77:22-139.178.89.65:51236.service: Deactivated successfully. Aug 13 00:33:49.936043 systemd[1]: session-3.scope: Deactivated successfully. Aug 13 00:33:49.936523 systemd-logind[1890]: Session 3 logged out. Waiting for processes to exit. Aug 13 00:33:49.937215 systemd-logind[1890]: Removed session 3. Aug 13 00:33:49.962647 unknown[1974]: wrote ssh authorized keys file for user: core Aug 13 00:33:49.988759 update-ssh-keys[2066]: Updated "/home/core/.ssh/authorized_keys" Aug 13 00:33:49.989074 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Aug 13 00:33:50.000597 systemd[1]: Finished sshkeys.service. Aug 13 00:33:50.039861 coreos-metadata[1857]: Aug 13 00:33:50.039 INFO Fetch successful Aug 13 00:33:50.139335 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Aug 13 00:33:50.149183 systemd[1]: Starting packet-phone-home.service - Report Success to Packet... Aug 13 00:33:50.772184 login[1957]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Aug 13 00:33:50.775320 systemd-logind[1890]: New session 4 of user core. Aug 13 00:33:50.775979 systemd[1]: Started session-4.scope - Session 4 of User core. Aug 13 00:33:50.777556 login[1956]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Aug 13 00:33:50.779731 systemd-logind[1890]: New session 5 of user core. Aug 13 00:33:50.780230 systemd[1]: Started session-5.scope - Session 5 of User core. Aug 13 00:33:51.714472 systemd[1]: Finished packet-phone-home.service - Report Success to Packet. Aug 13 00:33:51.716761 systemd[1]: Reached target multi-user.target - Multi-User System. Aug 13 00:33:51.717315 systemd[1]: Startup finished in 4.356s (kernel) + 24.233s (initrd) + 10.272s (userspace) = 38.862s. Aug 13 00:33:52.534732 systemd-timesyncd[1818]: Network configuration changed, trying to establish connection. Aug 13 00:33:58.845853 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Aug 13 00:33:58.849112 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:33:59.150642 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:33:59.153019 (kubelet)[2111]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:33:59.172638 kubelet[2111]: E0813 00:33:59.172584 2111 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:33:59.174694 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:33:59.174773 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:33:59.174939 systemd[1]: kubelet.service: Consumed 149ms CPU time, 112.2M memory peak. Aug 13 00:33:59.953278 systemd[1]: Started sshd@3-147.75.71.77:22-139.178.89.65:55740.service - OpenSSH per-connection server daemon (139.178.89.65:55740). Aug 13 00:33:59.985697 sshd[2128]: Accepted publickey for core from 139.178.89.65 port 55740 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:33:59.986284 sshd-session[2128]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:33:59.988703 systemd-logind[1890]: New session 6 of user core. Aug 13 00:34:00.006053 systemd[1]: Started session-6.scope - Session 6 of User core. Aug 13 00:34:00.057139 sshd[2130]: Connection closed by 139.178.89.65 port 55740 Aug 13 00:34:00.057279 sshd-session[2128]: pam_unix(sshd:session): session closed for user core Aug 13 00:34:00.067911 systemd[1]: sshd@3-147.75.71.77:22-139.178.89.65:55740.service: Deactivated successfully. Aug 13 00:34:00.068721 systemd[1]: session-6.scope: Deactivated successfully. Aug 13 00:34:00.069263 systemd-logind[1890]: Session 6 logged out. Waiting for processes to exit. Aug 13 00:34:00.070359 systemd[1]: Started sshd@4-147.75.71.77:22-139.178.89.65:55754.service - OpenSSH per-connection server daemon (139.178.89.65:55754). Aug 13 00:34:00.070926 systemd-logind[1890]: Removed session 6. Aug 13 00:34:00.115236 sshd[2136]: Accepted publickey for core from 139.178.89.65 port 55754 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:34:00.115810 sshd-session[2136]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:00.118352 systemd-logind[1890]: New session 7 of user core. Aug 13 00:34:00.130049 systemd[1]: Started session-7.scope - Session 7 of User core. Aug 13 00:34:00.176402 sshd[2139]: Connection closed by 139.178.89.65 port 55754 Aug 13 00:34:00.176681 sshd-session[2136]: pam_unix(sshd:session): session closed for user core Aug 13 00:34:00.206303 systemd[1]: sshd@4-147.75.71.77:22-139.178.89.65:55754.service: Deactivated successfully. Aug 13 00:34:00.210144 systemd[1]: session-7.scope: Deactivated successfully. Aug 13 00:34:00.212398 systemd-logind[1890]: Session 7 logged out. Waiting for processes to exit. Aug 13 00:34:00.218069 systemd[1]: Started sshd@5-147.75.71.77:22-139.178.89.65:55768.service - OpenSSH per-connection server daemon (139.178.89.65:55768). Aug 13 00:34:00.219921 systemd-logind[1890]: Removed session 7. Aug 13 00:34:00.304216 sshd[2145]: Accepted publickey for core from 139.178.89.65 port 55768 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:34:00.304952 sshd-session[2145]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:00.308028 systemd-logind[1890]: New session 8 of user core. Aug 13 00:34:00.321034 systemd[1]: Started session-8.scope - Session 8 of User core. Aug 13 00:34:00.374939 sshd[2147]: Connection closed by 139.178.89.65 port 55768 Aug 13 00:34:00.375422 sshd-session[2145]: pam_unix(sshd:session): session closed for user core Aug 13 00:34:00.400052 systemd[1]: sshd@5-147.75.71.77:22-139.178.89.65:55768.service: Deactivated successfully. Aug 13 00:34:00.403943 systemd[1]: session-8.scope: Deactivated successfully. Aug 13 00:34:00.406145 systemd-logind[1890]: Session 8 logged out. Waiting for processes to exit. Aug 13 00:34:00.412613 systemd[1]: Started sshd@6-147.75.71.77:22-139.178.89.65:58984.service - OpenSSH per-connection server daemon (139.178.89.65:58984). Aug 13 00:34:00.414450 systemd-logind[1890]: Removed session 8. Aug 13 00:34:00.497761 sshd[2153]: Accepted publickey for core from 139.178.89.65 port 58984 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:34:00.500032 sshd-session[2153]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:00.508946 systemd-logind[1890]: New session 9 of user core. Aug 13 00:34:00.521139 systemd[1]: Started session-9.scope - Session 9 of User core. Aug 13 00:34:00.590770 sudo[2156]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Aug 13 00:34:00.590915 sudo[2156]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:34:00.608380 sudo[2156]: pam_unix(sudo:session): session closed for user root Aug 13 00:34:00.609185 sshd[2155]: Connection closed by 139.178.89.65 port 58984 Aug 13 00:34:00.609369 sshd-session[2153]: pam_unix(sshd:session): session closed for user core Aug 13 00:34:00.620323 systemd[1]: sshd@6-147.75.71.77:22-139.178.89.65:58984.service: Deactivated successfully. Aug 13 00:34:00.621258 systemd[1]: session-9.scope: Deactivated successfully. Aug 13 00:34:00.621797 systemd-logind[1890]: Session 9 logged out. Waiting for processes to exit. Aug 13 00:34:00.623279 systemd[1]: Started sshd@7-147.75.71.77:22-139.178.89.65:58992.service - OpenSSH per-connection server daemon (139.178.89.65:58992). Aug 13 00:34:00.624082 systemd-logind[1890]: Removed session 9. Aug 13 00:34:00.682170 sshd[2162]: Accepted publickey for core from 139.178.89.65 port 58992 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:34:00.683294 sshd-session[2162]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:00.687846 systemd-logind[1890]: New session 10 of user core. Aug 13 00:34:00.699045 systemd[1]: Started session-10.scope - Session 10 of User core. Aug 13 00:34:00.756633 sudo[2167]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Aug 13 00:34:00.756769 sudo[2167]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:34:00.759328 sudo[2167]: pam_unix(sudo:session): session closed for user root Aug 13 00:34:00.761906 sudo[2166]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Aug 13 00:34:00.762042 sudo[2166]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:34:00.767542 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 13 00:34:00.810860 augenrules[2189]: No rules Aug 13 00:34:00.811794 systemd[1]: audit-rules.service: Deactivated successfully. Aug 13 00:34:00.812119 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 13 00:34:00.813663 sudo[2166]: pam_unix(sudo:session): session closed for user root Aug 13 00:34:00.815630 sshd[2165]: Connection closed by 139.178.89.65 port 58992 Aug 13 00:34:00.816133 sshd-session[2162]: pam_unix(sshd:session): session closed for user core Aug 13 00:34:00.835607 systemd[1]: sshd@7-147.75.71.77:22-139.178.89.65:58992.service: Deactivated successfully. Aug 13 00:34:00.839298 systemd[1]: session-10.scope: Deactivated successfully. Aug 13 00:34:00.841637 systemd-logind[1890]: Session 10 logged out. Waiting for processes to exit. Aug 13 00:34:00.847295 systemd[1]: Started sshd@8-147.75.71.77:22-139.178.89.65:59006.service - OpenSSH per-connection server daemon (139.178.89.65:59006). Aug 13 00:34:00.849146 systemd-logind[1890]: Removed session 10. Aug 13 00:34:00.942064 sshd[2198]: Accepted publickey for core from 139.178.89.65 port 59006 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:34:00.942664 sshd-session[2198]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:00.945489 systemd-logind[1890]: New session 11 of user core. Aug 13 00:34:00.958052 systemd[1]: Started session-11.scope - Session 11 of User core. Aug 13 00:34:01.006232 sudo[2201]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Aug 13 00:34:01.006457 sudo[2201]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:34:01.344723 systemd[1]: Starting docker.service - Docker Application Container Engine... Aug 13 00:34:01.361666 (dockerd)[2226]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Aug 13 00:34:01.595758 dockerd[2226]: time="2025-08-13T00:34:01.595657147Z" level=info msg="Starting up" Aug 13 00:34:01.596462 dockerd[2226]: time="2025-08-13T00:34:01.596453842Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Aug 13 00:34:01.623624 dockerd[2226]: time="2025-08-13T00:34:01.623572617Z" level=info msg="Loading containers: start." Aug 13 00:34:01.634870 kernel: Initializing XFRM netlink socket Aug 13 00:34:01.745741 systemd-timesyncd[1818]: Network configuration changed, trying to establish connection. Aug 13 00:34:01.766254 systemd-networkd[1816]: docker0: Link UP Aug 13 00:34:01.767664 dockerd[2226]: time="2025-08-13T00:34:01.767620228Z" level=info msg="Loading containers: done." Aug 13 00:34:01.773987 dockerd[2226]: time="2025-08-13T00:34:01.773965635Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Aug 13 00:34:01.774056 dockerd[2226]: time="2025-08-13T00:34:01.774009393Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Aug 13 00:34:01.774076 dockerd[2226]: time="2025-08-13T00:34:01.774056739Z" level=info msg="Initializing buildkit" Aug 13 00:34:01.784747 dockerd[2226]: time="2025-08-13T00:34:01.784708117Z" level=info msg="Completed buildkit initialization" Aug 13 00:34:01.788087 dockerd[2226]: time="2025-08-13T00:34:01.788049524Z" level=info msg="Daemon has completed initialization" Aug 13 00:34:01.788087 dockerd[2226]: time="2025-08-13T00:34:01.788075563Z" level=info msg="API listen on /run/docker.sock" Aug 13 00:34:01.788217 systemd[1]: Started docker.service - Docker Application Container Engine. Aug 13 00:34:00.602040 systemd-resolved[1817]: Clock change detected. Flushing caches. Aug 13 00:34:00.613094 systemd-journald[1439]: Time jumped backwards, rotating. Aug 13 00:34:00.602161 systemd-timesyncd[1818]: Contacted time server [2604:4300:a:299::164]:123 (2.flatcar.pool.ntp.org). Aug 13 00:34:00.602208 systemd-timesyncd[1818]: Initial clock synchronization to Wed 2025-08-13 00:34:00.602018 UTC. Aug 13 00:34:01.027348 containerd[1908]: time="2025-08-13T00:34:01.027214834Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.3\"" Aug 13 00:34:01.618743 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount216044892.mount: Deactivated successfully. Aug 13 00:34:02.388683 containerd[1908]: time="2025-08-13T00:34:02.388629018Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:02.388874 containerd[1908]: time="2025-08-13T00:34:02.388772486Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.3: active requests=0, bytes read=30078237" Aug 13 00:34:02.389134 containerd[1908]: time="2025-08-13T00:34:02.389098583Z" level=info msg="ImageCreate event name:\"sha256:a92b4b92a991677d355596cc4aa9b0b12cbc38e8cbdc1e476548518ae045bc4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:02.390406 containerd[1908]: time="2025-08-13T00:34:02.390363113Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:125a8b488def5ea24e2de5682ab1abf063163aae4d89ce21811a45f3ecf23816\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:02.390950 containerd[1908]: time="2025-08-13T00:34:02.390910392Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.3\" with image id \"sha256:a92b4b92a991677d355596cc4aa9b0b12cbc38e8cbdc1e476548518ae045bc4a\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:125a8b488def5ea24e2de5682ab1abf063163aae4d89ce21811a45f3ecf23816\", size \"30075037\" in 1.363620253s" Aug 13 00:34:02.390950 containerd[1908]: time="2025-08-13T00:34:02.390927265Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.3\" returns image reference \"sha256:a92b4b92a991677d355596cc4aa9b0b12cbc38e8cbdc1e476548518ae045bc4a\"" Aug 13 00:34:02.391246 containerd[1908]: time="2025-08-13T00:34:02.391214918Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.3\"" Aug 13 00:34:03.399974 containerd[1908]: time="2025-08-13T00:34:03.399916113Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:03.400168 containerd[1908]: time="2025-08-13T00:34:03.400141044Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.3: active requests=0, bytes read=26019361" Aug 13 00:34:03.400494 containerd[1908]: time="2025-08-13T00:34:03.400455335Z" level=info msg="ImageCreate event name:\"sha256:bf97fadcef43049604abcf0caf4f35229fbee25bd0cdb6fdc1d2bbb4f03d9660\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:03.416401 containerd[1908]: time="2025-08-13T00:34:03.416326954Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:96091626e37c5d5920ee6c3203b783cc01a08f287ec0713aeb7809bb62ccea90\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:03.417477 containerd[1908]: time="2025-08-13T00:34:03.417413327Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.3\" with image id \"sha256:bf97fadcef43049604abcf0caf4f35229fbee25bd0cdb6fdc1d2bbb4f03d9660\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:96091626e37c5d5920ee6c3203b783cc01a08f287ec0713aeb7809bb62ccea90\", size \"27646922\" in 1.026178617s" Aug 13 00:34:03.417477 containerd[1908]: time="2025-08-13T00:34:03.417448522Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.3\" returns image reference \"sha256:bf97fadcef43049604abcf0caf4f35229fbee25bd0cdb6fdc1d2bbb4f03d9660\"" Aug 13 00:34:03.417938 containerd[1908]: time="2025-08-13T00:34:03.417884719Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.3\"" Aug 13 00:34:04.372512 containerd[1908]: time="2025-08-13T00:34:04.372486117Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:04.372682 containerd[1908]: time="2025-08-13T00:34:04.372670736Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.3: active requests=0, bytes read=20155013" Aug 13 00:34:04.373151 containerd[1908]: time="2025-08-13T00:34:04.373111053Z" level=info msg="ImageCreate event name:\"sha256:41376797d5122e388663ab6d0ad583e58cff63e1a0f1eebfb31d615d8f1c1c87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:04.374388 containerd[1908]: time="2025-08-13T00:34:04.374348916Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f3a2ffdd7483168205236f7762e9a1933f17dd733bc0188b52bddab9c0762868\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:04.375233 containerd[1908]: time="2025-08-13T00:34:04.375217935Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.3\" with image id \"sha256:41376797d5122e388663ab6d0ad583e58cff63e1a0f1eebfb31d615d8f1c1c87\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f3a2ffdd7483168205236f7762e9a1933f17dd733bc0188b52bddab9c0762868\", size \"21782592\" in 957.304249ms" Aug 13 00:34:04.375258 containerd[1908]: time="2025-08-13T00:34:04.375234692Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.3\" returns image reference \"sha256:41376797d5122e388663ab6d0ad583e58cff63e1a0f1eebfb31d615d8f1c1c87\"" Aug 13 00:34:04.375520 containerd[1908]: time="2025-08-13T00:34:04.375479937Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.3\"" Aug 13 00:34:05.146978 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1196039024.mount: Deactivated successfully. Aug 13 00:34:05.348309 containerd[1908]: time="2025-08-13T00:34:05.348257239Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:05.348494 containerd[1908]: time="2025-08-13T00:34:05.348423565Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.3: active requests=0, bytes read=31892666" Aug 13 00:34:05.348824 containerd[1908]: time="2025-08-13T00:34:05.348783986Z" level=info msg="ImageCreate event name:\"sha256:af855adae796077ff822e22c0102f686b2ca7b7c51948889b1825388eaac9234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:05.349590 containerd[1908]: time="2025-08-13T00:34:05.349549473Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:c69929cfba9e38305eb1e20ca859aeb90e0d2a7326eab9bb1e8298882fe626cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:05.349957 containerd[1908]: time="2025-08-13T00:34:05.349914592Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.3\" with image id \"sha256:af855adae796077ff822e22c0102f686b2ca7b7c51948889b1825388eaac9234\", repo tag \"registry.k8s.io/kube-proxy:v1.33.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:c69929cfba9e38305eb1e20ca859aeb90e0d2a7326eab9bb1e8298882fe626cd\", size \"31891685\" in 974.420811ms" Aug 13 00:34:05.349957 containerd[1908]: time="2025-08-13T00:34:05.349930496Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.3\" returns image reference \"sha256:af855adae796077ff822e22c0102f686b2ca7b7c51948889b1825388eaac9234\"" Aug 13 00:34:05.350171 containerd[1908]: time="2025-08-13T00:34:05.350161837Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Aug 13 00:34:05.842834 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2255833157.mount: Deactivated successfully. Aug 13 00:34:06.424743 containerd[1908]: time="2025-08-13T00:34:06.424717195Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:06.424941 containerd[1908]: time="2025-08-13T00:34:06.424913785Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942238" Aug 13 00:34:06.425298 containerd[1908]: time="2025-08-13T00:34:06.425271329Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:06.426966 containerd[1908]: time="2025-08-13T00:34:06.426954103Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:06.427415 containerd[1908]: time="2025-08-13T00:34:06.427399980Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.077224665s" Aug 13 00:34:06.427445 containerd[1908]: time="2025-08-13T00:34:06.427417003Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Aug 13 00:34:06.427648 containerd[1908]: time="2025-08-13T00:34:06.427632009Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Aug 13 00:34:07.003226 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3068179279.mount: Deactivated successfully. Aug 13 00:34:07.004362 containerd[1908]: time="2025-08-13T00:34:07.004343525Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:34:07.004563 containerd[1908]: time="2025-08-13T00:34:07.004550583Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Aug 13 00:34:07.004911 containerd[1908]: time="2025-08-13T00:34:07.004900270Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:34:07.005753 containerd[1908]: time="2025-08-13T00:34:07.005719046Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:34:07.006158 containerd[1908]: time="2025-08-13T00:34:07.006118724Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 578.456514ms" Aug 13 00:34:07.006158 containerd[1908]: time="2025-08-13T00:34:07.006132945Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Aug 13 00:34:07.006563 containerd[1908]: time="2025-08-13T00:34:07.006508056Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Aug 13 00:34:07.558982 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3766231938.mount: Deactivated successfully. Aug 13 00:34:07.930644 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Aug 13 00:34:07.932619 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:08.297859 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:08.299936 (kubelet)[2642]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:34:08.323277 kubelet[2642]: E0813 00:34:08.323250 2642 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:34:08.324706 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:34:08.324803 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:34:08.325034 systemd[1]: kubelet.service: Consumed 126ms CPU time, 114.3M memory peak. Aug 13 00:34:08.721135 containerd[1908]: time="2025-08-13T00:34:08.721109288Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:08.721338 containerd[1908]: time="2025-08-13T00:34:08.721326356Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58247175" Aug 13 00:34:08.721685 containerd[1908]: time="2025-08-13T00:34:08.721674677Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:08.723368 containerd[1908]: time="2025-08-13T00:34:08.723324617Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:08.723836 containerd[1908]: time="2025-08-13T00:34:08.723796006Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 1.717259175s" Aug 13 00:34:08.723836 containerd[1908]: time="2025-08-13T00:34:08.723811808Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Aug 13 00:34:11.583099 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:11.583292 systemd[1]: kubelet.service: Consumed 126ms CPU time, 114.3M memory peak. Aug 13 00:34:11.584710 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:11.599060 systemd[1]: Reload requested from client PID 2712 ('systemctl') (unit session-11.scope)... Aug 13 00:34:11.599067 systemd[1]: Reloading... Aug 13 00:34:11.642231 zram_generator::config[2756]: No configuration found. Aug 13 00:34:11.697996 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:34:11.785789 systemd[1]: Reloading finished in 186 ms. Aug 13 00:34:11.829043 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Aug 13 00:34:11.829100 systemd[1]: kubelet.service: Failed with result 'signal'. Aug 13 00:34:11.829249 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:11.830405 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:12.093649 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:12.095702 (kubelet)[2822]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 13 00:34:12.115716 kubelet[2822]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:34:12.115716 kubelet[2822]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 13 00:34:12.115716 kubelet[2822]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:34:12.115940 kubelet[2822]: I0813 00:34:12.115769 2822 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 13 00:34:12.542650 kubelet[2822]: I0813 00:34:12.542639 2822 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Aug 13 00:34:12.542650 kubelet[2822]: I0813 00:34:12.542649 2822 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 13 00:34:12.542763 kubelet[2822]: I0813 00:34:12.542757 2822 server.go:956] "Client rotation is on, will bootstrap in background" Aug 13 00:34:12.568968 kubelet[2822]: I0813 00:34:12.568909 2822 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 13 00:34:12.569623 kubelet[2822]: E0813 00:34:12.569583 2822 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://147.75.71.77:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 147.75.71.77:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Aug 13 00:34:12.573302 kubelet[2822]: I0813 00:34:12.573269 2822 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 13 00:34:12.582027 kubelet[2822]: I0813 00:34:12.582014 2822 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 13 00:34:12.582143 kubelet[2822]: I0813 00:34:12.582127 2822 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 13 00:34:12.582299 kubelet[2822]: I0813 00:34:12.582142 2822 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372.1.0-a-083aa5303b","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 13 00:34:12.582374 kubelet[2822]: I0813 00:34:12.582302 2822 topology_manager.go:138] "Creating topology manager with none policy" Aug 13 00:34:12.582374 kubelet[2822]: I0813 00:34:12.582311 2822 container_manager_linux.go:303] "Creating device plugin manager" Aug 13 00:34:12.582424 kubelet[2822]: I0813 00:34:12.582396 2822 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:34:12.584504 kubelet[2822]: I0813 00:34:12.584480 2822 kubelet.go:480] "Attempting to sync node with API server" Aug 13 00:34:12.584504 kubelet[2822]: I0813 00:34:12.584504 2822 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 13 00:34:12.584558 kubelet[2822]: I0813 00:34:12.584520 2822 kubelet.go:386] "Adding apiserver pod source" Aug 13 00:34:12.584558 kubelet[2822]: I0813 00:34:12.584532 2822 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 13 00:34:12.588149 kubelet[2822]: I0813 00:34:12.588139 2822 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Aug 13 00:34:12.588431 kubelet[2822]: I0813 00:34:12.588424 2822 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Aug 13 00:34:12.589648 kubelet[2822]: W0813 00:34:12.589641 2822 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Aug 13 00:34:12.590859 kubelet[2822]: E0813 00:34:12.590809 2822 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://147.75.71.77:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372.1.0-a-083aa5303b&limit=500&resourceVersion=0\": dial tcp 147.75.71.77:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Aug 13 00:34:12.591049 kubelet[2822]: I0813 00:34:12.591041 2822 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 13 00:34:12.591079 kubelet[2822]: I0813 00:34:12.591066 2822 server.go:1289] "Started kubelet" Aug 13 00:34:12.591135 kubelet[2822]: I0813 00:34:12.591110 2822 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Aug 13 00:34:12.591610 kubelet[2822]: E0813 00:34:12.591591 2822 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://147.75.71.77:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 147.75.71.77:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Aug 13 00:34:12.591983 kubelet[2822]: I0813 00:34:12.591975 2822 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 13 00:34:12.592025 kubelet[2822]: I0813 00:34:12.592007 2822 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 13 00:34:12.592059 kubelet[2822]: E0813 00:34:12.592039 2822 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372.1.0-a-083aa5303b\" not found" Aug 13 00:34:12.592090 kubelet[2822]: I0813 00:34:12.592071 2822 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 13 00:34:12.592122 kubelet[2822]: I0813 00:34:12.592114 2822 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 13 00:34:12.592611 kubelet[2822]: I0813 00:34:12.592320 2822 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 13 00:34:12.592611 kubelet[2822]: I0813 00:34:12.592598 2822 reconciler.go:26] "Reconciler: start to sync state" Aug 13 00:34:12.592611 kubelet[2822]: I0813 00:34:12.592617 2822 server.go:317] "Adding debug handlers to kubelet server" Aug 13 00:34:12.592713 kubelet[2822]: E0813 00:34:12.592632 2822 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 13 00:34:12.595280 kubelet[2822]: E0813 00:34:12.595259 2822 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://147.75.71.77:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 147.75.71.77:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Aug 13 00:34:12.595346 kubelet[2822]: E0813 00:34:12.595275 2822 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.75.71.77:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-a-083aa5303b?timeout=10s\": dial tcp 147.75.71.77:6443: connect: connection refused" interval="200ms" Aug 13 00:34:12.595346 kubelet[2822]: I0813 00:34:12.595326 2822 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 13 00:34:12.595951 kubelet[2822]: I0813 00:34:12.595934 2822 factory.go:223] Registration of the containerd container factory successfully Aug 13 00:34:12.595951 kubelet[2822]: I0813 00:34:12.595951 2822 factory.go:223] Registration of the systemd container factory successfully Aug 13 00:34:12.596201 kubelet[2822]: I0813 00:34:12.596187 2822 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 13 00:34:12.597117 kubelet[2822]: E0813 00:34:12.596180 2822 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://147.75.71.77:6443/api/v1/namespaces/default/events\": dial tcp 147.75.71.77:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4372.1.0-a-083aa5303b.185b2c5bcd6cdef0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4372.1.0-a-083aa5303b,UID:ci-4372.1.0-a-083aa5303b,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4372.1.0-a-083aa5303b,},FirstTimestamp:2025-08-13 00:34:12.59105048 +0000 UTC m=+0.493292266,LastTimestamp:2025-08-13 00:34:12.59105048 +0000 UTC m=+0.493292266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372.1.0-a-083aa5303b,}" Aug 13 00:34:12.602055 kubelet[2822]: I0813 00:34:12.602028 2822 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 13 00:34:12.602055 kubelet[2822]: I0813 00:34:12.602056 2822 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 13 00:34:12.602102 kubelet[2822]: I0813 00:34:12.602066 2822 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:34:12.603073 kubelet[2822]: I0813 00:34:12.603067 2822 policy_none.go:49] "None policy: Start" Aug 13 00:34:12.603097 kubelet[2822]: I0813 00:34:12.603075 2822 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 13 00:34:12.603097 kubelet[2822]: I0813 00:34:12.603081 2822 state_mem.go:35] "Initializing new in-memory state store" Aug 13 00:34:12.604181 kubelet[2822]: I0813 00:34:12.604166 2822 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Aug 13 00:34:12.604704 kubelet[2822]: I0813 00:34:12.604695 2822 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Aug 13 00:34:12.604729 kubelet[2822]: I0813 00:34:12.604705 2822 status_manager.go:230] "Starting to sync pod status with apiserver" Aug 13 00:34:12.604729 kubelet[2822]: I0813 00:34:12.604719 2822 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 13 00:34:12.604729 kubelet[2822]: I0813 00:34:12.604724 2822 kubelet.go:2436] "Starting kubelet main sync loop" Aug 13 00:34:12.604768 kubelet[2822]: E0813 00:34:12.604746 2822 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 13 00:34:12.605561 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Aug 13 00:34:12.606470 kubelet[2822]: E0813 00:34:12.606458 2822 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://147.75.71.77:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 147.75.71.77:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Aug 13 00:34:12.621935 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Aug 13 00:34:12.623686 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Aug 13 00:34:12.644120 kubelet[2822]: E0813 00:34:12.644059 2822 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Aug 13 00:34:12.644227 kubelet[2822]: I0813 00:34:12.644215 2822 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 13 00:34:12.644266 kubelet[2822]: I0813 00:34:12.644224 2822 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 13 00:34:12.644414 kubelet[2822]: I0813 00:34:12.644403 2822 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 13 00:34:12.644752 kubelet[2822]: E0813 00:34:12.644709 2822 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 13 00:34:12.644752 kubelet[2822]: E0813 00:34:12.644732 2822 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4372.1.0-a-083aa5303b\" not found" Aug 13 00:34:12.714596 systemd[1]: Created slice kubepods-burstable-pod54e4fd1f6bad5f4de475981db5c39435.slice - libcontainer container kubepods-burstable-pod54e4fd1f6bad5f4de475981db5c39435.slice. Aug 13 00:34:12.735840 kubelet[2822]: E0813 00:34:12.735800 2822 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-a-083aa5303b\" not found" node="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:12.738937 systemd[1]: Created slice kubepods-burstable-pod4437afbba4064448e269d9c196183bd5.slice - libcontainer container kubepods-burstable-pod4437afbba4064448e269d9c196183bd5.slice. Aug 13 00:34:12.740333 kubelet[2822]: E0813 00:34:12.740309 2822 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-a-083aa5303b\" not found" node="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:12.742108 systemd[1]: Created slice kubepods-burstable-pod54ba87ba354053e283e8aa896fc562be.slice - libcontainer container kubepods-burstable-pod54ba87ba354053e283e8aa896fc562be.slice. Aug 13 00:34:12.743318 kubelet[2822]: E0813 00:34:12.743286 2822 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-a-083aa5303b\" not found" node="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:12.747125 kubelet[2822]: I0813 00:34:12.747088 2822 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:12.747837 kubelet[2822]: E0813 00:34:12.747776 2822 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://147.75.71.77:6443/api/v1/nodes\": dial tcp 147.75.71.77:6443: connect: connection refused" node="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:12.796647 kubelet[2822]: E0813 00:34:12.796451 2822 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.75.71.77:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-a-083aa5303b?timeout=10s\": dial tcp 147.75.71.77:6443: connect: connection refused" interval="400ms" Aug 13 00:34:12.894598 kubelet[2822]: I0813 00:34:12.894484 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4437afbba4064448e269d9c196183bd5-k8s-certs\") pod \"kube-apiserver-ci-4372.1.0-a-083aa5303b\" (UID: \"4437afbba4064448e269d9c196183bd5\") " pod="kube-system/kube-apiserver-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:12.894598 kubelet[2822]: I0813 00:34:12.894571 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4437afbba4064448e269d9c196183bd5-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372.1.0-a-083aa5303b\" (UID: \"4437afbba4064448e269d9c196183bd5\") " pod="kube-system/kube-apiserver-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:12.894932 kubelet[2822]: I0813 00:34:12.894662 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/54ba87ba354053e283e8aa896fc562be-flexvolume-dir\") pod \"kube-controller-manager-ci-4372.1.0-a-083aa5303b\" (UID: \"54ba87ba354053e283e8aa896fc562be\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:12.894932 kubelet[2822]: I0813 00:34:12.894708 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/54ba87ba354053e283e8aa896fc562be-k8s-certs\") pod \"kube-controller-manager-ci-4372.1.0-a-083aa5303b\" (UID: \"54ba87ba354053e283e8aa896fc562be\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:12.894932 kubelet[2822]: I0813 00:34:12.894756 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/54ba87ba354053e283e8aa896fc562be-kubeconfig\") pod \"kube-controller-manager-ci-4372.1.0-a-083aa5303b\" (UID: \"54ba87ba354053e283e8aa896fc562be\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:12.894932 kubelet[2822]: I0813 00:34:12.894797 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/54e4fd1f6bad5f4de475981db5c39435-kubeconfig\") pod \"kube-scheduler-ci-4372.1.0-a-083aa5303b\" (UID: \"54e4fd1f6bad5f4de475981db5c39435\") " pod="kube-system/kube-scheduler-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:12.894932 kubelet[2822]: I0813 00:34:12.894836 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4437afbba4064448e269d9c196183bd5-ca-certs\") pod \"kube-apiserver-ci-4372.1.0-a-083aa5303b\" (UID: \"4437afbba4064448e269d9c196183bd5\") " pod="kube-system/kube-apiserver-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:12.895483 kubelet[2822]: I0813 00:34:12.894876 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/54ba87ba354053e283e8aa896fc562be-ca-certs\") pod \"kube-controller-manager-ci-4372.1.0-a-083aa5303b\" (UID: \"54ba87ba354053e283e8aa896fc562be\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:12.895483 kubelet[2822]: I0813 00:34:12.894921 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/54ba87ba354053e283e8aa896fc562be-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372.1.0-a-083aa5303b\" (UID: \"54ba87ba354053e283e8aa896fc562be\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:12.952206 kubelet[2822]: I0813 00:34:12.952129 2822 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:12.953058 kubelet[2822]: E0813 00:34:12.952949 2822 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://147.75.71.77:6443/api/v1/nodes\": dial tcp 147.75.71.77:6443: connect: connection refused" node="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:13.037931 containerd[1908]: time="2025-08-13T00:34:13.037797911Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372.1.0-a-083aa5303b,Uid:54e4fd1f6bad5f4de475981db5c39435,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:13.041200 containerd[1908]: time="2025-08-13T00:34:13.041186544Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372.1.0-a-083aa5303b,Uid:4437afbba4064448e269d9c196183bd5,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:13.044687 containerd[1908]: time="2025-08-13T00:34:13.044608513Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372.1.0-a-083aa5303b,Uid:54ba87ba354053e283e8aa896fc562be,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:13.049488 containerd[1908]: time="2025-08-13T00:34:13.049428338Z" level=info msg="connecting to shim a1f919168dcb0036512a15ee7180107dcb9c85c19b56a04b2947204576491a3a" address="unix:///run/containerd/s/30d59a95974e1f31960c05b91b8fd665460abaca595161043785dd176c930a8f" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:13.051526 containerd[1908]: time="2025-08-13T00:34:13.051498628Z" level=info msg="connecting to shim 9ca1b79a13c886f792eb4cbd45d4eec140d04feb72f87a16faa4b48280e4a8ed" address="unix:///run/containerd/s/cf6a2923c05880318e92b615468b6af80a2cd8ee665eccb4c88b3402811caf50" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:13.053990 containerd[1908]: time="2025-08-13T00:34:13.053956108Z" level=info msg="connecting to shim ec760eed6c55f2d4e675297a91eed591d4a3cb1aa39892c350e3a6e3b459598e" address="unix:///run/containerd/s/fd7c138ea5de536a3dd8ed05fc7c291893fbae6b430e9c8c76b2920fbb017d7b" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:13.069696 systemd[1]: Started cri-containerd-a1f919168dcb0036512a15ee7180107dcb9c85c19b56a04b2947204576491a3a.scope - libcontainer container a1f919168dcb0036512a15ee7180107dcb9c85c19b56a04b2947204576491a3a. Aug 13 00:34:13.083161 systemd[1]: Started cri-containerd-9ca1b79a13c886f792eb4cbd45d4eec140d04feb72f87a16faa4b48280e4a8ed.scope - libcontainer container 9ca1b79a13c886f792eb4cbd45d4eec140d04feb72f87a16faa4b48280e4a8ed. Aug 13 00:34:13.086632 systemd[1]: Started cri-containerd-ec760eed6c55f2d4e675297a91eed591d4a3cb1aa39892c350e3a6e3b459598e.scope - libcontainer container ec760eed6c55f2d4e675297a91eed591d4a3cb1aa39892c350e3a6e3b459598e. Aug 13 00:34:13.145046 containerd[1908]: time="2025-08-13T00:34:13.145025719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372.1.0-a-083aa5303b,Uid:54e4fd1f6bad5f4de475981db5c39435,Namespace:kube-system,Attempt:0,} returns sandbox id \"a1f919168dcb0036512a15ee7180107dcb9c85c19b56a04b2947204576491a3a\"" Aug 13 00:34:13.147362 containerd[1908]: time="2025-08-13T00:34:13.147337466Z" level=info msg="CreateContainer within sandbox \"a1f919168dcb0036512a15ee7180107dcb9c85c19b56a04b2947204576491a3a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Aug 13 00:34:13.148026 containerd[1908]: time="2025-08-13T00:34:13.148009556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372.1.0-a-083aa5303b,Uid:4437afbba4064448e269d9c196183bd5,Namespace:kube-system,Attempt:0,} returns sandbox id \"9ca1b79a13c886f792eb4cbd45d4eec140d04feb72f87a16faa4b48280e4a8ed\"" Aug 13 00:34:13.148256 containerd[1908]: time="2025-08-13T00:34:13.148245458Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372.1.0-a-083aa5303b,Uid:54ba87ba354053e283e8aa896fc562be,Namespace:kube-system,Attempt:0,} returns sandbox id \"ec760eed6c55f2d4e675297a91eed591d4a3cb1aa39892c350e3a6e3b459598e\"" Aug 13 00:34:13.149835 containerd[1908]: time="2025-08-13T00:34:13.149818721Z" level=info msg="CreateContainer within sandbox \"9ca1b79a13c886f792eb4cbd45d4eec140d04feb72f87a16faa4b48280e4a8ed\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Aug 13 00:34:13.150282 containerd[1908]: time="2025-08-13T00:34:13.150267327Z" level=info msg="CreateContainer within sandbox \"ec760eed6c55f2d4e675297a91eed591d4a3cb1aa39892c350e3a6e3b459598e\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Aug 13 00:34:13.151524 containerd[1908]: time="2025-08-13T00:34:13.151513479Z" level=info msg="Container 7ee79acb85ce47824a4d787924f4e76b4df8968a53dce12a0732dcf1424da19d: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:13.153376 containerd[1908]: time="2025-08-13T00:34:13.153364345Z" level=info msg="Container 1b7ce64e8d512483d94ad8bcbfdd8737891b0ae54cda931a08bf14834451eae4: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:13.154602 containerd[1908]: time="2025-08-13T00:34:13.154590763Z" level=info msg="Container 6f7ac4966d318b42daf19e83eebffd97d06fec5bb94d0c21f94b03f47b4663df: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:13.154632 containerd[1908]: time="2025-08-13T00:34:13.154621481Z" level=info msg="CreateContainer within sandbox \"a1f919168dcb0036512a15ee7180107dcb9c85c19b56a04b2947204576491a3a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"7ee79acb85ce47824a4d787924f4e76b4df8968a53dce12a0732dcf1424da19d\"" Aug 13 00:34:13.154854 containerd[1908]: time="2025-08-13T00:34:13.154841489Z" level=info msg="StartContainer for \"7ee79acb85ce47824a4d787924f4e76b4df8968a53dce12a0732dcf1424da19d\"" Aug 13 00:34:13.155391 containerd[1908]: time="2025-08-13T00:34:13.155378254Z" level=info msg="connecting to shim 7ee79acb85ce47824a4d787924f4e76b4df8968a53dce12a0732dcf1424da19d" address="unix:///run/containerd/s/30d59a95974e1f31960c05b91b8fd665460abaca595161043785dd176c930a8f" protocol=ttrpc version=3 Aug 13 00:34:13.155595 containerd[1908]: time="2025-08-13T00:34:13.155583728Z" level=info msg="CreateContainer within sandbox \"9ca1b79a13c886f792eb4cbd45d4eec140d04feb72f87a16faa4b48280e4a8ed\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"1b7ce64e8d512483d94ad8bcbfdd8737891b0ae54cda931a08bf14834451eae4\"" Aug 13 00:34:13.155739 containerd[1908]: time="2025-08-13T00:34:13.155728763Z" level=info msg="StartContainer for \"1b7ce64e8d512483d94ad8bcbfdd8737891b0ae54cda931a08bf14834451eae4\"" Aug 13 00:34:13.156335 containerd[1908]: time="2025-08-13T00:34:13.156322515Z" level=info msg="connecting to shim 1b7ce64e8d512483d94ad8bcbfdd8737891b0ae54cda931a08bf14834451eae4" address="unix:///run/containerd/s/cf6a2923c05880318e92b615468b6af80a2cd8ee665eccb4c88b3402811caf50" protocol=ttrpc version=3 Aug 13 00:34:13.157354 containerd[1908]: time="2025-08-13T00:34:13.157340219Z" level=info msg="CreateContainer within sandbox \"ec760eed6c55f2d4e675297a91eed591d4a3cb1aa39892c350e3a6e3b459598e\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"6f7ac4966d318b42daf19e83eebffd97d06fec5bb94d0c21f94b03f47b4663df\"" Aug 13 00:34:13.157528 containerd[1908]: time="2025-08-13T00:34:13.157518360Z" level=info msg="StartContainer for \"6f7ac4966d318b42daf19e83eebffd97d06fec5bb94d0c21f94b03f47b4663df\"" Aug 13 00:34:13.158020 containerd[1908]: time="2025-08-13T00:34:13.158005721Z" level=info msg="connecting to shim 6f7ac4966d318b42daf19e83eebffd97d06fec5bb94d0c21f94b03f47b4663df" address="unix:///run/containerd/s/fd7c138ea5de536a3dd8ed05fc7c291893fbae6b430e9c8c76b2920fbb017d7b" protocol=ttrpc version=3 Aug 13 00:34:13.181430 systemd[1]: Started cri-containerd-1b7ce64e8d512483d94ad8bcbfdd8737891b0ae54cda931a08bf14834451eae4.scope - libcontainer container 1b7ce64e8d512483d94ad8bcbfdd8737891b0ae54cda931a08bf14834451eae4. Aug 13 00:34:13.181982 systemd[1]: Started cri-containerd-7ee79acb85ce47824a4d787924f4e76b4df8968a53dce12a0732dcf1424da19d.scope - libcontainer container 7ee79acb85ce47824a4d787924f4e76b4df8968a53dce12a0732dcf1424da19d. Aug 13 00:34:13.183764 systemd[1]: Started cri-containerd-6f7ac4966d318b42daf19e83eebffd97d06fec5bb94d0c21f94b03f47b4663df.scope - libcontainer container 6f7ac4966d318b42daf19e83eebffd97d06fec5bb94d0c21f94b03f47b4663df. Aug 13 00:34:13.196929 kubelet[2822]: E0813 00:34:13.196906 2822 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.75.71.77:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-a-083aa5303b?timeout=10s\": dial tcp 147.75.71.77:6443: connect: connection refused" interval="800ms" Aug 13 00:34:13.221513 containerd[1908]: time="2025-08-13T00:34:13.221482052Z" level=info msg="StartContainer for \"7ee79acb85ce47824a4d787924f4e76b4df8968a53dce12a0732dcf1424da19d\" returns successfully" Aug 13 00:34:13.221611 containerd[1908]: time="2025-08-13T00:34:13.221536030Z" level=info msg="StartContainer for \"6f7ac4966d318b42daf19e83eebffd97d06fec5bb94d0c21f94b03f47b4663df\" returns successfully" Aug 13 00:34:13.221611 containerd[1908]: time="2025-08-13T00:34:13.221586637Z" level=info msg="StartContainer for \"1b7ce64e8d512483d94ad8bcbfdd8737891b0ae54cda931a08bf14834451eae4\" returns successfully" Aug 13 00:34:13.354380 kubelet[2822]: I0813 00:34:13.354297 2822 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:13.607799 kubelet[2822]: E0813 00:34:13.607760 2822 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-a-083aa5303b\" not found" node="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:13.608343 kubelet[2822]: E0813 00:34:13.608333 2822 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-a-083aa5303b\" not found" node="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:13.609082 kubelet[2822]: E0813 00:34:13.609076 2822 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-a-083aa5303b\" not found" node="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:13.999551 kubelet[2822]: E0813 00:34:13.999507 2822 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4372.1.0-a-083aa5303b\" not found" node="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:14.036780 kubelet[2822]: I0813 00:34:14.036759 2822 kubelet_node_status.go:78] "Successfully registered node" node="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:14.036870 kubelet[2822]: E0813 00:34:14.036785 2822 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4372.1.0-a-083aa5303b\": node \"ci-4372.1.0-a-083aa5303b\" not found" Aug 13 00:34:14.042853 kubelet[2822]: E0813 00:34:14.042833 2822 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372.1.0-a-083aa5303b\" not found" Aug 13 00:34:14.192746 kubelet[2822]: I0813 00:34:14.192699 2822 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:14.196077 kubelet[2822]: E0813 00:34:14.196032 2822 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4372.1.0-a-083aa5303b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:14.196077 kubelet[2822]: I0813 00:34:14.196046 2822 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:14.196725 kubelet[2822]: E0813 00:34:14.196689 2822 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4372.1.0-a-083aa5303b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:14.196725 kubelet[2822]: I0813 00:34:14.196699 2822 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:14.197609 kubelet[2822]: E0813 00:34:14.197599 2822 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4372.1.0-a-083aa5303b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:14.590163 kubelet[2822]: I0813 00:34:14.590055 2822 apiserver.go:52] "Watching apiserver" Aug 13 00:34:14.611043 kubelet[2822]: I0813 00:34:14.610981 2822 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:14.611385 kubelet[2822]: I0813 00:34:14.611230 2822 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:14.615386 kubelet[2822]: E0813 00:34:14.615295 2822 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4372.1.0-a-083aa5303b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:14.615545 kubelet[2822]: E0813 00:34:14.615309 2822 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4372.1.0-a-083aa5303b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:14.693032 kubelet[2822]: I0813 00:34:14.692961 2822 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 13 00:34:16.543915 systemd[1]: Reload requested from client PID 3145 ('systemctl') (unit session-11.scope)... Aug 13 00:34:16.543922 systemd[1]: Reloading... Aug 13 00:34:16.587232 zram_generator::config[3190]: No configuration found. Aug 13 00:34:16.648144 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:34:16.744713 systemd[1]: Reloading finished in 200 ms. Aug 13 00:34:16.765442 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:16.775366 systemd[1]: kubelet.service: Deactivated successfully. Aug 13 00:34:16.775522 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:16.775550 systemd[1]: kubelet.service: Consumed 894ms CPU time, 140.3M memory peak. Aug 13 00:34:16.776599 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:17.067695 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:17.069705 (kubelet)[3254]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 13 00:34:17.090846 kubelet[3254]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:34:17.090846 kubelet[3254]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 13 00:34:17.090846 kubelet[3254]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:34:17.091065 kubelet[3254]: I0813 00:34:17.090873 3254 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 13 00:34:17.094485 kubelet[3254]: I0813 00:34:17.094444 3254 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Aug 13 00:34:17.094485 kubelet[3254]: I0813 00:34:17.094454 3254 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 13 00:34:17.094585 kubelet[3254]: I0813 00:34:17.094547 3254 server.go:956] "Client rotation is on, will bootstrap in background" Aug 13 00:34:17.095784 kubelet[3254]: I0813 00:34:17.095749 3254 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Aug 13 00:34:17.097640 kubelet[3254]: I0813 00:34:17.097603 3254 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 13 00:34:17.099420 kubelet[3254]: I0813 00:34:17.099407 3254 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 13 00:34:17.106103 kubelet[3254]: I0813 00:34:17.106058 3254 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 13 00:34:17.106248 kubelet[3254]: I0813 00:34:17.106172 3254 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 13 00:34:17.106280 kubelet[3254]: I0813 00:34:17.106190 3254 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372.1.0-a-083aa5303b","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 13 00:34:17.106280 kubelet[3254]: I0813 00:34:17.106269 3254 topology_manager.go:138] "Creating topology manager with none policy" Aug 13 00:34:17.106280 kubelet[3254]: I0813 00:34:17.106274 3254 container_manager_linux.go:303] "Creating device plugin manager" Aug 13 00:34:17.106359 kubelet[3254]: I0813 00:34:17.106301 3254 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:34:17.106450 kubelet[3254]: I0813 00:34:17.106413 3254 kubelet.go:480] "Attempting to sync node with API server" Aug 13 00:34:17.106450 kubelet[3254]: I0813 00:34:17.106420 3254 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 13 00:34:17.106450 kubelet[3254]: I0813 00:34:17.106431 3254 kubelet.go:386] "Adding apiserver pod source" Aug 13 00:34:17.106450 kubelet[3254]: I0813 00:34:17.106438 3254 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 13 00:34:17.107768 kubelet[3254]: I0813 00:34:17.107748 3254 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Aug 13 00:34:17.108039 kubelet[3254]: I0813 00:34:17.108031 3254 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Aug 13 00:34:17.109216 kubelet[3254]: I0813 00:34:17.109208 3254 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 13 00:34:17.109253 kubelet[3254]: I0813 00:34:17.109230 3254 server.go:1289] "Started kubelet" Aug 13 00:34:17.109298 kubelet[3254]: I0813 00:34:17.109272 3254 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Aug 13 00:34:17.109329 kubelet[3254]: I0813 00:34:17.109296 3254 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 13 00:34:17.109492 kubelet[3254]: I0813 00:34:17.109481 3254 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 13 00:34:17.110029 kubelet[3254]: I0813 00:34:17.110019 3254 server.go:317] "Adding debug handlers to kubelet server" Aug 13 00:34:17.110070 kubelet[3254]: I0813 00:34:17.110059 3254 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 13 00:34:17.110070 kubelet[3254]: I0813 00:34:17.110063 3254 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 13 00:34:17.110124 kubelet[3254]: E0813 00:34:17.110107 3254 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372.1.0-a-083aa5303b\" not found" Aug 13 00:34:17.110297 kubelet[3254]: I0813 00:34:17.110143 3254 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 13 00:34:17.110297 kubelet[3254]: I0813 00:34:17.110245 3254 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 13 00:34:17.111130 kubelet[3254]: E0813 00:34:17.111063 3254 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 13 00:34:17.111130 kubelet[3254]: I0813 00:34:17.111104 3254 reconciler.go:26] "Reconciler: start to sync state" Aug 13 00:34:17.111477 kubelet[3254]: I0813 00:34:17.111464 3254 factory.go:223] Registration of the systemd container factory successfully Aug 13 00:34:17.111584 kubelet[3254]: I0813 00:34:17.111562 3254 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 13 00:34:17.112436 kubelet[3254]: I0813 00:34:17.112425 3254 factory.go:223] Registration of the containerd container factory successfully Aug 13 00:34:17.117021 kubelet[3254]: I0813 00:34:17.116999 3254 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Aug 13 00:34:17.117517 kubelet[3254]: I0813 00:34:17.117507 3254 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Aug 13 00:34:17.117517 kubelet[3254]: I0813 00:34:17.117517 3254 status_manager.go:230] "Starting to sync pod status with apiserver" Aug 13 00:34:17.117579 kubelet[3254]: I0813 00:34:17.117528 3254 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 13 00:34:17.117579 kubelet[3254]: I0813 00:34:17.117533 3254 kubelet.go:2436] "Starting kubelet main sync loop" Aug 13 00:34:17.117579 kubelet[3254]: E0813 00:34:17.117554 3254 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 13 00:34:17.126078 kubelet[3254]: I0813 00:34:17.126063 3254 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 13 00:34:17.126078 kubelet[3254]: I0813 00:34:17.126074 3254 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 13 00:34:17.126162 kubelet[3254]: I0813 00:34:17.126086 3254 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:34:17.126200 kubelet[3254]: I0813 00:34:17.126165 3254 state_mem.go:88] "Updated default CPUSet" cpuSet="" Aug 13 00:34:17.126200 kubelet[3254]: I0813 00:34:17.126171 3254 state_mem.go:96] "Updated CPUSet assignments" assignments={} Aug 13 00:34:17.126200 kubelet[3254]: I0813 00:34:17.126186 3254 policy_none.go:49] "None policy: Start" Aug 13 00:34:17.126200 kubelet[3254]: I0813 00:34:17.126192 3254 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 13 00:34:17.126200 kubelet[3254]: I0813 00:34:17.126197 3254 state_mem.go:35] "Initializing new in-memory state store" Aug 13 00:34:17.126273 kubelet[3254]: I0813 00:34:17.126248 3254 state_mem.go:75] "Updated machine memory state" Aug 13 00:34:17.128187 kubelet[3254]: E0813 00:34:17.128146 3254 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Aug 13 00:34:17.128287 kubelet[3254]: I0813 00:34:17.128251 3254 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 13 00:34:17.128287 kubelet[3254]: I0813 00:34:17.128258 3254 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 13 00:34:17.128344 kubelet[3254]: I0813 00:34:17.128333 3254 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 13 00:34:17.128576 kubelet[3254]: E0813 00:34:17.128568 3254 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 13 00:34:17.219582 kubelet[3254]: I0813 00:34:17.219448 3254 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:17.219582 kubelet[3254]: I0813 00:34:17.219592 3254 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:17.219954 kubelet[3254]: I0813 00:34:17.219751 3254 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:17.227163 kubelet[3254]: I0813 00:34:17.227086 3254 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Aug 13 00:34:17.227422 kubelet[3254]: I0813 00:34:17.227174 3254 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Aug 13 00:34:17.227667 kubelet[3254]: I0813 00:34:17.227608 3254 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Aug 13 00:34:17.233420 kubelet[3254]: I0813 00:34:17.233363 3254 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:17.242635 kubelet[3254]: I0813 00:34:17.242546 3254 kubelet_node_status.go:124] "Node was previously registered" node="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:17.242845 kubelet[3254]: I0813 00:34:17.242703 3254 kubelet_node_status.go:78] "Successfully registered node" node="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:17.312298 kubelet[3254]: I0813 00:34:17.312083 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4437afbba4064448e269d9c196183bd5-ca-certs\") pod \"kube-apiserver-ci-4372.1.0-a-083aa5303b\" (UID: \"4437afbba4064448e269d9c196183bd5\") " pod="kube-system/kube-apiserver-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:17.312298 kubelet[3254]: I0813 00:34:17.312273 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4437afbba4064448e269d9c196183bd5-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372.1.0-a-083aa5303b\" (UID: \"4437afbba4064448e269d9c196183bd5\") " pod="kube-system/kube-apiserver-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:17.312649 kubelet[3254]: I0813 00:34:17.312365 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/54ba87ba354053e283e8aa896fc562be-ca-certs\") pod \"kube-controller-manager-ci-4372.1.0-a-083aa5303b\" (UID: \"54ba87ba354053e283e8aa896fc562be\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:17.312649 kubelet[3254]: I0813 00:34:17.312430 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/54ba87ba354053e283e8aa896fc562be-flexvolume-dir\") pod \"kube-controller-manager-ci-4372.1.0-a-083aa5303b\" (UID: \"54ba87ba354053e283e8aa896fc562be\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:17.312649 kubelet[3254]: I0813 00:34:17.312530 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/54ba87ba354053e283e8aa896fc562be-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372.1.0-a-083aa5303b\" (UID: \"54ba87ba354053e283e8aa896fc562be\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:17.312649 kubelet[3254]: I0813 00:34:17.312588 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/54e4fd1f6bad5f4de475981db5c39435-kubeconfig\") pod \"kube-scheduler-ci-4372.1.0-a-083aa5303b\" (UID: \"54e4fd1f6bad5f4de475981db5c39435\") " pod="kube-system/kube-scheduler-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:17.313015 kubelet[3254]: I0813 00:34:17.312649 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4437afbba4064448e269d9c196183bd5-k8s-certs\") pod \"kube-apiserver-ci-4372.1.0-a-083aa5303b\" (UID: \"4437afbba4064448e269d9c196183bd5\") " pod="kube-system/kube-apiserver-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:17.313015 kubelet[3254]: I0813 00:34:17.312713 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/54ba87ba354053e283e8aa896fc562be-k8s-certs\") pod \"kube-controller-manager-ci-4372.1.0-a-083aa5303b\" (UID: \"54ba87ba354053e283e8aa896fc562be\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:17.313015 kubelet[3254]: I0813 00:34:17.312771 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/54ba87ba354053e283e8aa896fc562be-kubeconfig\") pod \"kube-controller-manager-ci-4372.1.0-a-083aa5303b\" (UID: \"54ba87ba354053e283e8aa896fc562be\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:18.107704 kubelet[3254]: I0813 00:34:18.107675 3254 apiserver.go:52] "Watching apiserver" Aug 13 00:34:18.121702 kubelet[3254]: I0813 00:34:18.121678 3254 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:18.121803 kubelet[3254]: I0813 00:34:18.121753 3254 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:18.125433 kubelet[3254]: I0813 00:34:18.125378 3254 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Aug 13 00:34:18.125433 kubelet[3254]: E0813 00:34:18.125429 3254 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4372.1.0-a-083aa5303b\" already exists" pod="kube-system/kube-apiserver-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:18.125737 kubelet[3254]: I0813 00:34:18.125692 3254 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Aug 13 00:34:18.125737 kubelet[3254]: E0813 00:34:18.125719 3254 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4372.1.0-a-083aa5303b\" already exists" pod="kube-system/kube-controller-manager-ci-4372.1.0-a-083aa5303b" Aug 13 00:34:18.148516 kubelet[3254]: I0813 00:34:18.148462 3254 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4372.1.0-a-083aa5303b" podStartSLOduration=1.148447109 podStartE2EDuration="1.148447109s" podCreationTimestamp="2025-08-13 00:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:34:18.143499828 +0000 UTC m=+1.071840015" watchObservedRunningTime="2025-08-13 00:34:18.148447109 +0000 UTC m=+1.076787292" Aug 13 00:34:18.153168 kubelet[3254]: I0813 00:34:18.153100 3254 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4372.1.0-a-083aa5303b" podStartSLOduration=1.153084508 podStartE2EDuration="1.153084508s" podCreationTimestamp="2025-08-13 00:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:34:18.148413533 +0000 UTC m=+1.076753718" watchObservedRunningTime="2025-08-13 00:34:18.153084508 +0000 UTC m=+1.081424690" Aug 13 00:34:18.153168 kubelet[3254]: I0813 00:34:18.153159 3254 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4372.1.0-a-083aa5303b" podStartSLOduration=1.153154858 podStartE2EDuration="1.153154858s" podCreationTimestamp="2025-08-13 00:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:34:18.152978416 +0000 UTC m=+1.081318603" watchObservedRunningTime="2025-08-13 00:34:18.153154858 +0000 UTC m=+1.081495040" Aug 13 00:34:18.210721 kubelet[3254]: I0813 00:34:18.210698 3254 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 13 00:34:22.283626 kubelet[3254]: I0813 00:34:22.283556 3254 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Aug 13 00:34:22.284664 containerd[1908]: time="2025-08-13T00:34:22.284338125Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Aug 13 00:34:22.285327 kubelet[3254]: I0813 00:34:22.284750 3254 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Aug 13 00:34:23.333203 systemd[1]: Created slice kubepods-besteffort-pod60bf4bd3_399a_486f_9946_5b48fb051422.slice - libcontainer container kubepods-besteffort-pod60bf4bd3_399a_486f_9946_5b48fb051422.slice. Aug 13 00:34:23.348917 kubelet[3254]: I0813 00:34:23.348869 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/60bf4bd3-399a-486f-9946-5b48fb051422-kube-proxy\") pod \"kube-proxy-hvjb5\" (UID: \"60bf4bd3-399a-486f-9946-5b48fb051422\") " pod="kube-system/kube-proxy-hvjb5" Aug 13 00:34:23.348917 kubelet[3254]: I0813 00:34:23.348897 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/60bf4bd3-399a-486f-9946-5b48fb051422-lib-modules\") pod \"kube-proxy-hvjb5\" (UID: \"60bf4bd3-399a-486f-9946-5b48fb051422\") " pod="kube-system/kube-proxy-hvjb5" Aug 13 00:34:23.349170 kubelet[3254]: I0813 00:34:23.348915 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/60bf4bd3-399a-486f-9946-5b48fb051422-xtables-lock\") pod \"kube-proxy-hvjb5\" (UID: \"60bf4bd3-399a-486f-9946-5b48fb051422\") " pod="kube-system/kube-proxy-hvjb5" Aug 13 00:34:23.349170 kubelet[3254]: I0813 00:34:23.348946 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnk6m\" (UniqueName: \"kubernetes.io/projected/60bf4bd3-399a-486f-9946-5b48fb051422-kube-api-access-fnk6m\") pod \"kube-proxy-hvjb5\" (UID: \"60bf4bd3-399a-486f-9946-5b48fb051422\") " pod="kube-system/kube-proxy-hvjb5" Aug 13 00:34:23.492299 systemd[1]: Created slice kubepods-besteffort-podfbad5c01_2100_49e4_9d5e_479a5c1e9692.slice - libcontainer container kubepods-besteffort-podfbad5c01_2100_49e4_9d5e_479a5c1e9692.slice. Aug 13 00:34:23.551319 kubelet[3254]: I0813 00:34:23.551198 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fbad5c01-2100-49e4-9d5e-479a5c1e9692-var-lib-calico\") pod \"tigera-operator-747864d56d-vkgv7\" (UID: \"fbad5c01-2100-49e4-9d5e-479a5c1e9692\") " pod="tigera-operator/tigera-operator-747864d56d-vkgv7" Aug 13 00:34:23.551579 kubelet[3254]: I0813 00:34:23.551331 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp4mb\" (UniqueName: \"kubernetes.io/projected/fbad5c01-2100-49e4-9d5e-479a5c1e9692-kube-api-access-zp4mb\") pod \"tigera-operator-747864d56d-vkgv7\" (UID: \"fbad5c01-2100-49e4-9d5e-479a5c1e9692\") " pod="tigera-operator/tigera-operator-747864d56d-vkgv7" Aug 13 00:34:23.650409 containerd[1908]: time="2025-08-13T00:34:23.650221351Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hvjb5,Uid:60bf4bd3-399a-486f-9946-5b48fb051422,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:23.657755 containerd[1908]: time="2025-08-13T00:34:23.657730617Z" level=info msg="connecting to shim e6cc8541df7282a3fe7ed20ed5b6b37fd212e82a649d36dd23e203b7d4d391bf" address="unix:///run/containerd/s/51fb0a03e17bf5e58394725c7605a752c88f51e4db7c4ae58ee15c3136d017a9" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:23.680407 systemd[1]: Started cri-containerd-e6cc8541df7282a3fe7ed20ed5b6b37fd212e82a649d36dd23e203b7d4d391bf.scope - libcontainer container e6cc8541df7282a3fe7ed20ed5b6b37fd212e82a649d36dd23e203b7d4d391bf. Aug 13 00:34:23.694439 containerd[1908]: time="2025-08-13T00:34:23.694404770Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hvjb5,Uid:60bf4bd3-399a-486f-9946-5b48fb051422,Namespace:kube-system,Attempt:0,} returns sandbox id \"e6cc8541df7282a3fe7ed20ed5b6b37fd212e82a649d36dd23e203b7d4d391bf\"" Aug 13 00:34:23.696526 containerd[1908]: time="2025-08-13T00:34:23.696512907Z" level=info msg="CreateContainer within sandbox \"e6cc8541df7282a3fe7ed20ed5b6b37fd212e82a649d36dd23e203b7d4d391bf\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Aug 13 00:34:23.700931 containerd[1908]: time="2025-08-13T00:34:23.700887647Z" level=info msg="Container e4a5da0ec5c1136e3e773685660216e61492d9bc5372006e4b982ec9e9aa23d9: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:23.704065 containerd[1908]: time="2025-08-13T00:34:23.704052909Z" level=info msg="CreateContainer within sandbox \"e6cc8541df7282a3fe7ed20ed5b6b37fd212e82a649d36dd23e203b7d4d391bf\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"e4a5da0ec5c1136e3e773685660216e61492d9bc5372006e4b982ec9e9aa23d9\"" Aug 13 00:34:23.704456 containerd[1908]: time="2025-08-13T00:34:23.704429209Z" level=info msg="StartContainer for \"e4a5da0ec5c1136e3e773685660216e61492d9bc5372006e4b982ec9e9aa23d9\"" Aug 13 00:34:23.705327 containerd[1908]: time="2025-08-13T00:34:23.705315189Z" level=info msg="connecting to shim e4a5da0ec5c1136e3e773685660216e61492d9bc5372006e4b982ec9e9aa23d9" address="unix:///run/containerd/s/51fb0a03e17bf5e58394725c7605a752c88f51e4db7c4ae58ee15c3136d017a9" protocol=ttrpc version=3 Aug 13 00:34:23.725459 systemd[1]: Started cri-containerd-e4a5da0ec5c1136e3e773685660216e61492d9bc5372006e4b982ec9e9aa23d9.scope - libcontainer container e4a5da0ec5c1136e3e773685660216e61492d9bc5372006e4b982ec9e9aa23d9. Aug 13 00:34:23.749505 containerd[1908]: time="2025-08-13T00:34:23.749457088Z" level=info msg="StartContainer for \"e4a5da0ec5c1136e3e773685660216e61492d9bc5372006e4b982ec9e9aa23d9\" returns successfully" Aug 13 00:34:23.796189 containerd[1908]: time="2025-08-13T00:34:23.796147992Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-vkgv7,Uid:fbad5c01-2100-49e4-9d5e-479a5c1e9692,Namespace:tigera-operator,Attempt:0,}" Aug 13 00:34:23.802992 containerd[1908]: time="2025-08-13T00:34:23.802965667Z" level=info msg="connecting to shim c5230833e5845e09193076f84713db1f9394b5de4c0e434094f9b6c968842a9f" address="unix:///run/containerd/s/33711717a4dac9c34084813815813be625b7fe1cd722ab06a641b10fa7a4c98d" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:23.827302 systemd[1]: Started cri-containerd-c5230833e5845e09193076f84713db1f9394b5de4c0e434094f9b6c968842a9f.scope - libcontainer container c5230833e5845e09193076f84713db1f9394b5de4c0e434094f9b6c968842a9f. Aug 13 00:34:23.868410 containerd[1908]: time="2025-08-13T00:34:23.868388299Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-vkgv7,Uid:fbad5c01-2100-49e4-9d5e-479a5c1e9692,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"c5230833e5845e09193076f84713db1f9394b5de4c0e434094f9b6c968842a9f\"" Aug 13 00:34:23.869222 containerd[1908]: time="2025-08-13T00:34:23.869206301Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Aug 13 00:34:24.872772 kubelet[3254]: I0813 00:34:24.872686 3254 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-hvjb5" podStartSLOduration=1.872659047 podStartE2EDuration="1.872659047s" podCreationTimestamp="2025-08-13 00:34:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:34:24.152522409 +0000 UTC m=+7.080862631" watchObservedRunningTime="2025-08-13 00:34:24.872659047 +0000 UTC m=+7.800999225" Aug 13 00:34:25.176007 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1663474210.mount: Deactivated successfully. Aug 13 00:34:25.488088 containerd[1908]: time="2025-08-13T00:34:25.488041912Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:25.488294 containerd[1908]: time="2025-08-13T00:34:25.488160458Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Aug 13 00:34:25.488569 containerd[1908]: time="2025-08-13T00:34:25.488534515Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:25.489435 containerd[1908]: time="2025-08-13T00:34:25.489400147Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:25.489830 containerd[1908]: time="2025-08-13T00:34:25.489792358Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 1.620569372s" Aug 13 00:34:25.489830 containerd[1908]: time="2025-08-13T00:34:25.489807156Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Aug 13 00:34:25.491189 containerd[1908]: time="2025-08-13T00:34:25.491174275Z" level=info msg="CreateContainer within sandbox \"c5230833e5845e09193076f84713db1f9394b5de4c0e434094f9b6c968842a9f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Aug 13 00:34:25.493891 containerd[1908]: time="2025-08-13T00:34:25.493879509Z" level=info msg="Container 0febd0938320f2e0e74e36b40e0f23008e301336e8d8ea59126582e21f129154: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:25.495968 containerd[1908]: time="2025-08-13T00:34:25.495928128Z" level=info msg="CreateContainer within sandbox \"c5230833e5845e09193076f84713db1f9394b5de4c0e434094f9b6c968842a9f\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"0febd0938320f2e0e74e36b40e0f23008e301336e8d8ea59126582e21f129154\"" Aug 13 00:34:25.496151 containerd[1908]: time="2025-08-13T00:34:25.496138856Z" level=info msg="StartContainer for \"0febd0938320f2e0e74e36b40e0f23008e301336e8d8ea59126582e21f129154\"" Aug 13 00:34:25.496546 containerd[1908]: time="2025-08-13T00:34:25.496510138Z" level=info msg="connecting to shim 0febd0938320f2e0e74e36b40e0f23008e301336e8d8ea59126582e21f129154" address="unix:///run/containerd/s/33711717a4dac9c34084813815813be625b7fe1cd722ab06a641b10fa7a4c98d" protocol=ttrpc version=3 Aug 13 00:34:25.520454 systemd[1]: Started cri-containerd-0febd0938320f2e0e74e36b40e0f23008e301336e8d8ea59126582e21f129154.scope - libcontainer container 0febd0938320f2e0e74e36b40e0f23008e301336e8d8ea59126582e21f129154. Aug 13 00:34:25.546747 containerd[1908]: time="2025-08-13T00:34:25.546696382Z" level=info msg="StartContainer for \"0febd0938320f2e0e74e36b40e0f23008e301336e8d8ea59126582e21f129154\" returns successfully" Aug 13 00:34:26.166196 kubelet[3254]: I0813 00:34:26.166158 3254 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-747864d56d-vkgv7" podStartSLOduration=1.544955152 podStartE2EDuration="3.166149008s" podCreationTimestamp="2025-08-13 00:34:23 +0000 UTC" firstStartedPulling="2025-08-13 00:34:23.868975835 +0000 UTC m=+6.797316016" lastFinishedPulling="2025-08-13 00:34:25.490169695 +0000 UTC m=+8.418509872" observedRunningTime="2025-08-13 00:34:26.166093665 +0000 UTC m=+9.094433842" watchObservedRunningTime="2025-08-13 00:34:26.166149008 +0000 UTC m=+9.094489182" Aug 13 00:34:29.444416 update_engine[1895]: I20250813 00:34:29.444316 1895 update_attempter.cc:509] Updating boot flags... Aug 13 00:34:30.150522 sudo[2201]: pam_unix(sudo:session): session closed for user root Aug 13 00:34:30.151197 sshd[2200]: Connection closed by 139.178.89.65 port 59006 Aug 13 00:34:30.151339 sshd-session[2198]: pam_unix(sshd:session): session closed for user core Aug 13 00:34:30.153057 systemd[1]: sshd@8-147.75.71.77:22-139.178.89.65:59006.service: Deactivated successfully. Aug 13 00:34:30.154081 systemd[1]: session-11.scope: Deactivated successfully. Aug 13 00:34:30.154192 systemd[1]: session-11.scope: Consumed 4.661s CPU time, 231.8M memory peak. Aug 13 00:34:30.155510 systemd-logind[1890]: Session 11 logged out. Waiting for processes to exit. Aug 13 00:34:30.156126 systemd-logind[1890]: Removed session 11. Aug 13 00:34:32.402262 systemd[1]: Created slice kubepods-besteffort-pod755b7172_6564_4a24_b349_ad57cb2e496f.slice - libcontainer container kubepods-besteffort-pod755b7172_6564_4a24_b349_ad57cb2e496f.slice. Aug 13 00:34:32.508766 kubelet[3254]: I0813 00:34:32.508684 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/755b7172-6564-4a24-b349-ad57cb2e496f-typha-certs\") pod \"calico-typha-7c748c5975-tcg75\" (UID: \"755b7172-6564-4a24-b349-ad57cb2e496f\") " pod="calico-system/calico-typha-7c748c5975-tcg75" Aug 13 00:34:32.509698 kubelet[3254]: I0813 00:34:32.508787 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g56t\" (UniqueName: \"kubernetes.io/projected/755b7172-6564-4a24-b349-ad57cb2e496f-kube-api-access-8g56t\") pod \"calico-typha-7c748c5975-tcg75\" (UID: \"755b7172-6564-4a24-b349-ad57cb2e496f\") " pod="calico-system/calico-typha-7c748c5975-tcg75" Aug 13 00:34:32.509698 kubelet[3254]: I0813 00:34:32.508969 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/755b7172-6564-4a24-b349-ad57cb2e496f-tigera-ca-bundle\") pod \"calico-typha-7c748c5975-tcg75\" (UID: \"755b7172-6564-4a24-b349-ad57cb2e496f\") " pod="calico-system/calico-typha-7c748c5975-tcg75" Aug 13 00:34:32.705439 containerd[1908]: time="2025-08-13T00:34:32.705354539Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7c748c5975-tcg75,Uid:755b7172-6564-4a24-b349-ad57cb2e496f,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:32.712495 containerd[1908]: time="2025-08-13T00:34:32.712468604Z" level=info msg="connecting to shim 541e8042e0abd845bad097dc51b68cf1cbcc37925ead7fe84f1a0b0248bf88cf" address="unix:///run/containerd/s/bda86fdf4fc1a9364666abd8bc50947021f61b494e6bd9149e2a6664f384ed1e" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:32.731349 systemd[1]: Started cri-containerd-541e8042e0abd845bad097dc51b68cf1cbcc37925ead7fe84f1a0b0248bf88cf.scope - libcontainer container 541e8042e0abd845bad097dc51b68cf1cbcc37925ead7fe84f1a0b0248bf88cf. Aug 13 00:34:32.735631 systemd[1]: Created slice kubepods-besteffort-pode3531c6c_9269_43ed_95d0_d796902d97b8.slice - libcontainer container kubepods-besteffort-pode3531c6c_9269_43ed_95d0_d796902d97b8.slice. Aug 13 00:34:32.771548 containerd[1908]: time="2025-08-13T00:34:32.771493165Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7c748c5975-tcg75,Uid:755b7172-6564-4a24-b349-ad57cb2e496f,Namespace:calico-system,Attempt:0,} returns sandbox id \"541e8042e0abd845bad097dc51b68cf1cbcc37925ead7fe84f1a0b0248bf88cf\"" Aug 13 00:34:32.772225 containerd[1908]: time="2025-08-13T00:34:32.772211986Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Aug 13 00:34:32.811356 kubelet[3254]: I0813 00:34:32.811329 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/e3531c6c-9269-43ed-95d0-d796902d97b8-policysync\") pod \"calico-node-4nwh7\" (UID: \"e3531c6c-9269-43ed-95d0-d796902d97b8\") " pod="calico-system/calico-node-4nwh7" Aug 13 00:34:32.811356 kubelet[3254]: I0813 00:34:32.811359 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/e3531c6c-9269-43ed-95d0-d796902d97b8-cni-net-dir\") pod \"calico-node-4nwh7\" (UID: \"e3531c6c-9269-43ed-95d0-d796902d97b8\") " pod="calico-system/calico-node-4nwh7" Aug 13 00:34:32.811474 kubelet[3254]: I0813 00:34:32.811372 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3531c6c-9269-43ed-95d0-d796902d97b8-tigera-ca-bundle\") pod \"calico-node-4nwh7\" (UID: \"e3531c6c-9269-43ed-95d0-d796902d97b8\") " pod="calico-system/calico-node-4nwh7" Aug 13 00:34:32.811474 kubelet[3254]: I0813 00:34:32.811397 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/e3531c6c-9269-43ed-95d0-d796902d97b8-cni-log-dir\") pod \"calico-node-4nwh7\" (UID: \"e3531c6c-9269-43ed-95d0-d796902d97b8\") " pod="calico-system/calico-node-4nwh7" Aug 13 00:34:32.811474 kubelet[3254]: I0813 00:34:32.811408 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/e3531c6c-9269-43ed-95d0-d796902d97b8-node-certs\") pod \"calico-node-4nwh7\" (UID: \"e3531c6c-9269-43ed-95d0-d796902d97b8\") " pod="calico-system/calico-node-4nwh7" Aug 13 00:34:32.811474 kubelet[3254]: I0813 00:34:32.811418 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/e3531c6c-9269-43ed-95d0-d796902d97b8-cni-bin-dir\") pod \"calico-node-4nwh7\" (UID: \"e3531c6c-9269-43ed-95d0-d796902d97b8\") " pod="calico-system/calico-node-4nwh7" Aug 13 00:34:32.811474 kubelet[3254]: I0813 00:34:32.811428 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/e3531c6c-9269-43ed-95d0-d796902d97b8-var-run-calico\") pod \"calico-node-4nwh7\" (UID: \"e3531c6c-9269-43ed-95d0-d796902d97b8\") " pod="calico-system/calico-node-4nwh7" Aug 13 00:34:32.811587 kubelet[3254]: I0813 00:34:32.811440 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e3531c6c-9269-43ed-95d0-d796902d97b8-var-lib-calico\") pod \"calico-node-4nwh7\" (UID: \"e3531c6c-9269-43ed-95d0-d796902d97b8\") " pod="calico-system/calico-node-4nwh7" Aug 13 00:34:32.811587 kubelet[3254]: I0813 00:34:32.811449 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e3531c6c-9269-43ed-95d0-d796902d97b8-xtables-lock\") pod \"calico-node-4nwh7\" (UID: \"e3531c6c-9269-43ed-95d0-d796902d97b8\") " pod="calico-system/calico-node-4nwh7" Aug 13 00:34:32.811587 kubelet[3254]: I0813 00:34:32.811460 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld4tt\" (UniqueName: \"kubernetes.io/projected/e3531c6c-9269-43ed-95d0-d796902d97b8-kube-api-access-ld4tt\") pod \"calico-node-4nwh7\" (UID: \"e3531c6c-9269-43ed-95d0-d796902d97b8\") " pod="calico-system/calico-node-4nwh7" Aug 13 00:34:32.811587 kubelet[3254]: I0813 00:34:32.811474 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/e3531c6c-9269-43ed-95d0-d796902d97b8-flexvol-driver-host\") pod \"calico-node-4nwh7\" (UID: \"e3531c6c-9269-43ed-95d0-d796902d97b8\") " pod="calico-system/calico-node-4nwh7" Aug 13 00:34:32.811587 kubelet[3254]: I0813 00:34:32.811483 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e3531c6c-9269-43ed-95d0-d796902d97b8-lib-modules\") pod \"calico-node-4nwh7\" (UID: \"e3531c6c-9269-43ed-95d0-d796902d97b8\") " pod="calico-system/calico-node-4nwh7" Aug 13 00:34:32.913239 kubelet[3254]: E0813 00:34:32.913172 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:32.913239 kubelet[3254]: W0813 00:34:32.913205 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:32.913239 kubelet[3254]: E0813 00:34:32.913223 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:32.914641 kubelet[3254]: E0813 00:34:32.914597 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:32.914641 kubelet[3254]: W0813 00:34:32.914610 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:32.914641 kubelet[3254]: E0813 00:34:32.914621 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:32.920291 kubelet[3254]: E0813 00:34:32.920243 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:32.920291 kubelet[3254]: W0813 00:34:32.920255 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:32.920291 kubelet[3254]: E0813 00:34:32.920268 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.038876 containerd[1908]: time="2025-08-13T00:34:33.038653490Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-4nwh7,Uid:e3531c6c-9269-43ed-95d0-d796902d97b8,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:33.045254 kubelet[3254]: E0813 00:34:33.045231 3254 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-768vv" podUID="baea1b6a-90df-43be-a718-a692f968b764" Aug 13 00:34:33.047192 containerd[1908]: time="2025-08-13T00:34:33.047159150Z" level=info msg="connecting to shim 44c300e3a755b2cb010880f3b8cc0c13c9785fe18af25777a084a781483ca21d" address="unix:///run/containerd/s/5d9ad55e387a9e77b9ce868a5b5a66ab15d420cc0434d9a36df80418171ea236" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:33.065377 systemd[1]: Started cri-containerd-44c300e3a755b2cb010880f3b8cc0c13c9785fe18af25777a084a781483ca21d.scope - libcontainer container 44c300e3a755b2cb010880f3b8cc0c13c9785fe18af25777a084a781483ca21d. Aug 13 00:34:33.076549 containerd[1908]: time="2025-08-13T00:34:33.076526650Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-4nwh7,Uid:e3531c6c-9269-43ed-95d0-d796902d97b8,Namespace:calico-system,Attempt:0,} returns sandbox id \"44c300e3a755b2cb010880f3b8cc0c13c9785fe18af25777a084a781483ca21d\"" Aug 13 00:34:33.096704 kubelet[3254]: E0813 00:34:33.096661 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.096704 kubelet[3254]: W0813 00:34:33.096673 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.096704 kubelet[3254]: E0813 00:34:33.096685 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.096853 kubelet[3254]: E0813 00:34:33.096805 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.096853 kubelet[3254]: W0813 00:34:33.096811 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.096853 kubelet[3254]: E0813 00:34:33.096818 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.096995 kubelet[3254]: E0813 00:34:33.096937 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.096995 kubelet[3254]: W0813 00:34:33.096943 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.096995 kubelet[3254]: E0813 00:34:33.096964 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.097125 kubelet[3254]: E0813 00:34:33.097084 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.097125 kubelet[3254]: W0813 00:34:33.097090 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.097125 kubelet[3254]: E0813 00:34:33.097096 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.097214 kubelet[3254]: E0813 00:34:33.097190 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.097214 kubelet[3254]: W0813 00:34:33.097195 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.097214 kubelet[3254]: E0813 00:34:33.097201 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.097328 kubelet[3254]: E0813 00:34:33.097293 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.097328 kubelet[3254]: W0813 00:34:33.097299 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.097328 kubelet[3254]: E0813 00:34:33.097304 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.097400 kubelet[3254]: E0813 00:34:33.097382 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.097400 kubelet[3254]: W0813 00:34:33.097388 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.097400 kubelet[3254]: E0813 00:34:33.097394 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.097505 kubelet[3254]: E0813 00:34:33.097473 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.097505 kubelet[3254]: W0813 00:34:33.097478 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.097505 kubelet[3254]: E0813 00:34:33.097485 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.097582 kubelet[3254]: E0813 00:34:33.097570 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.097582 kubelet[3254]: W0813 00:34:33.097576 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.097582 kubelet[3254]: E0813 00:34:33.097581 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.097664 kubelet[3254]: E0813 00:34:33.097658 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.097664 kubelet[3254]: W0813 00:34:33.097664 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.097704 kubelet[3254]: E0813 00:34:33.097669 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.097757 kubelet[3254]: E0813 00:34:33.097751 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.097781 kubelet[3254]: W0813 00:34:33.097757 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.097781 kubelet[3254]: E0813 00:34:33.097762 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.097847 kubelet[3254]: E0813 00:34:33.097840 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.097867 kubelet[3254]: W0813 00:34:33.097847 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.097867 kubelet[3254]: E0813 00:34:33.097852 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.097941 kubelet[3254]: E0813 00:34:33.097935 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.097966 kubelet[3254]: W0813 00:34:33.097940 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.097966 kubelet[3254]: E0813 00:34:33.097948 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.098035 kubelet[3254]: E0813 00:34:33.098028 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.098035 kubelet[3254]: W0813 00:34:33.098034 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.098074 kubelet[3254]: E0813 00:34:33.098039 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.098121 kubelet[3254]: E0813 00:34:33.098115 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.098147 kubelet[3254]: W0813 00:34:33.098121 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.098147 kubelet[3254]: E0813 00:34:33.098126 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.098214 kubelet[3254]: E0813 00:34:33.098208 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.098214 kubelet[3254]: W0813 00:34:33.098214 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.098261 kubelet[3254]: E0813 00:34:33.098220 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.098317 kubelet[3254]: E0813 00:34:33.098310 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.098343 kubelet[3254]: W0813 00:34:33.098316 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.098343 kubelet[3254]: E0813 00:34:33.098322 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.098407 kubelet[3254]: E0813 00:34:33.098400 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.098430 kubelet[3254]: W0813 00:34:33.098408 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.098430 kubelet[3254]: E0813 00:34:33.098415 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.098498 kubelet[3254]: E0813 00:34:33.098492 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.098525 kubelet[3254]: W0813 00:34:33.098498 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.098525 kubelet[3254]: E0813 00:34:33.098503 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.098588 kubelet[3254]: E0813 00:34:33.098582 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.098609 kubelet[3254]: W0813 00:34:33.098588 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.098609 kubelet[3254]: E0813 00:34:33.098593 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.115001 kubelet[3254]: E0813 00:34:33.114916 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.115001 kubelet[3254]: W0813 00:34:33.114955 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.115001 kubelet[3254]: E0813 00:34:33.114990 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.115399 kubelet[3254]: I0813 00:34:33.115051 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/baea1b6a-90df-43be-a718-a692f968b764-registration-dir\") pod \"csi-node-driver-768vv\" (UID: \"baea1b6a-90df-43be-a718-a692f968b764\") " pod="calico-system/csi-node-driver-768vv" Aug 13 00:34:33.115697 kubelet[3254]: E0813 00:34:33.115606 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.115697 kubelet[3254]: W0813 00:34:33.115648 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.115697 kubelet[3254]: E0813 00:34:33.115683 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.116055 kubelet[3254]: I0813 00:34:33.115744 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/baea1b6a-90df-43be-a718-a692f968b764-varrun\") pod \"csi-node-driver-768vv\" (UID: \"baea1b6a-90df-43be-a718-a692f968b764\") " pod="calico-system/csi-node-driver-768vv" Aug 13 00:34:33.116391 kubelet[3254]: E0813 00:34:33.116304 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.116391 kubelet[3254]: W0813 00:34:33.116339 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.116391 kubelet[3254]: E0813 00:34:33.116371 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.116861 kubelet[3254]: E0813 00:34:33.116779 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.116861 kubelet[3254]: W0813 00:34:33.116807 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.116861 kubelet[3254]: E0813 00:34:33.116834 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.117312 kubelet[3254]: E0813 00:34:33.117253 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.117312 kubelet[3254]: W0813 00:34:33.117280 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.117312 kubelet[3254]: E0813 00:34:33.117306 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.117592 kubelet[3254]: I0813 00:34:33.117363 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtsjj\" (UniqueName: \"kubernetes.io/projected/baea1b6a-90df-43be-a718-a692f968b764-kube-api-access-xtsjj\") pod \"csi-node-driver-768vv\" (UID: \"baea1b6a-90df-43be-a718-a692f968b764\") " pod="calico-system/csi-node-driver-768vv" Aug 13 00:34:33.118038 kubelet[3254]: E0813 00:34:33.117946 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.118038 kubelet[3254]: W0813 00:34:33.117985 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.118038 kubelet[3254]: E0813 00:34:33.118019 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.118651 kubelet[3254]: E0813 00:34:33.118589 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.118651 kubelet[3254]: W0813 00:34:33.118621 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.118856 kubelet[3254]: E0813 00:34:33.118658 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.119160 kubelet[3254]: E0813 00:34:33.119130 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.119160 kubelet[3254]: W0813 00:34:33.119157 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.119383 kubelet[3254]: E0813 00:34:33.119225 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.119383 kubelet[3254]: I0813 00:34:33.119300 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baea1b6a-90df-43be-a718-a692f968b764-kubelet-dir\") pod \"csi-node-driver-768vv\" (UID: \"baea1b6a-90df-43be-a718-a692f968b764\") " pod="calico-system/csi-node-driver-768vv" Aug 13 00:34:33.119871 kubelet[3254]: E0813 00:34:33.119830 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.119963 kubelet[3254]: W0813 00:34:33.119873 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.119963 kubelet[3254]: E0813 00:34:33.119911 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.120465 kubelet[3254]: E0813 00:34:33.120425 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.120465 kubelet[3254]: W0813 00:34:33.120456 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.120803 kubelet[3254]: E0813 00:34:33.120486 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.120962 kubelet[3254]: E0813 00:34:33.120922 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.120962 kubelet[3254]: W0813 00:34:33.120948 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.121272 kubelet[3254]: E0813 00:34:33.120975 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.121272 kubelet[3254]: I0813 00:34:33.121040 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/baea1b6a-90df-43be-a718-a692f968b764-socket-dir\") pod \"csi-node-driver-768vv\" (UID: \"baea1b6a-90df-43be-a718-a692f968b764\") " pod="calico-system/csi-node-driver-768vv" Aug 13 00:34:33.121658 kubelet[3254]: E0813 00:34:33.121592 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.121658 kubelet[3254]: W0813 00:34:33.121653 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.121979 kubelet[3254]: E0813 00:34:33.121694 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.122215 kubelet[3254]: E0813 00:34:33.122143 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.122413 kubelet[3254]: W0813 00:34:33.122228 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.122413 kubelet[3254]: E0813 00:34:33.122280 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.122924 kubelet[3254]: E0813 00:34:33.122874 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.122924 kubelet[3254]: W0813 00:34:33.122910 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.123310 kubelet[3254]: E0813 00:34:33.122949 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.123472 kubelet[3254]: E0813 00:34:33.123389 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.123472 kubelet[3254]: W0813 00:34:33.123414 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.123820 kubelet[3254]: E0813 00:34:33.123444 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.222207 kubelet[3254]: E0813 00:34:33.222188 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.222207 kubelet[3254]: W0813 00:34:33.222201 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.222345 kubelet[3254]: E0813 00:34:33.222219 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.222391 kubelet[3254]: E0813 00:34:33.222354 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.222391 kubelet[3254]: W0813 00:34:33.222363 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.222391 kubelet[3254]: E0813 00:34:33.222375 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.222518 kubelet[3254]: E0813 00:34:33.222507 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.222518 kubelet[3254]: W0813 00:34:33.222515 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.222584 kubelet[3254]: E0813 00:34:33.222526 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.222721 kubelet[3254]: E0813 00:34:33.222687 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.222721 kubelet[3254]: W0813 00:34:33.222698 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.222721 kubelet[3254]: E0813 00:34:33.222707 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.222842 kubelet[3254]: E0813 00:34:33.222803 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.222842 kubelet[3254]: W0813 00:34:33.222809 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.222842 kubelet[3254]: E0813 00:34:33.222815 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.222957 kubelet[3254]: E0813 00:34:33.222917 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.222957 kubelet[3254]: W0813 00:34:33.222926 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.222957 kubelet[3254]: E0813 00:34:33.222935 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.223074 kubelet[3254]: E0813 00:34:33.223067 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.223101 kubelet[3254]: W0813 00:34:33.223073 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.223101 kubelet[3254]: E0813 00:34:33.223081 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.223199 kubelet[3254]: E0813 00:34:33.223192 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.223199 kubelet[3254]: W0813 00:34:33.223198 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.223247 kubelet[3254]: E0813 00:34:33.223204 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.223297 kubelet[3254]: E0813 00:34:33.223291 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.223322 kubelet[3254]: W0813 00:34:33.223297 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.223322 kubelet[3254]: E0813 00:34:33.223303 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.223390 kubelet[3254]: E0813 00:34:33.223383 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.223390 kubelet[3254]: W0813 00:34:33.223389 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.223430 kubelet[3254]: E0813 00:34:33.223395 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.223496 kubelet[3254]: E0813 00:34:33.223490 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.223518 kubelet[3254]: W0813 00:34:33.223497 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.223518 kubelet[3254]: E0813 00:34:33.223503 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.223592 kubelet[3254]: E0813 00:34:33.223586 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.223622 kubelet[3254]: W0813 00:34:33.223592 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.223622 kubelet[3254]: E0813 00:34:33.223598 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.223723 kubelet[3254]: E0813 00:34:33.223713 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.223745 kubelet[3254]: W0813 00:34:33.223723 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.223745 kubelet[3254]: E0813 00:34:33.223733 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.223836 kubelet[3254]: E0813 00:34:33.223829 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.223836 kubelet[3254]: W0813 00:34:33.223835 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.223884 kubelet[3254]: E0813 00:34:33.223842 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.223934 kubelet[3254]: E0813 00:34:33.223927 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.223958 kubelet[3254]: W0813 00:34:33.223933 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.223958 kubelet[3254]: E0813 00:34:33.223939 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.224052 kubelet[3254]: E0813 00:34:33.224045 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.224075 kubelet[3254]: W0813 00:34:33.224053 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.224075 kubelet[3254]: E0813 00:34:33.224061 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.224242 kubelet[3254]: E0813 00:34:33.224231 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.224274 kubelet[3254]: W0813 00:34:33.224241 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.224274 kubelet[3254]: E0813 00:34:33.224250 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.224367 kubelet[3254]: E0813 00:34:33.224360 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.224367 kubelet[3254]: W0813 00:34:33.224367 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.224416 kubelet[3254]: E0813 00:34:33.224374 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.224475 kubelet[3254]: E0813 00:34:33.224468 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.224475 kubelet[3254]: W0813 00:34:33.224474 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.224526 kubelet[3254]: E0813 00:34:33.224480 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.224571 kubelet[3254]: E0813 00:34:33.224565 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.224600 kubelet[3254]: W0813 00:34:33.224573 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.224600 kubelet[3254]: E0813 00:34:33.224582 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.224715 kubelet[3254]: E0813 00:34:33.224705 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.224715 kubelet[3254]: W0813 00:34:33.224714 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.224789 kubelet[3254]: E0813 00:34:33.224725 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.224853 kubelet[3254]: E0813 00:34:33.224844 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.224853 kubelet[3254]: W0813 00:34:33.224852 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.224929 kubelet[3254]: E0813 00:34:33.224862 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.224985 kubelet[3254]: E0813 00:34:33.224976 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.225014 kubelet[3254]: W0813 00:34:33.224985 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.225014 kubelet[3254]: E0813 00:34:33.224994 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.225116 kubelet[3254]: E0813 00:34:33.225109 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.225143 kubelet[3254]: W0813 00:34:33.225116 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.225143 kubelet[3254]: E0813 00:34:33.225123 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.225242 kubelet[3254]: E0813 00:34:33.225235 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.225242 kubelet[3254]: W0813 00:34:33.225241 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.225284 kubelet[3254]: E0813 00:34:33.225248 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:33.229400 kubelet[3254]: E0813 00:34:33.229376 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:33.229400 kubelet[3254]: W0813 00:34:33.229386 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:33.229400 kubelet[3254]: E0813 00:34:33.229396 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:34.211295 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount944930268.mount: Deactivated successfully. Aug 13 00:34:34.521425 containerd[1908]: time="2025-08-13T00:34:34.521367696Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:34.521656 containerd[1908]: time="2025-08-13T00:34:34.521561315Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Aug 13 00:34:34.521957 containerd[1908]: time="2025-08-13T00:34:34.521946333Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:34.522787 containerd[1908]: time="2025-08-13T00:34:34.522775911Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:34.523119 containerd[1908]: time="2025-08-13T00:34:34.523105333Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 1.750876534s" Aug 13 00:34:34.523159 containerd[1908]: time="2025-08-13T00:34:34.523122453Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Aug 13 00:34:34.523569 containerd[1908]: time="2025-08-13T00:34:34.523558812Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Aug 13 00:34:34.527154 containerd[1908]: time="2025-08-13T00:34:34.527133216Z" level=info msg="CreateContainer within sandbox \"541e8042e0abd845bad097dc51b68cf1cbcc37925ead7fe84f1a0b0248bf88cf\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Aug 13 00:34:34.529910 containerd[1908]: time="2025-08-13T00:34:34.529895905Z" level=info msg="Container 02d1fc216b606c6d4a36af771f120ee2dcd6ca9337a3fc18c0cd5d3e21c9f073: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:34.532472 containerd[1908]: time="2025-08-13T00:34:34.532437405Z" level=info msg="CreateContainer within sandbox \"541e8042e0abd845bad097dc51b68cf1cbcc37925ead7fe84f1a0b0248bf88cf\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"02d1fc216b606c6d4a36af771f120ee2dcd6ca9337a3fc18c0cd5d3e21c9f073\"" Aug 13 00:34:34.532698 containerd[1908]: time="2025-08-13T00:34:34.532660354Z" level=info msg="StartContainer for \"02d1fc216b606c6d4a36af771f120ee2dcd6ca9337a3fc18c0cd5d3e21c9f073\"" Aug 13 00:34:34.533159 containerd[1908]: time="2025-08-13T00:34:34.533148343Z" level=info msg="connecting to shim 02d1fc216b606c6d4a36af771f120ee2dcd6ca9337a3fc18c0cd5d3e21c9f073" address="unix:///run/containerd/s/bda86fdf4fc1a9364666abd8bc50947021f61b494e6bd9149e2a6664f384ed1e" protocol=ttrpc version=3 Aug 13 00:34:34.557477 systemd[1]: Started cri-containerd-02d1fc216b606c6d4a36af771f120ee2dcd6ca9337a3fc18c0cd5d3e21c9f073.scope - libcontainer container 02d1fc216b606c6d4a36af771f120ee2dcd6ca9337a3fc18c0cd5d3e21c9f073. Aug 13 00:34:34.589433 containerd[1908]: time="2025-08-13T00:34:34.589410927Z" level=info msg="StartContainer for \"02d1fc216b606c6d4a36af771f120ee2dcd6ca9337a3fc18c0cd5d3e21c9f073\" returns successfully" Aug 13 00:34:35.118689 kubelet[3254]: E0813 00:34:35.118554 3254 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-768vv" podUID="baea1b6a-90df-43be-a718-a692f968b764" Aug 13 00:34:35.193015 kubelet[3254]: I0813 00:34:35.192902 3254 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7c748c5975-tcg75" podStartSLOduration=1.441427383 podStartE2EDuration="3.192864627s" podCreationTimestamp="2025-08-13 00:34:32 +0000 UTC" firstStartedPulling="2025-08-13 00:34:32.772072131 +0000 UTC m=+15.700412306" lastFinishedPulling="2025-08-13 00:34:34.523509376 +0000 UTC m=+17.451849550" observedRunningTime="2025-08-13 00:34:35.192369956 +0000 UTC m=+18.120710202" watchObservedRunningTime="2025-08-13 00:34:35.192864627 +0000 UTC m=+18.121204859" Aug 13 00:34:35.213367 kubelet[3254]: E0813 00:34:35.213295 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.213367 kubelet[3254]: W0813 00:34:35.213342 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.213367 kubelet[3254]: E0813 00:34:35.213382 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.213863 kubelet[3254]: E0813 00:34:35.213820 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.213863 kubelet[3254]: W0813 00:34:35.213858 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.214101 kubelet[3254]: E0813 00:34:35.213892 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.214500 kubelet[3254]: E0813 00:34:35.214434 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.214500 kubelet[3254]: W0813 00:34:35.214471 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.214767 kubelet[3254]: E0813 00:34:35.214525 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.214992 kubelet[3254]: E0813 00:34:35.214956 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.214992 kubelet[3254]: W0813 00:34:35.214985 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.215296 kubelet[3254]: E0813 00:34:35.215014 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.215522 kubelet[3254]: E0813 00:34:35.215486 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.215522 kubelet[3254]: W0813 00:34:35.215513 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.215787 kubelet[3254]: E0813 00:34:35.215543 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.215992 kubelet[3254]: E0813 00:34:35.215959 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.215992 kubelet[3254]: W0813 00:34:35.215985 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.216302 kubelet[3254]: E0813 00:34:35.216008 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.216491 kubelet[3254]: E0813 00:34:35.216448 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.216491 kubelet[3254]: W0813 00:34:35.216482 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.216728 kubelet[3254]: E0813 00:34:35.216520 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.216989 kubelet[3254]: E0813 00:34:35.216953 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.216989 kubelet[3254]: W0813 00:34:35.216982 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.217292 kubelet[3254]: E0813 00:34:35.217006 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.217750 kubelet[3254]: E0813 00:34:35.217702 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.217957 kubelet[3254]: W0813 00:34:35.217746 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.217957 kubelet[3254]: E0813 00:34:35.217795 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.218327 kubelet[3254]: E0813 00:34:35.218283 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.218327 kubelet[3254]: W0813 00:34:35.218318 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.218665 kubelet[3254]: E0813 00:34:35.218356 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.218913 kubelet[3254]: E0813 00:34:35.218871 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.218913 kubelet[3254]: W0813 00:34:35.218906 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.219268 kubelet[3254]: E0813 00:34:35.218945 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.219476 kubelet[3254]: E0813 00:34:35.219434 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.219652 kubelet[3254]: W0813 00:34:35.219473 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.219652 kubelet[3254]: E0813 00:34:35.219514 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.220050 kubelet[3254]: E0813 00:34:35.220011 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.220273 kubelet[3254]: W0813 00:34:35.220048 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.220273 kubelet[3254]: E0813 00:34:35.220087 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.220660 kubelet[3254]: E0813 00:34:35.220619 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.220660 kubelet[3254]: W0813 00:34:35.220656 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.221129 kubelet[3254]: E0813 00:34:35.220694 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.221363 kubelet[3254]: E0813 00:34:35.221170 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.221363 kubelet[3254]: W0813 00:34:35.221235 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.221363 kubelet[3254]: E0813 00:34:35.221273 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.238854 kubelet[3254]: E0813 00:34:35.238798 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.238854 kubelet[3254]: W0813 00:34:35.238836 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.239169 kubelet[3254]: E0813 00:34:35.238873 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.239491 kubelet[3254]: E0813 00:34:35.239443 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.239491 kubelet[3254]: W0813 00:34:35.239479 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.239797 kubelet[3254]: E0813 00:34:35.239511 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.240039 kubelet[3254]: E0813 00:34:35.239986 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.240039 kubelet[3254]: W0813 00:34:35.240014 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.240039 kubelet[3254]: E0813 00:34:35.240040 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.240697 kubelet[3254]: E0813 00:34:35.240648 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.240697 kubelet[3254]: W0813 00:34:35.240689 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.241044 kubelet[3254]: E0813 00:34:35.240735 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.241307 kubelet[3254]: E0813 00:34:35.241264 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.241307 kubelet[3254]: W0813 00:34:35.241295 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.241624 kubelet[3254]: E0813 00:34:35.241331 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.241883 kubelet[3254]: E0813 00:34:35.241838 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.241883 kubelet[3254]: W0813 00:34:35.241872 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.242237 kubelet[3254]: E0813 00:34:35.241914 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.242477 kubelet[3254]: E0813 00:34:35.242435 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.242477 kubelet[3254]: W0813 00:34:35.242468 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.242784 kubelet[3254]: E0813 00:34:35.242504 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.242996 kubelet[3254]: E0813 00:34:35.242960 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.243174 kubelet[3254]: W0813 00:34:35.242995 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.243174 kubelet[3254]: E0813 00:34:35.243033 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.243592 kubelet[3254]: E0813 00:34:35.243556 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.243592 kubelet[3254]: W0813 00:34:35.243586 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.243897 kubelet[3254]: E0813 00:34:35.243622 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.244135 kubelet[3254]: E0813 00:34:35.244098 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.244336 kubelet[3254]: W0813 00:34:35.244133 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.244336 kubelet[3254]: E0813 00:34:35.244172 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.244736 kubelet[3254]: E0813 00:34:35.244700 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.244736 kubelet[3254]: W0813 00:34:35.244730 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.245075 kubelet[3254]: E0813 00:34:35.244767 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.245275 kubelet[3254]: E0813 00:34:35.245251 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.245420 kubelet[3254]: W0813 00:34:35.245284 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.246348 kubelet[3254]: E0813 00:34:35.245322 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.247210 kubelet[3254]: E0813 00:34:35.246993 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.247210 kubelet[3254]: W0813 00:34:35.247071 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.247210 kubelet[3254]: E0813 00:34:35.247127 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.248836 kubelet[3254]: E0813 00:34:35.248390 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.248836 kubelet[3254]: W0813 00:34:35.248442 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.248836 kubelet[3254]: E0813 00:34:35.248513 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.249775 kubelet[3254]: E0813 00:34:35.249690 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.249775 kubelet[3254]: W0813 00:34:35.249731 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.249775 kubelet[3254]: E0813 00:34:35.249775 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.251394 kubelet[3254]: E0813 00:34:35.251299 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.251394 kubelet[3254]: W0813 00:34:35.251347 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.251394 kubelet[3254]: E0813 00:34:35.251385 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.252040 kubelet[3254]: E0813 00:34:35.251996 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.252145 kubelet[3254]: W0813 00:34:35.252040 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.252145 kubelet[3254]: E0813 00:34:35.252077 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.252718 kubelet[3254]: E0813 00:34:35.252639 3254 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:34:35.252718 kubelet[3254]: W0813 00:34:35.252674 3254 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:34:35.252718 kubelet[3254]: E0813 00:34:35.252705 3254 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:34:35.913348 containerd[1908]: time="2025-08-13T00:34:35.913326176Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:35.913596 containerd[1908]: time="2025-08-13T00:34:35.913523426Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Aug 13 00:34:35.913920 containerd[1908]: time="2025-08-13T00:34:35.913905926Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:35.914722 containerd[1908]: time="2025-08-13T00:34:35.914711435Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:35.915377 containerd[1908]: time="2025-08-13T00:34:35.915360386Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 1.391787857s" Aug 13 00:34:35.915415 containerd[1908]: time="2025-08-13T00:34:35.915380476Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Aug 13 00:34:35.917242 containerd[1908]: time="2025-08-13T00:34:35.917214314Z" level=info msg="CreateContainer within sandbox \"44c300e3a755b2cb010880f3b8cc0c13c9785fe18af25777a084a781483ca21d\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Aug 13 00:34:35.921049 containerd[1908]: time="2025-08-13T00:34:35.921011330Z" level=info msg="Container 08155112aa8e31ba578297fd38deb5317e417b40b2d393a5d05361b4aec3c850: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:35.924424 containerd[1908]: time="2025-08-13T00:34:35.924382544Z" level=info msg="CreateContainer within sandbox \"44c300e3a755b2cb010880f3b8cc0c13c9785fe18af25777a084a781483ca21d\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"08155112aa8e31ba578297fd38deb5317e417b40b2d393a5d05361b4aec3c850\"" Aug 13 00:34:35.924645 containerd[1908]: time="2025-08-13T00:34:35.924600925Z" level=info msg="StartContainer for \"08155112aa8e31ba578297fd38deb5317e417b40b2d393a5d05361b4aec3c850\"" Aug 13 00:34:35.925362 containerd[1908]: time="2025-08-13T00:34:35.925319834Z" level=info msg="connecting to shim 08155112aa8e31ba578297fd38deb5317e417b40b2d393a5d05361b4aec3c850" address="unix:///run/containerd/s/5d9ad55e387a9e77b9ce868a5b5a66ab15d420cc0434d9a36df80418171ea236" protocol=ttrpc version=3 Aug 13 00:34:35.947282 systemd[1]: Started cri-containerd-08155112aa8e31ba578297fd38deb5317e417b40b2d393a5d05361b4aec3c850.scope - libcontainer container 08155112aa8e31ba578297fd38deb5317e417b40b2d393a5d05361b4aec3c850. Aug 13 00:34:35.969408 containerd[1908]: time="2025-08-13T00:34:35.969379387Z" level=info msg="StartContainer for \"08155112aa8e31ba578297fd38deb5317e417b40b2d393a5d05361b4aec3c850\" returns successfully" Aug 13 00:34:35.973792 systemd[1]: cri-containerd-08155112aa8e31ba578297fd38deb5317e417b40b2d393a5d05361b4aec3c850.scope: Deactivated successfully. Aug 13 00:34:35.974933 containerd[1908]: time="2025-08-13T00:34:35.974905085Z" level=info msg="TaskExit event in podsandbox handler container_id:\"08155112aa8e31ba578297fd38deb5317e417b40b2d393a5d05361b4aec3c850\" id:\"08155112aa8e31ba578297fd38deb5317e417b40b2d393a5d05361b4aec3c850\" pid:4090 exited_at:{seconds:1755045275 nanos:974610561}" Aug 13 00:34:35.975027 containerd[1908]: time="2025-08-13T00:34:35.974979134Z" level=info msg="received exit event container_id:\"08155112aa8e31ba578297fd38deb5317e417b40b2d393a5d05361b4aec3c850\" id:\"08155112aa8e31ba578297fd38deb5317e417b40b2d393a5d05361b4aec3c850\" pid:4090 exited_at:{seconds:1755045275 nanos:974610561}" Aug 13 00:34:35.986824 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-08155112aa8e31ba578297fd38deb5317e417b40b2d393a5d05361b4aec3c850-rootfs.mount: Deactivated successfully. Aug 13 00:34:36.179897 kubelet[3254]: I0813 00:34:36.179809 3254 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:34:37.118497 kubelet[3254]: E0813 00:34:37.118456 3254 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-768vv" podUID="baea1b6a-90df-43be-a718-a692f968b764" Aug 13 00:34:37.188811 containerd[1908]: time="2025-08-13T00:34:37.188711416Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Aug 13 00:34:39.118251 kubelet[3254]: E0813 00:34:39.118222 3254 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-768vv" podUID="baea1b6a-90df-43be-a718-a692f968b764" Aug 13 00:34:39.504651 containerd[1908]: time="2025-08-13T00:34:39.504625078Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:39.504923 containerd[1908]: time="2025-08-13T00:34:39.504906724Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Aug 13 00:34:39.505804 containerd[1908]: time="2025-08-13T00:34:39.505791950Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:39.506867 containerd[1908]: time="2025-08-13T00:34:39.506854899Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:39.507148 containerd[1908]: time="2025-08-13T00:34:39.507134315Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 2.318335654s" Aug 13 00:34:39.507171 containerd[1908]: time="2025-08-13T00:34:39.507150330Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Aug 13 00:34:39.508471 containerd[1908]: time="2025-08-13T00:34:39.508452836Z" level=info msg="CreateContainer within sandbox \"44c300e3a755b2cb010880f3b8cc0c13c9785fe18af25777a084a781483ca21d\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Aug 13 00:34:39.511756 containerd[1908]: time="2025-08-13T00:34:39.511717409Z" level=info msg="Container 2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:39.515309 containerd[1908]: time="2025-08-13T00:34:39.515267576Z" level=info msg="CreateContainer within sandbox \"44c300e3a755b2cb010880f3b8cc0c13c9785fe18af25777a084a781483ca21d\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e\"" Aug 13 00:34:39.515471 containerd[1908]: time="2025-08-13T00:34:39.515459785Z" level=info msg="StartContainer for \"2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e\"" Aug 13 00:34:39.516173 containerd[1908]: time="2025-08-13T00:34:39.516160617Z" level=info msg="connecting to shim 2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e" address="unix:///run/containerd/s/5d9ad55e387a9e77b9ce868a5b5a66ab15d420cc0434d9a36df80418171ea236" protocol=ttrpc version=3 Aug 13 00:34:39.531416 systemd[1]: Started cri-containerd-2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e.scope - libcontainer container 2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e. Aug 13 00:34:39.549994 containerd[1908]: time="2025-08-13T00:34:39.549890493Z" level=info msg="StartContainer for \"2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e\" returns successfully" Aug 13 00:34:40.160152 containerd[1908]: time="2025-08-13T00:34:40.160082036Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 13 00:34:40.161266 systemd[1]: cri-containerd-2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e.scope: Deactivated successfully. Aug 13 00:34:40.161444 systemd[1]: cri-containerd-2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e.scope: Consumed 414ms CPU time, 195.7M memory peak, 171.2M written to disk. Aug 13 00:34:40.161796 containerd[1908]: time="2025-08-13T00:34:40.161782622Z" level=info msg="received exit event container_id:\"2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e\" id:\"2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e\" pid:4151 exited_at:{seconds:1755045280 nanos:161686366}" Aug 13 00:34:40.161849 containerd[1908]: time="2025-08-13T00:34:40.161836910Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e\" id:\"2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e\" pid:4151 exited_at:{seconds:1755045280 nanos:161686366}" Aug 13 00:34:40.174654 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e-rootfs.mount: Deactivated successfully. Aug 13 00:34:40.225021 kubelet[3254]: I0813 00:34:40.224925 3254 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Aug 13 00:34:40.312263 systemd[1]: Created slice kubepods-burstable-pod5edf5ee0_2bb3_4fc8_a5b6_81be33ed6600.slice - libcontainer container kubepods-burstable-pod5edf5ee0_2bb3_4fc8_a5b6_81be33ed6600.slice. Aug 13 00:34:40.349241 systemd[1]: Created slice kubepods-burstable-poda319200b_9c2f_4e26_983f_0ba93ef3f84c.slice - libcontainer container kubepods-burstable-poda319200b_9c2f_4e26_983f_0ba93ef3f84c.slice. Aug 13 00:34:40.367974 systemd[1]: Created slice kubepods-besteffort-pod4a08bd94_d4c1_4c85_ac9d_1c68619b4bd3.slice - libcontainer container kubepods-besteffort-pod4a08bd94_d4c1_4c85_ac9d_1c68619b4bd3.slice. Aug 13 00:34:40.376232 kubelet[3254]: I0813 00:34:40.376216 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5edf5ee0-2bb3-4fc8-a5b6-81be33ed6600-config-volume\") pod \"coredns-674b8bbfcf-fg4zz\" (UID: \"5edf5ee0-2bb3-4fc8-a5b6-81be33ed6600\") " pod="kube-system/coredns-674b8bbfcf-fg4zz" Aug 13 00:34:40.376284 kubelet[3254]: I0813 00:34:40.376240 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmm5d\" (UniqueName: \"kubernetes.io/projected/5edf5ee0-2bb3-4fc8-a5b6-81be33ed6600-kube-api-access-hmm5d\") pod \"coredns-674b8bbfcf-fg4zz\" (UID: \"5edf5ee0-2bb3-4fc8-a5b6-81be33ed6600\") " pod="kube-system/coredns-674b8bbfcf-fg4zz" Aug 13 00:34:40.376284 kubelet[3254]: I0813 00:34:40.376255 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a319200b-9c2f-4e26-983f-0ba93ef3f84c-config-volume\") pod \"coredns-674b8bbfcf-98gch\" (UID: \"a319200b-9c2f-4e26-983f-0ba93ef3f84c\") " pod="kube-system/coredns-674b8bbfcf-98gch" Aug 13 00:34:40.376284 kubelet[3254]: I0813 00:34:40.376265 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxmvv\" (UniqueName: \"kubernetes.io/projected/a319200b-9c2f-4e26-983f-0ba93ef3f84c-kube-api-access-wxmvv\") pod \"coredns-674b8bbfcf-98gch\" (UID: \"a319200b-9c2f-4e26-983f-0ba93ef3f84c\") " pod="kube-system/coredns-674b8bbfcf-98gch" Aug 13 00:34:40.408603 systemd[1]: Created slice kubepods-besteffort-podf0f2e0f9_90cf_41db_b550_f165e6260ca8.slice - libcontainer container kubepods-besteffort-podf0f2e0f9_90cf_41db_b550_f165e6260ca8.slice. Aug 13 00:34:40.448789 systemd[1]: Created slice kubepods-besteffort-pod6ac70178_1af5_4be2_929d_3b8bf392fd61.slice - libcontainer container kubepods-besteffort-pod6ac70178_1af5_4be2_929d_3b8bf392fd61.slice. Aug 13 00:34:40.477237 kubelet[3254]: I0813 00:34:40.477140 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f2e0f9-90cf-41db-b550-f165e6260ca8-tigera-ca-bundle\") pod \"calico-kube-controllers-85df7c9ff6-mpjsz\" (UID: \"f0f2e0f9-90cf-41db-b550-f165e6260ca8\") " pod="calico-system/calico-kube-controllers-85df7c9ff6-mpjsz" Aug 13 00:34:40.481352 kubelet[3254]: I0813 00:34:40.477259 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcxwd\" (UniqueName: \"kubernetes.io/projected/4a08bd94-d4c1-4c85-ac9d-1c68619b4bd3-kube-api-access-wcxwd\") pod \"calico-apiserver-856b9ff84f-llc5l\" (UID: \"4a08bd94-d4c1-4c85-ac9d-1c68619b4bd3\") " pod="calico-apiserver/calico-apiserver-856b9ff84f-llc5l" Aug 13 00:34:40.481352 kubelet[3254]: I0813 00:34:40.477317 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc959\" (UniqueName: \"kubernetes.io/projected/f0f2e0f9-90cf-41db-b550-f165e6260ca8-kube-api-access-lc959\") pod \"calico-kube-controllers-85df7c9ff6-mpjsz\" (UID: \"f0f2e0f9-90cf-41db-b550-f165e6260ca8\") " pod="calico-system/calico-kube-controllers-85df7c9ff6-mpjsz" Aug 13 00:34:40.481352 kubelet[3254]: I0813 00:34:40.477476 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6ac70178-1af5-4be2-929d-3b8bf392fd61-calico-apiserver-certs\") pod \"calico-apiserver-856b9ff84f-hvjbp\" (UID: \"6ac70178-1af5-4be2-929d-3b8bf392fd61\") " pod="calico-apiserver/calico-apiserver-856b9ff84f-hvjbp" Aug 13 00:34:40.481352 kubelet[3254]: I0813 00:34:40.477572 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhr6h\" (UniqueName: \"kubernetes.io/projected/6ac70178-1af5-4be2-929d-3b8bf392fd61-kube-api-access-rhr6h\") pod \"calico-apiserver-856b9ff84f-hvjbp\" (UID: \"6ac70178-1af5-4be2-929d-3b8bf392fd61\") " pod="calico-apiserver/calico-apiserver-856b9ff84f-hvjbp" Aug 13 00:34:40.481352 kubelet[3254]: I0813 00:34:40.477798 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4a08bd94-d4c1-4c85-ac9d-1c68619b4bd3-calico-apiserver-certs\") pod \"calico-apiserver-856b9ff84f-llc5l\" (UID: \"4a08bd94-d4c1-4c85-ac9d-1c68619b4bd3\") " pod="calico-apiserver/calico-apiserver-856b9ff84f-llc5l" Aug 13 00:34:40.487877 systemd[1]: Created slice kubepods-besteffort-poddda6ea77_d0ff_457f_8498_7faa0278932a.slice - libcontainer container kubepods-besteffort-poddda6ea77_d0ff_457f_8498_7faa0278932a.slice. Aug 13 00:34:40.569969 systemd[1]: Created slice kubepods-besteffort-podeb462351_c546_4721_840a_b37dd9e5027a.slice - libcontainer container kubepods-besteffort-podeb462351_c546_4721_840a_b37dd9e5027a.slice. Aug 13 00:34:40.578943 kubelet[3254]: I0813 00:34:40.578898 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p4vn\" (UniqueName: \"kubernetes.io/projected/dda6ea77-d0ff-457f-8498-7faa0278932a-kube-api-access-2p4vn\") pod \"goldmane-768f4c5c69-rd2g6\" (UID: \"dda6ea77-d0ff-457f-8498-7faa0278932a\") " pod="calico-system/goldmane-768f4c5c69-rd2g6" Aug 13 00:34:40.578943 kubelet[3254]: I0813 00:34:40.578925 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dda6ea77-d0ff-457f-8498-7faa0278932a-config\") pod \"goldmane-768f4c5c69-rd2g6\" (UID: \"dda6ea77-d0ff-457f-8498-7faa0278932a\") " pod="calico-system/goldmane-768f4c5c69-rd2g6" Aug 13 00:34:40.578943 kubelet[3254]: I0813 00:34:40.578938 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dda6ea77-d0ff-457f-8498-7faa0278932a-goldmane-ca-bundle\") pod \"goldmane-768f4c5c69-rd2g6\" (UID: \"dda6ea77-d0ff-457f-8498-7faa0278932a\") " pod="calico-system/goldmane-768f4c5c69-rd2g6" Aug 13 00:34:40.579074 kubelet[3254]: I0813 00:34:40.579003 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/dda6ea77-d0ff-457f-8498-7faa0278932a-goldmane-key-pair\") pod \"goldmane-768f4c5c69-rd2g6\" (UID: \"dda6ea77-d0ff-457f-8498-7faa0278932a\") " pod="calico-system/goldmane-768f4c5c69-rd2g6" Aug 13 00:34:40.615539 containerd[1908]: time="2025-08-13T00:34:40.615507783Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-fg4zz,Uid:5edf5ee0-2bb3-4fc8-a5b6-81be33ed6600,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:40.639673 containerd[1908]: time="2025-08-13T00:34:40.639616925Z" level=error msg="Failed to destroy network for sandbox \"a83514104413c0548d95d9ea653409c895f503556c47618d5df0a0b092634d59\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.640098 containerd[1908]: time="2025-08-13T00:34:40.640054075Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-fg4zz,Uid:5edf5ee0-2bb3-4fc8-a5b6-81be33ed6600,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a83514104413c0548d95d9ea653409c895f503556c47618d5df0a0b092634d59\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.640255 kubelet[3254]: E0813 00:34:40.640200 3254 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a83514104413c0548d95d9ea653409c895f503556c47618d5df0a0b092634d59\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.640255 kubelet[3254]: E0813 00:34:40.640249 3254 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a83514104413c0548d95d9ea653409c895f503556c47618d5df0a0b092634d59\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-fg4zz" Aug 13 00:34:40.640319 kubelet[3254]: E0813 00:34:40.640262 3254 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a83514104413c0548d95d9ea653409c895f503556c47618d5df0a0b092634d59\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-fg4zz" Aug 13 00:34:40.640319 kubelet[3254]: E0813 00:34:40.640298 3254 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-fg4zz_kube-system(5edf5ee0-2bb3-4fc8-a5b6-81be33ed6600)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-fg4zz_kube-system(5edf5ee0-2bb3-4fc8-a5b6-81be33ed6600)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a83514104413c0548d95d9ea653409c895f503556c47618d5df0a0b092634d59\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-fg4zz" podUID="5edf5ee0-2bb3-4fc8-a5b6-81be33ed6600" Aug 13 00:34:40.640920 systemd[1]: run-netns-cni\x2dcdeba00b\x2dfa69\x2d166f\x2dd4c8\x2d6765ec8459ad.mount: Deactivated successfully. Aug 13 00:34:40.662661 containerd[1908]: time="2025-08-13T00:34:40.662574027Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-98gch,Uid:a319200b-9c2f-4e26-983f-0ba93ef3f84c,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:40.671174 containerd[1908]: time="2025-08-13T00:34:40.671152285Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-856b9ff84f-llc5l,Uid:4a08bd94-d4c1-4c85-ac9d-1c68619b4bd3,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:34:40.679373 kubelet[3254]: I0813 00:34:40.679349 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/eb462351-c546-4721-840a-b37dd9e5027a-whisker-backend-key-pair\") pod \"whisker-6b58d784d-pbjhv\" (UID: \"eb462351-c546-4721-840a-b37dd9e5027a\") " pod="calico-system/whisker-6b58d784d-pbjhv" Aug 13 00:34:40.679471 kubelet[3254]: I0813 00:34:40.679386 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb462351-c546-4721-840a-b37dd9e5027a-whisker-ca-bundle\") pod \"whisker-6b58d784d-pbjhv\" (UID: \"eb462351-c546-4721-840a-b37dd9e5027a\") " pod="calico-system/whisker-6b58d784d-pbjhv" Aug 13 00:34:40.679471 kubelet[3254]: I0813 00:34:40.679403 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2ngj\" (UniqueName: \"kubernetes.io/projected/eb462351-c546-4721-840a-b37dd9e5027a-kube-api-access-t2ngj\") pod \"whisker-6b58d784d-pbjhv\" (UID: \"eb462351-c546-4721-840a-b37dd9e5027a\") " pod="calico-system/whisker-6b58d784d-pbjhv" Aug 13 00:34:40.686550 containerd[1908]: time="2025-08-13T00:34:40.686521631Z" level=error msg="Failed to destroy network for sandbox \"dfff47b6a125da871dc973d13b67a7b366869d5bf887cfd5706a1f474c280a27\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.687023 containerd[1908]: time="2025-08-13T00:34:40.687004933Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-98gch,Uid:a319200b-9c2f-4e26-983f-0ba93ef3f84c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfff47b6a125da871dc973d13b67a7b366869d5bf887cfd5706a1f474c280a27\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.687132 kubelet[3254]: E0813 00:34:40.687115 3254 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfff47b6a125da871dc973d13b67a7b366869d5bf887cfd5706a1f474c280a27\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.687162 kubelet[3254]: E0813 00:34:40.687146 3254 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfff47b6a125da871dc973d13b67a7b366869d5bf887cfd5706a1f474c280a27\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-98gch" Aug 13 00:34:40.687185 kubelet[3254]: E0813 00:34:40.687160 3254 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfff47b6a125da871dc973d13b67a7b366869d5bf887cfd5706a1f474c280a27\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-98gch" Aug 13 00:34:40.687210 kubelet[3254]: E0813 00:34:40.687196 3254 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-98gch_kube-system(a319200b-9c2f-4e26-983f-0ba93ef3f84c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-98gch_kube-system(a319200b-9c2f-4e26-983f-0ba93ef3f84c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dfff47b6a125da871dc973d13b67a7b366869d5bf887cfd5706a1f474c280a27\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-98gch" podUID="a319200b-9c2f-4e26-983f-0ba93ef3f84c" Aug 13 00:34:40.694007 containerd[1908]: time="2025-08-13T00:34:40.693977993Z" level=error msg="Failed to destroy network for sandbox \"1bdb730ca4302deac5641ca889958e8f2baccfa9d74b7abe1e928d94f0767e51\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.694500 containerd[1908]: time="2025-08-13T00:34:40.694481625Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-856b9ff84f-llc5l,Uid:4a08bd94-d4c1-4c85-ac9d-1c68619b4bd3,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bdb730ca4302deac5641ca889958e8f2baccfa9d74b7abe1e928d94f0767e51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.694604 kubelet[3254]: E0813 00:34:40.694586 3254 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bdb730ca4302deac5641ca889958e8f2baccfa9d74b7abe1e928d94f0767e51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.694633 kubelet[3254]: E0813 00:34:40.694617 3254 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bdb730ca4302deac5641ca889958e8f2baccfa9d74b7abe1e928d94f0767e51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-856b9ff84f-llc5l" Aug 13 00:34:40.694653 kubelet[3254]: E0813 00:34:40.694631 3254 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bdb730ca4302deac5641ca889958e8f2baccfa9d74b7abe1e928d94f0767e51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-856b9ff84f-llc5l" Aug 13 00:34:40.694676 kubelet[3254]: E0813 00:34:40.694662 3254 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-856b9ff84f-llc5l_calico-apiserver(4a08bd94-d4c1-4c85-ac9d-1c68619b4bd3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-856b9ff84f-llc5l_calico-apiserver(4a08bd94-d4c1-4c85-ac9d-1c68619b4bd3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1bdb730ca4302deac5641ca889958e8f2baccfa9d74b7abe1e928d94f0767e51\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-856b9ff84f-llc5l" podUID="4a08bd94-d4c1-4c85-ac9d-1c68619b4bd3" Aug 13 00:34:40.712484 containerd[1908]: time="2025-08-13T00:34:40.712317915Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-85df7c9ff6-mpjsz,Uid:f0f2e0f9-90cf-41db-b550-f165e6260ca8,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:40.736824 containerd[1908]: time="2025-08-13T00:34:40.736797719Z" level=error msg="Failed to destroy network for sandbox \"f5ea4afbfeb02f5c4d5467e93828e28bface93e0c4a0eebb37f47dfe9b409513\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.737221 containerd[1908]: time="2025-08-13T00:34:40.737203434Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-85df7c9ff6-mpjsz,Uid:f0f2e0f9-90cf-41db-b550-f165e6260ca8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5ea4afbfeb02f5c4d5467e93828e28bface93e0c4a0eebb37f47dfe9b409513\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.737398 kubelet[3254]: E0813 00:34:40.737377 3254 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5ea4afbfeb02f5c4d5467e93828e28bface93e0c4a0eebb37f47dfe9b409513\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.737428 kubelet[3254]: E0813 00:34:40.737413 3254 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5ea4afbfeb02f5c4d5467e93828e28bface93e0c4a0eebb37f47dfe9b409513\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-85df7c9ff6-mpjsz" Aug 13 00:34:40.737448 kubelet[3254]: E0813 00:34:40.737432 3254 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5ea4afbfeb02f5c4d5467e93828e28bface93e0c4a0eebb37f47dfe9b409513\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-85df7c9ff6-mpjsz" Aug 13 00:34:40.737475 kubelet[3254]: E0813 00:34:40.737462 3254 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-85df7c9ff6-mpjsz_calico-system(f0f2e0f9-90cf-41db-b550-f165e6260ca8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-85df7c9ff6-mpjsz_calico-system(f0f2e0f9-90cf-41db-b550-f165e6260ca8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f5ea4afbfeb02f5c4d5467e93828e28bface93e0c4a0eebb37f47dfe9b409513\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-85df7c9ff6-mpjsz" podUID="f0f2e0f9-90cf-41db-b550-f165e6260ca8" Aug 13 00:34:40.751806 containerd[1908]: time="2025-08-13T00:34:40.751753272Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-856b9ff84f-hvjbp,Uid:6ac70178-1af5-4be2-929d-3b8bf392fd61,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:34:40.775198 containerd[1908]: time="2025-08-13T00:34:40.775161952Z" level=error msg="Failed to destroy network for sandbox \"c7d0a6c7259076e9d8abaf1cfad5a5a6d0506d57cc6b914be2bdb824c60c0a8d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.775642 containerd[1908]: time="2025-08-13T00:34:40.775626120Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-856b9ff84f-hvjbp,Uid:6ac70178-1af5-4be2-929d-3b8bf392fd61,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7d0a6c7259076e9d8abaf1cfad5a5a6d0506d57cc6b914be2bdb824c60c0a8d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.775847 kubelet[3254]: E0813 00:34:40.775797 3254 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7d0a6c7259076e9d8abaf1cfad5a5a6d0506d57cc6b914be2bdb824c60c0a8d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.775847 kubelet[3254]: E0813 00:34:40.775839 3254 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7d0a6c7259076e9d8abaf1cfad5a5a6d0506d57cc6b914be2bdb824c60c0a8d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-856b9ff84f-hvjbp" Aug 13 00:34:40.775896 kubelet[3254]: E0813 00:34:40.775853 3254 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7d0a6c7259076e9d8abaf1cfad5a5a6d0506d57cc6b914be2bdb824c60c0a8d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-856b9ff84f-hvjbp" Aug 13 00:34:40.775918 kubelet[3254]: E0813 00:34:40.775887 3254 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-856b9ff84f-hvjbp_calico-apiserver(6ac70178-1af5-4be2-929d-3b8bf392fd61)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-856b9ff84f-hvjbp_calico-apiserver(6ac70178-1af5-4be2-929d-3b8bf392fd61)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c7d0a6c7259076e9d8abaf1cfad5a5a6d0506d57cc6b914be2bdb824c60c0a8d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-856b9ff84f-hvjbp" podUID="6ac70178-1af5-4be2-929d-3b8bf392fd61" Aug 13 00:34:40.794875 containerd[1908]: time="2025-08-13T00:34:40.794813573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-rd2g6,Uid:dda6ea77-d0ff-457f-8498-7faa0278932a,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:40.817498 containerd[1908]: time="2025-08-13T00:34:40.817471144Z" level=error msg="Failed to destroy network for sandbox \"6249f16e6351c2e956736d3f684b5fd77a2c5e6a746994f78f9dfbbcb2fbf2d6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.817996 containerd[1908]: time="2025-08-13T00:34:40.817977967Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-rd2g6,Uid:dda6ea77-d0ff-457f-8498-7faa0278932a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6249f16e6351c2e956736d3f684b5fd77a2c5e6a746994f78f9dfbbcb2fbf2d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.818131 kubelet[3254]: E0813 00:34:40.818109 3254 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6249f16e6351c2e956736d3f684b5fd77a2c5e6a746994f78f9dfbbcb2fbf2d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.818162 kubelet[3254]: E0813 00:34:40.818148 3254 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6249f16e6351c2e956736d3f684b5fd77a2c5e6a746994f78f9dfbbcb2fbf2d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-rd2g6" Aug 13 00:34:40.818189 kubelet[3254]: E0813 00:34:40.818161 3254 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6249f16e6351c2e956736d3f684b5fd77a2c5e6a746994f78f9dfbbcb2fbf2d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-rd2g6" Aug 13 00:34:40.818217 kubelet[3254]: E0813 00:34:40.818196 3254 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-rd2g6_calico-system(dda6ea77-d0ff-457f-8498-7faa0278932a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-rd2g6_calico-system(dda6ea77-d0ff-457f-8498-7faa0278932a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6249f16e6351c2e956736d3f684b5fd77a2c5e6a746994f78f9dfbbcb2fbf2d6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-rd2g6" podUID="dda6ea77-d0ff-457f-8498-7faa0278932a" Aug 13 00:34:40.872960 containerd[1908]: time="2025-08-13T00:34:40.872850447Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6b58d784d-pbjhv,Uid:eb462351-c546-4721-840a-b37dd9e5027a,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:40.897362 containerd[1908]: time="2025-08-13T00:34:40.897306545Z" level=error msg="Failed to destroy network for sandbox \"871ca3afcb19aca3ad256a21c256e81b48006345e5e42d29592576aaeb12ca62\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.897862 containerd[1908]: time="2025-08-13T00:34:40.897843633Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6b58d784d-pbjhv,Uid:eb462351-c546-4721-840a-b37dd9e5027a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"871ca3afcb19aca3ad256a21c256e81b48006345e5e42d29592576aaeb12ca62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.898014 kubelet[3254]: E0813 00:34:40.897963 3254 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"871ca3afcb19aca3ad256a21c256e81b48006345e5e42d29592576aaeb12ca62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:40.898014 kubelet[3254]: E0813 00:34:40.897998 3254 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"871ca3afcb19aca3ad256a21c256e81b48006345e5e42d29592576aaeb12ca62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6b58d784d-pbjhv" Aug 13 00:34:40.898014 kubelet[3254]: E0813 00:34:40.898011 3254 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"871ca3afcb19aca3ad256a21c256e81b48006345e5e42d29592576aaeb12ca62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6b58d784d-pbjhv" Aug 13 00:34:40.898081 kubelet[3254]: E0813 00:34:40.898041 3254 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6b58d784d-pbjhv_calico-system(eb462351-c546-4721-840a-b37dd9e5027a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6b58d784d-pbjhv_calico-system(eb462351-c546-4721-840a-b37dd9e5027a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"871ca3afcb19aca3ad256a21c256e81b48006345e5e42d29592576aaeb12ca62\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6b58d784d-pbjhv" podUID="eb462351-c546-4721-840a-b37dd9e5027a" Aug 13 00:34:41.133798 systemd[1]: Created slice kubepods-besteffort-podbaea1b6a_90df_43be_a718_a692f968b764.slice - libcontainer container kubepods-besteffort-podbaea1b6a_90df_43be_a718_a692f968b764.slice. Aug 13 00:34:41.140056 containerd[1908]: time="2025-08-13T00:34:41.139967265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-768vv,Uid:baea1b6a-90df-43be-a718-a692f968b764,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:41.165551 containerd[1908]: time="2025-08-13T00:34:41.165525533Z" level=error msg="Failed to destroy network for sandbox \"ee5b2d10444e0a9398003cb0dc2d2898137c5bccc47dcabd4090610d050ef334\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:41.166027 containerd[1908]: time="2025-08-13T00:34:41.165969791Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-768vv,Uid:baea1b6a-90df-43be-a718-a692f968b764,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee5b2d10444e0a9398003cb0dc2d2898137c5bccc47dcabd4090610d050ef334\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:41.166160 kubelet[3254]: E0813 00:34:41.166143 3254 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee5b2d10444e0a9398003cb0dc2d2898137c5bccc47dcabd4090610d050ef334\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:34:41.166193 kubelet[3254]: E0813 00:34:41.166174 3254 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee5b2d10444e0a9398003cb0dc2d2898137c5bccc47dcabd4090610d050ef334\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-768vv" Aug 13 00:34:41.166211 kubelet[3254]: E0813 00:34:41.166199 3254 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee5b2d10444e0a9398003cb0dc2d2898137c5bccc47dcabd4090610d050ef334\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-768vv" Aug 13 00:34:41.166242 kubelet[3254]: E0813 00:34:41.166229 3254 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-768vv_calico-system(baea1b6a-90df-43be-a718-a692f968b764)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-768vv_calico-system(baea1b6a-90df-43be-a718-a692f968b764)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ee5b2d10444e0a9398003cb0dc2d2898137c5bccc47dcabd4090610d050ef334\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-768vv" podUID="baea1b6a-90df-43be-a718-a692f968b764" Aug 13 00:34:41.207220 containerd[1908]: time="2025-08-13T00:34:41.207086784Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Aug 13 00:34:41.587688 systemd[1]: run-netns-cni\x2d5d19b2df\x2dcff2\x2db2e2\x2dd141\x2d6e2834884f60.mount: Deactivated successfully. Aug 13 00:34:41.863526 kubelet[3254]: I0813 00:34:41.863304 3254 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:34:44.600170 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2547772640.mount: Deactivated successfully. Aug 13 00:34:44.620146 containerd[1908]: time="2025-08-13T00:34:44.620089253Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:44.620311 containerd[1908]: time="2025-08-13T00:34:44.620287118Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Aug 13 00:34:44.620648 containerd[1908]: time="2025-08-13T00:34:44.620608087Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:44.621370 containerd[1908]: time="2025-08-13T00:34:44.621328782Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:44.621681 containerd[1908]: time="2025-08-13T00:34:44.621642462Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 3.414489688s" Aug 13 00:34:44.621681 containerd[1908]: time="2025-08-13T00:34:44.621655898Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Aug 13 00:34:44.625615 containerd[1908]: time="2025-08-13T00:34:44.625597636Z" level=info msg="CreateContainer within sandbox \"44c300e3a755b2cb010880f3b8cc0c13c9785fe18af25777a084a781483ca21d\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Aug 13 00:34:44.629032 containerd[1908]: time="2025-08-13T00:34:44.628995384Z" level=info msg="Container 5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:44.632690 containerd[1908]: time="2025-08-13T00:34:44.632640386Z" level=info msg="CreateContainer within sandbox \"44c300e3a755b2cb010880f3b8cc0c13c9785fe18af25777a084a781483ca21d\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\"" Aug 13 00:34:44.632896 containerd[1908]: time="2025-08-13T00:34:44.632838084Z" level=info msg="StartContainer for \"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\"" Aug 13 00:34:44.633607 containerd[1908]: time="2025-08-13T00:34:44.633562584Z" level=info msg="connecting to shim 5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c" address="unix:///run/containerd/s/5d9ad55e387a9e77b9ce868a5b5a66ab15d420cc0434d9a36df80418171ea236" protocol=ttrpc version=3 Aug 13 00:34:44.652497 systemd[1]: Started cri-containerd-5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c.scope - libcontainer container 5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c. Aug 13 00:34:44.672159 containerd[1908]: time="2025-08-13T00:34:44.672111328Z" level=info msg="StartContainer for \"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\" returns successfully" Aug 13 00:34:44.728724 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Aug 13 00:34:44.728776 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Aug 13 00:34:44.807333 kubelet[3254]: I0813 00:34:44.807309 3254 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/eb462351-c546-4721-840a-b37dd9e5027a-whisker-backend-key-pair\") pod \"eb462351-c546-4721-840a-b37dd9e5027a\" (UID: \"eb462351-c546-4721-840a-b37dd9e5027a\") " Aug 13 00:34:44.807562 kubelet[3254]: I0813 00:34:44.807343 3254 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2ngj\" (UniqueName: \"kubernetes.io/projected/eb462351-c546-4721-840a-b37dd9e5027a-kube-api-access-t2ngj\") pod \"eb462351-c546-4721-840a-b37dd9e5027a\" (UID: \"eb462351-c546-4721-840a-b37dd9e5027a\") " Aug 13 00:34:44.807562 kubelet[3254]: I0813 00:34:44.807364 3254 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb462351-c546-4721-840a-b37dd9e5027a-whisker-ca-bundle\") pod \"eb462351-c546-4721-840a-b37dd9e5027a\" (UID: \"eb462351-c546-4721-840a-b37dd9e5027a\") " Aug 13 00:34:44.807598 kubelet[3254]: I0813 00:34:44.807575 3254 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb462351-c546-4721-840a-b37dd9e5027a-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "eb462351-c546-4721-840a-b37dd9e5027a" (UID: "eb462351-c546-4721-840a-b37dd9e5027a"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Aug 13 00:34:44.808763 kubelet[3254]: I0813 00:34:44.808748 3254 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb462351-c546-4721-840a-b37dd9e5027a-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "eb462351-c546-4721-840a-b37dd9e5027a" (UID: "eb462351-c546-4721-840a-b37dd9e5027a"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Aug 13 00:34:44.808797 kubelet[3254]: I0813 00:34:44.808767 3254 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb462351-c546-4721-840a-b37dd9e5027a-kube-api-access-t2ngj" (OuterVolumeSpecName: "kube-api-access-t2ngj") pod "eb462351-c546-4721-840a-b37dd9e5027a" (UID: "eb462351-c546-4721-840a-b37dd9e5027a"). InnerVolumeSpecName "kube-api-access-t2ngj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Aug 13 00:34:44.908512 kubelet[3254]: I0813 00:34:44.908307 3254 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/eb462351-c546-4721-840a-b37dd9e5027a-whisker-backend-key-pair\") on node \"ci-4372.1.0-a-083aa5303b\" DevicePath \"\"" Aug 13 00:34:44.908512 kubelet[3254]: I0813 00:34:44.908369 3254 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t2ngj\" (UniqueName: \"kubernetes.io/projected/eb462351-c546-4721-840a-b37dd9e5027a-kube-api-access-t2ngj\") on node \"ci-4372.1.0-a-083aa5303b\" DevicePath \"\"" Aug 13 00:34:44.908512 kubelet[3254]: I0813 00:34:44.908397 3254 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb462351-c546-4721-840a-b37dd9e5027a-whisker-ca-bundle\") on node \"ci-4372.1.0-a-083aa5303b\" DevicePath \"\"" Aug 13 00:34:45.134218 systemd[1]: Removed slice kubepods-besteffort-podeb462351_c546_4721_840a_b37dd9e5027a.slice - libcontainer container kubepods-besteffort-podeb462351_c546_4721_840a_b37dd9e5027a.slice. Aug 13 00:34:45.233413 kubelet[3254]: I0813 00:34:45.233355 3254 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-4nwh7" podStartSLOduration=1.688359407 podStartE2EDuration="13.233336657s" podCreationTimestamp="2025-08-13 00:34:32 +0000 UTC" firstStartedPulling="2025-08-13 00:34:33.077059453 +0000 UTC m=+16.005399628" lastFinishedPulling="2025-08-13 00:34:44.622036701 +0000 UTC m=+27.550376878" observedRunningTime="2025-08-13 00:34:45.232820889 +0000 UTC m=+28.161161083" watchObservedRunningTime="2025-08-13 00:34:45.233336657 +0000 UTC m=+28.161676839" Aug 13 00:34:45.243087 systemd[1]: Created slice kubepods-besteffort-pod50eb8268_c998_4f07_806d_439bfe8fc2c1.slice - libcontainer container kubepods-besteffort-pod50eb8268_c998_4f07_806d_439bfe8fc2c1.slice. Aug 13 00:34:45.274283 containerd[1908]: time="2025-08-13T00:34:45.274247822Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\" id:\"fda64f2ebb7ea7c4c563f874101783bae702e4e84c1f507cfb20650606ab8c6d\" pid:4653 exit_status:1 exited_at:{seconds:1755045285 nanos:273779174}" Aug 13 00:34:45.310893 kubelet[3254]: I0813 00:34:45.310852 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50eb8268-c998-4f07-806d-439bfe8fc2c1-whisker-ca-bundle\") pod \"whisker-584cfc84b4-46ld2\" (UID: \"50eb8268-c998-4f07-806d-439bfe8fc2c1\") " pod="calico-system/whisker-584cfc84b4-46ld2" Aug 13 00:34:45.311056 kubelet[3254]: I0813 00:34:45.310928 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/50eb8268-c998-4f07-806d-439bfe8fc2c1-whisker-backend-key-pair\") pod \"whisker-584cfc84b4-46ld2\" (UID: \"50eb8268-c998-4f07-806d-439bfe8fc2c1\") " pod="calico-system/whisker-584cfc84b4-46ld2" Aug 13 00:34:45.311056 kubelet[3254]: I0813 00:34:45.310981 3254 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c957v\" (UniqueName: \"kubernetes.io/projected/50eb8268-c998-4f07-806d-439bfe8fc2c1-kube-api-access-c957v\") pod \"whisker-584cfc84b4-46ld2\" (UID: \"50eb8268-c998-4f07-806d-439bfe8fc2c1\") " pod="calico-system/whisker-584cfc84b4-46ld2" Aug 13 00:34:45.546960 containerd[1908]: time="2025-08-13T00:34:45.546768032Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-584cfc84b4-46ld2,Uid:50eb8268-c998-4f07-806d-439bfe8fc2c1,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:45.602613 systemd[1]: var-lib-kubelet-pods-eb462351\x2dc546\x2d4721\x2d840a\x2db37dd9e5027a-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dt2ngj.mount: Deactivated successfully. Aug 13 00:34:45.602703 systemd[1]: var-lib-kubelet-pods-eb462351\x2dc546\x2d4721\x2d840a\x2db37dd9e5027a-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Aug 13 00:34:45.604863 systemd-networkd[1816]: cali1a0a5678557: Link UP Aug 13 00:34:45.604973 systemd-networkd[1816]: cali1a0a5678557: Gained carrier Aug 13 00:34:45.611042 containerd[1908]: 2025-08-13 00:34:45.559 [INFO][4680] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:34:45.611042 containerd[1908]: 2025-08-13 00:34:45.566 [INFO][4680] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--a--083aa5303b-k8s-whisker--584cfc84b4--46ld2-eth0 whisker-584cfc84b4- calico-system 50eb8268-c998-4f07-806d-439bfe8fc2c1 862 0 2025-08-13 00:34:45 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:584cfc84b4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4372.1.0-a-083aa5303b whisker-584cfc84b4-46ld2 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali1a0a5678557 [] [] }} ContainerID="01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" Namespace="calico-system" Pod="whisker-584cfc84b4-46ld2" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-whisker--584cfc84b4--46ld2-" Aug 13 00:34:45.611042 containerd[1908]: 2025-08-13 00:34:45.566 [INFO][4680] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" Namespace="calico-system" Pod="whisker-584cfc84b4-46ld2" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-whisker--584cfc84b4--46ld2-eth0" Aug 13 00:34:45.611042 containerd[1908]: 2025-08-13 00:34:45.578 [INFO][4703] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" HandleID="k8s-pod-network.01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" Workload="ci--4372.1.0--a--083aa5303b-k8s-whisker--584cfc84b4--46ld2-eth0" Aug 13 00:34:45.611209 containerd[1908]: 2025-08-13 00:34:45.578 [INFO][4703] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" HandleID="k8s-pod-network.01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" Workload="ci--4372.1.0--a--083aa5303b-k8s-whisker--584cfc84b4--46ld2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f850), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-a-083aa5303b", "pod":"whisker-584cfc84b4-46ld2", "timestamp":"2025-08-13 00:34:45.578829455 +0000 UTC"}, Hostname:"ci-4372.1.0-a-083aa5303b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:34:45.611209 containerd[1908]: 2025-08-13 00:34:45.578 [INFO][4703] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:34:45.611209 containerd[1908]: 2025-08-13 00:34:45.578 [INFO][4703] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:34:45.611209 containerd[1908]: 2025-08-13 00:34:45.578 [INFO][4703] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-a-083aa5303b' Aug 13 00:34:45.611209 containerd[1908]: 2025-08-13 00:34:45.583 [INFO][4703] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:45.611209 containerd[1908]: 2025-08-13 00:34:45.586 [INFO][4703] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:45.611209 containerd[1908]: 2025-08-13 00:34:45.589 [INFO][4703] ipam/ipam.go 511: Trying affinity for 192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:45.611209 containerd[1908]: 2025-08-13 00:34:45.591 [INFO][4703] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:45.611209 containerd[1908]: 2025-08-13 00:34:45.592 [INFO][4703] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:45.611374 containerd[1908]: 2025-08-13 00:34:45.592 [INFO][4703] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.64/26 handle="k8s-pod-network.01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:45.611374 containerd[1908]: 2025-08-13 00:34:45.593 [INFO][4703] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64 Aug 13 00:34:45.611374 containerd[1908]: 2025-08-13 00:34:45.596 [INFO][4703] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.64/26 handle="k8s-pod-network.01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:45.611374 containerd[1908]: 2025-08-13 00:34:45.598 [INFO][4703] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.65/26] block=192.168.35.64/26 handle="k8s-pod-network.01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:45.611374 containerd[1908]: 2025-08-13 00:34:45.598 [INFO][4703] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.65/26] handle="k8s-pod-network.01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:45.611374 containerd[1908]: 2025-08-13 00:34:45.598 [INFO][4703] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:34:45.611374 containerd[1908]: 2025-08-13 00:34:45.598 [INFO][4703] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.65/26] IPv6=[] ContainerID="01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" HandleID="k8s-pod-network.01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" Workload="ci--4372.1.0--a--083aa5303b-k8s-whisker--584cfc84b4--46ld2-eth0" Aug 13 00:34:45.611485 containerd[1908]: 2025-08-13 00:34:45.600 [INFO][4680] cni-plugin/k8s.go 418: Populated endpoint ContainerID="01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" Namespace="calico-system" Pod="whisker-584cfc84b4-46ld2" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-whisker--584cfc84b4--46ld2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--083aa5303b-k8s-whisker--584cfc84b4--46ld2-eth0", GenerateName:"whisker-584cfc84b4-", Namespace:"calico-system", SelfLink:"", UID:"50eb8268-c998-4f07-806d-439bfe8fc2c1", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"584cfc84b4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-083aa5303b", ContainerID:"", Pod:"whisker-584cfc84b4-46ld2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.35.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali1a0a5678557", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:45.611485 containerd[1908]: 2025-08-13 00:34:45.600 [INFO][4680] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.65/32] ContainerID="01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" Namespace="calico-system" Pod="whisker-584cfc84b4-46ld2" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-whisker--584cfc84b4--46ld2-eth0" Aug 13 00:34:45.611545 containerd[1908]: 2025-08-13 00:34:45.600 [INFO][4680] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1a0a5678557 ContainerID="01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" Namespace="calico-system" Pod="whisker-584cfc84b4-46ld2" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-whisker--584cfc84b4--46ld2-eth0" Aug 13 00:34:45.611545 containerd[1908]: 2025-08-13 00:34:45.605 [INFO][4680] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" Namespace="calico-system" Pod="whisker-584cfc84b4-46ld2" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-whisker--584cfc84b4--46ld2-eth0" Aug 13 00:34:45.611578 containerd[1908]: 2025-08-13 00:34:45.605 [INFO][4680] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" Namespace="calico-system" Pod="whisker-584cfc84b4-46ld2" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-whisker--584cfc84b4--46ld2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--083aa5303b-k8s-whisker--584cfc84b4--46ld2-eth0", GenerateName:"whisker-584cfc84b4-", Namespace:"calico-system", SelfLink:"", UID:"50eb8268-c998-4f07-806d-439bfe8fc2c1", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"584cfc84b4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-083aa5303b", ContainerID:"01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64", Pod:"whisker-584cfc84b4-46ld2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.35.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali1a0a5678557", MAC:"5a:fe:29:63:db:46", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:45.611620 containerd[1908]: 2025-08-13 00:34:45.609 [INFO][4680] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" Namespace="calico-system" Pod="whisker-584cfc84b4-46ld2" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-whisker--584cfc84b4--46ld2-eth0" Aug 13 00:34:45.619034 containerd[1908]: time="2025-08-13T00:34:45.619011457Z" level=info msg="connecting to shim 01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64" address="unix:///run/containerd/s/3ce2e59b79739358264fed89bfa961b912b91df58d267164d505be7c61f650eb" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:45.647635 systemd[1]: Started cri-containerd-01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64.scope - libcontainer container 01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64. Aug 13 00:34:45.733027 containerd[1908]: time="2025-08-13T00:34:45.732979673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-584cfc84b4-46ld2,Uid:50eb8268-c998-4f07-806d-439bfe8fc2c1,Namespace:calico-system,Attempt:0,} returns sandbox id \"01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64\"" Aug 13 00:34:45.733694 containerd[1908]: time="2025-08-13T00:34:45.733680683Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Aug 13 00:34:46.073297 systemd-networkd[1816]: vxlan.calico: Link UP Aug 13 00:34:46.073300 systemd-networkd[1816]: vxlan.calico: Gained carrier Aug 13 00:34:46.257924 containerd[1908]: time="2025-08-13T00:34:46.257868736Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\" id:\"a4fc1c8169355058a82cc9daf6f8296d4c463565eaccf33ad86a2633de4c3dfa\" pid:5015 exit_status:1 exited_at:{seconds:1755045286 nanos:257655272}" Aug 13 00:34:46.945538 systemd-networkd[1816]: cali1a0a5678557: Gained IPv6LL Aug 13 00:34:47.099954 containerd[1908]: time="2025-08-13T00:34:47.099900525Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:47.100150 containerd[1908]: time="2025-08-13T00:34:47.100064911Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Aug 13 00:34:47.100405 containerd[1908]: time="2025-08-13T00:34:47.100362900Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:47.101320 containerd[1908]: time="2025-08-13T00:34:47.101275694Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:47.102010 containerd[1908]: time="2025-08-13T00:34:47.101971407Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 1.368272732s" Aug 13 00:34:47.102010 containerd[1908]: time="2025-08-13T00:34:47.101984613Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Aug 13 00:34:47.103555 containerd[1908]: time="2025-08-13T00:34:47.103542551Z" level=info msg="CreateContainer within sandbox \"01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Aug 13 00:34:47.106126 containerd[1908]: time="2025-08-13T00:34:47.106098653Z" level=info msg="Container 49e737dd843a015bd00f02f9569dcaae2fb6eff10a61be3b6446c667966ad82c: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:47.119372 kubelet[3254]: I0813 00:34:47.119329 3254 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb462351-c546-4721-840a-b37dd9e5027a" path="/var/lib/kubelet/pods/eb462351-c546-4721-840a-b37dd9e5027a/volumes" Aug 13 00:34:47.125838 containerd[1908]: time="2025-08-13T00:34:47.125795586Z" level=info msg="CreateContainer within sandbox \"01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"49e737dd843a015bd00f02f9569dcaae2fb6eff10a61be3b6446c667966ad82c\"" Aug 13 00:34:47.126022 containerd[1908]: time="2025-08-13T00:34:47.125973248Z" level=info msg="StartContainer for \"49e737dd843a015bd00f02f9569dcaae2fb6eff10a61be3b6446c667966ad82c\"" Aug 13 00:34:47.126506 containerd[1908]: time="2025-08-13T00:34:47.126471530Z" level=info msg="connecting to shim 49e737dd843a015bd00f02f9569dcaae2fb6eff10a61be3b6446c667966ad82c" address="unix:///run/containerd/s/3ce2e59b79739358264fed89bfa961b912b91df58d267164d505be7c61f650eb" protocol=ttrpc version=3 Aug 13 00:34:47.150465 systemd[1]: Started cri-containerd-49e737dd843a015bd00f02f9569dcaae2fb6eff10a61be3b6446c667966ad82c.scope - libcontainer container 49e737dd843a015bd00f02f9569dcaae2fb6eff10a61be3b6446c667966ad82c. Aug 13 00:34:47.187297 containerd[1908]: time="2025-08-13T00:34:47.187275186Z" level=info msg="StartContainer for \"49e737dd843a015bd00f02f9569dcaae2fb6eff10a61be3b6446c667966ad82c\" returns successfully" Aug 13 00:34:47.187857 containerd[1908]: time="2025-08-13T00:34:47.187840328Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Aug 13 00:34:48.097484 systemd-networkd[1816]: vxlan.calico: Gained IPv6LL Aug 13 00:34:49.149390 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2803688621.mount: Deactivated successfully. Aug 13 00:34:49.154279 containerd[1908]: time="2025-08-13T00:34:49.154226495Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:49.154568 containerd[1908]: time="2025-08-13T00:34:49.154518806Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Aug 13 00:34:49.154982 containerd[1908]: time="2025-08-13T00:34:49.154943328Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:49.155877 containerd[1908]: time="2025-08-13T00:34:49.155837134Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:49.156242 containerd[1908]: time="2025-08-13T00:34:49.156225101Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 1.968363748s" Aug 13 00:34:49.156269 containerd[1908]: time="2025-08-13T00:34:49.156240861Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Aug 13 00:34:49.157634 containerd[1908]: time="2025-08-13T00:34:49.157593034Z" level=info msg="CreateContainer within sandbox \"01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Aug 13 00:34:49.160188 containerd[1908]: time="2025-08-13T00:34:49.160146562Z" level=info msg="Container 001d5e5e33c33470603b63af001e4f27f73f037198594c9ab26a48c25f4c8d31: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:49.163016 containerd[1908]: time="2025-08-13T00:34:49.162974210Z" level=info msg="CreateContainer within sandbox \"01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"001d5e5e33c33470603b63af001e4f27f73f037198594c9ab26a48c25f4c8d31\"" Aug 13 00:34:49.163286 containerd[1908]: time="2025-08-13T00:34:49.163231461Z" level=info msg="StartContainer for \"001d5e5e33c33470603b63af001e4f27f73f037198594c9ab26a48c25f4c8d31\"" Aug 13 00:34:49.163768 containerd[1908]: time="2025-08-13T00:34:49.163728233Z" level=info msg="connecting to shim 001d5e5e33c33470603b63af001e4f27f73f037198594c9ab26a48c25f4c8d31" address="unix:///run/containerd/s/3ce2e59b79739358264fed89bfa961b912b91df58d267164d505be7c61f650eb" protocol=ttrpc version=3 Aug 13 00:34:49.181371 systemd[1]: Started cri-containerd-001d5e5e33c33470603b63af001e4f27f73f037198594c9ab26a48c25f4c8d31.scope - libcontainer container 001d5e5e33c33470603b63af001e4f27f73f037198594c9ab26a48c25f4c8d31. Aug 13 00:34:49.212668 containerd[1908]: time="2025-08-13T00:34:49.212645948Z" level=info msg="StartContainer for \"001d5e5e33c33470603b63af001e4f27f73f037198594c9ab26a48c25f4c8d31\" returns successfully" Aug 13 00:34:49.227896 kubelet[3254]: I0813 00:34:49.227864 3254 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-584cfc84b4-46ld2" podStartSLOduration=0.804822824 podStartE2EDuration="4.227854921s" podCreationTimestamp="2025-08-13 00:34:45 +0000 UTC" firstStartedPulling="2025-08-13 00:34:45.733560879 +0000 UTC m=+28.661901053" lastFinishedPulling="2025-08-13 00:34:49.156592976 +0000 UTC m=+32.084933150" observedRunningTime="2025-08-13 00:34:49.227490315 +0000 UTC m=+32.155830493" watchObservedRunningTime="2025-08-13 00:34:49.227854921 +0000 UTC m=+32.156195095" Aug 13 00:34:53.120155 containerd[1908]: time="2025-08-13T00:34:53.120047395Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-768vv,Uid:baea1b6a-90df-43be-a718-a692f968b764,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:53.121047 containerd[1908]: time="2025-08-13T00:34:53.120229475Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-rd2g6,Uid:dda6ea77-d0ff-457f-8498-7faa0278932a,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:53.121047 containerd[1908]: time="2025-08-13T00:34:53.120504882Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-fg4zz,Uid:5edf5ee0-2bb3-4fc8-a5b6-81be33ed6600,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:53.169590 systemd-networkd[1816]: calicc562f40c3c: Link UP Aug 13 00:34:53.169726 systemd-networkd[1816]: calicc562f40c3c: Gained carrier Aug 13 00:34:53.175204 containerd[1908]: 2025-08-13 00:34:53.139 [INFO][5189] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--a--083aa5303b-k8s-csi--node--driver--768vv-eth0 csi-node-driver- calico-system baea1b6a-90df-43be-a718-a692f968b764 688 0 2025-08-13 00:34:33 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:8967bcb6f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4372.1.0-a-083aa5303b csi-node-driver-768vv eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calicc562f40c3c [] [] }} ContainerID="3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" Namespace="calico-system" Pod="csi-node-driver-768vv" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-csi--node--driver--768vv-" Aug 13 00:34:53.175204 containerd[1908]: 2025-08-13 00:34:53.139 [INFO][5189] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" Namespace="calico-system" Pod="csi-node-driver-768vv" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-csi--node--driver--768vv-eth0" Aug 13 00:34:53.175204 containerd[1908]: 2025-08-13 00:34:53.152 [INFO][5253] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" HandleID="k8s-pod-network.3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" Workload="ci--4372.1.0--a--083aa5303b-k8s-csi--node--driver--768vv-eth0" Aug 13 00:34:53.175324 containerd[1908]: 2025-08-13 00:34:53.152 [INFO][5253] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" HandleID="k8s-pod-network.3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" Workload="ci--4372.1.0--a--083aa5303b-k8s-csi--node--driver--768vv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000799e60), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-a-083aa5303b", "pod":"csi-node-driver-768vv", "timestamp":"2025-08-13 00:34:53.152773767 +0000 UTC"}, Hostname:"ci-4372.1.0-a-083aa5303b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:34:53.175324 containerd[1908]: 2025-08-13 00:34:53.152 [INFO][5253] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:34:53.175324 containerd[1908]: 2025-08-13 00:34:53.152 [INFO][5253] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:34:53.175324 containerd[1908]: 2025-08-13 00:34:53.152 [INFO][5253] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-a-083aa5303b' Aug 13 00:34:53.175324 containerd[1908]: 2025-08-13 00:34:53.156 [INFO][5253] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.175324 containerd[1908]: 2025-08-13 00:34:53.159 [INFO][5253] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.175324 containerd[1908]: 2025-08-13 00:34:53.160 [INFO][5253] ipam/ipam.go 511: Trying affinity for 192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.175324 containerd[1908]: 2025-08-13 00:34:53.161 [INFO][5253] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.175324 containerd[1908]: 2025-08-13 00:34:53.162 [INFO][5253] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.175484 containerd[1908]: 2025-08-13 00:34:53.162 [INFO][5253] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.64/26 handle="k8s-pod-network.3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.175484 containerd[1908]: 2025-08-13 00:34:53.163 [INFO][5253] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716 Aug 13 00:34:53.175484 containerd[1908]: 2025-08-13 00:34:53.165 [INFO][5253] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.64/26 handle="k8s-pod-network.3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.175484 containerd[1908]: 2025-08-13 00:34:53.167 [INFO][5253] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.66/26] block=192.168.35.64/26 handle="k8s-pod-network.3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.175484 containerd[1908]: 2025-08-13 00:34:53.167 [INFO][5253] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.66/26] handle="k8s-pod-network.3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.175484 containerd[1908]: 2025-08-13 00:34:53.167 [INFO][5253] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:34:53.175484 containerd[1908]: 2025-08-13 00:34:53.167 [INFO][5253] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.66/26] IPv6=[] ContainerID="3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" HandleID="k8s-pod-network.3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" Workload="ci--4372.1.0--a--083aa5303b-k8s-csi--node--driver--768vv-eth0" Aug 13 00:34:53.175627 containerd[1908]: 2025-08-13 00:34:53.168 [INFO][5189] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" Namespace="calico-system" Pod="csi-node-driver-768vv" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-csi--node--driver--768vv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--083aa5303b-k8s-csi--node--driver--768vv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"baea1b6a-90df-43be-a718-a692f968b764", ResourceVersion:"688", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-083aa5303b", ContainerID:"", Pod:"csi-node-driver-768vv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.35.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicc562f40c3c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:53.175666 containerd[1908]: 2025-08-13 00:34:53.168 [INFO][5189] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.66/32] ContainerID="3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" Namespace="calico-system" Pod="csi-node-driver-768vv" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-csi--node--driver--768vv-eth0" Aug 13 00:34:53.175666 containerd[1908]: 2025-08-13 00:34:53.168 [INFO][5189] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicc562f40c3c ContainerID="3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" Namespace="calico-system" Pod="csi-node-driver-768vv" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-csi--node--driver--768vv-eth0" Aug 13 00:34:53.175666 containerd[1908]: 2025-08-13 00:34:53.169 [INFO][5189] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" Namespace="calico-system" Pod="csi-node-driver-768vv" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-csi--node--driver--768vv-eth0" Aug 13 00:34:53.175714 containerd[1908]: 2025-08-13 00:34:53.170 [INFO][5189] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" Namespace="calico-system" Pod="csi-node-driver-768vv" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-csi--node--driver--768vv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--083aa5303b-k8s-csi--node--driver--768vv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"baea1b6a-90df-43be-a718-a692f968b764", ResourceVersion:"688", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-083aa5303b", ContainerID:"3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716", Pod:"csi-node-driver-768vv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.35.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicc562f40c3c", MAC:"b2:a8:ae:24:0c:41", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:53.175750 containerd[1908]: 2025-08-13 00:34:53.174 [INFO][5189] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" Namespace="calico-system" Pod="csi-node-driver-768vv" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-csi--node--driver--768vv-eth0" Aug 13 00:34:53.182837 containerd[1908]: time="2025-08-13T00:34:53.182782276Z" level=info msg="connecting to shim 3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716" address="unix:///run/containerd/s/1f8d31810d5749fcaba22db772beadfeb79ecaa2ecd0dc033106108745a8f81f" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:53.208418 systemd[1]: Started cri-containerd-3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716.scope - libcontainer container 3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716. Aug 13 00:34:53.219778 containerd[1908]: time="2025-08-13T00:34:53.219760912Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-768vv,Uid:baea1b6a-90df-43be-a718-a692f968b764,Namespace:calico-system,Attempt:0,} returns sandbox id \"3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716\"" Aug 13 00:34:53.220458 containerd[1908]: time="2025-08-13T00:34:53.220420205Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Aug 13 00:34:53.285424 systemd-networkd[1816]: calif9aab2150c3: Link UP Aug 13 00:34:53.286321 systemd-networkd[1816]: calif9aab2150c3: Gained carrier Aug 13 00:34:53.308026 containerd[1908]: 2025-08-13 00:34:53.140 [INFO][5193] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--a--083aa5303b-k8s-goldmane--768f4c5c69--rd2g6-eth0 goldmane-768f4c5c69- calico-system dda6ea77-d0ff-457f-8498-7faa0278932a 798 0 2025-08-13 00:34:32 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:768f4c5c69 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4372.1.0-a-083aa5303b goldmane-768f4c5c69-rd2g6 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calif9aab2150c3 [] [] }} ContainerID="8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" Namespace="calico-system" Pod="goldmane-768f4c5c69-rd2g6" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-goldmane--768f4c5c69--rd2g6-" Aug 13 00:34:53.308026 containerd[1908]: 2025-08-13 00:34:53.140 [INFO][5193] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" Namespace="calico-system" Pod="goldmane-768f4c5c69-rd2g6" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-goldmane--768f4c5c69--rd2g6-eth0" Aug 13 00:34:53.308026 containerd[1908]: 2025-08-13 00:34:53.153 [INFO][5259] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" HandleID="k8s-pod-network.8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" Workload="ci--4372.1.0--a--083aa5303b-k8s-goldmane--768f4c5c69--rd2g6-eth0" Aug 13 00:34:53.308675 containerd[1908]: 2025-08-13 00:34:53.153 [INFO][5259] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" HandleID="k8s-pod-network.8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" Workload="ci--4372.1.0--a--083aa5303b-k8s-goldmane--768f4c5c69--rd2g6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00026f730), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-a-083aa5303b", "pod":"goldmane-768f4c5c69-rd2g6", "timestamp":"2025-08-13 00:34:53.153060588 +0000 UTC"}, Hostname:"ci-4372.1.0-a-083aa5303b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:34:53.308675 containerd[1908]: 2025-08-13 00:34:53.153 [INFO][5259] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:34:53.308675 containerd[1908]: 2025-08-13 00:34:53.167 [INFO][5259] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:34:53.308675 containerd[1908]: 2025-08-13 00:34:53.167 [INFO][5259] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-a-083aa5303b' Aug 13 00:34:53.308675 containerd[1908]: 2025-08-13 00:34:53.258 [INFO][5259] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.308675 containerd[1908]: 2025-08-13 00:34:53.261 [INFO][5259] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.308675 containerd[1908]: 2025-08-13 00:34:53.264 [INFO][5259] ipam/ipam.go 511: Trying affinity for 192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.308675 containerd[1908]: 2025-08-13 00:34:53.266 [INFO][5259] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.308675 containerd[1908]: 2025-08-13 00:34:53.268 [INFO][5259] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.309650 containerd[1908]: 2025-08-13 00:34:53.268 [INFO][5259] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.64/26 handle="k8s-pod-network.8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.309650 containerd[1908]: 2025-08-13 00:34:53.269 [INFO][5259] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320 Aug 13 00:34:53.309650 containerd[1908]: 2025-08-13 00:34:53.273 [INFO][5259] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.64/26 handle="k8s-pod-network.8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.309650 containerd[1908]: 2025-08-13 00:34:53.278 [INFO][5259] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.67/26] block=192.168.35.64/26 handle="k8s-pod-network.8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.309650 containerd[1908]: 2025-08-13 00:34:53.279 [INFO][5259] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.67/26] handle="k8s-pod-network.8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.309650 containerd[1908]: 2025-08-13 00:34:53.279 [INFO][5259] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:34:53.309650 containerd[1908]: 2025-08-13 00:34:53.279 [INFO][5259] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.67/26] IPv6=[] ContainerID="8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" HandleID="k8s-pod-network.8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" Workload="ci--4372.1.0--a--083aa5303b-k8s-goldmane--768f4c5c69--rd2g6-eth0" Aug 13 00:34:53.310105 containerd[1908]: 2025-08-13 00:34:53.281 [INFO][5193] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" Namespace="calico-system" Pod="goldmane-768f4c5c69-rd2g6" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-goldmane--768f4c5c69--rd2g6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--083aa5303b-k8s-goldmane--768f4c5c69--rd2g6-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"dda6ea77-d0ff-457f-8498-7faa0278932a", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-083aa5303b", ContainerID:"", Pod:"goldmane-768f4c5c69-rd2g6", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.35.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calif9aab2150c3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:53.310105 containerd[1908]: 2025-08-13 00:34:53.281 [INFO][5193] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.67/32] ContainerID="8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" Namespace="calico-system" Pod="goldmane-768f4c5c69-rd2g6" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-goldmane--768f4c5c69--rd2g6-eth0" Aug 13 00:34:53.310412 containerd[1908]: 2025-08-13 00:34:53.282 [INFO][5193] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif9aab2150c3 ContainerID="8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" Namespace="calico-system" Pod="goldmane-768f4c5c69-rd2g6" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-goldmane--768f4c5c69--rd2g6-eth0" Aug 13 00:34:53.310412 containerd[1908]: 2025-08-13 00:34:53.286 [INFO][5193] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" Namespace="calico-system" Pod="goldmane-768f4c5c69-rd2g6" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-goldmane--768f4c5c69--rd2g6-eth0" Aug 13 00:34:53.310552 containerd[1908]: 2025-08-13 00:34:53.289 [INFO][5193] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" Namespace="calico-system" Pod="goldmane-768f4c5c69-rd2g6" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-goldmane--768f4c5c69--rd2g6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--083aa5303b-k8s-goldmane--768f4c5c69--rd2g6-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"dda6ea77-d0ff-457f-8498-7faa0278932a", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-083aa5303b", ContainerID:"8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320", Pod:"goldmane-768f4c5c69-rd2g6", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.35.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calif9aab2150c3", MAC:"2a:12:e4:98:07:c5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:53.310717 containerd[1908]: 2025-08-13 00:34:53.304 [INFO][5193] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" Namespace="calico-system" Pod="goldmane-768f4c5c69-rd2g6" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-goldmane--768f4c5c69--rd2g6-eth0" Aug 13 00:34:53.319169 containerd[1908]: time="2025-08-13T00:34:53.319116090Z" level=info msg="connecting to shim 8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320" address="unix:///run/containerd/s/ce4ca70896ced164726c4c4da1e946e0346e6c6b9b70356db7f6928a2f677ff2" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:53.335366 systemd[1]: Started cri-containerd-8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320.scope - libcontainer container 8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320. Aug 13 00:34:53.361027 containerd[1908]: time="2025-08-13T00:34:53.361005415Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-rd2g6,Uid:dda6ea77-d0ff-457f-8498-7faa0278932a,Namespace:calico-system,Attempt:0,} returns sandbox id \"8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320\"" Aug 13 00:34:53.373140 systemd-networkd[1816]: cali5afaad16572: Link UP Aug 13 00:34:53.373330 systemd-networkd[1816]: cali5afaad16572: Gained carrier Aug 13 00:34:53.378843 containerd[1908]: 2025-08-13 00:34:53.140 [INFO][5205] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--fg4zz-eth0 coredns-674b8bbfcf- kube-system 5edf5ee0-2bb3-4fc8-a5b6-81be33ed6600 793 0 2025-08-13 00:34:23 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372.1.0-a-083aa5303b coredns-674b8bbfcf-fg4zz eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali5afaad16572 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" Namespace="kube-system" Pod="coredns-674b8bbfcf-fg4zz" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--fg4zz-" Aug 13 00:34:53.378843 containerd[1908]: 2025-08-13 00:34:53.140 [INFO][5205] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" Namespace="kube-system" Pod="coredns-674b8bbfcf-fg4zz" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--fg4zz-eth0" Aug 13 00:34:53.378843 containerd[1908]: 2025-08-13 00:34:53.152 [INFO][5261] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" HandleID="k8s-pod-network.8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" Workload="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--fg4zz-eth0" Aug 13 00:34:53.378967 containerd[1908]: 2025-08-13 00:34:53.152 [INFO][5261] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" HandleID="k8s-pod-network.8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" Workload="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--fg4zz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f730), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372.1.0-a-083aa5303b", "pod":"coredns-674b8bbfcf-fg4zz", "timestamp":"2025-08-13 00:34:53.152767955 +0000 UTC"}, Hostname:"ci-4372.1.0-a-083aa5303b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:34:53.378967 containerd[1908]: 2025-08-13 00:34:53.153 [INFO][5261] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:34:53.378967 containerd[1908]: 2025-08-13 00:34:53.279 [INFO][5261] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:34:53.378967 containerd[1908]: 2025-08-13 00:34:53.279 [INFO][5261] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-a-083aa5303b' Aug 13 00:34:53.378967 containerd[1908]: 2025-08-13 00:34:53.358 [INFO][5261] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.378967 containerd[1908]: 2025-08-13 00:34:53.361 [INFO][5261] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.378967 containerd[1908]: 2025-08-13 00:34:53.364 [INFO][5261] ipam/ipam.go 511: Trying affinity for 192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.378967 containerd[1908]: 2025-08-13 00:34:53.365 [INFO][5261] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.378967 containerd[1908]: 2025-08-13 00:34:53.366 [INFO][5261] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.379134 containerd[1908]: 2025-08-13 00:34:53.366 [INFO][5261] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.64/26 handle="k8s-pod-network.8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.379134 containerd[1908]: 2025-08-13 00:34:53.367 [INFO][5261] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4 Aug 13 00:34:53.379134 containerd[1908]: 2025-08-13 00:34:53.368 [INFO][5261] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.64/26 handle="k8s-pod-network.8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.379134 containerd[1908]: 2025-08-13 00:34:53.371 [INFO][5261] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.68/26] block=192.168.35.64/26 handle="k8s-pod-network.8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.379134 containerd[1908]: 2025-08-13 00:34:53.371 [INFO][5261] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.68/26] handle="k8s-pod-network.8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:53.379134 containerd[1908]: 2025-08-13 00:34:53.371 [INFO][5261] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:34:53.379134 containerd[1908]: 2025-08-13 00:34:53.371 [INFO][5261] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.68/26] IPv6=[] ContainerID="8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" HandleID="k8s-pod-network.8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" Workload="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--fg4zz-eth0" Aug 13 00:34:53.379249 containerd[1908]: 2025-08-13 00:34:53.372 [INFO][5205] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" Namespace="kube-system" Pod="coredns-674b8bbfcf-fg4zz" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--fg4zz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--fg4zz-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"5edf5ee0-2bb3-4fc8-a5b6-81be33ed6600", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-083aa5303b", ContainerID:"", Pod:"coredns-674b8bbfcf-fg4zz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5afaad16572", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:53.379249 containerd[1908]: 2025-08-13 00:34:53.372 [INFO][5205] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.68/32] ContainerID="8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" Namespace="kube-system" Pod="coredns-674b8bbfcf-fg4zz" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--fg4zz-eth0" Aug 13 00:34:53.379249 containerd[1908]: 2025-08-13 00:34:53.372 [INFO][5205] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5afaad16572 ContainerID="8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" Namespace="kube-system" Pod="coredns-674b8bbfcf-fg4zz" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--fg4zz-eth0" Aug 13 00:34:53.379249 containerd[1908]: 2025-08-13 00:34:53.373 [INFO][5205] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" Namespace="kube-system" Pod="coredns-674b8bbfcf-fg4zz" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--fg4zz-eth0" Aug 13 00:34:53.379249 containerd[1908]: 2025-08-13 00:34:53.374 [INFO][5205] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" Namespace="kube-system" Pod="coredns-674b8bbfcf-fg4zz" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--fg4zz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--fg4zz-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"5edf5ee0-2bb3-4fc8-a5b6-81be33ed6600", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-083aa5303b", ContainerID:"8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4", Pod:"coredns-674b8bbfcf-fg4zz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5afaad16572", MAC:"0a:ce:ce:39:8d:06", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:53.379249 containerd[1908]: 2025-08-13 00:34:53.377 [INFO][5205] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" Namespace="kube-system" Pod="coredns-674b8bbfcf-fg4zz" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--fg4zz-eth0" Aug 13 00:34:53.393957 containerd[1908]: time="2025-08-13T00:34:53.393905690Z" level=info msg="connecting to shim 8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4" address="unix:///run/containerd/s/64ef22b1ab96eeaac718ce56298a45a66d32da4552392d96b9847e9af559a905" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:53.418404 systemd[1]: Started cri-containerd-8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4.scope - libcontainer container 8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4. Aug 13 00:34:53.460979 containerd[1908]: time="2025-08-13T00:34:53.460958469Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-fg4zz,Uid:5edf5ee0-2bb3-4fc8-a5b6-81be33ed6600,Namespace:kube-system,Attempt:0,} returns sandbox id \"8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4\"" Aug 13 00:34:53.462630 containerd[1908]: time="2025-08-13T00:34:53.462617672Z" level=info msg="CreateContainer within sandbox \"8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 13 00:34:53.465773 containerd[1908]: time="2025-08-13T00:34:53.465737536Z" level=info msg="Container c1554f1be4e2c9638ceae8928fdfe6832e2f75516af2027302e99b6078772015: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:53.467832 containerd[1908]: time="2025-08-13T00:34:53.467792375Z" level=info msg="CreateContainer within sandbox \"8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c1554f1be4e2c9638ceae8928fdfe6832e2f75516af2027302e99b6078772015\"" Aug 13 00:34:53.468068 containerd[1908]: time="2025-08-13T00:34:53.468022971Z" level=info msg="StartContainer for \"c1554f1be4e2c9638ceae8928fdfe6832e2f75516af2027302e99b6078772015\"" Aug 13 00:34:53.468454 containerd[1908]: time="2025-08-13T00:34:53.468420169Z" level=info msg="connecting to shim c1554f1be4e2c9638ceae8928fdfe6832e2f75516af2027302e99b6078772015" address="unix:///run/containerd/s/64ef22b1ab96eeaac718ce56298a45a66d32da4552392d96b9847e9af559a905" protocol=ttrpc version=3 Aug 13 00:34:53.486358 systemd[1]: Started cri-containerd-c1554f1be4e2c9638ceae8928fdfe6832e2f75516af2027302e99b6078772015.scope - libcontainer container c1554f1be4e2c9638ceae8928fdfe6832e2f75516af2027302e99b6078772015. Aug 13 00:34:53.499636 containerd[1908]: time="2025-08-13T00:34:53.499610717Z" level=info msg="StartContainer for \"c1554f1be4e2c9638ceae8928fdfe6832e2f75516af2027302e99b6078772015\" returns successfully" Aug 13 00:34:54.240930 kubelet[3254]: I0813 00:34:54.240895 3254 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-fg4zz" podStartSLOduration=31.240884743 podStartE2EDuration="31.240884743s" podCreationTimestamp="2025-08-13 00:34:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:34:54.240626847 +0000 UTC m=+37.168967029" watchObservedRunningTime="2025-08-13 00:34:54.240884743 +0000 UTC m=+37.169224919" Aug 13 00:34:54.625244 systemd-networkd[1816]: cali5afaad16572: Gained IPv6LL Aug 13 00:34:54.666286 containerd[1908]: time="2025-08-13T00:34:54.666259062Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:54.666544 containerd[1908]: time="2025-08-13T00:34:54.666386120Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Aug 13 00:34:54.666777 containerd[1908]: time="2025-08-13T00:34:54.666764707Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:54.667580 containerd[1908]: time="2025-08-13T00:34:54.667567531Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:54.667939 containerd[1908]: time="2025-08-13T00:34:54.667927930Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 1.44749028s" Aug 13 00:34:54.667961 containerd[1908]: time="2025-08-13T00:34:54.667941798Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Aug 13 00:34:54.668587 containerd[1908]: time="2025-08-13T00:34:54.668574264Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Aug 13 00:34:54.669546 containerd[1908]: time="2025-08-13T00:34:54.669514736Z" level=info msg="CreateContainer within sandbox \"3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Aug 13 00:34:54.673203 containerd[1908]: time="2025-08-13T00:34:54.673167368Z" level=info msg="Container 6bce9b05590ef8329f6a718ff8d607a5de3a94617cdd442213f4d674eb9e049f: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:54.676782 containerd[1908]: time="2025-08-13T00:34:54.676730699Z" level=info msg="CreateContainer within sandbox \"3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"6bce9b05590ef8329f6a718ff8d607a5de3a94617cdd442213f4d674eb9e049f\"" Aug 13 00:34:54.676999 containerd[1908]: time="2025-08-13T00:34:54.676943562Z" level=info msg="StartContainer for \"6bce9b05590ef8329f6a718ff8d607a5de3a94617cdd442213f4d674eb9e049f\"" Aug 13 00:34:54.677814 containerd[1908]: time="2025-08-13T00:34:54.677773066Z" level=info msg="connecting to shim 6bce9b05590ef8329f6a718ff8d607a5de3a94617cdd442213f4d674eb9e049f" address="unix:///run/containerd/s/1f8d31810d5749fcaba22db772beadfeb79ecaa2ecd0dc033106108745a8f81f" protocol=ttrpc version=3 Aug 13 00:34:54.696585 systemd[1]: Started cri-containerd-6bce9b05590ef8329f6a718ff8d607a5de3a94617cdd442213f4d674eb9e049f.scope - libcontainer container 6bce9b05590ef8329f6a718ff8d607a5de3a94617cdd442213f4d674eb9e049f. Aug 13 00:34:54.756207 containerd[1908]: time="2025-08-13T00:34:54.756157989Z" level=info msg="StartContainer for \"6bce9b05590ef8329f6a718ff8d607a5de3a94617cdd442213f4d674eb9e049f\" returns successfully" Aug 13 00:34:55.073628 systemd-networkd[1816]: calicc562f40c3c: Gained IPv6LL Aug 13 00:34:55.119244 containerd[1908]: time="2025-08-13T00:34:55.119205302Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-856b9ff84f-hvjbp,Uid:6ac70178-1af5-4be2-929d-3b8bf392fd61,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:34:55.119363 containerd[1908]: time="2025-08-13T00:34:55.119291142Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-85df7c9ff6-mpjsz,Uid:f0f2e0f9-90cf-41db-b550-f165e6260ca8,Namespace:calico-system,Attempt:0,}" Aug 13 00:34:55.119363 containerd[1908]: time="2025-08-13T00:34:55.119220301Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-98gch,Uid:a319200b-9c2f-4e26-983f-0ba93ef3f84c,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:55.177080 systemd-networkd[1816]: cali8a75be5c54a: Link UP Aug 13 00:34:55.177237 systemd-networkd[1816]: cali8a75be5c54a: Gained carrier Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.141 [INFO][5570] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--98gch-eth0 coredns-674b8bbfcf- kube-system a319200b-9c2f-4e26-983f-0ba93ef3f84c 794 0 2025-08-13 00:34:23 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372.1.0-a-083aa5303b coredns-674b8bbfcf-98gch eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8a75be5c54a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" Namespace="kube-system" Pod="coredns-674b8bbfcf-98gch" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--98gch-" Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.141 [INFO][5570] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" Namespace="kube-system" Pod="coredns-674b8bbfcf-98gch" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--98gch-eth0" Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.154 [INFO][5627] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" HandleID="k8s-pod-network.c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" Workload="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--98gch-eth0" Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.154 [INFO][5627] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" HandleID="k8s-pod-network.c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" Workload="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--98gch-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e7910), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372.1.0-a-083aa5303b", "pod":"coredns-674b8bbfcf-98gch", "timestamp":"2025-08-13 00:34:55.154334696 +0000 UTC"}, Hostname:"ci-4372.1.0-a-083aa5303b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.154 [INFO][5627] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.154 [INFO][5627] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.154 [INFO][5627] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-a-083aa5303b' Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.159 [INFO][5627] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.162 [INFO][5627] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.165 [INFO][5627] ipam/ipam.go 511: Trying affinity for 192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.167 [INFO][5627] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.168 [INFO][5627] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.168 [INFO][5627] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.64/26 handle="k8s-pod-network.c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.169 [INFO][5627] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909 Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.172 [INFO][5627] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.64/26 handle="k8s-pod-network.c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.175 [INFO][5627] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.69/26] block=192.168.35.64/26 handle="k8s-pod-network.c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.175 [INFO][5627] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.69/26] handle="k8s-pod-network.c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.175 [INFO][5627] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:34:55.182706 containerd[1908]: 2025-08-13 00:34:55.175 [INFO][5627] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.69/26] IPv6=[] ContainerID="c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" HandleID="k8s-pod-network.c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" Workload="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--98gch-eth0" Aug 13 00:34:55.183118 containerd[1908]: 2025-08-13 00:34:55.176 [INFO][5570] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" Namespace="kube-system" Pod="coredns-674b8bbfcf-98gch" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--98gch-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--98gch-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"a319200b-9c2f-4e26-983f-0ba93ef3f84c", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-083aa5303b", ContainerID:"", Pod:"coredns-674b8bbfcf-98gch", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8a75be5c54a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:55.183118 containerd[1908]: 2025-08-13 00:34:55.176 [INFO][5570] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.69/32] ContainerID="c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" Namespace="kube-system" Pod="coredns-674b8bbfcf-98gch" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--98gch-eth0" Aug 13 00:34:55.183118 containerd[1908]: 2025-08-13 00:34:55.176 [INFO][5570] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8a75be5c54a ContainerID="c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" Namespace="kube-system" Pod="coredns-674b8bbfcf-98gch" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--98gch-eth0" Aug 13 00:34:55.183118 containerd[1908]: 2025-08-13 00:34:55.177 [INFO][5570] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" Namespace="kube-system" Pod="coredns-674b8bbfcf-98gch" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--98gch-eth0" Aug 13 00:34:55.183118 containerd[1908]: 2025-08-13 00:34:55.177 [INFO][5570] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" Namespace="kube-system" Pod="coredns-674b8bbfcf-98gch" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--98gch-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--98gch-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"a319200b-9c2f-4e26-983f-0ba93ef3f84c", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-083aa5303b", ContainerID:"c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909", Pod:"coredns-674b8bbfcf-98gch", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8a75be5c54a", MAC:"fe:7a:6c:c6:69:85", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:55.183118 containerd[1908]: 2025-08-13 00:34:55.181 [INFO][5570] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" Namespace="kube-system" Pod="coredns-674b8bbfcf-98gch" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-coredns--674b8bbfcf--98gch-eth0" Aug 13 00:34:55.190799 containerd[1908]: time="2025-08-13T00:34:55.190769476Z" level=info msg="connecting to shim c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909" address="unix:///run/containerd/s/193afd48f23ff130bee99ed990c184dfa7669a9a8ef27268c26af7db656789be" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:55.201334 systemd-networkd[1816]: calif9aab2150c3: Gained IPv6LL Aug 13 00:34:55.210323 systemd[1]: Started cri-containerd-c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909.scope - libcontainer container c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909. Aug 13 00:34:55.243331 containerd[1908]: time="2025-08-13T00:34:55.243309779Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-98gch,Uid:a319200b-9c2f-4e26-983f-0ba93ef3f84c,Namespace:kube-system,Attempt:0,} returns sandbox id \"c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909\"" Aug 13 00:34:55.245112 containerd[1908]: time="2025-08-13T00:34:55.245100046Z" level=info msg="CreateContainer within sandbox \"c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 13 00:34:55.248003 containerd[1908]: time="2025-08-13T00:34:55.247961943Z" level=info msg="Container 08d0e36d4a22f1960b42bc5f8ee4ff13a7a751d89be499d6508df3916231eef6: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:55.250335 containerd[1908]: time="2025-08-13T00:34:55.250293088Z" level=info msg="CreateContainer within sandbox \"c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"08d0e36d4a22f1960b42bc5f8ee4ff13a7a751d89be499d6508df3916231eef6\"" Aug 13 00:34:55.250487 containerd[1908]: time="2025-08-13T00:34:55.250473386Z" level=info msg="StartContainer for \"08d0e36d4a22f1960b42bc5f8ee4ff13a7a751d89be499d6508df3916231eef6\"" Aug 13 00:34:55.250888 containerd[1908]: time="2025-08-13T00:34:55.250878161Z" level=info msg="connecting to shim 08d0e36d4a22f1960b42bc5f8ee4ff13a7a751d89be499d6508df3916231eef6" address="unix:///run/containerd/s/193afd48f23ff130bee99ed990c184dfa7669a9a8ef27268c26af7db656789be" protocol=ttrpc version=3 Aug 13 00:34:55.266285 systemd[1]: Started cri-containerd-08d0e36d4a22f1960b42bc5f8ee4ff13a7a751d89be499d6508df3916231eef6.scope - libcontainer container 08d0e36d4a22f1960b42bc5f8ee4ff13a7a751d89be499d6508df3916231eef6. Aug 13 00:34:55.275183 systemd-networkd[1816]: calid0256a94c3f: Link UP Aug 13 00:34:55.275370 systemd-networkd[1816]: calid0256a94c3f: Gained carrier Aug 13 00:34:55.280730 containerd[1908]: time="2025-08-13T00:34:55.280708768Z" level=info msg="StartContainer for \"08d0e36d4a22f1960b42bc5f8ee4ff13a7a751d89be499d6508df3916231eef6\" returns successfully" Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.142 [INFO][5553] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--hvjbp-eth0 calico-apiserver-856b9ff84f- calico-apiserver 6ac70178-1af5-4be2-929d-3b8bf392fd61 797 0 2025-08-13 00:34:30 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:856b9ff84f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372.1.0-a-083aa5303b calico-apiserver-856b9ff84f-hvjbp eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid0256a94c3f [] [] }} ContainerID="d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" Namespace="calico-apiserver" Pod="calico-apiserver-856b9ff84f-hvjbp" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--hvjbp-" Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.142 [INFO][5553] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" Namespace="calico-apiserver" Pod="calico-apiserver-856b9ff84f-hvjbp" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--hvjbp-eth0" Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.154 [INFO][5629] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" HandleID="k8s-pod-network.d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" Workload="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--hvjbp-eth0" Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.154 [INFO][5629] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" HandleID="k8s-pod-network.d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" Workload="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--hvjbp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001386b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372.1.0-a-083aa5303b", "pod":"calico-apiserver-856b9ff84f-hvjbp", "timestamp":"2025-08-13 00:34:55.154343182 +0000 UTC"}, Hostname:"ci-4372.1.0-a-083aa5303b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.154 [INFO][5629] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.175 [INFO][5629] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.175 [INFO][5629] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-a-083aa5303b' Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.260 [INFO][5629] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.262 [INFO][5629] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.265 [INFO][5629] ipam/ipam.go 511: Trying affinity for 192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.266 [INFO][5629] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.267 [INFO][5629] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.267 [INFO][5629] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.64/26 handle="k8s-pod-network.d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.268 [INFO][5629] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3 Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.270 [INFO][5629] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.64/26 handle="k8s-pod-network.d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.273 [INFO][5629] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.70/26] block=192.168.35.64/26 handle="k8s-pod-network.d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.273 [INFO][5629] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.70/26] handle="k8s-pod-network.d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.273 [INFO][5629] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:34:55.280808 containerd[1908]: 2025-08-13 00:34:55.273 [INFO][5629] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.70/26] IPv6=[] ContainerID="d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" HandleID="k8s-pod-network.d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" Workload="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--hvjbp-eth0" Aug 13 00:34:55.281165 containerd[1908]: 2025-08-13 00:34:55.274 [INFO][5553] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" Namespace="calico-apiserver" Pod="calico-apiserver-856b9ff84f-hvjbp" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--hvjbp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--hvjbp-eth0", GenerateName:"calico-apiserver-856b9ff84f-", Namespace:"calico-apiserver", SelfLink:"", UID:"6ac70178-1af5-4be2-929d-3b8bf392fd61", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"856b9ff84f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-083aa5303b", ContainerID:"", Pod:"calico-apiserver-856b9ff84f-hvjbp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid0256a94c3f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:55.281165 containerd[1908]: 2025-08-13 00:34:55.274 [INFO][5553] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.70/32] ContainerID="d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" Namespace="calico-apiserver" Pod="calico-apiserver-856b9ff84f-hvjbp" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--hvjbp-eth0" Aug 13 00:34:55.281165 containerd[1908]: 2025-08-13 00:34:55.274 [INFO][5553] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid0256a94c3f ContainerID="d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" Namespace="calico-apiserver" Pod="calico-apiserver-856b9ff84f-hvjbp" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--hvjbp-eth0" Aug 13 00:34:55.281165 containerd[1908]: 2025-08-13 00:34:55.275 [INFO][5553] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" Namespace="calico-apiserver" Pod="calico-apiserver-856b9ff84f-hvjbp" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--hvjbp-eth0" Aug 13 00:34:55.281165 containerd[1908]: 2025-08-13 00:34:55.275 [INFO][5553] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" Namespace="calico-apiserver" Pod="calico-apiserver-856b9ff84f-hvjbp" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--hvjbp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--hvjbp-eth0", GenerateName:"calico-apiserver-856b9ff84f-", Namespace:"calico-apiserver", SelfLink:"", UID:"6ac70178-1af5-4be2-929d-3b8bf392fd61", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"856b9ff84f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-083aa5303b", ContainerID:"d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3", Pod:"calico-apiserver-856b9ff84f-hvjbp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid0256a94c3f", MAC:"06:a1:fc:cd:95:62", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:55.281165 containerd[1908]: 2025-08-13 00:34:55.280 [INFO][5553] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" Namespace="calico-apiserver" Pod="calico-apiserver-856b9ff84f-hvjbp" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--hvjbp-eth0" Aug 13 00:34:55.289711 containerd[1908]: time="2025-08-13T00:34:55.289655250Z" level=info msg="connecting to shim d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3" address="unix:///run/containerd/s/a736199498545efe4ab611201640e5da4c72bfe3cb37f4bee7135f3bb70da186" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:55.309330 systemd[1]: Started cri-containerd-d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3.scope - libcontainer container d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3. Aug 13 00:34:55.334696 containerd[1908]: time="2025-08-13T00:34:55.334619298Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-856b9ff84f-hvjbp,Uid:6ac70178-1af5-4be2-929d-3b8bf392fd61,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3\"" Aug 13 00:34:55.381273 systemd-networkd[1816]: calibf2f59f3c92: Link UP Aug 13 00:34:55.381460 systemd-networkd[1816]: calibf2f59f3c92: Gained carrier Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.141 [INFO][5560] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--a--083aa5303b-k8s-calico--kube--controllers--85df7c9ff6--mpjsz-eth0 calico-kube-controllers-85df7c9ff6- calico-system f0f2e0f9-90cf-41db-b550-f165e6260ca8 796 0 2025-08-13 00:34:33 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:85df7c9ff6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4372.1.0-a-083aa5303b calico-kube-controllers-85df7c9ff6-mpjsz eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calibf2f59f3c92 [] [] }} ContainerID="1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" Namespace="calico-system" Pod="calico-kube-controllers-85df7c9ff6-mpjsz" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--kube--controllers--85df7c9ff6--mpjsz-" Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.141 [INFO][5560] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" Namespace="calico-system" Pod="calico-kube-controllers-85df7c9ff6-mpjsz" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--kube--controllers--85df7c9ff6--mpjsz-eth0" Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.154 [INFO][5626] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" HandleID="k8s-pod-network.1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" Workload="ci--4372.1.0--a--083aa5303b-k8s-calico--kube--controllers--85df7c9ff6--mpjsz-eth0" Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.154 [INFO][5626] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" HandleID="k8s-pod-network.1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" Workload="ci--4372.1.0--a--083aa5303b-k8s-calico--kube--controllers--85df7c9ff6--mpjsz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e7aa0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-a-083aa5303b", "pod":"calico-kube-controllers-85df7c9ff6-mpjsz", "timestamp":"2025-08-13 00:34:55.154829631 +0000 UTC"}, Hostname:"ci-4372.1.0-a-083aa5303b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.154 [INFO][5626] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.273 [INFO][5626] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.273 [INFO][5626] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-a-083aa5303b' Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.361 [INFO][5626] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.364 [INFO][5626] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.367 [INFO][5626] ipam/ipam.go 511: Trying affinity for 192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.369 [INFO][5626] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.370 [INFO][5626] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.370 [INFO][5626] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.64/26 handle="k8s-pod-network.1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.372 [INFO][5626] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1 Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.375 [INFO][5626] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.64/26 handle="k8s-pod-network.1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.378 [INFO][5626] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.71/26] block=192.168.35.64/26 handle="k8s-pod-network.1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.378 [INFO][5626] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.71/26] handle="k8s-pod-network.1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.378 [INFO][5626] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:34:55.389732 containerd[1908]: 2025-08-13 00:34:55.378 [INFO][5626] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.71/26] IPv6=[] ContainerID="1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" HandleID="k8s-pod-network.1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" Workload="ci--4372.1.0--a--083aa5303b-k8s-calico--kube--controllers--85df7c9ff6--mpjsz-eth0" Aug 13 00:34:55.390469 containerd[1908]: 2025-08-13 00:34:55.380 [INFO][5560] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" Namespace="calico-system" Pod="calico-kube-controllers-85df7c9ff6-mpjsz" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--kube--controllers--85df7c9ff6--mpjsz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--083aa5303b-k8s-calico--kube--controllers--85df7c9ff6--mpjsz-eth0", GenerateName:"calico-kube-controllers-85df7c9ff6-", Namespace:"calico-system", SelfLink:"", UID:"f0f2e0f9-90cf-41db-b550-f165e6260ca8", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"85df7c9ff6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-083aa5303b", ContainerID:"", Pod:"calico-kube-controllers-85df7c9ff6-mpjsz", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.35.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibf2f59f3c92", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:55.390469 containerd[1908]: 2025-08-13 00:34:55.380 [INFO][5560] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.71/32] ContainerID="1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" Namespace="calico-system" Pod="calico-kube-controllers-85df7c9ff6-mpjsz" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--kube--controllers--85df7c9ff6--mpjsz-eth0" Aug 13 00:34:55.390469 containerd[1908]: 2025-08-13 00:34:55.380 [INFO][5560] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibf2f59f3c92 ContainerID="1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" Namespace="calico-system" Pod="calico-kube-controllers-85df7c9ff6-mpjsz" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--kube--controllers--85df7c9ff6--mpjsz-eth0" Aug 13 00:34:55.390469 containerd[1908]: 2025-08-13 00:34:55.381 [INFO][5560] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" Namespace="calico-system" Pod="calico-kube-controllers-85df7c9ff6-mpjsz" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--kube--controllers--85df7c9ff6--mpjsz-eth0" Aug 13 00:34:55.390469 containerd[1908]: 2025-08-13 00:34:55.381 [INFO][5560] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" Namespace="calico-system" Pod="calico-kube-controllers-85df7c9ff6-mpjsz" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--kube--controllers--85df7c9ff6--mpjsz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--083aa5303b-k8s-calico--kube--controllers--85df7c9ff6--mpjsz-eth0", GenerateName:"calico-kube-controllers-85df7c9ff6-", Namespace:"calico-system", SelfLink:"", UID:"f0f2e0f9-90cf-41db-b550-f165e6260ca8", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"85df7c9ff6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-083aa5303b", ContainerID:"1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1", Pod:"calico-kube-controllers-85df7c9ff6-mpjsz", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.35.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibf2f59f3c92", MAC:"f6:1d:7d:a0:6d:4c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:55.390469 containerd[1908]: 2025-08-13 00:34:55.387 [INFO][5560] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" Namespace="calico-system" Pod="calico-kube-controllers-85df7c9ff6-mpjsz" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--kube--controllers--85df7c9ff6--mpjsz-eth0" Aug 13 00:34:55.398364 containerd[1908]: time="2025-08-13T00:34:55.398311083Z" level=info msg="connecting to shim 1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1" address="unix:///run/containerd/s/6b3024d58135dc753e4537f0b3ca0e8ad848ed166ab41e65a65dec3528396bb3" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:55.419400 systemd[1]: Started cri-containerd-1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1.scope - libcontainer container 1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1. Aug 13 00:34:55.452445 containerd[1908]: time="2025-08-13T00:34:55.452396383Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-85df7c9ff6-mpjsz,Uid:f0f2e0f9-90cf-41db-b550-f165e6260ca8,Namespace:calico-system,Attempt:0,} returns sandbox id \"1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1\"" Aug 13 00:34:56.119950 containerd[1908]: time="2025-08-13T00:34:56.119899612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-856b9ff84f-llc5l,Uid:4a08bd94-d4c1-4c85-ac9d-1c68619b4bd3,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:34:56.180376 systemd-networkd[1816]: cali1be3cc3e96d: Link UP Aug 13 00:34:56.180543 systemd-networkd[1816]: cali1be3cc3e96d: Gained carrier Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.141 [INFO][5887] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--llc5l-eth0 calico-apiserver-856b9ff84f- calico-apiserver 4a08bd94-d4c1-4c85-ac9d-1c68619b4bd3 795 0 2025-08-13 00:34:30 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:856b9ff84f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372.1.0-a-083aa5303b calico-apiserver-856b9ff84f-llc5l eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1be3cc3e96d [] [] }} ContainerID="eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" Namespace="calico-apiserver" Pod="calico-apiserver-856b9ff84f-llc5l" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--llc5l-" Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.141 [INFO][5887] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" Namespace="calico-apiserver" Pod="calico-apiserver-856b9ff84f-llc5l" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--llc5l-eth0" Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.155 [INFO][5909] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" HandleID="k8s-pod-network.eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" Workload="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--llc5l-eth0" Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.155 [INFO][5909] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" HandleID="k8s-pod-network.eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" Workload="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--llc5l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f6a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372.1.0-a-083aa5303b", "pod":"calico-apiserver-856b9ff84f-llc5l", "timestamp":"2025-08-13 00:34:56.155156913 +0000 UTC"}, Hostname:"ci-4372.1.0-a-083aa5303b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.155 [INFO][5909] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.155 [INFO][5909] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.155 [INFO][5909] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-a-083aa5303b' Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.160 [INFO][5909] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.164 [INFO][5909] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.168 [INFO][5909] ipam/ipam.go 511: Trying affinity for 192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.169 [INFO][5909] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.171 [INFO][5909] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.64/26 host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.171 [INFO][5909] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.64/26 handle="k8s-pod-network.eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.172 [INFO][5909] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.174 [INFO][5909] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.64/26 handle="k8s-pod-network.eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.178 [INFO][5909] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.72/26] block=192.168.35.64/26 handle="k8s-pod-network.eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.178 [INFO][5909] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.72/26] handle="k8s-pod-network.eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" host="ci-4372.1.0-a-083aa5303b" Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.178 [INFO][5909] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:34:56.185503 containerd[1908]: 2025-08-13 00:34:56.178 [INFO][5909] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.72/26] IPv6=[] ContainerID="eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" HandleID="k8s-pod-network.eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" Workload="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--llc5l-eth0" Aug 13 00:34:56.185873 containerd[1908]: 2025-08-13 00:34:56.179 [INFO][5887] cni-plugin/k8s.go 418: Populated endpoint ContainerID="eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" Namespace="calico-apiserver" Pod="calico-apiserver-856b9ff84f-llc5l" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--llc5l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--llc5l-eth0", GenerateName:"calico-apiserver-856b9ff84f-", Namespace:"calico-apiserver", SelfLink:"", UID:"4a08bd94-d4c1-4c85-ac9d-1c68619b4bd3", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"856b9ff84f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-083aa5303b", ContainerID:"", Pod:"calico-apiserver-856b9ff84f-llc5l", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1be3cc3e96d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:56.185873 containerd[1908]: 2025-08-13 00:34:56.179 [INFO][5887] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.72/32] ContainerID="eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" Namespace="calico-apiserver" Pod="calico-apiserver-856b9ff84f-llc5l" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--llc5l-eth0" Aug 13 00:34:56.185873 containerd[1908]: 2025-08-13 00:34:56.179 [INFO][5887] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1be3cc3e96d ContainerID="eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" Namespace="calico-apiserver" Pod="calico-apiserver-856b9ff84f-llc5l" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--llc5l-eth0" Aug 13 00:34:56.185873 containerd[1908]: 2025-08-13 00:34:56.180 [INFO][5887] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" Namespace="calico-apiserver" Pod="calico-apiserver-856b9ff84f-llc5l" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--llc5l-eth0" Aug 13 00:34:56.185873 containerd[1908]: 2025-08-13 00:34:56.180 [INFO][5887] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" Namespace="calico-apiserver" Pod="calico-apiserver-856b9ff84f-llc5l" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--llc5l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--llc5l-eth0", GenerateName:"calico-apiserver-856b9ff84f-", Namespace:"calico-apiserver", SelfLink:"", UID:"4a08bd94-d4c1-4c85-ac9d-1c68619b4bd3", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 34, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"856b9ff84f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-a-083aa5303b", ContainerID:"eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f", Pod:"calico-apiserver-856b9ff84f-llc5l", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1be3cc3e96d", MAC:"22:69:2b:1c:e5:2b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:34:56.185873 containerd[1908]: 2025-08-13 00:34:56.184 [INFO][5887] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" Namespace="calico-apiserver" Pod="calico-apiserver-856b9ff84f-llc5l" WorkloadEndpoint="ci--4372.1.0--a--083aa5303b-k8s-calico--apiserver--856b9ff84f--llc5l-eth0" Aug 13 00:34:56.193536 containerd[1908]: time="2025-08-13T00:34:56.193495085Z" level=info msg="connecting to shim eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f" address="unix:///run/containerd/s/12055d9c93280f3933d2fcf354d7eda6ba120cd3df7d9dfe3b353f6dcb76cd17" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:56.211330 systemd[1]: Started cri-containerd-eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f.scope - libcontainer container eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f. Aug 13 00:34:56.244421 kubelet[3254]: I0813 00:34:56.244390 3254 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-98gch" podStartSLOduration=33.24437939 podStartE2EDuration="33.24437939s" podCreationTimestamp="2025-08-13 00:34:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:34:56.244072795 +0000 UTC m=+39.172412973" watchObservedRunningTime="2025-08-13 00:34:56.24437939 +0000 UTC m=+39.172719564" Aug 13 00:34:56.351569 containerd[1908]: time="2025-08-13T00:34:56.351519354Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-856b9ff84f-llc5l,Uid:4a08bd94-d4c1-4c85-ac9d-1c68619b4bd3,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f\"" Aug 13 00:34:56.566807 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1959421827.mount: Deactivated successfully. Aug 13 00:34:56.610264 systemd-networkd[1816]: calid0256a94c3f: Gained IPv6LL Aug 13 00:34:56.610445 systemd-networkd[1816]: cali8a75be5c54a: Gained IPv6LL Aug 13 00:34:56.784511 containerd[1908]: time="2025-08-13T00:34:56.784459527Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:56.784680 containerd[1908]: time="2025-08-13T00:34:56.784666415Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Aug 13 00:34:56.785031 containerd[1908]: time="2025-08-13T00:34:56.785020076Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:56.785967 containerd[1908]: time="2025-08-13T00:34:56.785953723Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:56.786383 containerd[1908]: time="2025-08-13T00:34:56.786367167Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 2.117774628s" Aug 13 00:34:56.786428 containerd[1908]: time="2025-08-13T00:34:56.786386090Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Aug 13 00:34:56.786884 containerd[1908]: time="2025-08-13T00:34:56.786872745Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Aug 13 00:34:56.787945 containerd[1908]: time="2025-08-13T00:34:56.787932317Z" level=info msg="CreateContainer within sandbox \"8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Aug 13 00:34:56.790535 containerd[1908]: time="2025-08-13T00:34:56.790496493Z" level=info msg="Container fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:56.793157 containerd[1908]: time="2025-08-13T00:34:56.793144551Z" level=info msg="CreateContainer within sandbox \"8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\"" Aug 13 00:34:56.793456 containerd[1908]: time="2025-08-13T00:34:56.793442500Z" level=info msg="StartContainer for \"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\"" Aug 13 00:34:56.794004 containerd[1908]: time="2025-08-13T00:34:56.793966891Z" level=info msg="connecting to shim fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57" address="unix:///run/containerd/s/ce4ca70896ced164726c4c4da1e946e0346e6c6b9b70356db7f6928a2f677ff2" protocol=ttrpc version=3 Aug 13 00:34:56.810349 systemd[1]: Started cri-containerd-fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57.scope - libcontainer container fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57. Aug 13 00:34:56.837614 containerd[1908]: time="2025-08-13T00:34:56.837525582Z" level=info msg="StartContainer for \"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" returns successfully" Aug 13 00:34:57.249932 systemd-networkd[1816]: calibf2f59f3c92: Gained IPv6LL Aug 13 00:34:57.250268 kubelet[3254]: I0813 00:34:57.250050 3254 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-768f4c5c69-rd2g6" podStartSLOduration=21.824755067 podStartE2EDuration="25.25002686s" podCreationTimestamp="2025-08-13 00:34:32 +0000 UTC" firstStartedPulling="2025-08-13 00:34:53.361555057 +0000 UTC m=+36.289895235" lastFinishedPulling="2025-08-13 00:34:56.786826854 +0000 UTC m=+39.715167028" observedRunningTime="2025-08-13 00:34:57.24937629 +0000 UTC m=+40.177716479" watchObservedRunningTime="2025-08-13 00:34:57.25002686 +0000 UTC m=+40.178367042" Aug 13 00:34:57.296690 containerd[1908]: time="2025-08-13T00:34:57.296668732Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"f9fea2f5e186d7e10ce242ac29d5743cdef8e986cda3d5de40fbf18bb3659959\" pid:6058 exit_status:1 exited_at:{seconds:1755045297 nanos:296331777}" Aug 13 00:34:57.954415 systemd-networkd[1816]: cali1be3cc3e96d: Gained IPv6LL Aug 13 00:34:58.265362 containerd[1908]: time="2025-08-13T00:34:58.265277581Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:58.265481 containerd[1908]: time="2025-08-13T00:34:58.265436360Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Aug 13 00:34:58.265865 containerd[1908]: time="2025-08-13T00:34:58.265827587Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:58.266658 containerd[1908]: time="2025-08-13T00:34:58.266616199Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:58.267035 containerd[1908]: time="2025-08-13T00:34:58.266995180Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 1.480108456s" Aug 13 00:34:58.267035 containerd[1908]: time="2025-08-13T00:34:58.267009837Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Aug 13 00:34:58.267519 containerd[1908]: time="2025-08-13T00:34:58.267474554Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 13 00:34:58.268564 containerd[1908]: time="2025-08-13T00:34:58.268523041Z" level=info msg="CreateContainer within sandbox \"3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Aug 13 00:34:58.271837 containerd[1908]: time="2025-08-13T00:34:58.271820806Z" level=info msg="Container d28ad9b7add56e68ce2268239df5bf84d86cfe4435d10ae35383c008614292ca: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:58.275647 containerd[1908]: time="2025-08-13T00:34:58.275606838Z" level=info msg="CreateContainer within sandbox \"3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"d28ad9b7add56e68ce2268239df5bf84d86cfe4435d10ae35383c008614292ca\"" Aug 13 00:34:58.275910 containerd[1908]: time="2025-08-13T00:34:58.275867196Z" level=info msg="StartContainer for \"d28ad9b7add56e68ce2268239df5bf84d86cfe4435d10ae35383c008614292ca\"" Aug 13 00:34:58.276635 containerd[1908]: time="2025-08-13T00:34:58.276623613Z" level=info msg="connecting to shim d28ad9b7add56e68ce2268239df5bf84d86cfe4435d10ae35383c008614292ca" address="unix:///run/containerd/s/1f8d31810d5749fcaba22db772beadfeb79ecaa2ecd0dc033106108745a8f81f" protocol=ttrpc version=3 Aug 13 00:34:58.293302 systemd[1]: Started cri-containerd-d28ad9b7add56e68ce2268239df5bf84d86cfe4435d10ae35383c008614292ca.scope - libcontainer container d28ad9b7add56e68ce2268239df5bf84d86cfe4435d10ae35383c008614292ca. Aug 13 00:34:58.296682 containerd[1908]: time="2025-08-13T00:34:58.296657919Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"0ed864982c008c3a6eac084f40bd2d096aa86ddab5cc15749bf215ccd2f2c8f9\" pid:6101 exit_status:1 exited_at:{seconds:1755045298 nanos:296438895}" Aug 13 00:34:58.312670 containerd[1908]: time="2025-08-13T00:34:58.312645271Z" level=info msg="StartContainer for \"d28ad9b7add56e68ce2268239df5bf84d86cfe4435d10ae35383c008614292ca\" returns successfully" Aug 13 00:34:59.153104 kubelet[3254]: I0813 00:34:59.153015 3254 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Aug 13 00:34:59.153104 kubelet[3254]: I0813 00:34:59.153109 3254 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Aug 13 00:34:59.250054 kubelet[3254]: I0813 00:34:59.250020 3254 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-768vv" podStartSLOduration=21.202874967 podStartE2EDuration="26.250006811s" podCreationTimestamp="2025-08-13 00:34:33 +0000 UTC" firstStartedPulling="2025-08-13 00:34:53.220293248 +0000 UTC m=+36.148633423" lastFinishedPulling="2025-08-13 00:34:58.267425089 +0000 UTC m=+41.195765267" observedRunningTime="2025-08-13 00:34:59.249767214 +0000 UTC m=+42.178107392" watchObservedRunningTime="2025-08-13 00:34:59.250006811 +0000 UTC m=+42.178346986" Aug 13 00:35:00.104909 containerd[1908]: time="2025-08-13T00:35:00.104886922Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:00.105113 containerd[1908]: time="2025-08-13T00:35:00.105101773Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Aug 13 00:35:00.105518 containerd[1908]: time="2025-08-13T00:35:00.105478305Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:00.106531 containerd[1908]: time="2025-08-13T00:35:00.106491019Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:00.106816 containerd[1908]: time="2025-08-13T00:35:00.106774416Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 1.839280854s" Aug 13 00:35:00.106816 containerd[1908]: time="2025-08-13T00:35:00.106793319Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 13 00:35:00.107202 containerd[1908]: time="2025-08-13T00:35:00.107190451Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Aug 13 00:35:00.108160 containerd[1908]: time="2025-08-13T00:35:00.108148778Z" level=info msg="CreateContainer within sandbox \"d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 13 00:35:00.110754 containerd[1908]: time="2025-08-13T00:35:00.110716628Z" level=info msg="Container c050e1a18502210ce88ece43c0aa532f06798c88cc22b0d95aaa63d8c14bda96: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:00.113300 containerd[1908]: time="2025-08-13T00:35:00.113263168Z" level=info msg="CreateContainer within sandbox \"d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c050e1a18502210ce88ece43c0aa532f06798c88cc22b0d95aaa63d8c14bda96\"" Aug 13 00:35:00.113480 containerd[1908]: time="2025-08-13T00:35:00.113469591Z" level=info msg="StartContainer for \"c050e1a18502210ce88ece43c0aa532f06798c88cc22b0d95aaa63d8c14bda96\"" Aug 13 00:35:00.113991 containerd[1908]: time="2025-08-13T00:35:00.113980231Z" level=info msg="connecting to shim c050e1a18502210ce88ece43c0aa532f06798c88cc22b0d95aaa63d8c14bda96" address="unix:///run/containerd/s/a736199498545efe4ab611201640e5da4c72bfe3cb37f4bee7135f3bb70da186" protocol=ttrpc version=3 Aug 13 00:35:00.133519 systemd[1]: Started cri-containerd-c050e1a18502210ce88ece43c0aa532f06798c88cc22b0d95aaa63d8c14bda96.scope - libcontainer container c050e1a18502210ce88ece43c0aa532f06798c88cc22b0d95aaa63d8c14bda96. Aug 13 00:35:00.160419 containerd[1908]: time="2025-08-13T00:35:00.160395479Z" level=info msg="StartContainer for \"c050e1a18502210ce88ece43c0aa532f06798c88cc22b0d95aaa63d8c14bda96\" returns successfully" Aug 13 00:35:00.251353 kubelet[3254]: I0813 00:35:00.251313 3254 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-856b9ff84f-hvjbp" podStartSLOduration=25.479358917 podStartE2EDuration="30.251301009s" podCreationTimestamp="2025-08-13 00:34:30 +0000 UTC" firstStartedPulling="2025-08-13 00:34:55.335170073 +0000 UTC m=+38.263510247" lastFinishedPulling="2025-08-13 00:35:00.107112162 +0000 UTC m=+43.035452339" observedRunningTime="2025-08-13 00:35:00.25122325 +0000 UTC m=+43.179563433" watchObservedRunningTime="2025-08-13 00:35:00.251301009 +0000 UTC m=+43.179641183" Aug 13 00:35:01.247635 kubelet[3254]: I0813 00:35:01.247596 3254 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:35:02.788564 containerd[1908]: time="2025-08-13T00:35:02.788503090Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:02.788797 containerd[1908]: time="2025-08-13T00:35:02.788731204Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Aug 13 00:35:02.789167 containerd[1908]: time="2025-08-13T00:35:02.789127335Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:02.789996 containerd[1908]: time="2025-08-13T00:35:02.789954913Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:02.790292 containerd[1908]: time="2025-08-13T00:35:02.790251702Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 2.683045762s" Aug 13 00:35:02.790292 containerd[1908]: time="2025-08-13T00:35:02.790266525Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Aug 13 00:35:02.790752 containerd[1908]: time="2025-08-13T00:35:02.790714480Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 13 00:35:02.794216 containerd[1908]: time="2025-08-13T00:35:02.794195620Z" level=info msg="CreateContainer within sandbox \"1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Aug 13 00:35:02.796927 containerd[1908]: time="2025-08-13T00:35:02.796914931Z" level=info msg="Container bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:02.799612 containerd[1908]: time="2025-08-13T00:35:02.799570724Z" level=info msg="CreateContainer within sandbox \"1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\"" Aug 13 00:35:02.799855 containerd[1908]: time="2025-08-13T00:35:02.799808641Z" level=info msg="StartContainer for \"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\"" Aug 13 00:35:02.800343 containerd[1908]: time="2025-08-13T00:35:02.800303419Z" level=info msg="connecting to shim bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574" address="unix:///run/containerd/s/6b3024d58135dc753e4537f0b3ca0e8ad848ed166ab41e65a65dec3528396bb3" protocol=ttrpc version=3 Aug 13 00:35:02.816500 systemd[1]: Started cri-containerd-bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574.scope - libcontainer container bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574. Aug 13 00:35:02.844794 containerd[1908]: time="2025-08-13T00:35:02.844742678Z" level=info msg="StartContainer for \"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" returns successfully" Aug 13 00:35:03.230100 containerd[1908]: time="2025-08-13T00:35:03.230080502Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:03.230380 containerd[1908]: time="2025-08-13T00:35:03.230367100Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Aug 13 00:35:03.231355 containerd[1908]: time="2025-08-13T00:35:03.231341362Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 440.613081ms" Aug 13 00:35:03.231387 containerd[1908]: time="2025-08-13T00:35:03.231355943Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 13 00:35:03.232661 containerd[1908]: time="2025-08-13T00:35:03.232650585Z" level=info msg="CreateContainer within sandbox \"eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 13 00:35:03.235491 containerd[1908]: time="2025-08-13T00:35:03.235477894Z" level=info msg="Container ade0c810561ef4366b61596837f2f0d7001c4a84c8622bd36f630e1b49be05a4: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:03.238471 containerd[1908]: time="2025-08-13T00:35:03.238430746Z" level=info msg="CreateContainer within sandbox \"eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ade0c810561ef4366b61596837f2f0d7001c4a84c8622bd36f630e1b49be05a4\"" Aug 13 00:35:03.238846 containerd[1908]: time="2025-08-13T00:35:03.238787368Z" level=info msg="StartContainer for \"ade0c810561ef4366b61596837f2f0d7001c4a84c8622bd36f630e1b49be05a4\"" Aug 13 00:35:03.239430 containerd[1908]: time="2025-08-13T00:35:03.239395304Z" level=info msg="connecting to shim ade0c810561ef4366b61596837f2f0d7001c4a84c8622bd36f630e1b49be05a4" address="unix:///run/containerd/s/12055d9c93280f3933d2fcf354d7eda6ba120cd3df7d9dfe3b353f6dcb76cd17" protocol=ttrpc version=3 Aug 13 00:35:03.257908 kubelet[3254]: I0813 00:35:03.257834 3254 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-85df7c9ff6-mpjsz" podStartSLOduration=22.920086538 podStartE2EDuration="30.257817647s" podCreationTimestamp="2025-08-13 00:34:33 +0000 UTC" firstStartedPulling="2025-08-13 00:34:55.452931888 +0000 UTC m=+38.381272063" lastFinishedPulling="2025-08-13 00:35:02.790662998 +0000 UTC m=+45.719003172" observedRunningTime="2025-08-13 00:35:03.257658329 +0000 UTC m=+46.185998516" watchObservedRunningTime="2025-08-13 00:35:03.257817647 +0000 UTC m=+46.186157828" Aug 13 00:35:03.258362 systemd[1]: Started cri-containerd-ade0c810561ef4366b61596837f2f0d7001c4a84c8622bd36f630e1b49be05a4.scope - libcontainer container ade0c810561ef4366b61596837f2f0d7001c4a84c8622bd36f630e1b49be05a4. Aug 13 00:35:03.280909 containerd[1908]: time="2025-08-13T00:35:03.280887720Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"2838f4ed1c15b0bf3737fc3e421fc82900155803ab2243c9c10cdb287f529663\" pid:6298 exited_at:{seconds:1755045303 nanos:280722528}" Aug 13 00:35:03.293904 containerd[1908]: time="2025-08-13T00:35:03.293879291Z" level=info msg="StartContainer for \"ade0c810561ef4366b61596837f2f0d7001c4a84c8622bd36f630e1b49be05a4\" returns successfully" Aug 13 00:35:04.280633 kubelet[3254]: I0813 00:35:04.280505 3254 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-856b9ff84f-llc5l" podStartSLOduration=27.400932773 podStartE2EDuration="34.280459298s" podCreationTimestamp="2025-08-13 00:34:30 +0000 UTC" firstStartedPulling="2025-08-13 00:34:56.35210083 +0000 UTC m=+39.280441005" lastFinishedPulling="2025-08-13 00:35:03.231627356 +0000 UTC m=+46.159967530" observedRunningTime="2025-08-13 00:35:04.280001542 +0000 UTC m=+47.208341807" watchObservedRunningTime="2025-08-13 00:35:04.280459298 +0000 UTC m=+47.208799551" Aug 13 00:35:05.262403 kubelet[3254]: I0813 00:35:05.262299 3254 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:35:11.788140 kubelet[3254]: I0813 00:35:11.787918 3254 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:35:16.038657 kubelet[3254]: I0813 00:35:16.038537 3254 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:35:16.259385 containerd[1908]: time="2025-08-13T00:35:16.259331436Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\" id:\"1fb37e49a2092d0bcb21379e6d939e3fe6db18e4c88148e80ff78538fa935a78\" pid:6375 exited_at:{seconds:1755045316 nanos:259152388}" Aug 13 00:35:28.356606 containerd[1908]: time="2025-08-13T00:35:28.356583868Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"16b82fe6f2f1f6d2c757eab3d2a4cfea383cbb30b25db27ce1de7364ef062032\" pid:6426 exited_at:{seconds:1755045328 nanos:356396893}" Aug 13 00:35:33.332653 containerd[1908]: time="2025-08-13T00:35:33.332619732Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"fccba5b78bc5f43da8b097121d60c191648ffd33e91a14af7ec7071d03e03464\" pid:6467 exited_at:{seconds:1755045333 nanos:332466624}" Aug 13 00:35:33.350930 containerd[1908]: time="2025-08-13T00:35:33.350904746Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"10684f0ea43eb4245da43a5739bbd971473748de6a66a9efbad4af8d80af0402\" pid:6479 exited_at:{seconds:1755045333 nanos:350692292}" Aug 13 00:35:43.708107 containerd[1908]: time="2025-08-13T00:35:43.707485384Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"fa4b5e03c467307aab827436a0bdd111ca28d816409e3d13f523cca963ceca44\" pid:6518 exited_at:{seconds:1755045343 nanos:707089715}" Aug 13 00:35:46.340703 containerd[1908]: time="2025-08-13T00:35:46.340654838Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\" id:\"d91a91bb2bb41d40ce459deffc154703a95daa2d34d2feee5ac05ba80adf639b\" pid:6539 exited_at:{seconds:1755045346 nanos:340487411}" Aug 13 00:35:58.296572 containerd[1908]: time="2025-08-13T00:35:58.296545483Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"249180b56a2bb96a956298814ea52d06293637653c159f1d2a845699a55d294e\" pid:6577 exited_at:{seconds:1755045358 nanos:296293522}" Aug 13 00:36:03.305629 containerd[1908]: time="2025-08-13T00:36:03.305567951Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"d9781945517dd966f81448df803da3a9f28d4605fd9cf10ca7496f3c854c81e3\" pid:6614 exited_at:{seconds:1755045363 nanos:305414889}" Aug 13 00:36:16.343049 containerd[1908]: time="2025-08-13T00:36:16.343021684Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\" id:\"5d9ef39b3cadf35db4fbac698c96142d33cf84cb01aeccdff0dd005df2256148\" pid:6644 exited_at:{seconds:1755045376 nanos:342781309}" Aug 13 00:36:28.341282 containerd[1908]: time="2025-08-13T00:36:28.341255918Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"b18ce19c98b8b357a92edc4ac0b4d419893e7ad978030863a659865d25b41677\" pid:6708 exited_at:{seconds:1755045388 nanos:341039574}" Aug 13 00:36:33.341544 containerd[1908]: time="2025-08-13T00:36:33.341505629Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"19df620eb97729b5ffd6ef8e21a9693d76fa0b0878385157bf37ef7f9a577a1e\" pid:6741 exited_at:{seconds:1755045393 nanos:341312579}" Aug 13 00:36:33.360826 containerd[1908]: time="2025-08-13T00:36:33.360772016Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"ce364296c3ffeb214f0b444b892425225dad36471ca0efc289decd9a2a2e64e0\" pid:6757 exited_at:{seconds:1755045393 nanos:360619847}" Aug 13 00:36:43.741319 containerd[1908]: time="2025-08-13T00:36:43.741263073Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"7fc201d68e067a61cb61e1cdcc9aa0f1af7bf0117f25fe941d84249d1cc54519\" pid:6799 exited_at:{seconds:1755045403 nanos:741116570}" Aug 13 00:36:46.321116 containerd[1908]: time="2025-08-13T00:36:46.321085600Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\" id:\"083ce5fa8fb33f50699d118fe3fb6f83a880fa618bfaa6d0a5df45c78d5b0704\" pid:6821 exited_at:{seconds:1755045406 nanos:320842715}" Aug 13 00:36:58.327329 containerd[1908]: time="2025-08-13T00:36:58.327303877Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"c05f1116b869c76f04bf9227a72243368847b4362bb44742e6322be5277e822d\" pid:6861 exited_at:{seconds:1755045418 nanos:327117658}" Aug 13 00:37:03.348377 containerd[1908]: time="2025-08-13T00:37:03.348347231Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"9082c25f8e5df6ff9839b5b4618f93d3f8c75b4048d6ef1f93855940cb5ae44b\" pid:6891 exited_at:{seconds:1755045423 nanos:348166152}" Aug 13 00:37:16.334542 containerd[1908]: time="2025-08-13T00:37:16.334499858Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\" id:\"78ca9557f6b3315a7ced87905b734df83ffe96e9f165356d689d769f213c0082\" pid:6912 exited_at:{seconds:1755045436 nanos:334206829}" Aug 13 00:37:28.311962 containerd[1908]: time="2025-08-13T00:37:28.311935829Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"df5521e43103a6bedecc28545428f66a9de316a5bcea1b730eddde0ce4dace79\" pid:6959 exited_at:{seconds:1755045448 nanos:311685097}" Aug 13 00:37:33.296600 containerd[1908]: time="2025-08-13T00:37:33.296576206Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"913da09362eb215fe4b079a3eeeab4ea7cc4600314fd12468db638a9bf0d1a9f\" pid:6996 exited_at:{seconds:1755045453 nanos:296464301}" Aug 13 00:37:33.315418 containerd[1908]: time="2025-08-13T00:37:33.315389198Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"3fe33c49c1830ad94ed8f32aa6afc5ca23303022a1cddbf6c8aaae895651eee2\" pid:7008 exited_at:{seconds:1755045453 nanos:315218789}" Aug 13 00:37:43.691251 containerd[1908]: time="2025-08-13T00:37:43.691212693Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"1bd0ec9ccae69181cdca7d384995abc9f62306267a5f8c2f13c2509a88c34a01\" pid:7049 exited_at:{seconds:1755045463 nanos:690724408}" Aug 13 00:37:46.281250 containerd[1908]: time="2025-08-13T00:37:46.281224045Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\" id:\"3e9ef1e7359f79a2dea6c26b7846fe8901157a07c83d6af50e12d50c527d4485\" pid:7070 exited_at:{seconds:1755045466 nanos:281006662}" Aug 13 00:37:58.309914 containerd[1908]: time="2025-08-13T00:37:58.309858978Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"730afc6badc4f3d246f3e210e093c2a97906f9ab8237eff515d5c0f192bccfa3\" pid:7130 exited_at:{seconds:1755045478 nanos:309590593}" Aug 13 00:38:03.302172 containerd[1908]: time="2025-08-13T00:38:03.302145917Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"34b9374fb03c19090eaebf522ea24a858f645d9c8b3e42456f25edef56558f4c\" pid:7164 exited_at:{seconds:1755045483 nanos:302048968}" Aug 13 00:38:16.302042 containerd[1908]: time="2025-08-13T00:38:16.301965259Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\" id:\"dc2d374b3ba86ab691a7fdb018f570e4699cfdfb97d4ad46f1463ebf94a1e450\" pid:7186 exited_at:{seconds:1755045496 nanos:301576629}" Aug 13 00:38:22.492467 update_engine[1895]: I20250813 00:38:22.492345 1895 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Aug 13 00:38:22.492467 update_engine[1895]: I20250813 00:38:22.492444 1895 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Aug 13 00:38:22.493630 update_engine[1895]: I20250813 00:38:22.492805 1895 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Aug 13 00:38:22.493899 update_engine[1895]: I20250813 00:38:22.493846 1895 omaha_request_params.cc:62] Current group set to beta Aug 13 00:38:22.494122 update_engine[1895]: I20250813 00:38:22.494081 1895 update_attempter.cc:499] Already updated boot flags. Skipping. Aug 13 00:38:22.494122 update_engine[1895]: I20250813 00:38:22.494109 1895 update_attempter.cc:643] Scheduling an action processor start. Aug 13 00:38:22.494433 update_engine[1895]: I20250813 00:38:22.494149 1895 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Aug 13 00:38:22.494433 update_engine[1895]: I20250813 00:38:22.494238 1895 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Aug 13 00:38:22.494433 update_engine[1895]: I20250813 00:38:22.494393 1895 omaha_request_action.cc:271] Posting an Omaha request to disabled Aug 13 00:38:22.494433 update_engine[1895]: I20250813 00:38:22.494422 1895 omaha_request_action.cc:272] Request: Aug 13 00:38:22.494433 update_engine[1895]: Aug 13 00:38:22.494433 update_engine[1895]: Aug 13 00:38:22.494433 update_engine[1895]: Aug 13 00:38:22.494433 update_engine[1895]: Aug 13 00:38:22.494433 update_engine[1895]: Aug 13 00:38:22.494433 update_engine[1895]: Aug 13 00:38:22.494433 update_engine[1895]: Aug 13 00:38:22.494433 update_engine[1895]: Aug 13 00:38:22.495698 update_engine[1895]: I20250813 00:38:22.494441 1895 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 13 00:38:22.495788 locksmithd[1965]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Aug 13 00:38:22.497844 update_engine[1895]: I20250813 00:38:22.497833 1895 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 13 00:38:22.498064 update_engine[1895]: I20250813 00:38:22.498054 1895 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 13 00:38:22.498437 update_engine[1895]: E20250813 00:38:22.498425 1895 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 13 00:38:22.498465 update_engine[1895]: I20250813 00:38:22.498456 1895 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Aug 13 00:38:28.301149 containerd[1908]: time="2025-08-13T00:38:28.301120053Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"524fd5258ac9ae425a9e86dc9231127968f9b6c4308577fb3131fe8294d19ee1\" pid:7226 exited_at:{seconds:1755045508 nanos:300815904}" Aug 13 00:38:32.451536 update_engine[1895]: I20250813 00:38:32.451370 1895 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 13 00:38:32.452370 update_engine[1895]: I20250813 00:38:32.451921 1895 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 13 00:38:32.452711 update_engine[1895]: I20250813 00:38:32.452608 1895 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 13 00:38:32.453045 update_engine[1895]: E20250813 00:38:32.452945 1895 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 13 00:38:32.453247 update_engine[1895]: I20250813 00:38:32.453109 1895 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Aug 13 00:38:33.332876 containerd[1908]: time="2025-08-13T00:38:33.332849764Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"1b7f32f723086a7ba4a41a3bed8833d95c67486549fd71dfed77615d49de4261\" pid:7260 exited_at:{seconds:1755045513 nanos:332709491}" Aug 13 00:38:33.354051 containerd[1908]: time="2025-08-13T00:38:33.354026116Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"70a551fb3123055cc959a94921a42c45c45e6d567aa729fa779592b62f22c460\" pid:7276 exited_at:{seconds:1755045513 nanos:353848331}" Aug 13 00:38:42.450405 update_engine[1895]: I20250813 00:38:42.450313 1895 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 13 00:38:42.451446 update_engine[1895]: I20250813 00:38:42.450559 1895 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 13 00:38:42.451446 update_engine[1895]: I20250813 00:38:42.450860 1895 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 13 00:38:42.451446 update_engine[1895]: E20250813 00:38:42.451215 1895 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 13 00:38:42.451446 update_engine[1895]: I20250813 00:38:42.451267 1895 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Aug 13 00:38:43.709780 containerd[1908]: time="2025-08-13T00:38:43.709755378Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"79717f7afd0f0ce7e304ea501d486c9a62f8e2832789b864a62639706d7c9943\" pid:7315 exited_at:{seconds:1755045523 nanos:709656375}" Aug 13 00:38:46.319134 containerd[1908]: time="2025-08-13T00:38:46.319093476Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\" id:\"6f048d27345dc41aea0cb04f29e68bc53ae52d388d7cf35529c333ca76881c30\" pid:7337 exited_at:{seconds:1755045526 nanos:318893841}" Aug 13 00:38:52.451560 update_engine[1895]: I20250813 00:38:52.451398 1895 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 13 00:38:52.452464 update_engine[1895]: I20250813 00:38:52.451969 1895 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 13 00:38:52.452877 update_engine[1895]: I20250813 00:38:52.452708 1895 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 13 00:38:52.453090 update_engine[1895]: E20250813 00:38:52.453027 1895 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 13 00:38:52.453232 update_engine[1895]: I20250813 00:38:52.453126 1895 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Aug 13 00:38:52.453232 update_engine[1895]: I20250813 00:38:52.453150 1895 omaha_request_action.cc:617] Omaha request response: Aug 13 00:38:52.453475 update_engine[1895]: E20250813 00:38:52.453342 1895 omaha_request_action.cc:636] Omaha request network transfer failed. Aug 13 00:38:52.453475 update_engine[1895]: I20250813 00:38:52.453392 1895 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Aug 13 00:38:52.453475 update_engine[1895]: I20250813 00:38:52.453408 1895 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Aug 13 00:38:52.453475 update_engine[1895]: I20250813 00:38:52.453423 1895 update_attempter.cc:306] Processing Done. Aug 13 00:38:52.453475 update_engine[1895]: E20250813 00:38:52.453454 1895 update_attempter.cc:619] Update failed. Aug 13 00:38:52.453475 update_engine[1895]: I20250813 00:38:52.453469 1895 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Aug 13 00:38:52.453978 update_engine[1895]: I20250813 00:38:52.453483 1895 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Aug 13 00:38:52.453978 update_engine[1895]: I20250813 00:38:52.453500 1895 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Aug 13 00:38:52.453978 update_engine[1895]: I20250813 00:38:52.453694 1895 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Aug 13 00:38:52.453978 update_engine[1895]: I20250813 00:38:52.453763 1895 omaha_request_action.cc:271] Posting an Omaha request to disabled Aug 13 00:38:52.453978 update_engine[1895]: I20250813 00:38:52.453781 1895 omaha_request_action.cc:272] Request: Aug 13 00:38:52.453978 update_engine[1895]: Aug 13 00:38:52.453978 update_engine[1895]: Aug 13 00:38:52.453978 update_engine[1895]: Aug 13 00:38:52.453978 update_engine[1895]: Aug 13 00:38:52.453978 update_engine[1895]: Aug 13 00:38:52.453978 update_engine[1895]: Aug 13 00:38:52.453978 update_engine[1895]: I20250813 00:38:52.453799 1895 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 13 00:38:52.454909 update_engine[1895]: I20250813 00:38:52.454216 1895 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 13 00:38:52.454909 update_engine[1895]: I20250813 00:38:52.454718 1895 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 13 00:38:52.455081 locksmithd[1965]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Aug 13 00:38:52.455734 update_engine[1895]: E20250813 00:38:52.455114 1895 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 13 00:38:52.455734 update_engine[1895]: I20250813 00:38:52.455235 1895 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Aug 13 00:38:52.455734 update_engine[1895]: I20250813 00:38:52.455262 1895 omaha_request_action.cc:617] Omaha request response: Aug 13 00:38:52.455734 update_engine[1895]: I20250813 00:38:52.455279 1895 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Aug 13 00:38:52.455734 update_engine[1895]: I20250813 00:38:52.455293 1895 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Aug 13 00:38:52.455734 update_engine[1895]: I20250813 00:38:52.455307 1895 update_attempter.cc:306] Processing Done. Aug 13 00:38:52.455734 update_engine[1895]: I20250813 00:38:52.455322 1895 update_attempter.cc:310] Error event sent. Aug 13 00:38:52.455734 update_engine[1895]: I20250813 00:38:52.455343 1895 update_check_scheduler.cc:74] Next update check in 44m29s Aug 13 00:38:52.456420 locksmithd[1965]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Aug 13 00:38:58.340996 containerd[1908]: time="2025-08-13T00:38:58.340946579Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"1d2a4ff38062a2dc4e9917b9b1f4e306fac3c116e98faaae3c8f878cc14a3ab0\" pid:7375 exited_at:{seconds:1755045538 nanos:340746690}" Aug 13 00:39:03.298800 containerd[1908]: time="2025-08-13T00:39:03.298777065Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"dcf17063f49ad6977936387ff8d7cea4d14bebad73f06b582a9c79f8d6df67dd\" pid:7411 exited_at:{seconds:1755045543 nanos:298667761}" Aug 13 00:39:13.145530 containerd[1908]: time="2025-08-13T00:39:13.145325934Z" level=warning msg="container event discarded" container=a1f919168dcb0036512a15ee7180107dcb9c85c19b56a04b2947204576491a3a type=CONTAINER_CREATED_EVENT Aug 13 00:39:13.145530 containerd[1908]: time="2025-08-13T00:39:13.145464309Z" level=warning msg="container event discarded" container=a1f919168dcb0036512a15ee7180107dcb9c85c19b56a04b2947204576491a3a type=CONTAINER_STARTED_EVENT Aug 13 00:39:13.159062 containerd[1908]: time="2025-08-13T00:39:13.158914724Z" level=warning msg="container event discarded" container=9ca1b79a13c886f792eb4cbd45d4eec140d04feb72f87a16faa4b48280e4a8ed type=CONTAINER_CREATED_EVENT Aug 13 00:39:13.159062 containerd[1908]: time="2025-08-13T00:39:13.159031373Z" level=warning msg="container event discarded" container=9ca1b79a13c886f792eb4cbd45d4eec140d04feb72f87a16faa4b48280e4a8ed type=CONTAINER_STARTED_EVENT Aug 13 00:39:13.159062 containerd[1908]: time="2025-08-13T00:39:13.159068267Z" level=warning msg="container event discarded" container=ec760eed6c55f2d4e675297a91eed591d4a3cb1aa39892c350e3a6e3b459598e type=CONTAINER_CREATED_EVENT Aug 13 00:39:13.159533 containerd[1908]: time="2025-08-13T00:39:13.159093542Z" level=warning msg="container event discarded" container=ec760eed6c55f2d4e675297a91eed591d4a3cb1aa39892c350e3a6e3b459598e type=CONTAINER_STARTED_EVENT Aug 13 00:39:13.159533 containerd[1908]: time="2025-08-13T00:39:13.159116152Z" level=warning msg="container event discarded" container=7ee79acb85ce47824a4d787924f4e76b4df8968a53dce12a0732dcf1424da19d type=CONTAINER_CREATED_EVENT Aug 13 00:39:13.159533 containerd[1908]: time="2025-08-13T00:39:13.159137366Z" level=warning msg="container event discarded" container=1b7ce64e8d512483d94ad8bcbfdd8737891b0ae54cda931a08bf14834451eae4 type=CONTAINER_CREATED_EVENT Aug 13 00:39:13.159533 containerd[1908]: time="2025-08-13T00:39:13.159158936Z" level=warning msg="container event discarded" container=6f7ac4966d318b42daf19e83eebffd97d06fec5bb94d0c21f94b03f47b4663df type=CONTAINER_CREATED_EVENT Aug 13 00:39:13.231547 containerd[1908]: time="2025-08-13T00:39:13.231483757Z" level=warning msg="container event discarded" container=1b7ce64e8d512483d94ad8bcbfdd8737891b0ae54cda931a08bf14834451eae4 type=CONTAINER_STARTED_EVENT Aug 13 00:39:13.231547 containerd[1908]: time="2025-08-13T00:39:13.231514738Z" level=warning msg="container event discarded" container=6f7ac4966d318b42daf19e83eebffd97d06fec5bb94d0c21f94b03f47b4663df type=CONTAINER_STARTED_EVENT Aug 13 00:39:13.231547 containerd[1908]: time="2025-08-13T00:39:13.231522712Z" level=warning msg="container event discarded" container=7ee79acb85ce47824a4d787924f4e76b4df8968a53dce12a0732dcf1424da19d type=CONTAINER_STARTED_EVENT Aug 13 00:39:16.329881 containerd[1908]: time="2025-08-13T00:39:16.329850096Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\" id:\"47e7ca24aa60e04214f2929ef438f1cdd985ab7b759d1d6706ad347be3ffc7f5\" pid:7433 exited_at:{seconds:1755045556 nanos:329582473}" Aug 13 00:39:23.705226 containerd[1908]: time="2025-08-13T00:39:23.705037011Z" level=warning msg="container event discarded" container=e6cc8541df7282a3fe7ed20ed5b6b37fd212e82a649d36dd23e203b7d4d391bf type=CONTAINER_CREATED_EVENT Aug 13 00:39:23.705226 containerd[1908]: time="2025-08-13T00:39:23.705192180Z" level=warning msg="container event discarded" container=e6cc8541df7282a3fe7ed20ed5b6b37fd212e82a649d36dd23e203b7d4d391bf type=CONTAINER_STARTED_EVENT Aug 13 00:39:23.705226 containerd[1908]: time="2025-08-13T00:39:23.705224228Z" level=warning msg="container event discarded" container=e4a5da0ec5c1136e3e773685660216e61492d9bc5372006e4b982ec9e9aa23d9 type=CONTAINER_CREATED_EVENT Aug 13 00:39:23.759762 containerd[1908]: time="2025-08-13T00:39:23.759650007Z" level=warning msg="container event discarded" container=e4a5da0ec5c1136e3e773685660216e61492d9bc5372006e4b982ec9e9aa23d9 type=CONTAINER_STARTED_EVENT Aug 13 00:39:23.879360 containerd[1908]: time="2025-08-13T00:39:23.879243763Z" level=warning msg="container event discarded" container=c5230833e5845e09193076f84713db1f9394b5de4c0e434094f9b6c968842a9f type=CONTAINER_CREATED_EVENT Aug 13 00:39:23.879360 containerd[1908]: time="2025-08-13T00:39:23.879339340Z" level=warning msg="container event discarded" container=c5230833e5845e09193076f84713db1f9394b5de4c0e434094f9b6c968842a9f type=CONTAINER_STARTED_EVENT Aug 13 00:39:25.506563 containerd[1908]: time="2025-08-13T00:39:25.506408667Z" level=warning msg="container event discarded" container=0febd0938320f2e0e74e36b40e0f23008e301336e8d8ea59126582e21f129154 type=CONTAINER_CREATED_EVENT Aug 13 00:39:25.557157 containerd[1908]: time="2025-08-13T00:39:25.556992455Z" level=warning msg="container event discarded" container=0febd0938320f2e0e74e36b40e0f23008e301336e8d8ea59126582e21f129154 type=CONTAINER_STARTED_EVENT Aug 13 00:39:28.315692 containerd[1908]: time="2025-08-13T00:39:28.315667148Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"72445c6bc12e7106bf64d4b0515d44255ceed5b3fb0d56520c23be460ab0c63b\" pid:7490 exited_at:{seconds:1755045568 nanos:315469489}" Aug 13 00:39:32.782333 containerd[1908]: time="2025-08-13T00:39:32.782147344Z" level=warning msg="container event discarded" container=541e8042e0abd845bad097dc51b68cf1cbcc37925ead7fe84f1a0b0248bf88cf type=CONTAINER_CREATED_EVENT Aug 13 00:39:32.782333 containerd[1908]: time="2025-08-13T00:39:32.782314822Z" level=warning msg="container event discarded" container=541e8042e0abd845bad097dc51b68cf1cbcc37925ead7fe84f1a0b0248bf88cf type=CONTAINER_STARTED_EVENT Aug 13 00:39:33.087390 containerd[1908]: time="2025-08-13T00:39:33.087075597Z" level=warning msg="container event discarded" container=44c300e3a755b2cb010880f3b8cc0c13c9785fe18af25777a084a781483ca21d type=CONTAINER_CREATED_EVENT Aug 13 00:39:33.087390 containerd[1908]: time="2025-08-13T00:39:33.087222922Z" level=warning msg="container event discarded" container=44c300e3a755b2cb010880f3b8cc0c13c9785fe18af25777a084a781483ca21d type=CONTAINER_STARTED_EVENT Aug 13 00:39:33.350026 containerd[1908]: time="2025-08-13T00:39:33.349948695Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"987cac696e48832d09126bafba0b5a9c7809cdfee8cc5d7e7bc41792bdf3057a\" pid:7537 exited_at:{seconds:1755045573 nanos:349791875}" Aug 13 00:39:33.368708 containerd[1908]: time="2025-08-13T00:39:33.368630863Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"de1b8f7925c86e4f78352b46e123855ff1dbabf5c0e568fea18156a7108c088a\" pid:7549 exited_at:{seconds:1755045573 nanos:368220766}" Aug 13 00:39:34.542720 containerd[1908]: time="2025-08-13T00:39:34.542576934Z" level=warning msg="container event discarded" container=02d1fc216b606c6d4a36af771f120ee2dcd6ca9337a3fc18c0cd5d3e21c9f073 type=CONTAINER_CREATED_EVENT Aug 13 00:39:34.599067 containerd[1908]: time="2025-08-13T00:39:34.598890058Z" level=warning msg="container event discarded" container=02d1fc216b606c6d4a36af771f120ee2dcd6ca9337a3fc18c0cd5d3e21c9f073 type=CONTAINER_STARTED_EVENT Aug 13 00:39:35.934561 containerd[1908]: time="2025-08-13T00:39:35.934403689Z" level=warning msg="container event discarded" container=08155112aa8e31ba578297fd38deb5317e417b40b2d393a5d05361b4aec3c850 type=CONTAINER_CREATED_EVENT Aug 13 00:39:35.978978 containerd[1908]: time="2025-08-13T00:39:35.978835902Z" level=warning msg="container event discarded" container=08155112aa8e31ba578297fd38deb5317e417b40b2d393a5d05361b4aec3c850 type=CONTAINER_STARTED_EVENT Aug 13 00:39:37.055565 containerd[1908]: time="2025-08-13T00:39:37.055415440Z" level=warning msg="container event discarded" container=08155112aa8e31ba578297fd38deb5317e417b40b2d393a5d05361b4aec3c850 type=CONTAINER_STOPPED_EVENT Aug 13 00:39:39.525369 containerd[1908]: time="2025-08-13T00:39:39.525169158Z" level=warning msg="container event discarded" container=2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e type=CONTAINER_CREATED_EVENT Aug 13 00:39:39.559829 containerd[1908]: time="2025-08-13T00:39:39.559705771Z" level=warning msg="container event discarded" container=2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e type=CONTAINER_STARTED_EVENT Aug 13 00:39:40.580104 containerd[1908]: time="2025-08-13T00:39:40.579941284Z" level=warning msg="container event discarded" container=2e07b3e31932fddc3cd079b9f765421211e418ef0a795448d9b3f6cdc348e62e type=CONTAINER_STOPPED_EVENT Aug 13 00:39:43.752189 containerd[1908]: time="2025-08-13T00:39:43.752120345Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"0d7956963d34b402a7a928930334792d115e7befa8d660b760cd321c6897130b\" pid:7587 exited_at:{seconds:1755045583 nanos:752011132}" Aug 13 00:39:44.643069 containerd[1908]: time="2025-08-13T00:39:44.642865061Z" level=warning msg="container event discarded" container=5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c type=CONTAINER_CREATED_EVENT Aug 13 00:39:44.682683 containerd[1908]: time="2025-08-13T00:39:44.682530369Z" level=warning msg="container event discarded" container=5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c type=CONTAINER_STARTED_EVENT Aug 13 00:39:45.743631 containerd[1908]: time="2025-08-13T00:39:45.743451298Z" level=warning msg="container event discarded" container=01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64 type=CONTAINER_CREATED_EVENT Aug 13 00:39:45.743631 containerd[1908]: time="2025-08-13T00:39:45.743578096Z" level=warning msg="container event discarded" container=01d969f550dc77ec4e3269e0cb4fdf61b1e3edaf0162548d11a6046e77e06d64 type=CONTAINER_STARTED_EVENT Aug 13 00:39:46.282015 containerd[1908]: time="2025-08-13T00:39:46.281951831Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\" id:\"c28e23d311f6eda4d5bb6bcc0feeb476019cdcc56e8f489105110fd8cca91735\" pid:7608 exited_at:{seconds:1755045586 nanos:281691982}" Aug 13 00:39:47.135820 containerd[1908]: time="2025-08-13T00:39:47.135691212Z" level=warning msg="container event discarded" container=49e737dd843a015bd00f02f9569dcaae2fb6eff10a61be3b6446c667966ad82c type=CONTAINER_CREATED_EVENT Aug 13 00:39:47.196713 containerd[1908]: time="2025-08-13T00:39:47.196623374Z" level=warning msg="container event discarded" container=49e737dd843a015bd00f02f9569dcaae2fb6eff10a61be3b6446c667966ad82c type=CONTAINER_STARTED_EVENT Aug 13 00:39:49.173327 containerd[1908]: time="2025-08-13T00:39:49.173198420Z" level=warning msg="container event discarded" container=001d5e5e33c33470603b63af001e4f27f73f037198594c9ab26a48c25f4c8d31 type=CONTAINER_CREATED_EVENT Aug 13 00:39:49.222604 containerd[1908]: time="2025-08-13T00:39:49.222546027Z" level=warning msg="container event discarded" container=001d5e5e33c33470603b63af001e4f27f73f037198594c9ab26a48c25f4c8d31 type=CONTAINER_STARTED_EVENT Aug 13 00:39:53.230412 containerd[1908]: time="2025-08-13T00:39:53.230257445Z" level=warning msg="container event discarded" container=3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716 type=CONTAINER_CREATED_EVENT Aug 13 00:39:53.230412 containerd[1908]: time="2025-08-13T00:39:53.230386800Z" level=warning msg="container event discarded" container=3fc3eb175b6825c5098206308317b5ca6ca36c7b8b93b482f48e3f356c2ff716 type=CONTAINER_STARTED_EVENT Aug 13 00:39:53.371452 containerd[1908]: time="2025-08-13T00:39:53.371344239Z" level=warning msg="container event discarded" container=8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320 type=CONTAINER_CREATED_EVENT Aug 13 00:39:53.371452 containerd[1908]: time="2025-08-13T00:39:53.371428867Z" level=warning msg="container event discarded" container=8ca9a32b1bee9803cf6c806b7d95dc15aa31784b8dd310ad720aa6fff8681320 type=CONTAINER_STARTED_EVENT Aug 13 00:39:53.472066 containerd[1908]: time="2025-08-13T00:39:53.471924346Z" level=warning msg="container event discarded" container=8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4 type=CONTAINER_CREATED_EVENT Aug 13 00:39:53.472066 containerd[1908]: time="2025-08-13T00:39:53.472037076Z" level=warning msg="container event discarded" container=8199ef500064dd4f7bfcfc8390e40c91aa06253f6ba3b3f2a698b23c243f4ec4 type=CONTAINER_STARTED_EVENT Aug 13 00:39:53.472066 containerd[1908]: time="2025-08-13T00:39:53.472070689Z" level=warning msg="container event discarded" container=c1554f1be4e2c9638ceae8928fdfe6832e2f75516af2027302e99b6078772015 type=CONTAINER_CREATED_EVENT Aug 13 00:39:53.509583 containerd[1908]: time="2025-08-13T00:39:53.509351001Z" level=warning msg="container event discarded" container=c1554f1be4e2c9638ceae8928fdfe6832e2f75516af2027302e99b6078772015 type=CONTAINER_STARTED_EVENT Aug 13 00:39:54.686652 containerd[1908]: time="2025-08-13T00:39:54.686503579Z" level=warning msg="container event discarded" container=6bce9b05590ef8329f6a718ff8d607a5de3a94617cdd442213f4d674eb9e049f type=CONTAINER_CREATED_EVENT Aug 13 00:39:54.765196 containerd[1908]: time="2025-08-13T00:39:54.765020285Z" level=warning msg="container event discarded" container=6bce9b05590ef8329f6a718ff8d607a5de3a94617cdd442213f4d674eb9e049f type=CONTAINER_STARTED_EVENT Aug 13 00:39:55.254408 containerd[1908]: time="2025-08-13T00:39:55.254256441Z" level=warning msg="container event discarded" container=c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909 type=CONTAINER_CREATED_EVENT Aug 13 00:39:55.254408 containerd[1908]: time="2025-08-13T00:39:55.254351111Z" level=warning msg="container event discarded" container=c8aef0604c9e587ada9f5285446d77dbc9e3d1b05b51459b50195a488ef8f909 type=CONTAINER_STARTED_EVENT Aug 13 00:39:55.254408 containerd[1908]: time="2025-08-13T00:39:55.254381985Z" level=warning msg="container event discarded" container=08d0e36d4a22f1960b42bc5f8ee4ff13a7a751d89be499d6508df3916231eef6 type=CONTAINER_CREATED_EVENT Aug 13 00:39:55.291062 containerd[1908]: time="2025-08-13T00:39:55.290920366Z" level=warning msg="container event discarded" container=08d0e36d4a22f1960b42bc5f8ee4ff13a7a751d89be499d6508df3916231eef6 type=CONTAINER_STARTED_EVENT Aug 13 00:39:55.345529 containerd[1908]: time="2025-08-13T00:39:55.345378366Z" level=warning msg="container event discarded" container=d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3 type=CONTAINER_CREATED_EVENT Aug 13 00:39:55.345529 containerd[1908]: time="2025-08-13T00:39:55.345469304Z" level=warning msg="container event discarded" container=d33397af391a50041fdd4cf91edfc0e0f235cd6d0f68cb6e2bd1cd13fa0adbc3 type=CONTAINER_STARTED_EVENT Aug 13 00:39:55.463166 containerd[1908]: time="2025-08-13T00:39:55.463062538Z" level=warning msg="container event discarded" container=1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1 type=CONTAINER_CREATED_EVENT Aug 13 00:39:55.463166 containerd[1908]: time="2025-08-13T00:39:55.463150791Z" level=warning msg="container event discarded" container=1cc5c3cca48b3bd9bdb9917a103bac28756171336aa9a58aff661e8226516ec1 type=CONTAINER_STARTED_EVENT Aug 13 00:39:56.362550 containerd[1908]: time="2025-08-13T00:39:56.362438087Z" level=warning msg="container event discarded" container=eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f type=CONTAINER_CREATED_EVENT Aug 13 00:39:56.362550 containerd[1908]: time="2025-08-13T00:39:56.362527072Z" level=warning msg="container event discarded" container=eb83ae772aaed984d2eb912c9289c44055ec0956f4f2e9266e10964b4d86d31f type=CONTAINER_STARTED_EVENT Aug 13 00:39:56.803437 containerd[1908]: time="2025-08-13T00:39:56.803298080Z" level=warning msg="container event discarded" container=fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57 type=CONTAINER_CREATED_EVENT Aug 13 00:39:56.847834 containerd[1908]: time="2025-08-13T00:39:56.847719714Z" level=warning msg="container event discarded" container=fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57 type=CONTAINER_STARTED_EVENT Aug 13 00:39:58.285573 containerd[1908]: time="2025-08-13T00:39:58.285450249Z" level=warning msg="container event discarded" container=d28ad9b7add56e68ce2268239df5bf84d86cfe4435d10ae35383c008614292ca type=CONTAINER_CREATED_EVENT Aug 13 00:39:58.322891 containerd[1908]: time="2025-08-13T00:39:58.322827135Z" level=warning msg="container event discarded" container=d28ad9b7add56e68ce2268239df5bf84d86cfe4435d10ae35383c008614292ca type=CONTAINER_STARTED_EVENT Aug 13 00:39:58.365493 containerd[1908]: time="2025-08-13T00:39:58.365439500Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"5b14d71099a2b252003a983a50d525224af25479be88fabddc6bd066a008508d\" pid:7645 exited_at:{seconds:1755045598 nanos:365274191}" Aug 13 00:40:00.123099 containerd[1908]: time="2025-08-13T00:40:00.122946024Z" level=warning msg="container event discarded" container=c050e1a18502210ce88ece43c0aa532f06798c88cc22b0d95aaa63d8c14bda96 type=CONTAINER_CREATED_EVENT Aug 13 00:40:00.170738 containerd[1908]: time="2025-08-13T00:40:00.170584438Z" level=warning msg="container event discarded" container=c050e1a18502210ce88ece43c0aa532f06798c88cc22b0d95aaa63d8c14bda96 type=CONTAINER_STARTED_EVENT Aug 13 00:40:02.810496 containerd[1908]: time="2025-08-13T00:40:02.810252792Z" level=warning msg="container event discarded" container=bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574 type=CONTAINER_CREATED_EVENT Aug 13 00:40:02.855029 containerd[1908]: time="2025-08-13T00:40:02.854866657Z" level=warning msg="container event discarded" container=bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574 type=CONTAINER_STARTED_EVENT Aug 13 00:40:03.248500 containerd[1908]: time="2025-08-13T00:40:03.248336709Z" level=warning msg="container event discarded" container=ade0c810561ef4366b61596837f2f0d7001c4a84c8622bd36f630e1b49be05a4 type=CONTAINER_CREATED_EVENT Aug 13 00:40:03.303467 containerd[1908]: time="2025-08-13T00:40:03.303312752Z" level=warning msg="container event discarded" container=ade0c810561ef4366b61596837f2f0d7001c4a84c8622bd36f630e1b49be05a4 type=CONTAINER_STARTED_EVENT Aug 13 00:40:03.335206 containerd[1908]: time="2025-08-13T00:40:03.335170755Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"33d7ce0b3cbfebe9573ea602e6c280899b006c546187ba11200aca33d4927173\" pid:7678 exited_at:{seconds:1755045603 nanos:335032884}" Aug 13 00:40:16.279467 containerd[1908]: time="2025-08-13T00:40:16.279425499Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\" id:\"bc3da772641d76591ce856108d297b11be6f0bb127af175b1656a49158e453fb\" pid:7707 exited_at:{seconds:1755045616 nanos:279139641}" Aug 13 00:40:28.317485 containerd[1908]: time="2025-08-13T00:40:28.317445537Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"8f27a92f1fcb8fc25e025b42e6f2fd4505c0a00c085762034e6045756268d21f\" pid:7746 exited_at:{seconds:1755045628 nanos:317207894}" Aug 13 00:40:33.299373 containerd[1908]: time="2025-08-13T00:40:33.299344861Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"7d1852565ccc574524749a8f180265e8ed867575080305ccd39a6d94f17111be\" pid:7786 exited_at:{seconds:1755045633 nanos:299227369}" Aug 13 00:40:33.315637 containerd[1908]: time="2025-08-13T00:40:33.315615143Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"cc925c75dbc0112f2262416cb0ea28adfda173b38d0b238f2c0b51ebf1cfb3de\" pid:7797 exited_at:{seconds:1755045633 nanos:315365469}" Aug 13 00:40:43.594890 systemd[1]: Started sshd@9-147.75.71.77:22-139.178.89.65:48454.service - OpenSSH per-connection server daemon (139.178.89.65:48454). Aug 13 00:40:43.690245 sshd[7827]: Accepted publickey for core from 139.178.89.65 port 48454 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:40:43.690958 sshd-session[7827]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:40:43.694081 systemd-logind[1890]: New session 12 of user core. Aug 13 00:40:43.697500 containerd[1908]: time="2025-08-13T00:40:43.697480281Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"9ded52c4998e9fbf6b577218a3c5b223fb0d382e0c5629f0e6bf1e6bfb23285f\" pid:7840 exited_at:{seconds:1755045643 nanos:697376440}" Aug 13 00:40:43.707303 systemd[1]: Started session-12.scope - Session 12 of User core. Aug 13 00:40:43.801359 sshd[7850]: Connection closed by 139.178.89.65 port 48454 Aug 13 00:40:43.801508 sshd-session[7827]: pam_unix(sshd:session): session closed for user core Aug 13 00:40:43.803646 systemd[1]: sshd@9-147.75.71.77:22-139.178.89.65:48454.service: Deactivated successfully. Aug 13 00:40:43.804544 systemd[1]: session-12.scope: Deactivated successfully. Aug 13 00:40:43.805042 systemd-logind[1890]: Session 12 logged out. Waiting for processes to exit. Aug 13 00:40:43.805868 systemd-logind[1890]: Removed session 12. Aug 13 00:40:46.280977 containerd[1908]: time="2025-08-13T00:40:46.280934257Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\" id:\"0345fbd63b7f1c10b4c1f6bc40617fa672a2174be608b4e79da181a608ac0f01\" pid:7895 exited_at:{seconds:1755045646 nanos:280689437}" Aug 13 00:40:48.818885 systemd[1]: Started sshd@10-147.75.71.77:22-139.178.89.65:48462.service - OpenSSH per-connection server daemon (139.178.89.65:48462). Aug 13 00:40:48.888639 sshd[7919]: Accepted publickey for core from 139.178.89.65 port 48462 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:40:48.889655 sshd-session[7919]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:40:48.893401 systemd-logind[1890]: New session 13 of user core. Aug 13 00:40:48.907353 systemd[1]: Started session-13.scope - Session 13 of User core. Aug 13 00:40:49.052885 sshd[7921]: Connection closed by 139.178.89.65 port 48462 Aug 13 00:40:49.053059 sshd-session[7919]: pam_unix(sshd:session): session closed for user core Aug 13 00:40:49.054776 systemd[1]: sshd@10-147.75.71.77:22-139.178.89.65:48462.service: Deactivated successfully. Aug 13 00:40:49.055831 systemd[1]: session-13.scope: Deactivated successfully. Aug 13 00:40:49.056767 systemd-logind[1890]: Session 13 logged out. Waiting for processes to exit. Aug 13 00:40:49.057421 systemd-logind[1890]: Removed session 13. Aug 13 00:40:54.069885 systemd[1]: Started sshd@11-147.75.71.77:22-139.178.89.65:36660.service - OpenSSH per-connection server daemon (139.178.89.65:36660). Aug 13 00:40:54.102248 sshd[7952]: Accepted publickey for core from 139.178.89.65 port 36660 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:40:54.102830 sshd-session[7952]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:40:54.105575 systemd-logind[1890]: New session 14 of user core. Aug 13 00:40:54.121449 systemd[1]: Started session-14.scope - Session 14 of User core. Aug 13 00:40:54.205773 sshd[7954]: Connection closed by 139.178.89.65 port 36660 Aug 13 00:40:54.205943 sshd-session[7952]: pam_unix(sshd:session): session closed for user core Aug 13 00:40:54.217560 systemd[1]: sshd@11-147.75.71.77:22-139.178.89.65:36660.service: Deactivated successfully. Aug 13 00:40:54.218508 systemd[1]: session-14.scope: Deactivated successfully. Aug 13 00:40:54.219025 systemd-logind[1890]: Session 14 logged out. Waiting for processes to exit. Aug 13 00:40:54.220624 systemd[1]: Started sshd@12-147.75.71.77:22-139.178.89.65:36676.service - OpenSSH per-connection server daemon (139.178.89.65:36676). Aug 13 00:40:54.221114 systemd-logind[1890]: Removed session 14. Aug 13 00:40:54.265318 sshd[7980]: Accepted publickey for core from 139.178.89.65 port 36676 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:40:54.266154 sshd-session[7980]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:40:54.269548 systemd-logind[1890]: New session 15 of user core. Aug 13 00:40:54.282346 systemd[1]: Started session-15.scope - Session 15 of User core. Aug 13 00:40:54.447126 sshd[7982]: Connection closed by 139.178.89.65 port 36676 Aug 13 00:40:54.447298 sshd-session[7980]: pam_unix(sshd:session): session closed for user core Aug 13 00:40:54.466424 systemd[1]: sshd@12-147.75.71.77:22-139.178.89.65:36676.service: Deactivated successfully. Aug 13 00:40:54.467370 systemd[1]: session-15.scope: Deactivated successfully. Aug 13 00:40:54.467899 systemd-logind[1890]: Session 15 logged out. Waiting for processes to exit. Aug 13 00:40:54.469082 systemd[1]: Started sshd@13-147.75.71.77:22-139.178.89.65:36678.service - OpenSSH per-connection server daemon (139.178.89.65:36678). Aug 13 00:40:54.469703 systemd-logind[1890]: Removed session 15. Aug 13 00:40:54.505441 sshd[8005]: Accepted publickey for core from 139.178.89.65 port 36678 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:40:54.506172 sshd-session[8005]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:40:54.509363 systemd-logind[1890]: New session 16 of user core. Aug 13 00:40:54.524397 systemd[1]: Started session-16.scope - Session 16 of User core. Aug 13 00:40:54.611592 sshd[8009]: Connection closed by 139.178.89.65 port 36678 Aug 13 00:40:54.611773 sshd-session[8005]: pam_unix(sshd:session): session closed for user core Aug 13 00:40:54.613566 systemd[1]: sshd@13-147.75.71.77:22-139.178.89.65:36678.service: Deactivated successfully. Aug 13 00:40:54.614601 systemd[1]: session-16.scope: Deactivated successfully. Aug 13 00:40:54.615617 systemd-logind[1890]: Session 16 logged out. Waiting for processes to exit. Aug 13 00:40:54.616201 systemd-logind[1890]: Removed session 16. Aug 13 00:40:58.304149 containerd[1908]: time="2025-08-13T00:40:58.304126092Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"3b4c13e1b71a40a5a6cc705bdaff167ec82d38b64220b72f57ce5b96185691d4\" pid:8045 exited_at:{seconds:1755045658 nanos:303880428}" Aug 13 00:40:59.634912 systemd[1]: Started sshd@14-147.75.71.77:22-139.178.89.65:34696.service - OpenSSH per-connection server daemon (139.178.89.65:34696). Aug 13 00:40:59.677633 sshd[8072]: Accepted publickey for core from 139.178.89.65 port 34696 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:40:59.678268 sshd-session[8072]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:40:59.681011 systemd-logind[1890]: New session 17 of user core. Aug 13 00:40:59.690476 systemd[1]: Started session-17.scope - Session 17 of User core. Aug 13 00:40:59.777630 sshd[8074]: Connection closed by 139.178.89.65 port 34696 Aug 13 00:40:59.777792 sshd-session[8072]: pam_unix(sshd:session): session closed for user core Aug 13 00:40:59.779582 systemd[1]: sshd@14-147.75.71.77:22-139.178.89.65:34696.service: Deactivated successfully. Aug 13 00:40:59.780568 systemd[1]: session-17.scope: Deactivated successfully. Aug 13 00:40:59.781649 systemd-logind[1890]: Session 17 logged out. Waiting for processes to exit. Aug 13 00:40:59.782295 systemd-logind[1890]: Removed session 17. Aug 13 00:41:03.296277 containerd[1908]: time="2025-08-13T00:41:03.296249650Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bbdea5bbb12ba29dadeab2ecc74f499d8cd968abe92fa521b7f386dd8eb37574\" id:\"80dd25424712f2257c54fb58180027c7957eb7802ad892eaca7662737276499b\" pid:8110 exited_at:{seconds:1755045663 nanos:296136129}" Aug 13 00:41:04.797828 systemd[1]: Started sshd@15-147.75.71.77:22-139.178.89.65:34710.service - OpenSSH per-connection server daemon (139.178.89.65:34710). Aug 13 00:41:04.835644 sshd[8138]: Accepted publickey for core from 139.178.89.65 port 34710 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:41:04.836313 sshd-session[8138]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:41:04.839010 systemd-logind[1890]: New session 18 of user core. Aug 13 00:41:04.849431 systemd[1]: Started session-18.scope - Session 18 of User core. Aug 13 00:41:04.933115 sshd[8140]: Connection closed by 139.178.89.65 port 34710 Aug 13 00:41:04.933283 sshd-session[8138]: pam_unix(sshd:session): session closed for user core Aug 13 00:41:04.935047 systemd[1]: sshd@15-147.75.71.77:22-139.178.89.65:34710.service: Deactivated successfully. Aug 13 00:41:04.936015 systemd[1]: session-18.scope: Deactivated successfully. Aug 13 00:41:04.936745 systemd-logind[1890]: Session 18 logged out. Waiting for processes to exit. Aug 13 00:41:04.937371 systemd-logind[1890]: Removed session 18. Aug 13 00:41:09.960902 systemd[1]: Started sshd@16-147.75.71.77:22-139.178.89.65:36392.service - OpenSSH per-connection server daemon (139.178.89.65:36392). Aug 13 00:41:10.009283 sshd[8171]: Accepted publickey for core from 139.178.89.65 port 36392 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:41:10.009836 sshd-session[8171]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:41:10.012591 systemd-logind[1890]: New session 19 of user core. Aug 13 00:41:10.025472 systemd[1]: Started session-19.scope - Session 19 of User core. Aug 13 00:41:10.112505 sshd[8173]: Connection closed by 139.178.89.65 port 36392 Aug 13 00:41:10.112699 sshd-session[8171]: pam_unix(sshd:session): session closed for user core Aug 13 00:41:10.114222 systemd[1]: sshd@16-147.75.71.77:22-139.178.89.65:36392.service: Deactivated successfully. Aug 13 00:41:10.115229 systemd[1]: session-19.scope: Deactivated successfully. Aug 13 00:41:10.115891 systemd-logind[1890]: Session 19 logged out. Waiting for processes to exit. Aug 13 00:41:10.116495 systemd-logind[1890]: Removed session 19. Aug 13 00:41:15.141934 systemd[1]: Started sshd@17-147.75.71.77:22-139.178.89.65:36406.service - OpenSSH per-connection server daemon (139.178.89.65:36406). Aug 13 00:41:15.195049 sshd[8198]: Accepted publickey for core from 139.178.89.65 port 36406 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:41:15.195944 sshd-session[8198]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:41:15.199052 systemd-logind[1890]: New session 20 of user core. Aug 13 00:41:15.209430 systemd[1]: Started session-20.scope - Session 20 of User core. Aug 13 00:41:15.354943 sshd[8200]: Connection closed by 139.178.89.65 port 36406 Aug 13 00:41:15.355260 sshd-session[8198]: pam_unix(sshd:session): session closed for user core Aug 13 00:41:15.369164 systemd[1]: sshd@17-147.75.71.77:22-139.178.89.65:36406.service: Deactivated successfully. Aug 13 00:41:15.371040 systemd[1]: session-20.scope: Deactivated successfully. Aug 13 00:41:15.372095 systemd-logind[1890]: Session 20 logged out. Waiting for processes to exit. Aug 13 00:41:15.374946 systemd[1]: Started sshd@18-147.75.71.77:22-139.178.89.65:36420.service - OpenSSH per-connection server daemon (139.178.89.65:36420). Aug 13 00:41:15.375986 systemd-logind[1890]: Removed session 20. Aug 13 00:41:15.461124 sshd[8225]: Accepted publickey for core from 139.178.89.65 port 36420 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:41:15.461991 sshd-session[8225]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:41:15.465418 systemd-logind[1890]: New session 21 of user core. Aug 13 00:41:15.481326 systemd[1]: Started session-21.scope - Session 21 of User core. Aug 13 00:41:15.617337 sshd[8229]: Connection closed by 139.178.89.65 port 36420 Aug 13 00:41:15.617518 sshd-session[8225]: pam_unix(sshd:session): session closed for user core Aug 13 00:41:15.640993 systemd[1]: sshd@18-147.75.71.77:22-139.178.89.65:36420.service: Deactivated successfully. Aug 13 00:41:15.642158 systemd[1]: session-21.scope: Deactivated successfully. Aug 13 00:41:15.642751 systemd-logind[1890]: Session 21 logged out. Waiting for processes to exit. Aug 13 00:41:15.644593 systemd[1]: Started sshd@19-147.75.71.77:22-139.178.89.65:36434.service - OpenSSH per-connection server daemon (139.178.89.65:36434). Aug 13 00:41:15.645192 systemd-logind[1890]: Removed session 21. Aug 13 00:41:15.705696 sshd[8251]: Accepted publickey for core from 139.178.89.65 port 36434 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:41:15.706738 sshd-session[8251]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:41:15.710863 systemd-logind[1890]: New session 22 of user core. Aug 13 00:41:15.734336 systemd[1]: Started session-22.scope - Session 22 of User core. Aug 13 00:41:16.272749 containerd[1908]: time="2025-08-13T00:41:16.272720887Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5edb60c85d9a8922fea3e568a58e03913f9f1226b814e866ea29c21abc42097c\" id:\"ee10debb527cbce44330ae7ac1cdc896749b576fb3da2a93ea4b6c0ebc3129ee\" pid:8284 exited_at:{seconds:1755045676 nanos:272517116}" Aug 13 00:41:16.375060 sshd[8253]: Connection closed by 139.178.89.65 port 36434 Aug 13 00:41:16.375240 sshd-session[8251]: pam_unix(sshd:session): session closed for user core Aug 13 00:41:16.388284 systemd[1]: sshd@19-147.75.71.77:22-139.178.89.65:36434.service: Deactivated successfully. Aug 13 00:41:16.389112 systemd[1]: session-22.scope: Deactivated successfully. Aug 13 00:41:16.389624 systemd-logind[1890]: Session 22 logged out. Waiting for processes to exit. Aug 13 00:41:16.390818 systemd[1]: Started sshd@20-147.75.71.77:22-139.178.89.65:36448.service - OpenSSH per-connection server daemon (139.178.89.65:36448). Aug 13 00:41:16.391202 systemd-logind[1890]: Removed session 22. Aug 13 00:41:16.422687 sshd[8317]: Accepted publickey for core from 139.178.89.65 port 36448 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:41:16.423416 sshd-session[8317]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:41:16.426086 systemd-logind[1890]: New session 23 of user core. Aug 13 00:41:16.437401 systemd[1]: Started session-23.scope - Session 23 of User core. Aug 13 00:41:16.605207 sshd[8321]: Connection closed by 139.178.89.65 port 36448 Aug 13 00:41:16.605373 sshd-session[8317]: pam_unix(sshd:session): session closed for user core Aug 13 00:41:16.617574 systemd[1]: sshd@20-147.75.71.77:22-139.178.89.65:36448.service: Deactivated successfully. Aug 13 00:41:16.618554 systemd[1]: session-23.scope: Deactivated successfully. Aug 13 00:41:16.619090 systemd-logind[1890]: Session 23 logged out. Waiting for processes to exit. Aug 13 00:41:16.620618 systemd[1]: Started sshd@21-147.75.71.77:22-139.178.89.65:36452.service - OpenSSH per-connection server daemon (139.178.89.65:36452). Aug 13 00:41:16.621101 systemd-logind[1890]: Removed session 23. Aug 13 00:41:16.666260 sshd[8344]: Accepted publickey for core from 139.178.89.65 port 36452 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:41:16.667105 sshd-session[8344]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:41:16.670791 systemd-logind[1890]: New session 24 of user core. Aug 13 00:41:16.691662 systemd[1]: Started session-24.scope - Session 24 of User core. Aug 13 00:41:16.821798 sshd[8346]: Connection closed by 139.178.89.65 port 36452 Aug 13 00:41:16.821962 sshd-session[8344]: pam_unix(sshd:session): session closed for user core Aug 13 00:41:16.823827 systemd[1]: sshd@21-147.75.71.77:22-139.178.89.65:36452.service: Deactivated successfully. Aug 13 00:41:16.824686 systemd[1]: session-24.scope: Deactivated successfully. Aug 13 00:41:16.825112 systemd-logind[1890]: Session 24 logged out. Waiting for processes to exit. Aug 13 00:41:16.825693 systemd-logind[1890]: Removed session 24. Aug 13 00:41:21.831886 systemd[1]: Started sshd@22-147.75.71.77:22-139.178.89.65:41260.service - OpenSSH per-connection server daemon (139.178.89.65:41260). Aug 13 00:41:21.865235 sshd[8375]: Accepted publickey for core from 139.178.89.65 port 41260 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:41:21.866018 sshd-session[8375]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:41:21.869179 systemd-logind[1890]: New session 25 of user core. Aug 13 00:41:21.877359 systemd[1]: Started session-25.scope - Session 25 of User core. Aug 13 00:41:21.962574 sshd[8377]: Connection closed by 139.178.89.65 port 41260 Aug 13 00:41:21.962739 sshd-session[8375]: pam_unix(sshd:session): session closed for user core Aug 13 00:41:21.965105 systemd[1]: sshd@22-147.75.71.77:22-139.178.89.65:41260.service: Deactivated successfully. Aug 13 00:41:21.966367 systemd[1]: session-25.scope: Deactivated successfully. Aug 13 00:41:21.966884 systemd-logind[1890]: Session 25 logged out. Waiting for processes to exit. Aug 13 00:41:21.967474 systemd-logind[1890]: Removed session 25. Aug 13 00:41:26.974957 systemd[1]: Started sshd@23-147.75.71.77:22-139.178.89.65:41270.service - OpenSSH per-connection server daemon (139.178.89.65:41270). Aug 13 00:41:27.008532 sshd[8404]: Accepted publickey for core from 139.178.89.65 port 41270 ssh2: RSA SHA256:b1fWuA11n3ra6ahrkuNoMCBqpG1Qz8qLzuGLGhpcBDI Aug 13 00:41:27.009129 sshd-session[8404]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:41:27.011983 systemd-logind[1890]: New session 26 of user core. Aug 13 00:41:27.028316 systemd[1]: Started session-26.scope - Session 26 of User core. Aug 13 00:41:27.177063 sshd[8406]: Connection closed by 139.178.89.65 port 41270 Aug 13 00:41:27.177289 sshd-session[8404]: pam_unix(sshd:session): session closed for user core Aug 13 00:41:27.179174 systemd[1]: sshd@23-147.75.71.77:22-139.178.89.65:41270.service: Deactivated successfully. Aug 13 00:41:27.180288 systemd[1]: session-26.scope: Deactivated successfully. Aug 13 00:41:27.181080 systemd-logind[1890]: Session 26 logged out. Waiting for processes to exit. Aug 13 00:41:27.181778 systemd-logind[1890]: Removed session 26. Aug 13 00:41:28.323590 containerd[1908]: time="2025-08-13T00:41:28.323563862Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe0d8f74e4f7a205a80da2afb6b9a1ddd631ab2a673d5d86a18d26da4a075e57\" id:\"fa8f6f358e654dff8f00bfbc488ed1d32d6a422d1c6ecb45d020a9f208960a1d\" pid:8443 exited_at:{seconds:1755045688 nanos:323308050}"