Sep 16 05:31:08.921306 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Sep 16 03:05:42 -00 2025 Sep 16 05:31:08.921355 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=0b876f86a632750e9937176808a48c2452d5168964273bcfc3c72f2a26140c06 Sep 16 05:31:08.921363 kernel: BIOS-provided physical RAM map: Sep 16 05:31:08.921368 kernel: BIOS-e820: [mem 0x0000000000000000-0x00000000000997ff] usable Sep 16 05:31:08.921372 kernel: BIOS-e820: [mem 0x0000000000099800-0x000000000009ffff] reserved Sep 16 05:31:08.921376 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Sep 16 05:31:08.921397 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Sep 16 05:31:08.921401 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Sep 16 05:31:08.921405 kernel: BIOS-e820: [mem 0x0000000040400000-0x0000000080fcefff] usable Sep 16 05:31:08.921410 kernel: BIOS-e820: [mem 0x0000000080fcf000-0x0000000080fcffff] ACPI NVS Sep 16 05:31:08.921414 kernel: BIOS-e820: [mem 0x0000000080fd0000-0x0000000080fd0fff] reserved Sep 16 05:31:08.921418 kernel: BIOS-e820: [mem 0x0000000080fd1000-0x000000008afc2fff] usable Sep 16 05:31:08.921422 kernel: BIOS-e820: [mem 0x000000008afc3000-0x000000008c0a7fff] reserved Sep 16 05:31:08.921426 kernel: BIOS-e820: [mem 0x000000008c0a8000-0x000000008c230fff] usable Sep 16 05:31:08.921431 kernel: BIOS-e820: [mem 0x000000008c231000-0x000000008c662fff] ACPI NVS Sep 16 05:31:08.921437 kernel: BIOS-e820: [mem 0x000000008c663000-0x000000008eefefff] reserved Sep 16 05:31:08.921441 kernel: BIOS-e820: [mem 0x000000008eeff000-0x000000008eefffff] usable Sep 16 05:31:08.921446 kernel: BIOS-e820: [mem 0x000000008ef00000-0x000000008fffffff] reserved Sep 16 05:31:08.921450 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 16 05:31:08.921455 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Sep 16 05:31:08.921459 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Sep 16 05:31:08.921464 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Sep 16 05:31:08.921468 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Sep 16 05:31:08.921473 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000086effffff] usable Sep 16 05:31:08.921477 kernel: NX (Execute Disable) protection: active Sep 16 05:31:08.921482 kernel: APIC: Static calls initialized Sep 16 05:31:08.921487 kernel: SMBIOS 3.2.1 present. Sep 16 05:31:08.921492 kernel: DMI: Supermicro SYS-5019C-MR-PH004/X11SCM-F, BIOS 1.9 09/16/2022 Sep 16 05:31:08.921496 kernel: DMI: Memory slots populated: 2/4 Sep 16 05:31:08.921501 kernel: tsc: Detected 3400.000 MHz processor Sep 16 05:31:08.921506 kernel: tsc: Detected 3399.906 MHz TSC Sep 16 05:31:08.921510 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 16 05:31:08.921515 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 16 05:31:08.921520 kernel: last_pfn = 0x86f000 max_arch_pfn = 0x400000000 Sep 16 05:31:08.921525 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 23), built from 10 variable MTRRs Sep 16 05:31:08.921530 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 16 05:31:08.921535 kernel: last_pfn = 0x8ef00 max_arch_pfn = 0x400000000 Sep 16 05:31:08.921540 kernel: Using GB pages for direct mapping Sep 16 05:31:08.921544 kernel: ACPI: Early table checksum verification disabled Sep 16 05:31:08.921549 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Sep 16 05:31:08.921556 kernel: ACPI: XSDT 0x000000008C5440C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Sep 16 05:31:08.921561 kernel: ACPI: FACP 0x000000008C580670 000114 (v06 01072009 AMI 00010013) Sep 16 05:31:08.921566 kernel: ACPI: DSDT 0x000000008C544268 03C404 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Sep 16 05:31:08.921572 kernel: ACPI: FACS 0x000000008C662F80 000040 Sep 16 05:31:08.921577 kernel: ACPI: APIC 0x000000008C580788 00012C (v04 01072009 AMI 00010013) Sep 16 05:31:08.921582 kernel: ACPI: FPDT 0x000000008C5808B8 000044 (v01 01072009 AMI 00010013) Sep 16 05:31:08.921586 kernel: ACPI: FIDT 0x000000008C580900 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Sep 16 05:31:08.921591 kernel: ACPI: MCFG 0x000000008C5809A0 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Sep 16 05:31:08.921596 kernel: ACPI: SPMI 0x000000008C5809E0 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Sep 16 05:31:08.921601 kernel: ACPI: SSDT 0x000000008C580A28 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Sep 16 05:31:08.921607 kernel: ACPI: SSDT 0x000000008C582548 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Sep 16 05:31:08.921612 kernel: ACPI: SSDT 0x000000008C585710 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Sep 16 05:31:08.921617 kernel: ACPI: HPET 0x000000008C587A40 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 16 05:31:08.921622 kernel: ACPI: SSDT 0x000000008C587A78 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Sep 16 05:31:08.921626 kernel: ACPI: SSDT 0x000000008C588A28 0008F4 (v02 INTEL xh_mossb 00000000 INTL 20160527) Sep 16 05:31:08.921631 kernel: ACPI: UEFI 0x000000008C589320 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 16 05:31:08.921636 kernel: ACPI: LPIT 0x000000008C589368 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 16 05:31:08.921641 kernel: ACPI: SSDT 0x000000008C589400 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Sep 16 05:31:08.921647 kernel: ACPI: SSDT 0x000000008C58BBE0 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Sep 16 05:31:08.921652 kernel: ACPI: DBGP 0x000000008C58D0C8 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 16 05:31:08.921657 kernel: ACPI: DBG2 0x000000008C58D100 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Sep 16 05:31:08.921662 kernel: ACPI: SSDT 0x000000008C58D158 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Sep 16 05:31:08.921667 kernel: ACPI: DMAR 0x000000008C58ECC0 000070 (v01 INTEL EDK2 00000002 01000013) Sep 16 05:31:08.921671 kernel: ACPI: SSDT 0x000000008C58ED30 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Sep 16 05:31:08.921676 kernel: ACPI: TPM2 0x000000008C58EE78 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Sep 16 05:31:08.921681 kernel: ACPI: SSDT 0x000000008C58EEB0 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Sep 16 05:31:08.921686 kernel: ACPI: WSMT 0x000000008C58FC40 000028 (v01 SUPERM 01072009 AMI 00010013) Sep 16 05:31:08.921692 kernel: ACPI: EINJ 0x000000008C58FC68 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Sep 16 05:31:08.921697 kernel: ACPI: ERST 0x000000008C58FD98 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Sep 16 05:31:08.921702 kernel: ACPI: BERT 0x000000008C58FFC8 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Sep 16 05:31:08.921707 kernel: ACPI: HEST 0x000000008C58FFF8 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Sep 16 05:31:08.921712 kernel: ACPI: SSDT 0x000000008C590278 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Sep 16 05:31:08.921716 kernel: ACPI: Reserving FACP table memory at [mem 0x8c580670-0x8c580783] Sep 16 05:31:08.921721 kernel: ACPI: Reserving DSDT table memory at [mem 0x8c544268-0x8c58066b] Sep 16 05:31:08.921726 kernel: ACPI: Reserving FACS table memory at [mem 0x8c662f80-0x8c662fbf] Sep 16 05:31:08.921731 kernel: ACPI: Reserving APIC table memory at [mem 0x8c580788-0x8c5808b3] Sep 16 05:31:08.921737 kernel: ACPI: Reserving FPDT table memory at [mem 0x8c5808b8-0x8c5808fb] Sep 16 05:31:08.921742 kernel: ACPI: Reserving FIDT table memory at [mem 0x8c580900-0x8c58099b] Sep 16 05:31:08.921747 kernel: ACPI: Reserving MCFG table memory at [mem 0x8c5809a0-0x8c5809db] Sep 16 05:31:08.921754 kernel: ACPI: Reserving SPMI table memory at [mem 0x8c5809e0-0x8c580a20] Sep 16 05:31:08.921759 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c580a28-0x8c582543] Sep 16 05:31:08.921764 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c582548-0x8c58570d] Sep 16 05:31:08.921768 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c585710-0x8c587a3a] Sep 16 05:31:08.921773 kernel: ACPI: Reserving HPET table memory at [mem 0x8c587a40-0x8c587a77] Sep 16 05:31:08.921778 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c587a78-0x8c588a25] Sep 16 05:31:08.921784 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c588a28-0x8c58931b] Sep 16 05:31:08.921789 kernel: ACPI: Reserving UEFI table memory at [mem 0x8c589320-0x8c589361] Sep 16 05:31:08.921794 kernel: ACPI: Reserving LPIT table memory at [mem 0x8c589368-0x8c5893fb] Sep 16 05:31:08.921799 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c589400-0x8c58bbdd] Sep 16 05:31:08.921803 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58bbe0-0x8c58d0c1] Sep 16 05:31:08.921808 kernel: ACPI: Reserving DBGP table memory at [mem 0x8c58d0c8-0x8c58d0fb] Sep 16 05:31:08.921813 kernel: ACPI: Reserving DBG2 table memory at [mem 0x8c58d100-0x8c58d153] Sep 16 05:31:08.921818 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58d158-0x8c58ecbe] Sep 16 05:31:08.921823 kernel: ACPI: Reserving DMAR table memory at [mem 0x8c58ecc0-0x8c58ed2f] Sep 16 05:31:08.921829 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58ed30-0x8c58ee73] Sep 16 05:31:08.921833 kernel: ACPI: Reserving TPM2 table memory at [mem 0x8c58ee78-0x8c58eeab] Sep 16 05:31:08.921838 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58eeb0-0x8c58fc3e] Sep 16 05:31:08.921843 kernel: ACPI: Reserving WSMT table memory at [mem 0x8c58fc40-0x8c58fc67] Sep 16 05:31:08.921848 kernel: ACPI: Reserving EINJ table memory at [mem 0x8c58fc68-0x8c58fd97] Sep 16 05:31:08.921853 kernel: ACPI: Reserving ERST table memory at [mem 0x8c58fd98-0x8c58ffc7] Sep 16 05:31:08.921858 kernel: ACPI: Reserving BERT table memory at [mem 0x8c58ffc8-0x8c58fff7] Sep 16 05:31:08.921863 kernel: ACPI: Reserving HEST table memory at [mem 0x8c58fff8-0x8c590273] Sep 16 05:31:08.921867 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c590278-0x8c5903d9] Sep 16 05:31:08.921872 kernel: No NUMA configuration found Sep 16 05:31:08.921878 kernel: Faking a node at [mem 0x0000000000000000-0x000000086effffff] Sep 16 05:31:08.921883 kernel: NODE_DATA(0) allocated [mem 0x86eff8dc0-0x86effffff] Sep 16 05:31:08.921888 kernel: Zone ranges: Sep 16 05:31:08.921893 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 16 05:31:08.921897 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Sep 16 05:31:08.921902 kernel: Normal [mem 0x0000000100000000-0x000000086effffff] Sep 16 05:31:08.921907 kernel: Device empty Sep 16 05:31:08.921912 kernel: Movable zone start for each node Sep 16 05:31:08.921917 kernel: Early memory node ranges Sep 16 05:31:08.921923 kernel: node 0: [mem 0x0000000000001000-0x0000000000098fff] Sep 16 05:31:08.921928 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Sep 16 05:31:08.921933 kernel: node 0: [mem 0x0000000040400000-0x0000000080fcefff] Sep 16 05:31:08.921937 kernel: node 0: [mem 0x0000000080fd1000-0x000000008afc2fff] Sep 16 05:31:08.921942 kernel: node 0: [mem 0x000000008c0a8000-0x000000008c230fff] Sep 16 05:31:08.921950 kernel: node 0: [mem 0x000000008eeff000-0x000000008eefffff] Sep 16 05:31:08.921957 kernel: node 0: [mem 0x0000000100000000-0x000000086effffff] Sep 16 05:31:08.921962 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000086effffff] Sep 16 05:31:08.921967 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 16 05:31:08.921973 kernel: On node 0, zone DMA: 103 pages in unavailable ranges Sep 16 05:31:08.921978 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Sep 16 05:31:08.921983 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Sep 16 05:31:08.921989 kernel: On node 0, zone DMA32: 4325 pages in unavailable ranges Sep 16 05:31:08.921996 kernel: On node 0, zone DMA32: 11470 pages in unavailable ranges Sep 16 05:31:08.922002 kernel: On node 0, zone Normal: 4352 pages in unavailable ranges Sep 16 05:31:08.922023 kernel: On node 0, zone Normal: 4096 pages in unavailable ranges Sep 16 05:31:08.922045 kernel: ACPI: PM-Timer IO Port: 0x1808 Sep 16 05:31:08.922066 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Sep 16 05:31:08.922071 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Sep 16 05:31:08.922076 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Sep 16 05:31:08.922101 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Sep 16 05:31:08.922109 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Sep 16 05:31:08.922114 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Sep 16 05:31:08.922134 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Sep 16 05:31:08.922140 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Sep 16 05:31:08.922160 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Sep 16 05:31:08.922168 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Sep 16 05:31:08.922174 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Sep 16 05:31:08.922179 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Sep 16 05:31:08.922184 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Sep 16 05:31:08.922189 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Sep 16 05:31:08.922218 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Sep 16 05:31:08.922239 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Sep 16 05:31:08.922244 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Sep 16 05:31:08.922250 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 16 05:31:08.922255 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 16 05:31:08.922285 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 16 05:31:08.922305 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 16 05:31:08.922328 kernel: TSC deadline timer available Sep 16 05:31:08.922333 kernel: CPU topo: Max. logical packages: 1 Sep 16 05:31:08.922362 kernel: CPU topo: Max. logical dies: 1 Sep 16 05:31:08.922367 kernel: CPU topo: Max. dies per package: 1 Sep 16 05:31:08.922372 kernel: CPU topo: Max. threads per core: 2 Sep 16 05:31:08.922380 kernel: CPU topo: Num. cores per package: 8 Sep 16 05:31:08.922385 kernel: CPU topo: Num. threads per package: 16 Sep 16 05:31:08.922409 kernel: CPU topo: Allowing 16 present CPUs plus 0 hotplug CPUs Sep 16 05:31:08.922415 kernel: [mem 0x90000000-0xdfffffff] available for PCI devices Sep 16 05:31:08.922420 kernel: Booting paravirtualized kernel on bare hardware Sep 16 05:31:08.922445 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 16 05:31:08.922450 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Sep 16 05:31:08.922472 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 16 05:31:08.922494 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 16 05:31:08.922499 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Sep 16 05:31:08.922505 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=0b876f86a632750e9937176808a48c2452d5168964273bcfc3c72f2a26140c06 Sep 16 05:31:08.922512 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 16 05:31:08.922517 kernel: random: crng init done Sep 16 05:31:08.922522 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Sep 16 05:31:08.922528 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Sep 16 05:31:08.922533 kernel: Fallback order for Node 0: 0 Sep 16 05:31:08.922538 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8363235 Sep 16 05:31:08.922544 kernel: Policy zone: Normal Sep 16 05:31:08.922549 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 16 05:31:08.922555 kernel: software IO TLB: area num 16. Sep 16 05:31:08.922561 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Sep 16 05:31:08.922566 kernel: ftrace: allocating 40125 entries in 157 pages Sep 16 05:31:08.922571 kernel: ftrace: allocated 157 pages with 5 groups Sep 16 05:31:08.922577 kernel: Dynamic Preempt: voluntary Sep 16 05:31:08.922582 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 16 05:31:08.922589 kernel: rcu: RCU event tracing is enabled. Sep 16 05:31:08.922595 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Sep 16 05:31:08.922600 kernel: Trampoline variant of Tasks RCU enabled. Sep 16 05:31:08.922606 kernel: Rude variant of Tasks RCU enabled. Sep 16 05:31:08.922612 kernel: Tracing variant of Tasks RCU enabled. Sep 16 05:31:08.922617 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 16 05:31:08.922622 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Sep 16 05:31:08.922628 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 16 05:31:08.922633 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 16 05:31:08.922638 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 16 05:31:08.922644 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Sep 16 05:31:08.922649 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 16 05:31:08.922655 kernel: Console: colour VGA+ 80x25 Sep 16 05:31:08.922661 kernel: printk: legacy console [tty0] enabled Sep 16 05:31:08.922666 kernel: printk: legacy console [ttyS1] enabled Sep 16 05:31:08.922671 kernel: ACPI: Core revision 20240827 Sep 16 05:31:08.922677 kernel: hpet: HPET dysfunctional in PC10. Force disabled. Sep 16 05:31:08.922682 kernel: APIC: Switch to symmetric I/O mode setup Sep 16 05:31:08.922687 kernel: DMAR: Host address width 39 Sep 16 05:31:08.922692 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Sep 16 05:31:08.922698 kernel: DMAR: dmar0: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Sep 16 05:31:08.922704 kernel: DMAR: RMRR base: 0x0000008cf10000 end: 0x0000008d159fff Sep 16 05:31:08.922709 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 0 Sep 16 05:31:08.922715 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Sep 16 05:31:08.922720 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Sep 16 05:31:08.922725 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Sep 16 05:31:08.922731 kernel: x2apic enabled Sep 16 05:31:08.922736 kernel: APIC: Switched APIC routing to: cluster x2apic Sep 16 05:31:08.922741 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Sep 16 05:31:08.922747 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Sep 16 05:31:08.922759 kernel: CPU0: Thermal monitoring enabled (TM1) Sep 16 05:31:08.922765 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Sep 16 05:31:08.922770 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Sep 16 05:31:08.922776 kernel: process: using mwait in idle threads Sep 16 05:31:08.922781 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 16 05:31:08.922786 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 16 05:31:08.922791 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Sep 16 05:31:08.922797 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Sep 16 05:31:08.922802 kernel: RETBleed: Mitigation: Enhanced IBRS Sep 16 05:31:08.922807 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 16 05:31:08.922812 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 16 05:31:08.922819 kernel: TAA: Mitigation: TSX disabled Sep 16 05:31:08.922824 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Sep 16 05:31:08.922829 kernel: SRBDS: Mitigation: Microcode Sep 16 05:31:08.922834 kernel: GDS: Vulnerable: No microcode Sep 16 05:31:08.922839 kernel: active return thunk: its_return_thunk Sep 16 05:31:08.922845 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 16 05:31:08.922850 kernel: VMSCAPE: Mitigation: IBPB before exit to userspace Sep 16 05:31:08.922855 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 16 05:31:08.922860 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 16 05:31:08.922866 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 16 05:31:08.922871 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Sep 16 05:31:08.922877 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Sep 16 05:31:08.922882 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 16 05:31:08.922887 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Sep 16 05:31:08.922892 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Sep 16 05:31:08.922898 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Sep 16 05:31:08.922903 kernel: Freeing SMP alternatives memory: 32K Sep 16 05:31:08.922908 kernel: pid_max: default: 32768 minimum: 301 Sep 16 05:31:08.922913 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 16 05:31:08.922919 kernel: landlock: Up and running. Sep 16 05:31:08.922924 kernel: SELinux: Initializing. Sep 16 05:31:08.922929 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 16 05:31:08.922934 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 16 05:31:08.922941 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Sep 16 05:31:08.922946 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Sep 16 05:31:08.922951 kernel: ... version: 4 Sep 16 05:31:08.922957 kernel: ... bit width: 48 Sep 16 05:31:08.922962 kernel: ... generic registers: 4 Sep 16 05:31:08.922967 kernel: ... value mask: 0000ffffffffffff Sep 16 05:31:08.922973 kernel: ... max period: 00007fffffffffff Sep 16 05:31:08.922978 kernel: ... fixed-purpose events: 3 Sep 16 05:31:08.922983 kernel: ... event mask: 000000070000000f Sep 16 05:31:08.922989 kernel: signal: max sigframe size: 2032 Sep 16 05:31:08.922995 kernel: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Sep 16 05:31:08.923000 kernel: rcu: Hierarchical SRCU implementation. Sep 16 05:31:08.923006 kernel: rcu: Max phase no-delay instances is 400. Sep 16 05:31:08.923011 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Sep 16 05:31:08.923017 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Sep 16 05:31:08.923022 kernel: smp: Bringing up secondary CPUs ... Sep 16 05:31:08.923027 kernel: smpboot: x86: Booting SMP configuration: Sep 16 05:31:08.923033 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 #11 #12 #13 #14 #15 Sep 16 05:31:08.923039 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Sep 16 05:31:08.923045 kernel: smp: Brought up 1 node, 16 CPUs Sep 16 05:31:08.923050 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Sep 16 05:31:08.923056 kernel: Memory: 32695112K/33452940K available (14336K kernel code, 2432K rwdata, 9992K rodata, 54096K init, 2868K bss, 732528K reserved, 0K cma-reserved) Sep 16 05:31:08.923061 kernel: devtmpfs: initialized Sep 16 05:31:08.923067 kernel: x86/mm: Memory block size: 128MB Sep 16 05:31:08.923072 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x80fcf000-0x80fcffff] (4096 bytes) Sep 16 05:31:08.923077 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x8c231000-0x8c662fff] (4399104 bytes) Sep 16 05:31:08.923083 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 16 05:31:08.923089 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Sep 16 05:31:08.923094 kernel: pinctrl core: initialized pinctrl subsystem Sep 16 05:31:08.923100 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 16 05:31:08.923105 kernel: audit: initializing netlink subsys (disabled) Sep 16 05:31:08.923110 kernel: audit: type=2000 audit(1758000660.041:1): state=initialized audit_enabled=0 res=1 Sep 16 05:31:08.923115 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 16 05:31:08.923121 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 16 05:31:08.923126 kernel: cpuidle: using governor menu Sep 16 05:31:08.923131 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 16 05:31:08.923137 kernel: dca service started, version 1.12.1 Sep 16 05:31:08.923143 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Sep 16 05:31:08.923148 kernel: PCI: Using configuration type 1 for base access Sep 16 05:31:08.923153 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 16 05:31:08.923159 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 16 05:31:08.923164 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 16 05:31:08.923169 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 16 05:31:08.923175 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 16 05:31:08.923181 kernel: ACPI: Added _OSI(Module Device) Sep 16 05:31:08.923186 kernel: ACPI: Added _OSI(Processor Device) Sep 16 05:31:08.923192 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 16 05:31:08.923197 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Sep 16 05:31:08.923202 kernel: ACPI: Dynamic OEM Table Load: Sep 16 05:31:08.923208 kernel: ACPI: SSDT 0xFFFF8A12C2101C00 000400 (v02 PmRef Cpu0Cst 00003001 INTL 20160527) Sep 16 05:31:08.923213 kernel: ACPI: Dynamic OEM Table Load: Sep 16 05:31:08.923218 kernel: ACPI: SSDT 0xFFFF8A12C20F8000 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Sep 16 05:31:08.923224 kernel: ACPI: Dynamic OEM Table Load: Sep 16 05:31:08.923229 kernel: ACPI: SSDT 0xFFFF8A12C1698B00 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Sep 16 05:31:08.923235 kernel: ACPI: Dynamic OEM Table Load: Sep 16 05:31:08.923240 kernel: ACPI: SSDT 0xFFFF8A12C0F9D000 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Sep 16 05:31:08.923245 kernel: ACPI: Dynamic OEM Table Load: Sep 16 05:31:08.923251 kernel: ACPI: SSDT 0xFFFF8A12C0FA1000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Sep 16 05:31:08.923256 kernel: ACPI: Dynamic OEM Table Load: Sep 16 05:31:08.923261 kernel: ACPI: SSDT 0xFFFF8A12C1802800 00030A (v02 PmRef ApCst 00003000 INTL 20160527) Sep 16 05:31:08.923267 kernel: ACPI: Interpreter enabled Sep 16 05:31:08.923272 kernel: ACPI: PM: (supports S0 S5) Sep 16 05:31:08.923277 kernel: ACPI: Using IOAPIC for interrupt routing Sep 16 05:31:08.923283 kernel: HEST: Enabling Firmware First mode for corrected errors. Sep 16 05:31:08.923289 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Sep 16 05:31:08.923294 kernel: HEST: Table parsing has been initialized. Sep 16 05:31:08.923299 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Sep 16 05:31:08.923305 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 16 05:31:08.923310 kernel: PCI: Using E820 reservations for host bridge windows Sep 16 05:31:08.923315 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Sep 16 05:31:08.923321 kernel: ACPI: \_SB_.PCI0.XDCI.USBC: New power resource Sep 16 05:31:08.923326 kernel: ACPI: \_SB_.PCI0.SAT0.VOL0.V0PR: New power resource Sep 16 05:31:08.923333 kernel: ACPI: \_SB_.PCI0.SAT0.VOL1.V1PR: New power resource Sep 16 05:31:08.923338 kernel: ACPI: \_SB_.PCI0.SAT0.VOL2.V2PR: New power resource Sep 16 05:31:08.923343 kernel: ACPI: \_SB_.PCI0.CNVW.WRST: New power resource Sep 16 05:31:08.923349 kernel: ACPI: \_TZ_.FN00: New power resource Sep 16 05:31:08.923354 kernel: ACPI: \_TZ_.FN01: New power resource Sep 16 05:31:08.923359 kernel: ACPI: \_TZ_.FN02: New power resource Sep 16 05:31:08.923364 kernel: ACPI: \_TZ_.FN03: New power resource Sep 16 05:31:08.923370 kernel: ACPI: \_TZ_.FN04: New power resource Sep 16 05:31:08.923375 kernel: ACPI: \PIN_: New power resource Sep 16 05:31:08.923381 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Sep 16 05:31:08.923461 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 16 05:31:08.923513 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Sep 16 05:31:08.923561 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Sep 16 05:31:08.923569 kernel: PCI host bridge to bus 0000:00 Sep 16 05:31:08.923619 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 16 05:31:08.923664 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 16 05:31:08.923710 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 16 05:31:08.923772 kernel: pci_bus 0000:00: root bus resource [mem 0x90000000-0xdfffffff window] Sep 16 05:31:08.923818 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Sep 16 05:31:08.923862 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Sep 16 05:31:08.923922 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 conventional PCI endpoint Sep 16 05:31:08.923983 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 PCIe Root Port Sep 16 05:31:08.924038 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 16 05:31:08.924087 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Sep 16 05:31:08.924136 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Sep 16 05:31:08.924185 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Sep 16 05:31:08.924239 kernel: pci 0000:00:08.0: [8086:1911] type 00 class 0x088000 conventional PCI endpoint Sep 16 05:31:08.924288 kernel: pci 0000:00:08.0: BAR 0 [mem 0x95520000-0x95520fff 64bit] Sep 16 05:31:08.924342 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 conventional PCI endpoint Sep 16 05:31:08.924394 kernel: pci 0000:00:12.0: BAR 0 [mem 0x9551f000-0x9551ffff 64bit] Sep 16 05:31:08.924446 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 conventional PCI endpoint Sep 16 05:31:08.924495 kernel: pci 0000:00:14.0: BAR 0 [mem 0x95500000-0x9550ffff 64bit] Sep 16 05:31:08.924544 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Sep 16 05:31:08.924597 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 conventional PCI endpoint Sep 16 05:31:08.924649 kernel: pci 0000:00:14.2: BAR 0 [mem 0x95512000-0x95513fff 64bit] Sep 16 05:31:08.924705 kernel: pci 0000:00:14.2: BAR 2 [mem 0x9551e000-0x9551efff 64bit] Sep 16 05:31:08.924764 kernel: pci 0000:00:14.5: [8086:a375] type 00 class 0x080501 conventional PCI endpoint Sep 16 05:31:08.924814 kernel: pci 0000:00:14.5: BAR 0 [mem 0x9551d000-0x9551dfff 64bit] Sep 16 05:31:08.924869 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 conventional PCI endpoint Sep 16 05:31:08.924919 kernel: pci 0000:00:15.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 16 05:31:08.924971 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 conventional PCI endpoint Sep 16 05:31:08.925023 kernel: pci 0000:00:15.1: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 16 05:31:08.925075 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 conventional PCI endpoint Sep 16 05:31:08.925124 kernel: pci 0000:00:16.0: BAR 0 [mem 0x9551a000-0x9551afff 64bit] Sep 16 05:31:08.925172 kernel: pci 0000:00:16.0: PME# supported from D3hot Sep 16 05:31:08.925224 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 conventional PCI endpoint Sep 16 05:31:08.925273 kernel: pci 0000:00:16.1: BAR 0 [mem 0x95519000-0x95519fff 64bit] Sep 16 05:31:08.925325 kernel: pci 0000:00:16.1: PME# supported from D3hot Sep 16 05:31:08.925377 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 conventional PCI endpoint Sep 16 05:31:08.925426 kernel: pci 0000:00:16.4: BAR 0 [mem 0x95518000-0x95518fff 64bit] Sep 16 05:31:08.925474 kernel: pci 0000:00:16.4: PME# supported from D3hot Sep 16 05:31:08.925528 kernel: pci 0000:00:17.0: [8086:a352] type 00 class 0x010601 conventional PCI endpoint Sep 16 05:31:08.925579 kernel: pci 0000:00:17.0: BAR 0 [mem 0x95510000-0x95511fff] Sep 16 05:31:08.925627 kernel: pci 0000:00:17.0: BAR 1 [mem 0x95517000-0x955170ff] Sep 16 05:31:08.925676 kernel: pci 0000:00:17.0: BAR 2 [io 0x6050-0x6057] Sep 16 05:31:08.925724 kernel: pci 0000:00:17.0: BAR 3 [io 0x6040-0x6043] Sep 16 05:31:08.925778 kernel: pci 0000:00:17.0: BAR 4 [io 0x6020-0x603f] Sep 16 05:31:08.925827 kernel: pci 0000:00:17.0: BAR 5 [mem 0x95516000-0x955167ff] Sep 16 05:31:08.925875 kernel: pci 0000:00:17.0: PME# supported from D3hot Sep 16 05:31:08.925932 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 PCIe Root Port Sep 16 05:31:08.925982 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Sep 16 05:31:08.926033 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Sep 16 05:31:08.926089 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 PCIe Root Port Sep 16 05:31:08.926139 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Sep 16 05:31:08.926187 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Sep 16 05:31:08.926238 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Sep 16 05:31:08.926287 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Sep 16 05:31:08.926340 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 PCIe Root Port Sep 16 05:31:08.926390 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Sep 16 05:31:08.926439 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Sep 16 05:31:08.926487 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Sep 16 05:31:08.926536 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Sep 16 05:31:08.926591 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 PCIe Root Port Sep 16 05:31:08.926641 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Sep 16 05:31:08.926690 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Sep 16 05:31:08.926744 kernel: pci 0000:00:1c.3: [8086:a33b] type 01 class 0x060400 PCIe Root Port Sep 16 05:31:08.926799 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Sep 16 05:31:08.926849 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Sep 16 05:31:08.926898 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Sep 16 05:31:08.926949 kernel: pci 0000:00:1c.3: PME# supported from D0 D3hot D3cold Sep 16 05:31:08.927003 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 conventional PCI endpoint Sep 16 05:31:08.927053 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 16 05:31:08.927107 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 conventional PCI endpoint Sep 16 05:31:08.927164 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 conventional PCI endpoint Sep 16 05:31:08.927213 kernel: pci 0000:00:1f.4: BAR 0 [mem 0x95514000-0x955140ff 64bit] Sep 16 05:31:08.927264 kernel: pci 0000:00:1f.4: BAR 4 [io 0xefa0-0xefbf] Sep 16 05:31:08.927317 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 conventional PCI endpoint Sep 16 05:31:08.927366 kernel: pci 0000:00:1f.5: BAR 0 [mem 0xfe010000-0xfe010fff] Sep 16 05:31:08.927420 kernel: pci 0000:01:00.0: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Sep 16 05:31:08.927472 kernel: pci 0000:01:00.0: BAR 0 [mem 0x92000000-0x93ffffff 64bit pref] Sep 16 05:31:08.927522 kernel: pci 0000:01:00.0: ROM [mem 0x95200000-0x952fffff pref] Sep 16 05:31:08.927572 kernel: pci 0000:01:00.0: PME# supported from D3cold Sep 16 05:31:08.927624 kernel: pci 0000:01:00.0: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Sep 16 05:31:08.927676 kernel: pci 0000:01:00.0: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Sep 16 05:31:08.927730 kernel: pci 0000:01:00.1: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Sep 16 05:31:08.927827 kernel: pci 0000:01:00.1: BAR 0 [mem 0x90000000-0x91ffffff 64bit pref] Sep 16 05:31:08.927878 kernel: pci 0000:01:00.1: ROM [mem 0x95100000-0x951fffff pref] Sep 16 05:31:08.927929 kernel: pci 0000:01:00.1: PME# supported from D3cold Sep 16 05:31:08.927979 kernel: pci 0000:01:00.1: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Sep 16 05:31:08.928031 kernel: pci 0000:01:00.1: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Sep 16 05:31:08.928082 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 16 05:31:08.928132 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Sep 16 05:31:08.928186 kernel: pci 0000:03:00.0: working around ROM BAR overlap defect Sep 16 05:31:08.928238 kernel: pci 0000:03:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Sep 16 05:31:08.928289 kernel: pci 0000:03:00.0: BAR 0 [mem 0x95400000-0x9547ffff] Sep 16 05:31:08.928339 kernel: pci 0000:03:00.0: BAR 2 [io 0x5000-0x501f] Sep 16 05:31:08.928391 kernel: pci 0000:03:00.0: BAR 3 [mem 0x95480000-0x95483fff] Sep 16 05:31:08.928441 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Sep 16 05:31:08.928491 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Sep 16 05:31:08.928545 kernel: pci 0000:04:00.0: working around ROM BAR overlap defect Sep 16 05:31:08.928596 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Sep 16 05:31:08.928646 kernel: pci 0000:04:00.0: BAR 0 [mem 0x95300000-0x9537ffff] Sep 16 05:31:08.928697 kernel: pci 0000:04:00.0: BAR 2 [io 0x4000-0x401f] Sep 16 05:31:08.928752 kernel: pci 0000:04:00.0: BAR 3 [mem 0x95380000-0x95383fff] Sep 16 05:31:08.928805 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Sep 16 05:31:08.928856 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Sep 16 05:31:08.928906 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Sep 16 05:31:08.928960 kernel: pci 0000:06:00.0: [1a03:1150] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Sep 16 05:31:08.929010 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Sep 16 05:31:08.929061 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Sep 16 05:31:08.929114 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Sep 16 05:31:08.929163 kernel: pci 0000:06:00.0: enabling Extended Tags Sep 16 05:31:08.929213 kernel: pci 0000:06:00.0: supports D1 D2 Sep 16 05:31:08.929264 kernel: pci 0000:06:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 16 05:31:08.929313 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Sep 16 05:31:08.929371 kernel: pci_bus 0000:07: extended config space not accessible Sep 16 05:31:08.929429 kernel: pci 0000:07:00.0: [1a03:2000] type 00 class 0x030000 conventional PCI endpoint Sep 16 05:31:08.929485 kernel: pci 0000:07:00.0: BAR 0 [mem 0x94000000-0x94ffffff] Sep 16 05:31:08.929536 kernel: pci 0000:07:00.0: BAR 1 [mem 0x95000000-0x9501ffff] Sep 16 05:31:08.929588 kernel: pci 0000:07:00.0: BAR 2 [io 0x3000-0x307f] Sep 16 05:31:08.929640 kernel: pci 0000:07:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 16 05:31:08.929691 kernel: pci 0000:07:00.0: supports D1 D2 Sep 16 05:31:08.929743 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 16 05:31:08.929797 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Sep 16 05:31:08.929805 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Sep 16 05:31:08.929813 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Sep 16 05:31:08.929819 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Sep 16 05:31:08.929824 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Sep 16 05:31:08.929830 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Sep 16 05:31:08.929835 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Sep 16 05:31:08.929841 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Sep 16 05:31:08.929847 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Sep 16 05:31:08.929852 kernel: iommu: Default domain type: Translated Sep 16 05:31:08.929858 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 16 05:31:08.929864 kernel: PCI: Using ACPI for IRQ routing Sep 16 05:31:08.929870 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 16 05:31:08.929875 kernel: e820: reserve RAM buffer [mem 0x00099800-0x0009ffff] Sep 16 05:31:08.929881 kernel: e820: reserve RAM buffer [mem 0x80fcf000-0x83ffffff] Sep 16 05:31:08.929886 kernel: e820: reserve RAM buffer [mem 0x8afc3000-0x8bffffff] Sep 16 05:31:08.929892 kernel: e820: reserve RAM buffer [mem 0x8c231000-0x8fffffff] Sep 16 05:31:08.929897 kernel: e820: reserve RAM buffer [mem 0x8ef00000-0x8fffffff] Sep 16 05:31:08.929903 kernel: e820: reserve RAM buffer [mem 0x86f000000-0x86fffffff] Sep 16 05:31:08.929953 kernel: pci 0000:07:00.0: vgaarb: setting as boot VGA device Sep 16 05:31:08.930007 kernel: pci 0000:07:00.0: vgaarb: bridge control possible Sep 16 05:31:08.930058 kernel: pci 0000:07:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 16 05:31:08.930066 kernel: vgaarb: loaded Sep 16 05:31:08.930072 kernel: clocksource: Switched to clocksource tsc-early Sep 16 05:31:08.930078 kernel: VFS: Disk quotas dquot_6.6.0 Sep 16 05:31:08.930084 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 16 05:31:08.930089 kernel: pnp: PnP ACPI init Sep 16 05:31:08.930140 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Sep 16 05:31:08.930192 kernel: pnp 00:02: [dma 0 disabled] Sep 16 05:31:08.930242 kernel: pnp 00:03: [dma 0 disabled] Sep 16 05:31:08.930294 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Sep 16 05:31:08.930340 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Sep 16 05:31:08.930389 kernel: system 00:05: [mem 0xfed10000-0xfed17fff] has been reserved Sep 16 05:31:08.930435 kernel: system 00:05: [mem 0xfed18000-0xfed18fff] has been reserved Sep 16 05:31:08.930482 kernel: system 00:05: [mem 0xfed19000-0xfed19fff] has been reserved Sep 16 05:31:08.930526 kernel: system 00:05: [mem 0xe0000000-0xefffffff] has been reserved Sep 16 05:31:08.930571 kernel: system 00:05: [mem 0xfed20000-0xfed3ffff] has been reserved Sep 16 05:31:08.930615 kernel: system 00:05: [mem 0xfed90000-0xfed93fff] could not be reserved Sep 16 05:31:08.930659 kernel: system 00:05: [mem 0xfed45000-0xfed8ffff] has been reserved Sep 16 05:31:08.930706 kernel: system 00:05: [mem 0xfee00000-0xfeefffff] could not be reserved Sep 16 05:31:08.930825 kernel: system 00:06: [io 0x1800-0x18fe] could not be reserved Sep 16 05:31:08.930872 kernel: system 00:06: [mem 0xfd000000-0xfd69ffff] has been reserved Sep 16 05:31:08.930917 kernel: system 00:06: [mem 0xfd6c0000-0xfd6cffff] has been reserved Sep 16 05:31:08.930961 kernel: system 00:06: [mem 0xfd6f0000-0xfdffffff] has been reserved Sep 16 05:31:08.931005 kernel: system 00:06: [mem 0xfe000000-0xfe01ffff] could not be reserved Sep 16 05:31:08.931049 kernel: system 00:06: [mem 0xfe200000-0xfe7fffff] has been reserved Sep 16 05:31:08.931108 kernel: system 00:06: [mem 0xff000000-0xffffffff] has been reserved Sep 16 05:31:08.931204 kernel: system 00:07: [io 0x2000-0x20fe] has been reserved Sep 16 05:31:08.931215 kernel: pnp: PnP ACPI: found 9 devices Sep 16 05:31:08.931221 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 16 05:31:08.931227 kernel: NET: Registered PF_INET protocol family Sep 16 05:31:08.931233 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 16 05:31:08.931239 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Sep 16 05:31:08.931244 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 16 05:31:08.931250 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 16 05:31:08.931256 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Sep 16 05:31:08.931262 kernel: TCP: Hash tables configured (established 262144 bind 65536) Sep 16 05:31:08.931268 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 16 05:31:08.931274 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 16 05:31:08.931279 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 16 05:31:08.931285 kernel: NET: Registered PF_XDP protocol family Sep 16 05:31:08.931335 kernel: pci 0000:00:15.0: BAR 0 [mem 0x95515000-0x95515fff 64bit]: assigned Sep 16 05:31:08.931385 kernel: pci 0000:00:15.1: BAR 0 [mem 0x9551b000-0x9551bfff 64bit]: assigned Sep 16 05:31:08.931435 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x9551c000-0x9551cfff 64bit]: assigned Sep 16 05:31:08.931488 kernel: pci 0000:01:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Sep 16 05:31:08.931540 kernel: pci 0000:01:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Sep 16 05:31:08.931591 kernel: pci 0000:01:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Sep 16 05:31:08.931641 kernel: pci 0000:01:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Sep 16 05:31:08.931764 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 16 05:31:08.931903 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Sep 16 05:31:08.931956 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Sep 16 05:31:08.932006 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Sep 16 05:31:08.932056 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Sep 16 05:31:08.932105 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Sep 16 05:31:08.932154 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Sep 16 05:31:08.932203 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Sep 16 05:31:08.932252 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Sep 16 05:31:08.932301 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Sep 16 05:31:08.932350 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Sep 16 05:31:08.932404 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Sep 16 05:31:08.932454 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Sep 16 05:31:08.932504 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Sep 16 05:31:08.932553 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Sep 16 05:31:08.932602 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Sep 16 05:31:08.932651 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Sep 16 05:31:08.932696 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Sep 16 05:31:08.932741 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 16 05:31:08.932790 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 16 05:31:08.932834 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 16 05:31:08.932879 kernel: pci_bus 0000:00: resource 7 [mem 0x90000000-0xdfffffff window] Sep 16 05:31:08.932923 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Sep 16 05:31:08.932973 kernel: pci_bus 0000:01: resource 1 [mem 0x95100000-0x952fffff] Sep 16 05:31:08.933018 kernel: pci_bus 0000:01: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Sep 16 05:31:08.933067 kernel: pci_bus 0000:03: resource 0 [io 0x5000-0x5fff] Sep 16 05:31:08.933112 kernel: pci_bus 0000:03: resource 1 [mem 0x95400000-0x954fffff] Sep 16 05:31:08.933167 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Sep 16 05:31:08.933213 kernel: pci_bus 0000:04: resource 1 [mem 0x95300000-0x953fffff] Sep 16 05:31:08.933262 kernel: pci_bus 0000:06: resource 0 [io 0x3000-0x3fff] Sep 16 05:31:08.933307 kernel: pci_bus 0000:06: resource 1 [mem 0x94000000-0x950fffff] Sep 16 05:31:08.933355 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Sep 16 05:31:08.933402 kernel: pci_bus 0000:07: resource 1 [mem 0x94000000-0x950fffff] Sep 16 05:31:08.933410 kernel: PCI: CLS 64 bytes, default 64 Sep 16 05:31:08.933418 kernel: DMAR: No ATSR found Sep 16 05:31:08.933424 kernel: DMAR: No SATC found Sep 16 05:31:08.933429 kernel: DMAR: dmar0: Using Queued invalidation Sep 16 05:31:08.933478 kernel: pci 0000:00:00.0: Adding to iommu group 0 Sep 16 05:31:08.933528 kernel: pci 0000:00:01.0: Adding to iommu group 1 Sep 16 05:31:08.933578 kernel: pci 0000:00:08.0: Adding to iommu group 2 Sep 16 05:31:08.933627 kernel: pci 0000:00:12.0: Adding to iommu group 3 Sep 16 05:31:08.933676 kernel: pci 0000:00:14.0: Adding to iommu group 4 Sep 16 05:31:08.933726 kernel: pci 0000:00:14.2: Adding to iommu group 4 Sep 16 05:31:08.933800 kernel: pci 0000:00:14.5: Adding to iommu group 4 Sep 16 05:31:08.933850 kernel: pci 0000:00:15.0: Adding to iommu group 5 Sep 16 05:31:08.933899 kernel: pci 0000:00:15.1: Adding to iommu group 5 Sep 16 05:31:08.933948 kernel: pci 0000:00:16.0: Adding to iommu group 6 Sep 16 05:31:08.933996 kernel: pci 0000:00:16.1: Adding to iommu group 6 Sep 16 05:31:08.934045 kernel: pci 0000:00:16.4: Adding to iommu group 6 Sep 16 05:31:08.934094 kernel: pci 0000:00:17.0: Adding to iommu group 7 Sep 16 05:31:08.934143 kernel: pci 0000:00:1b.0: Adding to iommu group 8 Sep 16 05:31:08.934195 kernel: pci 0000:00:1b.4: Adding to iommu group 9 Sep 16 05:31:08.934244 kernel: pci 0000:00:1b.5: Adding to iommu group 10 Sep 16 05:31:08.934294 kernel: pci 0000:00:1c.0: Adding to iommu group 11 Sep 16 05:31:08.934343 kernel: pci 0000:00:1c.3: Adding to iommu group 12 Sep 16 05:31:08.934391 kernel: pci 0000:00:1e.0: Adding to iommu group 13 Sep 16 05:31:08.934440 kernel: pci 0000:00:1f.0: Adding to iommu group 14 Sep 16 05:31:08.934489 kernel: pci 0000:00:1f.4: Adding to iommu group 14 Sep 16 05:31:08.934540 kernel: pci 0000:00:1f.5: Adding to iommu group 14 Sep 16 05:31:08.934590 kernel: pci 0000:01:00.0: Adding to iommu group 1 Sep 16 05:31:08.934640 kernel: pci 0000:01:00.1: Adding to iommu group 1 Sep 16 05:31:08.934691 kernel: pci 0000:03:00.0: Adding to iommu group 15 Sep 16 05:31:08.934741 kernel: pci 0000:04:00.0: Adding to iommu group 16 Sep 16 05:31:08.934796 kernel: pci 0000:06:00.0: Adding to iommu group 17 Sep 16 05:31:08.934849 kernel: pci 0000:07:00.0: Adding to iommu group 17 Sep 16 05:31:08.934858 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Sep 16 05:31:08.934866 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 16 05:31:08.934872 kernel: software IO TLB: mapped [mem 0x0000000086fc3000-0x000000008afc3000] (64MB) Sep 16 05:31:08.934878 kernel: RAPL PMU: API unit is 2^-32 Joules, 3 fixed counters, 655360 ms ovfl timer Sep 16 05:31:08.934884 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Sep 16 05:31:08.934889 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Sep 16 05:31:08.934895 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Sep 16 05:31:08.934947 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Sep 16 05:31:08.934956 kernel: Initialise system trusted keyrings Sep 16 05:31:08.934964 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Sep 16 05:31:08.934969 kernel: Key type asymmetric registered Sep 16 05:31:08.934975 kernel: Asymmetric key parser 'x509' registered Sep 16 05:31:08.934980 kernel: tsc: Refined TSC clocksource calibration: 3407.999 MHz Sep 16 05:31:08.934986 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd336761, max_idle_ns: 440795243819 ns Sep 16 05:31:08.934992 kernel: clocksource: Switched to clocksource tsc Sep 16 05:31:08.934998 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 16 05:31:08.935003 kernel: io scheduler mq-deadline registered Sep 16 05:31:08.935009 kernel: io scheduler kyber registered Sep 16 05:31:08.935016 kernel: io scheduler bfq registered Sep 16 05:31:08.935064 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 121 Sep 16 05:31:08.935113 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 122 Sep 16 05:31:08.935164 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 123 Sep 16 05:31:08.935212 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 124 Sep 16 05:31:08.935262 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 125 Sep 16 05:31:08.935311 kernel: pcieport 0000:00:1c.3: PME: Signaling with IRQ 126 Sep 16 05:31:08.935366 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Sep 16 05:31:08.935376 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Sep 16 05:31:08.935382 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Sep 16 05:31:08.935388 kernel: pstore: Using crash dump compression: deflate Sep 16 05:31:08.935393 kernel: pstore: Registered erst as persistent store backend Sep 16 05:31:08.935399 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 16 05:31:08.935405 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 16 05:31:08.935410 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 16 05:31:08.935416 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Sep 16 05:31:08.935422 kernel: hpet_acpi_add: no address or irqs in _CRS Sep 16 05:31:08.935472 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Sep 16 05:31:08.935481 kernel: i8042: PNP: No PS/2 controller found. Sep 16 05:31:08.935525 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Sep 16 05:31:08.935571 kernel: rtc_cmos rtc_cmos: registered as rtc0 Sep 16 05:31:08.935615 kernel: rtc_cmos rtc_cmos: setting system clock to 2025-09-16T05:31:07 UTC (1758000667) Sep 16 05:31:08.935660 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Sep 16 05:31:08.935668 kernel: intel_pstate: Intel P-state driver initializing Sep 16 05:31:08.935674 kernel: intel_pstate: Disabling energy efficiency optimization Sep 16 05:31:08.935682 kernel: intel_pstate: HWP enabled Sep 16 05:31:08.935687 kernel: NET: Registered PF_INET6 protocol family Sep 16 05:31:08.935693 kernel: Segment Routing with IPv6 Sep 16 05:31:08.935698 kernel: In-situ OAM (IOAM) with IPv6 Sep 16 05:31:08.935704 kernel: NET: Registered PF_PACKET protocol family Sep 16 05:31:08.935710 kernel: Key type dns_resolver registered Sep 16 05:31:08.935715 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Sep 16 05:31:08.935721 kernel: microcode: Current revision: 0x000000f4 Sep 16 05:31:08.935726 kernel: IPI shorthand broadcast: enabled Sep 16 05:31:08.935733 kernel: sched_clock: Marking stable (4620000711, 1494264060)->(6743145616, -628880845) Sep 16 05:31:08.935739 kernel: registered taskstats version 1 Sep 16 05:31:08.935744 kernel: Loading compiled-in X.509 certificates Sep 16 05:31:08.935754 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: d1d5b0d56b9b23dabf19e645632ff93bf659b3bf' Sep 16 05:31:08.935764 kernel: Demotion targets for Node 0: null Sep 16 05:31:08.935770 kernel: Key type .fscrypt registered Sep 16 05:31:08.935775 kernel: Key type fscrypt-provisioning registered Sep 16 05:31:08.935781 kernel: ima: Allocated hash algorithm: sha1 Sep 16 05:31:08.935787 kernel: ima: No architecture policies found Sep 16 05:31:08.935794 kernel: clk: Disabling unused clocks Sep 16 05:31:08.935800 kernel: Warning: unable to open an initial console. Sep 16 05:31:08.935806 kernel: Freeing unused kernel image (initmem) memory: 54096K Sep 16 05:31:08.935811 kernel: Write protecting the kernel read-only data: 24576k Sep 16 05:31:08.935817 kernel: Freeing unused kernel image (rodata/data gap) memory: 248K Sep 16 05:31:08.935823 kernel: Run /init as init process Sep 16 05:31:08.935828 kernel: with arguments: Sep 16 05:31:08.935834 kernel: /init Sep 16 05:31:08.935840 kernel: with environment: Sep 16 05:31:08.935846 kernel: HOME=/ Sep 16 05:31:08.935851 kernel: TERM=linux Sep 16 05:31:08.935857 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 16 05:31:08.935863 systemd[1]: Successfully made /usr/ read-only. Sep 16 05:31:08.935871 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 16 05:31:08.935877 systemd[1]: Detected architecture x86-64. Sep 16 05:31:08.935883 systemd[1]: Running in initrd. Sep 16 05:31:08.935890 systemd[1]: No hostname configured, using default hostname. Sep 16 05:31:08.935896 systemd[1]: Hostname set to . Sep 16 05:31:08.935902 systemd[1]: Initializing machine ID from random generator. Sep 16 05:31:08.935908 systemd[1]: Queued start job for default target initrd.target. Sep 16 05:31:08.935914 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 05:31:08.935920 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 05:31:08.935927 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 16 05:31:08.935933 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 16 05:31:08.935940 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 16 05:31:08.935946 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 16 05:31:08.935953 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 16 05:31:08.935959 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 16 05:31:08.935965 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 05:31:08.935970 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 16 05:31:08.935976 systemd[1]: Reached target paths.target - Path Units. Sep 16 05:31:08.935983 systemd[1]: Reached target slices.target - Slice Units. Sep 16 05:31:08.935989 systemd[1]: Reached target swap.target - Swaps. Sep 16 05:31:08.935995 systemd[1]: Reached target timers.target - Timer Units. Sep 16 05:31:08.936001 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 16 05:31:08.936007 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 16 05:31:08.936013 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 16 05:31:08.936019 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 16 05:31:08.936025 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 16 05:31:08.936032 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 16 05:31:08.936038 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 05:31:08.936044 systemd[1]: Reached target sockets.target - Socket Units. Sep 16 05:31:08.936050 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 16 05:31:08.936056 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 16 05:31:08.936062 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 16 05:31:08.936068 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 16 05:31:08.936074 systemd[1]: Starting systemd-fsck-usr.service... Sep 16 05:31:08.936080 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 16 05:31:08.936087 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 16 05:31:08.936105 systemd-journald[297]: Collecting audit messages is disabled. Sep 16 05:31:08.936120 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 05:31:08.936128 systemd-journald[297]: Journal started Sep 16 05:31:08.936142 systemd-journald[297]: Runtime Journal (/run/log/journal/8568f2b3c7dc4701bfb95c77cb182aa1) is 8M, max 640.1M, 632.1M free. Sep 16 05:31:08.921365 systemd-modules-load[300]: Inserted module 'overlay' Sep 16 05:31:08.949331 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 16 05:31:09.003983 systemd[1]: Started systemd-journald.service - Journal Service. Sep 16 05:31:09.003997 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 16 05:31:09.004005 kernel: Bridge firewalling registered Sep 16 05:31:08.955355 systemd-modules-load[300]: Inserted module 'br_netfilter' Sep 16 05:31:08.987011 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 05:31:09.015042 systemd[1]: Finished systemd-fsck-usr.service. Sep 16 05:31:09.040220 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 16 05:31:09.055225 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 05:31:09.080680 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 16 05:31:09.094430 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 16 05:31:09.128519 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 16 05:31:09.129558 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 16 05:31:09.135012 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 16 05:31:09.136734 systemd-tmpfiles[320]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 16 05:31:09.137056 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 16 05:31:09.137702 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 16 05:31:09.138517 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 05:31:09.139397 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 16 05:31:09.141918 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 16 05:31:09.152035 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 05:31:09.159008 systemd-resolved[336]: Positive Trust Anchors: Sep 16 05:31:09.159013 systemd-resolved[336]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 16 05:31:09.159035 systemd-resolved[336]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 16 05:31:09.160455 systemd-resolved[336]: Defaulting to hostname 'linux'. Sep 16 05:31:09.162021 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 16 05:31:09.192094 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 16 05:31:09.208508 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 16 05:31:09.343615 dracut-cmdline[342]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=0b876f86a632750e9937176808a48c2452d5168964273bcfc3c72f2a26140c06 Sep 16 05:31:09.548799 kernel: SCSI subsystem initialized Sep 16 05:31:09.561784 kernel: Loading iSCSI transport class v2.0-870. Sep 16 05:31:09.573757 kernel: iscsi: registered transport (tcp) Sep 16 05:31:09.596892 kernel: iscsi: registered transport (qla4xxx) Sep 16 05:31:09.596909 kernel: QLogic iSCSI HBA Driver Sep 16 05:31:09.607190 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 16 05:31:09.644682 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 05:31:09.656242 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 16 05:31:09.704288 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 16 05:31:09.714191 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 16 05:31:09.818811 kernel: raid6: avx2x4 gen() 44573 MB/s Sep 16 05:31:09.839782 kernel: raid6: avx2x2 gen() 54833 MB/s Sep 16 05:31:09.865834 kernel: raid6: avx2x1 gen() 45927 MB/s Sep 16 05:31:09.865851 kernel: raid6: using algorithm avx2x2 gen() 54833 MB/s Sep 16 05:31:09.892913 kernel: raid6: .... xor() 32967 MB/s, rmw enabled Sep 16 05:31:09.892930 kernel: raid6: using avx2x2 recovery algorithm Sep 16 05:31:09.913789 kernel: xor: automatically using best checksumming function avx Sep 16 05:31:10.016765 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 16 05:31:10.020442 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 16 05:31:10.030818 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 05:31:10.079528 systemd-udevd[553]: Using default interface naming scheme 'v255'. Sep 16 05:31:10.082830 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 05:31:10.110681 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 16 05:31:10.166768 dracut-pre-trigger[566]: rd.md=0: removing MD RAID activation Sep 16 05:31:10.185454 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 16 05:31:10.197920 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 16 05:31:10.279514 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 05:31:10.295760 kernel: cryptd: max_cpu_qlen set to 1000 Sep 16 05:31:10.308662 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 16 05:31:10.378832 kernel: pps_core: LinuxPPS API ver. 1 registered Sep 16 05:31:10.378850 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Sep 16 05:31:10.378858 kernel: AES CTR mode by8 optimization enabled Sep 16 05:31:10.378865 kernel: sdhci: Secure Digital Host Controller Interface driver Sep 16 05:31:10.378877 kernel: sdhci: Copyright(c) Pierre Ossman Sep 16 05:31:10.378884 kernel: PTP clock support registered Sep 16 05:31:10.378891 kernel: ACPI: bus type USB registered Sep 16 05:31:10.378898 kernel: usbcore: registered new interface driver usbfs Sep 16 05:31:10.378904 kernel: usbcore: registered new interface driver hub Sep 16 05:31:10.378911 kernel: libata version 3.00 loaded. Sep 16 05:31:10.378918 kernel: usbcore: registered new device driver usb Sep 16 05:31:10.378925 kernel: ahci 0000:00:17.0: version 3.0 Sep 16 05:31:10.379021 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:10.379093 kernel: ahci 0000:00:17.0: AHCI vers 0001.0301, 32 command slots, 6 Gbps, SATA mode Sep 16 05:31:10.352447 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 16 05:31:10.406819 kernel: ahci 0000:00:17.0: 7/7 ports implemented (port mask 0x7f) Sep 16 05:31:10.406929 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Sep 16 05:31:10.406998 kernel: scsi host0: ahci Sep 16 05:31:10.407065 kernel: scsi host1: ahci Sep 16 05:31:10.352527 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 05:31:10.507867 kernel: scsi host2: ahci Sep 16 05:31:10.507996 kernel: scsi host3: ahci Sep 16 05:31:10.508083 kernel: scsi host4: ahci Sep 16 05:31:10.508181 kernel: scsi host5: ahci Sep 16 05:31:10.508264 kernel: scsi host6: ahci Sep 16 05:31:10.508349 kernel: ata1: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516100 irq 127 lpm-pol 0 Sep 16 05:31:10.508361 kernel: ata2: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516180 irq 127 lpm-pol 0 Sep 16 05:31:10.508372 kernel: ata3: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516200 irq 127 lpm-pol 0 Sep 16 05:31:10.508382 kernel: ata4: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516280 irq 127 lpm-pol 0 Sep 16 05:31:10.508393 kernel: ata5: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516300 irq 127 lpm-pol 0 Sep 16 05:31:10.508403 kernel: ata6: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516380 irq 127 lpm-pol 0 Sep 16 05:31:10.508413 kernel: ata7: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516400 irq 127 lpm-pol 0 Sep 16 05:31:10.508423 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:10.482920 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 05:31:10.674654 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Sep 16 05:31:10.674755 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Sep 16 05:31:10.674826 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Sep 16 05:31:10.674837 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Sep 16 05:31:10.674902 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Sep 16 05:31:10.674910 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Sep 16 05:31:10.674972 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Sep 16 05:31:10.675033 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Sep 16 05:31:10.675093 kernel: hub 1-0:1.0: USB hub found Sep 16 05:31:10.675163 kernel: hub 1-0:1.0: 16 ports detected Sep 16 05:31:10.675223 kernel: igb 0000:03:00.0: added PHC on eth0 Sep 16 05:31:10.675293 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:10.675356 kernel: hub 2-0:1.0: USB hub found Sep 16 05:31:10.675421 kernel: hub 2-0:1.0: 10 ports detected Sep 16 05:31:10.675479 kernel: igb 0000:03:00.0: Intel(R) Gigabit Ethernet Network Connection Sep 16 05:31:10.675541 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:10.675601 kernel: igb 0000:03:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:70:d8:b0 Sep 16 05:31:10.675663 kernel: igb 0000:03:00.0: eth0: PBA No: 010000-000 Sep 16 05:31:10.675726 kernel: igb 0000:03:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Sep 16 05:31:10.675798 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:10.675860 kernel: igb 0000:04:00.0: added PHC on eth1 Sep 16 05:31:10.675925 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Sep 16 05:31:10.675986 kernel: igb 0000:04:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:70:d8:b1 Sep 16 05:31:10.676047 kernel: igb 0000:04:00.0: eth1: PBA No: 010000-000 Sep 16 05:31:10.508437 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 05:31:10.713013 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Sep 16 05:31:10.713099 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:10.655038 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 16 05:31:10.749016 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 05:31:10.799765 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 16 05:31:10.799809 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 16 05:31:10.805797 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Sep 16 05:31:10.812757 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 16 05:31:10.817782 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Sep 16 05:31:10.823752 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Sep 16 05:31:10.823785 kernel: ata7: SATA link down (SStatus 0 SControl 300) Sep 16 05:31:10.836753 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 16 05:31:10.842795 kernel: ata2.00: Model 'Micron_5200_MTFDDAK480TDN', rev ' D1MU020', applying quirks: zeroaftertrim Sep 16 05:31:10.859171 kernel: ata2.00: ATA-10: Micron_5200_MTFDDAK480TDN, D1MU020, max UDMA/133 Sep 16 05:31:10.859792 kernel: ata1.00: Model 'Micron_5200_MTFDDAK480TDN', rev ' D1MU020', applying quirks: zeroaftertrim Sep 16 05:31:10.876352 kernel: ata1.00: ATA-10: Micron_5200_MTFDDAK480TDN, D1MU020, max UDMA/133 Sep 16 05:31:10.891448 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Sep 16 05:31:10.891567 kernel: ata2.00: Features: NCQ-prio Sep 16 05:31:10.891755 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Sep 16 05:31:10.902716 kernel: ata1.00: Features: NCQ-prio Sep 16 05:31:10.912780 kernel: ata2.00: configured for UDMA/133 Sep 16 05:31:10.912796 kernel: ata1.00: configured for UDMA/133 Sep 16 05:31:10.917834 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5200_MTFD U020 PQ: 0 ANSI: 5 Sep 16 05:31:10.934756 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5200_MTFD U020 PQ: 0 ANSI: 5 Sep 16 05:31:10.941756 kernel: igb 0000:04:00.0 eno2: renamed from eth1 Sep 16 05:31:10.941899 kernel: igb 0000:03:00.0 eno1: renamed from eth0 Sep 16 05:31:10.942001 kernel: ata1.00: Enabling discard_zeroes_data Sep 16 05:31:10.963933 kernel: sd 0:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Sep 16 05:31:10.964081 kernel: ata2.00: Enabling discard_zeroes_data Sep 16 05:31:10.964095 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Sep 16 05:31:10.964188 kernel: sd 1:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Sep 16 05:31:10.966761 kernel: hub 1-14:1.0: USB hub found Sep 16 05:31:10.966905 kernel: hub 1-14:1.0: 4 ports detected Sep 16 05:31:10.967000 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:10.976666 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 16 05:31:10.976851 kernel: sd 1:0:0:0: [sdb] 4096-byte physical blocks Sep 16 05:31:10.980517 kernel: sd 0:0:0:0: [sda] Mode Sense: 00 3a 00 00 Sep 16 05:31:10.991985 kernel: sd 1:0:0:0: [sdb] Write Protect is off Sep 16 05:31:10.992129 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 16 05:31:10.996763 kernel: sd 1:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Sep 16 05:31:10.996886 kernel: sd 1:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 16 05:31:11.002016 kernel: sd 0:0:0:0: [sda] Preferred minimum I/O size 4096 bytes Sep 16 05:31:11.007099 kernel: sd 1:0:0:0: [sdb] Preferred minimum I/O size 4096 bytes Sep 16 05:31:11.046410 kernel: ata1.00: Enabling discard_zeroes_data Sep 16 05:31:11.051609 kernel: ata2.00: Enabling discard_zeroes_data Sep 16 05:31:11.177755 kernel: sd 1:0:0:0: [sdb] Attached SCSI disk Sep 16 05:31:11.177851 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 16 05:31:11.177863 kernel: mlx5_core 0000:01:00.0: PTM is not supported by PCIe Sep 16 05:31:11.177939 kernel: mlx5_core 0000:01:00.0: firmware version: 14.28.2006 Sep 16 05:31:11.178036 kernel: mlx5_core 0000:01:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Sep 16 05:31:11.186942 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:11.187027 kernel: GPT:9289727 != 937703087 Sep 16 05:31:11.187036 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 16 05:31:11.187043 kernel: GPT:9289727 != 937703087 Sep 16 05:31:11.187049 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 16 05:31:11.187055 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 16 05:31:11.246230 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 16 05:31:11.253849 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:11.253932 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Sep 16 05:31:11.301653 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Micron_5200_MTFDDAK480TDN EFI-SYSTEM. Sep 16 05:31:11.318249 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Micron_5200_MTFDDAK480TDN ROOT. Sep 16 05:31:11.329239 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5200_MTFDDAK480TDN OEM. Sep 16 05:31:11.398857 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:11.398981 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 16 05:31:11.398995 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:11.399092 kernel: usbcore: registered new interface driver usbhid Sep 16 05:31:11.399105 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:11.399199 kernel: usbhid: USB HID core driver Sep 16 05:31:11.348458 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Micron_5200_MTFDDAK480TDN USR-A. Sep 16 05:31:11.431792 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Sep 16 05:31:11.420863 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Micron_5200_MTFDDAK480TDN USR-A. Sep 16 05:31:11.442782 kernel: mlx5_core 0000:01:00.0: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Sep 16 05:31:11.455596 kernel: mlx5_core 0000:01:00.0: Port module event: module 0, Cable plugged Sep 16 05:31:11.478792 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Sep 16 05:31:11.479412 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:11.479943 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Sep 16 05:31:11.511495 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Sep 16 05:31:11.512802 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:11.523562 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 16 05:31:11.554126 disk-uuid[762]: Primary Header is updated. Sep 16 05:31:11.554126 disk-uuid[762]: Secondary Entries is updated. Sep 16 05:31:11.554126 disk-uuid[762]: Secondary Header is updated. Sep 16 05:31:11.587829 kernel: ata1.00: Enabling discard_zeroes_data Sep 16 05:31:11.587841 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 16 05:31:11.587848 kernel: ata1.00: Enabling discard_zeroes_data Sep 16 05:31:11.602798 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 16 05:31:11.698762 kernel: mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Sep 16 05:31:11.698882 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:11.712116 kernel: mlx5_core 0000:01:00.1: PTM is not supported by PCIe Sep 16 05:31:11.712209 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:11.712278 kernel: mlx5_core 0000:01:00.1: firmware version: 14.28.2006 Sep 16 05:31:11.735455 kernel: mlx5_core 0000:01:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Sep 16 05:31:12.020755 kernel: mlx5_core 0000:01:00.1: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Sep 16 05:31:12.033064 kernel: mlx5_core 0000:01:00.1: Port module event: module 1, Cable plugged Sep 16 05:31:12.287802 kernel: mlx5_core 0000:01:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Sep 16 05:31:12.287953 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:12.302549 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:12.311786 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: renamed from eth1 Sep 16 05:31:12.311902 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: renamed from eth0 Sep 16 05:31:12.325335 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 16 05:31:12.334262 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 16 05:31:12.342917 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 05:31:12.362992 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 16 05:31:12.393286 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 16 05:31:12.450368 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 16 05:31:12.584170 kernel: ata1.00: Enabling discard_zeroes_data Sep 16 05:31:12.602724 disk-uuid[763]: The operation has completed successfully. Sep 16 05:31:12.609853 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 16 05:31:12.640474 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 16 05:31:12.640521 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 16 05:31:12.681541 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 16 05:31:12.708805 sh[804]: Success Sep 16 05:31:12.735980 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 16 05:31:12.736001 kernel: device-mapper: uevent: version 1.0.3 Sep 16 05:31:12.745226 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 16 05:31:12.757813 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 16 05:31:12.804471 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 16 05:31:12.814112 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 16 05:31:12.852154 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 16 05:31:12.899879 kernel: BTRFS: device fsid f1b91845-3914-4d21-a370-6d760ee45b2e devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (816) Sep 16 05:31:12.899895 kernel: BTRFS info (device dm-0): first mount of filesystem f1b91845-3914-4d21-a370-6d760ee45b2e Sep 16 05:31:12.899902 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 16 05:31:12.918049 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 16 05:31:12.918067 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 16 05:31:12.925754 kernel: BTRFS info (device dm-0): enabling free space tree Sep 16 05:31:12.927944 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 16 05:31:12.935050 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 16 05:31:12.952265 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 16 05:31:12.954306 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 16 05:31:12.984520 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 16 05:31:13.057810 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (847) Sep 16 05:31:13.057836 kernel: BTRFS info (device sda6): first mount of filesystem 8b047ef5-4757-404a-b211-2a505a425364 Sep 16 05:31:13.065907 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 16 05:31:13.075587 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 16 05:31:13.117867 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 16 05:31:13.117881 kernel: BTRFS info (device sda6): turning on async discard Sep 16 05:31:13.117888 kernel: BTRFS info (device sda6): enabling free space tree Sep 16 05:31:13.117895 kernel: BTRFS info (device sda6): last unmount of filesystem 8b047ef5-4757-404a-b211-2a505a425364 Sep 16 05:31:13.107961 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 16 05:31:13.127653 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 16 05:31:13.134885 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 16 05:31:13.192843 systemd-networkd[986]: lo: Link UP Sep 16 05:31:13.192846 systemd-networkd[986]: lo: Gained carrier Sep 16 05:31:13.195163 systemd-networkd[986]: Enumeration completed Sep 16 05:31:13.195236 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 16 05:31:13.195788 systemd-networkd[986]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 05:31:13.198969 systemd[1]: Reached target network.target - Network. Sep 16 05:31:13.223617 systemd-networkd[986]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 05:31:13.250266 ignition[985]: Ignition 2.22.0 Sep 16 05:31:13.250743 systemd-networkd[986]: enp1s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 05:31:13.250270 ignition[985]: Stage: fetch-offline Sep 16 05:31:13.253192 unknown[985]: fetched base config from "system" Sep 16 05:31:13.250290 ignition[985]: no configs at "/usr/lib/ignition/base.d" Sep 16 05:31:13.253196 unknown[985]: fetched user config from "system" Sep 16 05:31:13.250295 ignition[985]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 16 05:31:13.254416 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 16 05:31:13.250343 ignition[985]: parsed url from cmdline: "" Sep 16 05:31:13.274093 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 16 05:31:13.250345 ignition[985]: no config URL provided Sep 16 05:31:13.274673 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 16 05:31:13.250347 ignition[985]: reading system config file "/usr/lib/ignition/user.ign" Sep 16 05:31:13.250369 ignition[985]: parsing config with SHA512: db4fd702f87e5a16f5376c010818ff7d69baedba3426f22827c8a5f78b51e9eef564fb9af5432d1ea1d1b53cb10a8531fb30548770bdfe015c8d0b14dbf9ee62 Sep 16 05:31:13.253384 ignition[985]: fetch-offline: fetch-offline passed Sep 16 05:31:13.253387 ignition[985]: POST message to Packet Timeline Sep 16 05:31:13.253389 ignition[985]: POST Status error: resource requires networking Sep 16 05:31:13.253422 ignition[985]: Ignition finished successfully Sep 16 05:31:13.419003 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Sep 16 05:31:13.308906 ignition[1001]: Ignition 2.22.0 Sep 16 05:31:13.308910 ignition[1001]: Stage: kargs Sep 16 05:31:13.421297 systemd-networkd[986]: enp1s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 05:31:13.308986 ignition[1001]: no configs at "/usr/lib/ignition/base.d" Sep 16 05:31:13.308991 ignition[1001]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 16 05:31:13.309408 ignition[1001]: kargs: kargs passed Sep 16 05:31:13.309410 ignition[1001]: POST message to Packet Timeline Sep 16 05:31:13.309419 ignition[1001]: GET https://metadata.packet.net/metadata: attempt #1 Sep 16 05:31:13.309769 ignition[1001]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:36820->[::1]:53: read: connection refused Sep 16 05:31:13.510031 ignition[1001]: GET https://metadata.packet.net/metadata: attempt #2 Sep 16 05:31:13.510355 ignition[1001]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:42156->[::1]:53: read: connection refused Sep 16 05:31:13.603791 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Sep 16 05:31:13.607862 systemd-networkd[986]: eno1: Link UP Sep 16 05:31:13.608014 systemd-networkd[986]: eno2: Link UP Sep 16 05:31:13.608155 systemd-networkd[986]: enp1s0f0np0: Link UP Sep 16 05:31:13.608317 systemd-networkd[986]: enp1s0f0np0: Gained carrier Sep 16 05:31:13.622121 systemd-networkd[986]: enp1s0f1np1: Link UP Sep 16 05:31:13.622924 systemd-networkd[986]: enp1s0f1np1: Gained carrier Sep 16 05:31:13.658016 systemd-networkd[986]: enp1s0f0np0: DHCPv4 address 139.178.94.21/31, gateway 139.178.94.20 acquired from 145.40.83.140 Sep 16 05:31:13.910919 ignition[1001]: GET https://metadata.packet.net/metadata: attempt #3 Sep 16 05:31:13.912221 ignition[1001]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:56166->[::1]:53: read: connection refused Sep 16 05:31:14.620391 systemd-networkd[986]: enp1s0f0np0: Gained IPv6LL Sep 16 05:31:14.713408 ignition[1001]: GET https://metadata.packet.net/metadata: attempt #4 Sep 16 05:31:14.714736 ignition[1001]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:50692->[::1]:53: read: connection refused Sep 16 05:31:15.196388 systemd-networkd[986]: enp1s0f1np1: Gained IPv6LL Sep 16 05:31:16.315905 ignition[1001]: GET https://metadata.packet.net/metadata: attempt #5 Sep 16 05:31:16.317089 ignition[1001]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:33638->[::1]:53: read: connection refused Sep 16 05:31:19.519732 ignition[1001]: GET https://metadata.packet.net/metadata: attempt #6 Sep 16 05:31:22.435387 ignition[1001]: GET result: OK Sep 16 05:31:22.499797 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:24.411723 ignition[1001]: Ignition finished successfully Sep 16 05:31:24.417891 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 16 05:31:24.429688 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 16 05:31:24.475425 ignition[1023]: Ignition 2.22.0 Sep 16 05:31:24.475431 ignition[1023]: Stage: disks Sep 16 05:31:24.475518 ignition[1023]: no configs at "/usr/lib/ignition/base.d" Sep 16 05:31:24.475524 ignition[1023]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 16 05:31:24.475988 ignition[1023]: disks: disks passed Sep 16 05:31:24.475991 ignition[1023]: POST message to Packet Timeline Sep 16 05:31:24.475999 ignition[1023]: GET https://metadata.packet.net/metadata: attempt #1 Sep 16 05:31:25.827166 ignition[1023]: GET result: OK Sep 16 05:31:26.320633 ignition[1023]: Ignition finished successfully Sep 16 05:31:26.325620 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 16 05:31:26.337974 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 16 05:31:26.356008 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 16 05:31:26.375034 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 16 05:31:26.394076 systemd[1]: Reached target sysinit.target - System Initialization. Sep 16 05:31:26.412066 systemd[1]: Reached target basic.target - Basic System. Sep 16 05:31:26.431601 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 16 05:31:26.485332 systemd-fsck[1044]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 16 05:31:26.494171 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 16 05:31:26.494919 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 16 05:31:26.615601 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 16 05:31:26.628983 kernel: EXT4-fs (sda9): mounted filesystem fb1cb44f-955b-4cd0-8849-33ce3640d547 r/w with ordered data mode. Quota mode: none. Sep 16 05:31:26.623148 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 16 05:31:26.630087 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 16 05:31:26.671436 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 16 05:31:26.717941 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sda6 (8:6) scanned by mount (1054) Sep 16 05:31:26.717956 kernel: BTRFS info (device sda6): first mount of filesystem 8b047ef5-4757-404a-b211-2a505a425364 Sep 16 05:31:26.717963 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 16 05:31:26.717970 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 16 05:31:26.679586 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 16 05:31:26.750989 kernel: BTRFS info (device sda6): turning on async discard Sep 16 05:31:26.751001 kernel: BTRFS info (device sda6): enabling free space tree Sep 16 05:31:26.743094 systemd[1]: Starting flatcar-static-network.service - Flatcar Static Network Agent... Sep 16 05:31:26.769798 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 16 05:31:26.769816 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 16 05:31:26.812873 coreos-metadata[1056]: Sep 16 05:31:26.806 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 16 05:31:26.781129 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 16 05:31:26.846994 coreos-metadata[1072]: Sep 16 05:31:26.806 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 16 05:31:26.796948 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 16 05:31:26.821771 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 16 05:31:26.892503 initrd-setup-root[1086]: cut: /sysroot/etc/passwd: No such file or directory Sep 16 05:31:26.900886 initrd-setup-root[1093]: cut: /sysroot/etc/group: No such file or directory Sep 16 05:31:26.910860 initrd-setup-root[1100]: cut: /sysroot/etc/shadow: No such file or directory Sep 16 05:31:26.920868 initrd-setup-root[1107]: cut: /sysroot/etc/gshadow: No such file or directory Sep 16 05:31:26.961025 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 16 05:31:26.970679 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 16 05:31:26.980508 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 16 05:31:27.019873 kernel: BTRFS info (device sda6): last unmount of filesystem 8b047ef5-4757-404a-b211-2a505a425364 Sep 16 05:31:27.006655 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 16 05:31:27.025701 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 16 05:31:27.059013 ignition[1175]: INFO : Ignition 2.22.0 Sep 16 05:31:27.059013 ignition[1175]: INFO : Stage: mount Sep 16 05:31:27.072953 ignition[1175]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 05:31:27.072953 ignition[1175]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 16 05:31:27.072953 ignition[1175]: INFO : mount: mount passed Sep 16 05:31:27.072953 ignition[1175]: INFO : POST message to Packet Timeline Sep 16 05:31:27.072953 ignition[1175]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 16 05:31:27.882623 coreos-metadata[1072]: Sep 16 05:31:27.882 INFO Fetch successful Sep 16 05:31:27.924814 systemd[1]: flatcar-static-network.service: Deactivated successfully. Sep 16 05:31:27.924887 systemd[1]: Finished flatcar-static-network.service - Flatcar Static Network Agent. Sep 16 05:31:27.984523 coreos-metadata[1056]: Sep 16 05:31:27.984 INFO Fetch successful Sep 16 05:31:28.043430 coreos-metadata[1056]: Sep 16 05:31:28.043 INFO wrote hostname ci-4459.0.0-n-e7bf2d745b to /sysroot/etc/hostname Sep 16 05:31:28.045125 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 16 05:31:28.209077 ignition[1175]: INFO : GET result: OK Sep 16 05:31:29.897563 ignition[1175]: INFO : Ignition finished successfully Sep 16 05:31:29.902243 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 16 05:31:29.920042 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 16 05:31:29.951506 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 16 05:31:30.001789 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sda6 (8:6) scanned by mount (1200) Sep 16 05:31:30.019267 kernel: BTRFS info (device sda6): first mount of filesystem 8b047ef5-4757-404a-b211-2a505a425364 Sep 16 05:31:30.019286 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 16 05:31:30.035168 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 16 05:31:30.035184 kernel: BTRFS info (device sda6): turning on async discard Sep 16 05:31:30.041285 kernel: BTRFS info (device sda6): enabling free space tree Sep 16 05:31:30.042783 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 16 05:31:30.079097 ignition[1217]: INFO : Ignition 2.22.0 Sep 16 05:31:30.079097 ignition[1217]: INFO : Stage: files Sep 16 05:31:30.092035 ignition[1217]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 05:31:30.092035 ignition[1217]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 16 05:31:30.092035 ignition[1217]: DEBUG : files: compiled without relabeling support, skipping Sep 16 05:31:30.092035 ignition[1217]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 16 05:31:30.092035 ignition[1217]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 16 05:31:30.092035 ignition[1217]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 16 05:31:30.092035 ignition[1217]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 16 05:31:30.092035 ignition[1217]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 16 05:31:30.092035 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 16 05:31:30.092035 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Sep 16 05:31:30.082538 unknown[1217]: wrote ssh authorized keys file for user: core Sep 16 05:31:30.221046 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 16 05:31:30.347746 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 16 05:31:30.364002 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 16 05:31:30.364002 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 16 05:31:30.364002 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 16 05:31:30.364002 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 16 05:31:30.364002 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 16 05:31:30.364002 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 16 05:31:30.364002 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 16 05:31:30.364002 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 16 05:31:30.364002 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 16 05:31:30.364002 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 16 05:31:30.364002 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 16 05:31:30.364002 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 16 05:31:30.364002 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 16 05:31:30.364002 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Sep 16 05:31:35.384259 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 16 05:31:35.782980 ignition[1217]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 16 05:31:35.782980 ignition[1217]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 16 05:31:35.819040 ignition[1217]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 16 05:31:35.819040 ignition[1217]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 16 05:31:35.819040 ignition[1217]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 16 05:31:35.819040 ignition[1217]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 16 05:31:35.819040 ignition[1217]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 16 05:31:35.819040 ignition[1217]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 16 05:31:35.819040 ignition[1217]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 16 05:31:35.819040 ignition[1217]: INFO : files: files passed Sep 16 05:31:35.819040 ignition[1217]: INFO : POST message to Packet Timeline Sep 16 05:31:35.819040 ignition[1217]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 16 05:31:36.637044 ignition[1217]: INFO : GET result: OK Sep 16 05:31:37.399221 ignition[1217]: INFO : Ignition finished successfully Sep 16 05:31:37.404154 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 16 05:31:37.419605 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 16 05:31:37.438971 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 16 05:31:37.462230 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 16 05:31:37.462314 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 16 05:31:37.484913 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 16 05:31:37.498137 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 16 05:31:37.519120 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 16 05:31:37.545856 initrd-setup-root-after-ignition[1257]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 16 05:31:37.545856 initrd-setup-root-after-ignition[1257]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 16 05:31:37.558977 initrd-setup-root-after-ignition[1261]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 16 05:31:37.621378 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 16 05:31:37.621444 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 16 05:31:37.639249 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 16 05:31:37.657940 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 16 05:31:37.674157 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 16 05:31:37.676557 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 16 05:31:37.753822 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 16 05:31:37.767496 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 16 05:31:37.837716 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 16 05:31:37.848409 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 05:31:37.868490 systemd[1]: Stopped target timers.target - Timer Units. Sep 16 05:31:37.885445 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 16 05:31:37.885879 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 16 05:31:37.922183 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 16 05:31:37.931379 systemd[1]: Stopped target basic.target - Basic System. Sep 16 05:31:37.948397 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 16 05:31:37.967409 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 16 05:31:37.986372 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 16 05:31:38.005381 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 16 05:31:38.025404 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 16 05:31:38.043505 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 16 05:31:38.063564 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 16 05:31:38.084408 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 16 05:31:38.102504 systemd[1]: Stopped target swap.target - Swaps. Sep 16 05:31:38.119403 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 16 05:31:38.119833 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 16 05:31:38.152138 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 16 05:31:38.162418 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 05:31:38.182274 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 16 05:31:38.182735 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 05:31:38.203272 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 16 05:31:38.203663 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 16 05:31:38.233497 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 16 05:31:38.233982 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 16 05:31:38.251546 systemd[1]: Stopped target paths.target - Path Units. Sep 16 05:31:38.267255 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 16 05:31:38.267696 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 05:31:38.286415 systemd[1]: Stopped target slices.target - Slice Units. Sep 16 05:31:38.303400 systemd[1]: Stopped target sockets.target - Socket Units. Sep 16 05:31:38.320378 systemd[1]: iscsid.socket: Deactivated successfully. Sep 16 05:31:38.320672 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 16 05:31:38.338410 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 16 05:31:38.338700 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 16 05:31:38.359528 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 16 05:31:38.359962 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 16 05:31:38.376461 systemd[1]: ignition-files.service: Deactivated successfully. Sep 16 05:31:38.494995 ignition[1282]: INFO : Ignition 2.22.0 Sep 16 05:31:38.494995 ignition[1282]: INFO : Stage: umount Sep 16 05:31:38.494995 ignition[1282]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 05:31:38.494995 ignition[1282]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 16 05:31:38.494995 ignition[1282]: INFO : umount: umount passed Sep 16 05:31:38.494995 ignition[1282]: INFO : POST message to Packet Timeline Sep 16 05:31:38.494995 ignition[1282]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 16 05:31:38.376880 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 16 05:31:38.392379 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 16 05:31:38.392738 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 16 05:31:38.412888 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 16 05:31:38.425394 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 16 05:31:38.441098 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 16 05:31:38.441232 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 05:31:38.457128 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 16 05:31:38.457246 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 16 05:31:38.496537 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 16 05:31:38.497039 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 16 05:31:38.497089 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 16 05:31:38.512379 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 16 05:31:38.512438 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 16 05:31:39.710827 ignition[1282]: INFO : GET result: OK Sep 16 05:31:40.220988 ignition[1282]: INFO : Ignition finished successfully Sep 16 05:31:40.225156 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 16 05:31:40.225500 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 16 05:31:40.239017 systemd[1]: Stopped target network.target - Network. Sep 16 05:31:40.252058 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 16 05:31:40.252248 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 16 05:31:40.270191 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 16 05:31:40.270337 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 16 05:31:40.286181 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 16 05:31:40.286339 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 16 05:31:40.303274 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 16 05:31:40.303442 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 16 05:31:40.322356 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 16 05:31:40.322550 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 16 05:31:40.339572 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 16 05:31:40.358308 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 16 05:31:40.375017 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 16 05:31:40.375415 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 16 05:31:40.396631 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 16 05:31:40.397285 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 16 05:31:40.397589 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 16 05:31:40.412531 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 16 05:31:40.414490 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 16 05:31:40.427078 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 16 05:31:40.427191 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 16 05:31:40.449223 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 16 05:31:40.472913 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 16 05:31:40.472947 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 16 05:31:40.473009 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 16 05:31:40.473036 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 16 05:31:40.501270 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 16 05:31:40.501345 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 16 05:31:40.520115 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 16 05:31:40.520277 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 05:31:40.541518 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 05:31:40.565302 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 16 05:31:40.565494 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 16 05:31:40.566508 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 16 05:31:40.566885 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 05:31:40.584342 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 16 05:31:40.584524 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 16 05:31:40.599113 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 16 05:31:40.599215 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 05:31:40.609223 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 16 05:31:40.609372 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 16 05:31:40.652980 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 16 05:31:40.653237 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 16 05:31:40.680334 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 16 05:31:40.680504 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 16 05:31:40.712502 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 16 05:31:40.719971 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 16 05:31:40.720004 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 05:31:41.021940 systemd-journald[297]: Received SIGTERM from PID 1 (systemd). Sep 16 05:31:40.762080 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 16 05:31:40.762137 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 05:31:40.782303 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 16 05:31:40.782444 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 16 05:31:40.802476 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 16 05:31:40.802619 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 05:31:40.822053 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 16 05:31:40.822205 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 05:31:40.847298 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 16 05:31:40.847458 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 16 05:31:40.847578 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 16 05:31:40.847702 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 16 05:31:40.848923 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 16 05:31:40.849155 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 16 05:31:40.871052 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 16 05:31:40.871105 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 16 05:31:40.881377 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 16 05:31:40.906957 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 16 05:31:40.957397 systemd[1]: Switching root. Sep 16 05:31:41.206048 systemd-journald[297]: Journal stopped Sep 16 05:31:42.988522 kernel: SELinux: policy capability network_peer_controls=1 Sep 16 05:31:42.988538 kernel: SELinux: policy capability open_perms=1 Sep 16 05:31:42.988546 kernel: SELinux: policy capability extended_socket_class=1 Sep 16 05:31:42.988552 kernel: SELinux: policy capability always_check_network=0 Sep 16 05:31:42.988558 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 16 05:31:42.988563 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 16 05:31:42.988569 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 16 05:31:42.988575 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 16 05:31:42.988580 kernel: SELinux: policy capability userspace_initial_context=0 Sep 16 05:31:42.988586 kernel: audit: type=1403 audit(1758000701.377:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 16 05:31:42.988593 systemd[1]: Successfully loaded SELinux policy in 95.298ms. Sep 16 05:31:42.988600 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.780ms. Sep 16 05:31:42.988607 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 16 05:31:42.988614 systemd[1]: Detected architecture x86-64. Sep 16 05:31:42.988621 systemd[1]: Detected first boot. Sep 16 05:31:42.988628 systemd[1]: Hostname set to . Sep 16 05:31:42.988635 systemd[1]: Initializing machine ID from random generator. Sep 16 05:31:42.988641 zram_generator::config[1335]: No configuration found. Sep 16 05:31:42.988648 systemd[1]: Populated /etc with preset unit settings. Sep 16 05:31:42.988655 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 16 05:31:42.988662 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 16 05:31:42.988669 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 16 05:31:42.988675 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 16 05:31:42.988682 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 16 05:31:42.988688 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 16 05:31:42.988695 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 16 05:31:42.988701 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 16 05:31:42.988709 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 16 05:31:42.988716 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 16 05:31:42.988723 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 16 05:31:42.988730 systemd[1]: Created slice user.slice - User and Session Slice. Sep 16 05:31:42.988737 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 05:31:42.988744 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 05:31:42.988755 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 16 05:31:42.988762 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 16 05:31:42.988769 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 16 05:31:42.988778 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 16 05:31:42.988784 systemd[1]: Expecting device dev-ttyS1.device - /dev/ttyS1... Sep 16 05:31:42.988791 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 05:31:42.988798 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 16 05:31:42.988806 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 16 05:31:42.988813 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 16 05:31:42.988820 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 16 05:31:42.988828 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 16 05:31:42.988835 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 05:31:42.988842 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 16 05:31:42.988848 systemd[1]: Reached target slices.target - Slice Units. Sep 16 05:31:42.988855 systemd[1]: Reached target swap.target - Swaps. Sep 16 05:31:42.988862 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 16 05:31:42.988868 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 16 05:31:42.988875 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 16 05:31:42.988883 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 16 05:31:42.988890 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 16 05:31:42.988897 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 05:31:42.988903 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 16 05:31:42.988910 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 16 05:31:42.988918 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 16 05:31:42.988925 systemd[1]: Mounting media.mount - External Media Directory... Sep 16 05:31:42.988932 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 16 05:31:42.988939 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 16 05:31:42.988945 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 16 05:31:42.988952 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 16 05:31:42.988959 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 16 05:31:42.988967 systemd[1]: Reached target machines.target - Containers. Sep 16 05:31:42.988975 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 16 05:31:42.988982 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 05:31:42.988989 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 16 05:31:42.988996 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 16 05:31:42.989003 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 16 05:31:42.989009 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 16 05:31:42.989016 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 16 05:31:42.989023 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 16 05:31:42.989030 kernel: ACPI: bus type drm_connector registered Sep 16 05:31:42.989038 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 16 05:31:42.989044 kernel: fuse: init (API version 7.41) Sep 16 05:31:42.989050 kernel: loop: module loaded Sep 16 05:31:42.989057 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 16 05:31:42.989064 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 16 05:31:42.989071 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 16 05:31:42.989077 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 16 05:31:42.989084 systemd[1]: Stopped systemd-fsck-usr.service. Sep 16 05:31:42.989092 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 05:31:42.989099 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 16 05:31:42.989107 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 16 05:31:42.989113 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 16 05:31:42.989130 systemd-journald[1440]: Collecting audit messages is disabled. Sep 16 05:31:42.989147 systemd-journald[1440]: Journal started Sep 16 05:31:42.989162 systemd-journald[1440]: Runtime Journal (/run/log/journal/252462594cf5431c8a1d9d48e29b3f34) is 8M, max 640.1M, 632.1M free. Sep 16 05:31:41.842643 systemd[1]: Queued start job for default target multi-user.target. Sep 16 05:31:41.855711 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 16 05:31:41.856050 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 16 05:31:43.008804 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 16 05:31:43.029807 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 16 05:31:43.058438 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 16 05:31:43.058464 systemd[1]: verity-setup.service: Deactivated successfully. Sep 16 05:31:43.058792 systemd[1]: Stopped verity-setup.service. Sep 16 05:31:43.095773 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 16 05:31:43.103789 systemd[1]: Started systemd-journald.service - Journal Service. Sep 16 05:31:43.112210 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 16 05:31:43.120881 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 16 05:31:43.129880 systemd[1]: Mounted media.mount - External Media Directory. Sep 16 05:31:43.138879 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 16 05:31:43.149049 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 16 05:31:43.158024 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 16 05:31:43.167125 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 16 05:31:43.178109 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 05:31:43.188099 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 16 05:31:43.188205 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 16 05:31:43.198121 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 16 05:31:43.198235 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 16 05:31:43.208132 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 16 05:31:43.208267 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 16 05:31:43.217065 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 16 05:31:43.217219 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 16 05:31:43.227184 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 16 05:31:43.227372 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 16 05:31:43.236344 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 16 05:31:43.236611 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 16 05:31:43.245741 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 16 05:31:43.255744 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 05:31:43.266741 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 16 05:31:43.278800 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 16 05:31:43.289744 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 05:31:43.308719 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 16 05:31:43.318636 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 16 05:31:43.343163 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 16 05:31:43.351970 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 16 05:31:43.352013 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 16 05:31:43.363468 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 16 05:31:43.376067 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 16 05:31:43.385022 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 05:31:43.393408 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 16 05:31:43.402319 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 16 05:31:43.412913 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 16 05:31:43.413548 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 16 05:31:43.416623 systemd-journald[1440]: Time spent on flushing to /var/log/journal/252462594cf5431c8a1d9d48e29b3f34 is 12.616ms for 1420 entries. Sep 16 05:31:43.416623 systemd-journald[1440]: System Journal (/var/log/journal/252462594cf5431c8a1d9d48e29b3f34) is 8M, max 195.6M, 187.6M free. Sep 16 05:31:43.453565 systemd-journald[1440]: Received client request to flush runtime journal. Sep 16 05:31:43.429901 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 16 05:31:43.430498 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 16 05:31:43.439471 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 16 05:31:43.450636 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 16 05:31:43.465945 kernel: loop0: detected capacity change from 0 to 128016 Sep 16 05:31:43.467060 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 16 05:31:43.476986 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 16 05:31:43.489198 systemd-tmpfiles[1479]: ACLs are not supported, ignoring. Sep 16 05:31:43.489208 systemd-tmpfiles[1479]: ACLs are not supported, ignoring. Sep 16 05:31:43.492747 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 16 05:31:43.492907 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 16 05:31:43.503016 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 16 05:31:43.512996 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 16 05:31:43.520758 kernel: loop1: detected capacity change from 0 to 224512 Sep 16 05:31:43.529029 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 16 05:31:43.540385 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 16 05:31:43.550508 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 16 05:31:43.567988 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 16 05:31:43.572758 kernel: loop2: detected capacity change from 0 to 8 Sep 16 05:31:43.582745 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 16 05:31:43.588912 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 16 05:31:43.607759 kernel: loop3: detected capacity change from 0 to 110984 Sep 16 05:31:43.609141 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 16 05:31:43.618697 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 16 05:31:43.646501 systemd-tmpfiles[1497]: ACLs are not supported, ignoring. Sep 16 05:31:43.646510 systemd-tmpfiles[1497]: ACLs are not supported, ignoring. Sep 16 05:31:43.647951 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 05:31:43.657805 kernel: loop4: detected capacity change from 0 to 128016 Sep 16 05:31:43.677806 kernel: loop5: detected capacity change from 0 to 224512 Sep 16 05:31:43.701757 kernel: loop6: detected capacity change from 0 to 8 Sep 16 05:31:43.708788 kernel: loop7: detected capacity change from 0 to 110984 Sep 16 05:31:43.721783 (sd-merge)[1501]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-packet'. Sep 16 05:31:43.722030 (sd-merge)[1501]: Merged extensions into '/usr'. Sep 16 05:31:43.724562 systemd[1]: Reload requested from client PID 1475 ('systemd-sysext') (unit systemd-sysext.service)... Sep 16 05:31:43.724569 systemd[1]: Reloading... Sep 16 05:31:43.754845 zram_generator::config[1528]: No configuration found. Sep 16 05:31:43.791755 ldconfig[1470]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 16 05:31:43.876246 systemd[1]: Reloading finished in 151 ms. Sep 16 05:31:43.893576 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 16 05:31:43.903119 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 16 05:31:43.914113 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 16 05:31:43.936683 systemd[1]: Starting ensure-sysext.service... Sep 16 05:31:43.943718 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 16 05:31:43.955820 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 05:31:43.965176 systemd-tmpfiles[1586]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 16 05:31:43.965210 systemd-tmpfiles[1586]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 16 05:31:43.965497 systemd-tmpfiles[1586]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 16 05:31:43.965791 systemd-tmpfiles[1586]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 16 05:31:43.966642 systemd-tmpfiles[1586]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 16 05:31:43.966947 systemd-tmpfiles[1586]: ACLs are not supported, ignoring. Sep 16 05:31:43.967010 systemd-tmpfiles[1586]: ACLs are not supported, ignoring. Sep 16 05:31:43.970141 systemd-tmpfiles[1586]: Detected autofs mount point /boot during canonicalization of boot. Sep 16 05:31:43.970149 systemd-tmpfiles[1586]: Skipping /boot Sep 16 05:31:43.976712 systemd-tmpfiles[1586]: Detected autofs mount point /boot during canonicalization of boot. Sep 16 05:31:43.976719 systemd-tmpfiles[1586]: Skipping /boot Sep 16 05:31:43.982240 systemd[1]: Reload requested from client PID 1585 ('systemctl') (unit ensure-sysext.service)... Sep 16 05:31:43.982253 systemd[1]: Reloading... Sep 16 05:31:44.000890 systemd-udevd[1587]: Using default interface naming scheme 'v255'. Sep 16 05:31:44.017840 zram_generator::config[1614]: No configuration found. Sep 16 05:31:44.072567 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Sep 16 05:31:44.072607 kernel: ACPI: button: Sleep Button [SLPB] Sep 16 05:31:44.080266 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:44.080414 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 16 05:31:44.092802 kernel: ACPI: button: Power Button [PWRF] Sep 16 05:31:44.093759 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:44.104758 kernel: mousedev: PS/2 mouse device common for all mice Sep 16 05:31:44.104791 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Sep 16 05:31:44.105866 kernel: IPMI message handler: version 39.2 Sep 16 05:31:44.105882 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Sep 16 05:31:44.105986 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:44.106801 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:44.159132 kernel: ipmi device interface Sep 16 05:31:44.159187 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Sep 16 05:31:44.159956 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Sep 16 05:31:44.176761 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:44.195757 kernel: iTCO_vendor_support: vendor-support=0 Sep 16 05:31:44.195800 kernel: MACsec IEEE 802.1AE Sep 16 05:31:44.195817 kernel: ipmi_si: IPMI System Interface driver Sep 16 05:31:44.195835 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Sep 16 05:31:44.195949 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Sep 16 05:31:44.195961 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Sep 16 05:31:44.195974 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:44.196084 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Sep 16 05:31:44.196182 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Sep 16 05:31:44.196282 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Sep 16 05:31:44.196351 kernel: ipmi_si: Adding ACPI-specified kcs state machine Sep 16 05:31:44.196362 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Sep 16 05:31:44.196370 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:44.205521 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5200_MTFDDAK480TDN OEM. Sep 16 05:31:44.215759 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Sep 16 05:31:44.306955 systemd[1]: Condition check resulted in dev-ttyS1.device - /dev/ttyS1 being skipped. Sep 16 05:31:44.307317 systemd[1]: Reloading finished in 324 ms. Sep 16 05:31:44.315788 kernel: iTCO_wdt iTCO_wdt: Found a Intel PCH TCO device (Version=6, TCOBASE=0x0400) Sep 16 05:31:44.315930 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b0f, dev_id: 0x20) Sep 16 05:31:44.316019 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:44.330729 kernel: iTCO_wdt iTCO_wdt: initialized. heartbeat=30 sec (nowayout=0) Sep 16 05:31:44.346818 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:44.363504 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 05:31:44.379589 kernel: intel_rapl_common: Found RAPL domain package Sep 16 05:31:44.379618 kernel: intel_rapl_common: Found RAPL domain core Sep 16 05:31:44.379628 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:44.379737 kernel: intel_rapl_common: Found RAPL domain dram Sep 16 05:31:44.402155 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:44.402265 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Sep 16 05:31:44.402756 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:44.419673 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 05:31:44.423756 kernel: ipmi_ssif: IPMI SSIF Interface driver Sep 16 05:31:44.445021 systemd[1]: Finished ensure-sysext.service. Sep 16 05:31:44.470190 systemd[1]: Reached target tpm2.target - Trusted Platform Module. Sep 16 05:31:44.478864 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 16 05:31:44.479503 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 16 05:31:44.494260 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 16 05:31:44.503944 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 05:31:44.512823 augenrules[1810]: No rules Sep 16 05:31:44.520318 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 16 05:31:44.539967 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 16 05:31:44.549380 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 16 05:31:44.570010 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 16 05:31:44.578916 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 05:31:44.579426 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 16 05:31:44.589845 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 05:31:44.590433 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 16 05:31:44.601690 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 16 05:31:44.602629 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 16 05:31:44.603520 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 16 05:31:44.617507 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 16 05:31:44.645040 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 05:31:44.653849 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 16 05:31:44.654849 systemd[1]: audit-rules.service: Deactivated successfully. Sep 16 05:31:44.667975 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 16 05:31:44.680861 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 16 05:31:44.681440 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 16 05:31:44.681727 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 16 05:31:44.682045 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 16 05:31:44.682307 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 16 05:31:44.682490 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 16 05:31:44.682601 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 16 05:31:44.682798 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 16 05:31:44.682882 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 16 05:31:44.683032 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 16 05:31:44.683186 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 16 05:31:44.686977 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 16 05:31:44.687043 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 16 05:31:44.687699 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 16 05:31:44.688580 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 16 05:31:44.688605 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 16 05:31:44.688797 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 16 05:31:44.707586 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 16 05:31:44.722929 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 16 05:31:44.762379 systemd-networkd[1822]: lo: Link UP Sep 16 05:31:44.762383 systemd-networkd[1822]: lo: Gained carrier Sep 16 05:31:44.764856 systemd-networkd[1822]: bond0: netdev ready Sep 16 05:31:44.765809 systemd-networkd[1822]: Enumeration completed Sep 16 05:31:44.765862 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 16 05:31:44.771573 systemd-networkd[1822]: enp1s0f0np0: Configuring with /etc/systemd/network/10-0c:42:a1:97:fd:2c.network. Sep 16 05:31:44.773419 systemd-resolved[1823]: Positive Trust Anchors: Sep 16 05:31:44.773424 systemd-resolved[1823]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 16 05:31:44.773447 systemd-resolved[1823]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 16 05:31:44.775854 systemd-resolved[1823]: Using system hostname 'ci-4459.0.0-n-e7bf2d745b'. Sep 16 05:31:44.792953 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 16 05:31:44.802970 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 05:31:44.813190 systemd[1]: Reached target time-set.target - System Time Set. Sep 16 05:31:44.822708 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 16 05:31:44.833731 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 16 05:31:45.070871 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Sep 16 05:31:45.085787 kernel: bond0: (slave enp1s0f0np0): Enslaving as a backup interface with an up link Sep 16 05:31:45.087276 systemd-networkd[1822]: enp1s0f1np1: Configuring with /etc/systemd/network/10-0c:42:a1:97:fd:2d.network. Sep 16 05:31:45.090808 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 16 05:31:45.099873 systemd[1]: Reached target network.target - Network. Sep 16 05:31:45.106828 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 16 05:31:45.116827 systemd[1]: Reached target sysinit.target - System Initialization. Sep 16 05:31:45.125881 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 16 05:31:45.135842 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 16 05:31:45.145823 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 16 05:31:45.156040 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 16 05:31:45.165006 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 16 05:31:45.174919 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 16 05:31:45.184913 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 16 05:31:45.184943 systemd[1]: Reached target paths.target - Path Units. Sep 16 05:31:45.191914 systemd[1]: Reached target timers.target - Timer Units. Sep 16 05:31:45.200513 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 16 05:31:45.210661 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 16 05:31:45.220008 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 16 05:31:45.231811 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 16 05:31:45.241800 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Sep 16 05:31:45.245975 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 16 05:31:45.256205 systemd-networkd[1822]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Sep 16 05:31:45.256804 kernel: bond0: (slave enp1s0f1np1): Enslaving as a backup interface with an up link Sep 16 05:31:45.257304 systemd-networkd[1822]: enp1s0f0np0: Link UP Sep 16 05:31:45.257467 systemd-networkd[1822]: enp1s0f0np0: Gained carrier Sep 16 05:31:45.267817 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Sep 16 05:31:45.273413 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 16 05:31:45.278988 systemd-networkd[1822]: enp1s0f1np1: Reconfiguring with /etc/systemd/network/10-0c:42:a1:97:fd:2c.network. Sep 16 05:31:45.279126 systemd-networkd[1822]: enp1s0f1np1: Link UP Sep 16 05:31:45.279251 systemd-networkd[1822]: enp1s0f1np1: Gained carrier Sep 16 05:31:45.285032 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 16 05:31:45.294588 systemd[1]: Reached target sockets.target - Socket Units. Sep 16 05:31:45.296933 systemd-networkd[1822]: bond0: Link UP Sep 16 05:31:45.297094 systemd-networkd[1822]: bond0: Gained carrier Sep 16 05:31:45.297245 systemd-timesyncd[1824]: Network configuration changed, trying to establish connection. Sep 16 05:31:45.303865 systemd[1]: Reached target basic.target - Basic System. Sep 16 05:31:45.310890 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 16 05:31:45.310906 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 16 05:31:45.311418 systemd[1]: Starting containerd.service - containerd container runtime... Sep 16 05:31:45.337180 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 16 05:31:45.346372 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 16 05:31:45.354403 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 16 05:31:45.359568 coreos-metadata[1863]: Sep 16 05:31:45.359 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 16 05:31:45.377284 kernel: bond0: (slave enp1s0f0np0): link status definitely up, 10000 Mbps full duplex Sep 16 05:31:45.377307 kernel: bond0: active interface up! Sep 16 05:31:45.382971 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 16 05:31:45.392368 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 16 05:31:45.393997 jq[1869]: false Sep 16 05:31:45.400838 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 16 05:31:45.401395 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 16 05:31:45.406039 extend-filesystems[1870]: Found /dev/sda6 Sep 16 05:31:45.415865 extend-filesystems[1870]: Found /dev/sda9 Sep 16 05:31:45.415865 extend-filesystems[1870]: Checking size of /dev/sda9 Sep 16 05:31:45.415865 extend-filesystems[1870]: Resized partition /dev/sda9 Sep 16 05:31:45.460923 kernel: EXT4-fs (sda9): resizing filesystem from 553472 to 116605649 blocks Sep 16 05:31:45.413821 oslogin_cache_refresh[1871]: Refreshing passwd entry cache Sep 16 05:31:45.410452 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 16 05:31:45.461097 extend-filesystems[1882]: resize2fs 1.47.3 (8-Jul-2025) Sep 16 05:31:45.467962 google_oslogin_nss_cache[1871]: oslogin_cache_refresh[1871]: Refreshing passwd entry cache Sep 16 05:31:45.416572 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 16 05:31:45.436578 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 16 05:31:45.461318 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 16 05:31:45.484125 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 16 05:31:45.494809 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 10000 Mbps full duplex Sep 16 05:31:45.501847 systemd[1]: Starting tcsd.service - TCG Core Services Daemon... Sep 16 05:31:45.509041 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 16 05:31:45.509367 systemd[1]: Starting update-engine.service - Update Engine... Sep 16 05:31:45.517316 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 16 05:31:45.523980 update_engine[1901]: I20250916 05:31:45.523916 1901 main.cc:92] Flatcar Update Engine starting Sep 16 05:31:45.524484 systemd-logind[1896]: Watching system buttons on /dev/input/event3 (Power Button) Sep 16 05:31:45.524495 systemd-logind[1896]: Watching system buttons on /dev/input/event2 (Sleep Button) Sep 16 05:31:45.524504 systemd-logind[1896]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Sep 16 05:31:45.524611 systemd-logind[1896]: New seat seat0. Sep 16 05:31:45.527409 systemd[1]: Started systemd-logind.service - User Login Management. Sep 16 05:31:45.528663 jq[1902]: true Sep 16 05:31:45.536372 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 16 05:31:45.545976 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 16 05:31:45.546089 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 16 05:31:45.546241 systemd[1]: motdgen.service: Deactivated successfully. Sep 16 05:31:45.546356 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 16 05:31:45.556292 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 16 05:31:45.556401 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 16 05:31:45.571472 (ntainerd)[1906]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 16 05:31:45.572919 jq[1905]: true Sep 16 05:31:45.582871 sshd_keygen[1899]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 16 05:31:45.583060 tar[1904]: linux-amd64/LICENSE Sep 16 05:31:45.583155 tar[1904]: linux-amd64/helm Sep 16 05:31:45.588799 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Sep 16 05:31:45.588933 systemd[1]: Condition check resulted in tcsd.service - TCG Core Services Daemon being skipped. Sep 16 05:31:45.596085 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 16 05:31:45.606420 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 16 05:31:45.619355 dbus-daemon[1864]: [system] SELinux support is enabled Sep 16 05:31:45.619475 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 16 05:31:45.623089 update_engine[1901]: I20250916 05:31:45.621170 1901 update_check_scheduler.cc:74] Next update check in 4m33s Sep 16 05:31:45.625259 bash[1938]: Updated "/home/core/.ssh/authorized_keys" Sep 16 05:31:45.630877 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 16 05:31:45.641041 systemd[1]: issuegen.service: Deactivated successfully. Sep 16 05:31:45.641149 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 16 05:31:45.651112 dbus-daemon[1864]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 16 05:31:45.651427 systemd[1]: Starting sshkeys.service... Sep 16 05:31:45.656819 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 16 05:31:45.656839 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 16 05:31:45.657502 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 16 05:31:45.674830 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 16 05:31:45.674854 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 16 05:31:45.686449 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 16 05:31:45.699454 systemd[1]: Started update-engine.service - Update Engine. Sep 16 05:31:45.708633 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 16 05:31:45.719701 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 16 05:31:45.736058 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 16 05:31:45.739688 containerd[1906]: time="2025-09-16T05:31:45Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 16 05:31:45.740020 containerd[1906]: time="2025-09-16T05:31:45.740007708Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 16 05:31:45.745106 containerd[1906]: time="2025-09-16T05:31:45.745089138Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.773µs" Sep 16 05:31:45.745136 containerd[1906]: time="2025-09-16T05:31:45.745105179Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 16 05:31:45.745136 containerd[1906]: time="2025-09-16T05:31:45.745118677Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 16 05:31:45.745219 containerd[1906]: time="2025-09-16T05:31:45.745209264Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 16 05:31:45.745238 containerd[1906]: time="2025-09-16T05:31:45.745222878Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 16 05:31:45.745251 containerd[1906]: time="2025-09-16T05:31:45.745241982Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 16 05:31:45.745296 containerd[1906]: time="2025-09-16T05:31:45.745285320Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 16 05:31:45.745315 containerd[1906]: time="2025-09-16T05:31:45.745296995Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 16 05:31:45.745533 containerd[1906]: time="2025-09-16T05:31:45.745516987Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 16 05:31:45.745551 containerd[1906]: time="2025-09-16T05:31:45.745542318Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 16 05:31:45.745573 containerd[1906]: time="2025-09-16T05:31:45.745553769Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 16 05:31:45.745589 containerd[1906]: time="2025-09-16T05:31:45.745574079Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 16 05:31:45.745653 containerd[1906]: time="2025-09-16T05:31:45.745645003Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 16 05:31:45.745850 containerd[1906]: time="2025-09-16T05:31:45.745838135Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 16 05:31:45.745877 containerd[1906]: time="2025-09-16T05:31:45.745863514Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 16 05:31:45.745904 containerd[1906]: time="2025-09-16T05:31:45.745877922Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 16 05:31:45.745897 systemd[1]: Started serial-getty@ttyS1.service - Serial Getty on ttyS1. Sep 16 05:31:45.746048 coreos-metadata[1961]: Sep 16 05:31:45.745 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 16 05:31:45.746160 containerd[1906]: time="2025-09-16T05:31:45.745915212Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 16 05:31:45.746160 containerd[1906]: time="2025-09-16T05:31:45.746110118Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 16 05:31:45.746160 containerd[1906]: time="2025-09-16T05:31:45.746147789Z" level=info msg="metadata content store policy set" policy=shared Sep 16 05:31:45.754631 containerd[1906]: time="2025-09-16T05:31:45.754588597Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 16 05:31:45.754631 containerd[1906]: time="2025-09-16T05:31:45.754610966Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 16 05:31:45.754631 containerd[1906]: time="2025-09-16T05:31:45.754626643Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 16 05:31:45.754694 containerd[1906]: time="2025-09-16T05:31:45.754634335Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 16 05:31:45.754694 containerd[1906]: time="2025-09-16T05:31:45.754641712Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 16 05:31:45.754694 containerd[1906]: time="2025-09-16T05:31:45.754648206Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 16 05:31:45.754694 containerd[1906]: time="2025-09-16T05:31:45.754660461Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 16 05:31:45.754694 containerd[1906]: time="2025-09-16T05:31:45.754666991Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 16 05:31:45.754694 containerd[1906]: time="2025-09-16T05:31:45.754672636Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 16 05:31:45.754694 containerd[1906]: time="2025-09-16T05:31:45.754677986Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 16 05:31:45.754694 containerd[1906]: time="2025-09-16T05:31:45.754688776Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 16 05:31:45.754802 containerd[1906]: time="2025-09-16T05:31:45.754696414Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 16 05:31:45.754802 containerd[1906]: time="2025-09-16T05:31:45.754779053Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 16 05:31:45.754802 containerd[1906]: time="2025-09-16T05:31:45.754796205Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 16 05:31:45.754838 containerd[1906]: time="2025-09-16T05:31:45.754818219Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 16 05:31:45.754854 containerd[1906]: time="2025-09-16T05:31:45.754845145Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 16 05:31:45.754870 containerd[1906]: time="2025-09-16T05:31:45.754856137Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 16 05:31:45.754870 containerd[1906]: time="2025-09-16T05:31:45.754866025Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 16 05:31:45.754908 containerd[1906]: time="2025-09-16T05:31:45.754876452Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 16 05:31:45.754908 containerd[1906]: time="2025-09-16T05:31:45.754895923Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 16 05:31:45.754962 containerd[1906]: time="2025-09-16T05:31:45.754909646Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 16 05:31:45.754962 containerd[1906]: time="2025-09-16T05:31:45.754919917Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 16 05:31:45.754962 containerd[1906]: time="2025-09-16T05:31:45.754933698Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 16 05:31:45.755032 containerd[1906]: time="2025-09-16T05:31:45.754996602Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 16 05:31:45.755032 containerd[1906]: time="2025-09-16T05:31:45.755017077Z" level=info msg="Start snapshots syncer" Sep 16 05:31:45.755157 containerd[1906]: time="2025-09-16T05:31:45.755037071Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 16 05:31:45.755035 systemd[1]: Reached target getty.target - Login Prompts. Sep 16 05:31:45.755590 containerd[1906]: time="2025-09-16T05:31:45.755514409Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 16 05:31:45.755664 containerd[1906]: time="2025-09-16T05:31:45.755627896Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 16 05:31:45.755683 containerd[1906]: time="2025-09-16T05:31:45.755675453Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 16 05:31:45.755748 containerd[1906]: time="2025-09-16T05:31:45.755738612Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 16 05:31:45.755776 containerd[1906]: time="2025-09-16T05:31:45.755757326Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 16 05:31:45.755776 containerd[1906]: time="2025-09-16T05:31:45.755765605Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 16 05:31:45.755776 containerd[1906]: time="2025-09-16T05:31:45.755771774Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 16 05:31:45.755816 containerd[1906]: time="2025-09-16T05:31:45.755778353Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 16 05:31:45.755816 containerd[1906]: time="2025-09-16T05:31:45.755784851Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 16 05:31:45.755816 containerd[1906]: time="2025-09-16T05:31:45.755790777Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 16 05:31:45.755816 containerd[1906]: time="2025-09-16T05:31:45.755812042Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 16 05:31:45.755872 containerd[1906]: time="2025-09-16T05:31:45.755819660Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 16 05:31:45.755872 containerd[1906]: time="2025-09-16T05:31:45.755826401Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 16 05:31:45.755872 containerd[1906]: time="2025-09-16T05:31:45.755843176Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 16 05:31:45.755872 containerd[1906]: time="2025-09-16T05:31:45.755851835Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 16 05:31:45.755872 containerd[1906]: time="2025-09-16T05:31:45.755857305Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 16 05:31:45.755872 containerd[1906]: time="2025-09-16T05:31:45.755862610Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 16 05:31:45.755872 containerd[1906]: time="2025-09-16T05:31:45.755867990Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 16 05:31:45.756006 containerd[1906]: time="2025-09-16T05:31:45.755875339Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 16 05:31:45.756006 containerd[1906]: time="2025-09-16T05:31:45.755881910Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 16 05:31:45.756006 containerd[1906]: time="2025-09-16T05:31:45.755893370Z" level=info msg="runtime interface created" Sep 16 05:31:45.756006 containerd[1906]: time="2025-09-16T05:31:45.755897505Z" level=info msg="created NRI interface" Sep 16 05:31:45.756006 containerd[1906]: time="2025-09-16T05:31:45.755902052Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 16 05:31:45.756006 containerd[1906]: time="2025-09-16T05:31:45.755908949Z" level=info msg="Connect containerd service" Sep 16 05:31:45.756006 containerd[1906]: time="2025-09-16T05:31:45.755946734Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 16 05:31:45.756424 containerd[1906]: time="2025-09-16T05:31:45.756410758Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 16 05:31:45.764663 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 16 05:31:45.773780 tar[1904]: linux-amd64/README.md Sep 16 05:31:45.785146 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 16 05:31:45.807310 locksmithd[1977]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 16 05:31:45.835040 containerd[1906]: time="2025-09-16T05:31:45.835016352Z" level=info msg="Start subscribing containerd event" Sep 16 05:31:45.835100 containerd[1906]: time="2025-09-16T05:31:45.835054910Z" level=info msg="Start recovering state" Sep 16 05:31:45.835100 containerd[1906]: time="2025-09-16T05:31:45.835078723Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 16 05:31:45.835132 containerd[1906]: time="2025-09-16T05:31:45.835108221Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 16 05:31:45.835132 containerd[1906]: time="2025-09-16T05:31:45.835115492Z" level=info msg="Start event monitor" Sep 16 05:31:45.835132 containerd[1906]: time="2025-09-16T05:31:45.835126083Z" level=info msg="Start cni network conf syncer for default" Sep 16 05:31:45.835132 containerd[1906]: time="2025-09-16T05:31:45.835130102Z" level=info msg="Start streaming server" Sep 16 05:31:45.835203 containerd[1906]: time="2025-09-16T05:31:45.835141473Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 16 05:31:45.835203 containerd[1906]: time="2025-09-16T05:31:45.835146143Z" level=info msg="runtime interface starting up..." Sep 16 05:31:45.835203 containerd[1906]: time="2025-09-16T05:31:45.835149299Z" level=info msg="starting plugins..." Sep 16 05:31:45.835203 containerd[1906]: time="2025-09-16T05:31:45.835157625Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 16 05:31:45.835290 containerd[1906]: time="2025-09-16T05:31:45.835227793Z" level=info msg="containerd successfully booted in 0.095750s" Sep 16 05:31:45.835281 systemd[1]: Started containerd.service - containerd container runtime. Sep 16 05:31:45.954786 kernel: EXT4-fs (sda9): resized filesystem to 116605649 Sep 16 05:31:45.982256 extend-filesystems[1882]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 16 05:31:45.982256 extend-filesystems[1882]: old_desc_blocks = 1, new_desc_blocks = 56 Sep 16 05:31:45.982256 extend-filesystems[1882]: The filesystem on /dev/sda9 is now 116605649 (4k) blocks long. Sep 16 05:31:46.011824 extend-filesystems[1870]: Resized filesystem in /dev/sda9 Sep 16 05:31:45.982727 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 16 05:31:45.982865 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 16 05:31:47.195847 systemd-networkd[1822]: bond0: Gained IPv6LL Sep 16 05:31:47.197495 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 16 05:31:47.208227 systemd[1]: Reached target network-online.target - Network is Online. Sep 16 05:31:47.217864 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 05:31:47.238075 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 16 05:31:47.257286 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 16 05:31:47.990826 kernel: mlx5_core 0000:01:00.0: lag map: port 1:1 port 2:2 Sep 16 05:31:47.990987 kernel: mlx5_core 0000:01:00.0: shared_fdb:0 mode:queue_affinity Sep 16 05:31:47.998289 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 05:31:48.008420 (kubelet)[2020]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 05:31:48.065755 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 16 05:31:48.434396 kubelet[2020]: E0916 05:31:48.434279 2020 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 05:31:48.435405 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 05:31:48.435483 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 05:31:48.435655 systemd[1]: kubelet.service: Consumed 606ms CPU time, 270.5M memory peak. Sep 16 05:31:47.969156 systemd-resolved[1823]: Clock change detected. Flushing caches. Sep 16 05:31:47.978844 systemd-journald[1440]: Time jumped backwards, rotating. Sep 16 05:31:47.969305 systemd-timesyncd[1824]: Contacted time server 104.167.215.195:123 (0.flatcar.pool.ntp.org). Sep 16 05:31:47.969434 systemd-timesyncd[1824]: Initial clock synchronization to Tue 2025-09-16 05:31:47.968963 UTC. Sep 16 05:31:48.329575 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 16 05:31:48.338869 systemd[1]: Started sshd@0-139.178.94.21:22-139.178.89.65:47328.service - OpenSSH per-connection server daemon (139.178.89.65:47328). Sep 16 05:31:48.409705 sshd[2043]: Accepted publickey for core from 139.178.89.65 port 47328 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:31:48.410368 sshd-session[2043]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:31:48.414327 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 16 05:31:48.424884 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 16 05:31:48.440443 systemd-logind[1896]: New session 1 of user core. Sep 16 05:31:48.457697 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 16 05:31:48.472263 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 16 05:31:48.502794 (systemd)[2048]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 16 05:31:48.504885 systemd-logind[1896]: New session c1 of user core. Sep 16 05:31:48.632212 systemd[2048]: Queued start job for default target default.target. Sep 16 05:31:48.644701 systemd[2048]: Created slice app.slice - User Application Slice. Sep 16 05:31:48.644734 systemd[2048]: Reached target paths.target - Paths. Sep 16 05:31:48.644754 systemd[2048]: Reached target timers.target - Timers. Sep 16 05:31:48.645361 systemd[2048]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 16 05:31:48.650732 systemd[2048]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 16 05:31:48.650759 systemd[2048]: Reached target sockets.target - Sockets. Sep 16 05:31:48.650780 systemd[2048]: Reached target basic.target - Basic System. Sep 16 05:31:48.650800 systemd[2048]: Reached target default.target - Main User Target. Sep 16 05:31:48.650814 systemd[2048]: Startup finished in 141ms. Sep 16 05:31:48.650880 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 16 05:31:48.669305 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 16 05:31:48.690110 google_oslogin_nss_cache[1871]: oslogin_cache_refresh[1871]: Failure getting users, quitting Sep 16 05:31:48.690110 google_oslogin_nss_cache[1871]: oslogin_cache_refresh[1871]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 16 05:31:48.690082 oslogin_cache_refresh[1871]: Failure getting users, quitting Sep 16 05:31:48.691291 google_oslogin_nss_cache[1871]: oslogin_cache_refresh[1871]: Refreshing group entry cache Sep 16 05:31:48.690127 oslogin_cache_refresh[1871]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 16 05:31:48.690235 oslogin_cache_refresh[1871]: Refreshing group entry cache Sep 16 05:31:48.691919 google_oslogin_nss_cache[1871]: oslogin_cache_refresh[1871]: Failure getting groups, quitting Sep 16 05:31:48.691919 google_oslogin_nss_cache[1871]: oslogin_cache_refresh[1871]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 16 05:31:48.691861 oslogin_cache_refresh[1871]: Failure getting groups, quitting Sep 16 05:31:48.691889 oslogin_cache_refresh[1871]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 16 05:31:48.695724 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 16 05:31:48.696367 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 16 05:31:48.749458 systemd[1]: Started sshd@1-139.178.94.21:22-139.178.89.65:47340.service - OpenSSH per-connection server daemon (139.178.89.65:47340). Sep 16 05:31:48.802320 coreos-metadata[1863]: Sep 16 05:31:48.802 INFO Fetch successful Sep 16 05:31:48.803153 sshd[2060]: Accepted publickey for core from 139.178.89.65 port 47340 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:31:48.804282 sshd-session[2060]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:31:48.808569 systemd-logind[1896]: New session 2 of user core. Sep 16 05:31:48.824186 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 16 05:31:48.865051 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 16 05:31:48.875210 systemd[1]: Starting packet-phone-home.service - Report Success to Packet... Sep 16 05:31:48.885008 sshd[2063]: Connection closed by 139.178.89.65 port 47340 Sep 16 05:31:48.885123 sshd-session[2060]: pam_unix(sshd:session): session closed for user core Sep 16 05:31:48.892259 systemd[1]: sshd@1-139.178.94.21:22-139.178.89.65:47340.service: Deactivated successfully. Sep 16 05:31:48.893024 systemd[1]: session-2.scope: Deactivated successfully. Sep 16 05:31:48.893533 systemd-logind[1896]: Session 2 logged out. Waiting for processes to exit. Sep 16 05:31:48.894502 systemd[1]: Started sshd@2-139.178.94.21:22-139.178.89.65:47354.service - OpenSSH per-connection server daemon (139.178.89.65:47354). Sep 16 05:31:48.906909 systemd-logind[1896]: Removed session 2. Sep 16 05:31:48.937324 sshd[2076]: Accepted publickey for core from 139.178.89.65 port 47354 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:31:48.938048 sshd-session[2076]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:31:48.941594 systemd-logind[1896]: New session 3 of user core. Sep 16 05:31:48.955286 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 16 05:31:49.028183 sshd[2079]: Connection closed by 139.178.89.65 port 47354 Sep 16 05:31:49.028870 sshd-session[2076]: pam_unix(sshd:session): session closed for user core Sep 16 05:31:49.037477 systemd[1]: sshd@2-139.178.94.21:22-139.178.89.65:47354.service: Deactivated successfully. Sep 16 05:31:49.041560 systemd[1]: session-3.scope: Deactivated successfully. Sep 16 05:31:49.043952 systemd-logind[1896]: Session 3 logged out. Waiting for processes to exit. Sep 16 05:31:49.047018 systemd-logind[1896]: Removed session 3. Sep 16 05:31:49.527637 systemd[1]: Finished packet-phone-home.service - Report Success to Packet. Sep 16 05:31:50.055282 login[1972]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 16 05:31:50.058726 systemd-logind[1896]: New session 4 of user core. Sep 16 05:31:50.059568 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 16 05:31:50.066409 login[1962]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 16 05:31:50.069156 systemd-logind[1896]: New session 5 of user core. Sep 16 05:31:50.069875 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 16 05:31:50.907415 coreos-metadata[1961]: Sep 16 05:31:50.907 INFO Fetch successful Sep 16 05:31:50.988951 unknown[1961]: wrote ssh authorized keys file for user: core Sep 16 05:31:51.020885 update-ssh-keys[2111]: Updated "/home/core/.ssh/authorized_keys" Sep 16 05:31:51.021301 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 16 05:31:51.022082 systemd[1]: Finished sshkeys.service. Sep 16 05:31:51.023159 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 16 05:31:51.023319 systemd[1]: Startup finished in 5.258s (kernel) + 33.068s (initrd) + 10.467s (userspace) = 48.794s. Sep 16 05:31:57.800955 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 16 05:31:57.802320 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 05:31:58.110009 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 05:31:58.116828 (kubelet)[2123]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 05:31:58.139202 kubelet[2123]: E0916 05:31:58.139179 2123 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 05:31:58.141319 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 05:31:58.141397 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 05:31:58.141598 systemd[1]: kubelet.service: Consumed 170ms CPU time, 114.4M memory peak. Sep 16 05:31:59.048500 systemd[1]: Started sshd@3-139.178.94.21:22-139.178.89.65:33752.service - OpenSSH per-connection server daemon (139.178.89.65:33752). Sep 16 05:31:59.080392 sshd[2143]: Accepted publickey for core from 139.178.89.65 port 33752 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:31:59.081006 sshd-session[2143]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:31:59.083837 systemd-logind[1896]: New session 6 of user core. Sep 16 05:31:59.099426 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 16 05:31:59.150690 sshd[2146]: Connection closed by 139.178.89.65 port 33752 Sep 16 05:31:59.150855 sshd-session[2143]: pam_unix(sshd:session): session closed for user core Sep 16 05:31:59.171148 systemd[1]: sshd@3-139.178.94.21:22-139.178.89.65:33752.service: Deactivated successfully. Sep 16 05:31:59.172041 systemd[1]: session-6.scope: Deactivated successfully. Sep 16 05:31:59.172623 systemd-logind[1896]: Session 6 logged out. Waiting for processes to exit. Sep 16 05:31:59.173701 systemd[1]: Started sshd@4-139.178.94.21:22-139.178.89.65:33754.service - OpenSSH per-connection server daemon (139.178.89.65:33754). Sep 16 05:31:59.174369 systemd-logind[1896]: Removed session 6. Sep 16 05:31:59.220577 sshd[2152]: Accepted publickey for core from 139.178.89.65 port 33754 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:31:59.221173 sshd-session[2152]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:31:59.223956 systemd-logind[1896]: New session 7 of user core. Sep 16 05:31:59.232412 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 16 05:31:59.291498 sshd[2155]: Connection closed by 139.178.89.65 port 33754 Sep 16 05:31:59.292259 sshd-session[2152]: pam_unix(sshd:session): session closed for user core Sep 16 05:31:59.315241 systemd[1]: sshd@4-139.178.94.21:22-139.178.89.65:33754.service: Deactivated successfully. Sep 16 05:31:59.318908 systemd[1]: session-7.scope: Deactivated successfully. Sep 16 05:31:59.321179 systemd-logind[1896]: Session 7 logged out. Waiting for processes to exit. Sep 16 05:31:59.326453 systemd[1]: Started sshd@5-139.178.94.21:22-139.178.89.65:33768.service - OpenSSH per-connection server daemon (139.178.89.65:33768). Sep 16 05:31:59.328326 systemd-logind[1896]: Removed session 7. Sep 16 05:31:59.401251 sshd[2161]: Accepted publickey for core from 139.178.89.65 port 33768 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:31:59.401813 sshd-session[2161]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:31:59.404673 systemd-logind[1896]: New session 8 of user core. Sep 16 05:31:59.413304 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 16 05:31:59.461140 sshd[2164]: Connection closed by 139.178.89.65 port 33768 Sep 16 05:31:59.461357 sshd-session[2161]: pam_unix(sshd:session): session closed for user core Sep 16 05:31:59.479591 systemd[1]: sshd@5-139.178.94.21:22-139.178.89.65:33768.service: Deactivated successfully. Sep 16 05:31:59.481078 systemd[1]: session-8.scope: Deactivated successfully. Sep 16 05:31:59.482023 systemd-logind[1896]: Session 8 logged out. Waiting for processes to exit. Sep 16 05:31:59.484551 systemd[1]: Started sshd@6-139.178.94.21:22-139.178.89.65:33778.service - OpenSSH per-connection server daemon (139.178.89.65:33778). Sep 16 05:31:59.485395 systemd-logind[1896]: Removed session 8. Sep 16 05:31:59.580368 sshd[2170]: Accepted publickey for core from 139.178.89.65 port 33778 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:31:59.581166 sshd-session[2170]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:31:59.584602 systemd-logind[1896]: New session 9 of user core. Sep 16 05:31:59.596427 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 16 05:31:59.665765 sudo[2174]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 16 05:31:59.665908 sudo[2174]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 05:31:59.681490 sudo[2174]: pam_unix(sudo:session): session closed for user root Sep 16 05:31:59.682301 sshd[2173]: Connection closed by 139.178.89.65 port 33778 Sep 16 05:31:59.682502 sshd-session[2170]: pam_unix(sshd:session): session closed for user core Sep 16 05:31:59.699860 systemd[1]: sshd@6-139.178.94.21:22-139.178.89.65:33778.service: Deactivated successfully. Sep 16 05:31:59.701043 systemd[1]: session-9.scope: Deactivated successfully. Sep 16 05:31:59.701779 systemd-logind[1896]: Session 9 logged out. Waiting for processes to exit. Sep 16 05:31:59.703510 systemd[1]: Started sshd@7-139.178.94.21:22-139.178.89.65:33784.service - OpenSSH per-connection server daemon (139.178.89.65:33784). Sep 16 05:31:59.704147 systemd-logind[1896]: Removed session 9. Sep 16 05:31:59.766575 sshd[2180]: Accepted publickey for core from 139.178.89.65 port 33784 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:31:59.767458 sshd-session[2180]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:31:59.771125 systemd-logind[1896]: New session 10 of user core. Sep 16 05:31:59.785233 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 16 05:31:59.839722 sudo[2185]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 16 05:31:59.839861 sudo[2185]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 05:31:59.842610 sudo[2185]: pam_unix(sudo:session): session closed for user root Sep 16 05:31:59.845205 sudo[2184]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 16 05:31:59.845355 sudo[2184]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 05:31:59.851079 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 16 05:31:59.888341 augenrules[2207]: No rules Sep 16 05:31:59.889172 systemd[1]: audit-rules.service: Deactivated successfully. Sep 16 05:31:59.889440 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 16 05:31:59.890534 sudo[2184]: pam_unix(sudo:session): session closed for user root Sep 16 05:31:59.891972 sshd[2183]: Connection closed by 139.178.89.65 port 33784 Sep 16 05:31:59.892410 sshd-session[2180]: pam_unix(sshd:session): session closed for user core Sep 16 05:31:59.915117 systemd[1]: sshd@7-139.178.94.21:22-139.178.89.65:33784.service: Deactivated successfully. Sep 16 05:31:59.918828 systemd[1]: session-10.scope: Deactivated successfully. Sep 16 05:31:59.921129 systemd-logind[1896]: Session 10 logged out. Waiting for processes to exit. Sep 16 05:31:59.926423 systemd[1]: Started sshd@8-139.178.94.21:22-139.178.89.65:45896.service - OpenSSH per-connection server daemon (139.178.89.65:45896). Sep 16 05:31:59.928252 systemd-logind[1896]: Removed session 10. Sep 16 05:32:00.016767 sshd[2216]: Accepted publickey for core from 139.178.89.65 port 45896 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:32:00.017459 sshd-session[2216]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:32:00.020676 systemd-logind[1896]: New session 11 of user core. Sep 16 05:32:00.043290 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 16 05:32:00.098124 sudo[2220]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 16 05:32:00.099056 sudo[2220]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 05:32:00.446917 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 16 05:32:00.468330 (dockerd)[2246]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 16 05:32:00.689319 dockerd[2246]: time="2025-09-16T05:32:00.689259240Z" level=info msg="Starting up" Sep 16 05:32:00.689790 dockerd[2246]: time="2025-09-16T05:32:00.689735518Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 16 05:32:00.695623 dockerd[2246]: time="2025-09-16T05:32:00.695558332Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 16 05:32:00.716215 dockerd[2246]: time="2025-09-16T05:32:00.716145398Z" level=info msg="Loading containers: start." Sep 16 05:32:00.745044 kernel: Initializing XFRM netlink socket Sep 16 05:32:00.884592 systemd-networkd[1822]: docker0: Link UP Sep 16 05:32:00.886248 dockerd[2246]: time="2025-09-16T05:32:00.886203765Z" level=info msg="Loading containers: done." Sep 16 05:32:00.892800 dockerd[2246]: time="2025-09-16T05:32:00.892754338Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 16 05:32:00.892800 dockerd[2246]: time="2025-09-16T05:32:00.892800422Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 16 05:32:00.892875 dockerd[2246]: time="2025-09-16T05:32:00.892839637Z" level=info msg="Initializing buildkit" Sep 16 05:32:00.903229 dockerd[2246]: time="2025-09-16T05:32:00.903182886Z" level=info msg="Completed buildkit initialization" Sep 16 05:32:00.906415 dockerd[2246]: time="2025-09-16T05:32:00.906383502Z" level=info msg="Daemon has completed initialization" Sep 16 05:32:00.906448 dockerd[2246]: time="2025-09-16T05:32:00.906416888Z" level=info msg="API listen on /run/docker.sock" Sep 16 05:32:00.906504 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 16 05:32:01.777445 containerd[1906]: time="2025-09-16T05:32:01.777355872Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\"" Sep 16 05:32:02.426416 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1203045505.mount: Deactivated successfully. Sep 16 05:32:03.470796 containerd[1906]: time="2025-09-16T05:32:03.470742175Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:03.471003 containerd[1906]: time="2025-09-16T05:32:03.470928621Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.9: active requests=0, bytes read=28837916" Sep 16 05:32:03.471318 containerd[1906]: time="2025-09-16T05:32:03.471280033Z" level=info msg="ImageCreate event name:\"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:03.472663 containerd[1906]: time="2025-09-16T05:32:03.472619976Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:03.473236 containerd[1906]: time="2025-09-16T05:32:03.473193241Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.9\" with image id \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\", size \"28834515\" in 1.695765729s" Sep 16 05:32:03.473236 containerd[1906]: time="2025-09-16T05:32:03.473213627Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\" returns image reference \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\"" Sep 16 05:32:03.473578 containerd[1906]: time="2025-09-16T05:32:03.473528315Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\"" Sep 16 05:32:04.682510 containerd[1906]: time="2025-09-16T05:32:04.682454902Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:04.682710 containerd[1906]: time="2025-09-16T05:32:04.682568710Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.9: active requests=0, bytes read=24787027" Sep 16 05:32:04.683090 containerd[1906]: time="2025-09-16T05:32:04.683034470Z" level=info msg="ImageCreate event name:\"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:04.684398 containerd[1906]: time="2025-09-16T05:32:04.684354964Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:04.684997 containerd[1906]: time="2025-09-16T05:32:04.684953297Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.9\" with image id \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\", size \"26421706\" in 1.211409698s" Sep 16 05:32:04.684997 containerd[1906]: time="2025-09-16T05:32:04.684968772Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\" returns image reference \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\"" Sep 16 05:32:04.685213 containerd[1906]: time="2025-09-16T05:32:04.685190825Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\"" Sep 16 05:32:05.703350 containerd[1906]: time="2025-09-16T05:32:05.703296647Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:05.703549 containerd[1906]: time="2025-09-16T05:32:05.703534088Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.9: active requests=0, bytes read=19176289" Sep 16 05:32:05.703826 containerd[1906]: time="2025-09-16T05:32:05.703785197Z" level=info msg="ImageCreate event name:\"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:05.705086 containerd[1906]: time="2025-09-16T05:32:05.705051384Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:05.705971 containerd[1906]: time="2025-09-16T05:32:05.705934669Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.9\" with image id \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\", size \"20810986\" in 1.020725816s" Sep 16 05:32:05.705971 containerd[1906]: time="2025-09-16T05:32:05.705951407Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\" returns image reference \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\"" Sep 16 05:32:05.706226 containerd[1906]: time="2025-09-16T05:32:05.706177014Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\"" Sep 16 05:32:06.684444 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2553219504.mount: Deactivated successfully. Sep 16 05:32:06.880656 containerd[1906]: time="2025-09-16T05:32:06.880629234Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:06.880847 containerd[1906]: time="2025-09-16T05:32:06.880757191Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.9: active requests=0, bytes read=30924206" Sep 16 05:32:06.881136 containerd[1906]: time="2025-09-16T05:32:06.881094719Z" level=info msg="ImageCreate event name:\"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:06.881971 containerd[1906]: time="2025-09-16T05:32:06.881930172Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:06.882342 containerd[1906]: time="2025-09-16T05:32:06.882301257Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.9\" with image id \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\", repo tag \"registry.k8s.io/kube-proxy:v1.32.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\", size \"30923225\" in 1.176106634s" Sep 16 05:32:06.882342 containerd[1906]: time="2025-09-16T05:32:06.882317462Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\" returns image reference \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\"" Sep 16 05:32:06.882603 containerd[1906]: time="2025-09-16T05:32:06.882563054Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 16 05:32:07.386265 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2943547267.mount: Deactivated successfully. Sep 16 05:32:07.906261 containerd[1906]: time="2025-09-16T05:32:07.906212024Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:07.906456 containerd[1906]: time="2025-09-16T05:32:07.906421824Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 16 05:32:07.906753 containerd[1906]: time="2025-09-16T05:32:07.906718140Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:07.908390 containerd[1906]: time="2025-09-16T05:32:07.908350227Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:07.908839 containerd[1906]: time="2025-09-16T05:32:07.908796204Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.026215765s" Sep 16 05:32:07.908839 containerd[1906]: time="2025-09-16T05:32:07.908814136Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 16 05:32:07.909141 containerd[1906]: time="2025-09-16T05:32:07.909098803Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 16 05:32:08.300749 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 16 05:32:08.304058 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 05:32:08.492874 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1358120510.mount: Deactivated successfully. Sep 16 05:32:08.600850 containerd[1906]: time="2025-09-16T05:32:08.600813412Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 05:32:08.601312 containerd[1906]: time="2025-09-16T05:32:08.601300394Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 16 05:32:08.601799 containerd[1906]: time="2025-09-16T05:32:08.601786729Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 05:32:08.602660 containerd[1906]: time="2025-09-16T05:32:08.602647057Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 05:32:08.603105 containerd[1906]: time="2025-09-16T05:32:08.603063577Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 693.949076ms" Sep 16 05:32:08.603105 containerd[1906]: time="2025-09-16T05:32:08.603079246Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 16 05:32:08.603370 containerd[1906]: time="2025-09-16T05:32:08.603329826Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 16 05:32:08.607618 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 05:32:08.609716 (kubelet)[2616]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 05:32:08.631292 kubelet[2616]: E0916 05:32:08.631226 2616 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 05:32:08.632500 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 05:32:08.632596 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 05:32:08.632792 systemd[1]: kubelet.service: Consumed 110ms CPU time, 116M memory peak. Sep 16 05:32:09.163159 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount126585228.mount: Deactivated successfully. Sep 16 05:32:10.276124 containerd[1906]: time="2025-09-16T05:32:10.276098446Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:10.276364 containerd[1906]: time="2025-09-16T05:32:10.276303575Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682056" Sep 16 05:32:10.276683 containerd[1906]: time="2025-09-16T05:32:10.276668165Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:10.278070 containerd[1906]: time="2025-09-16T05:32:10.278027868Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:10.279043 containerd[1906]: time="2025-09-16T05:32:10.279026168Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 1.675664942s" Sep 16 05:32:10.279067 containerd[1906]: time="2025-09-16T05:32:10.279043878Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Sep 16 05:32:11.919324 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 05:32:11.919492 systemd[1]: kubelet.service: Consumed 110ms CPU time, 116M memory peak. Sep 16 05:32:11.920782 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 05:32:11.934618 systemd[1]: Reload requested from client PID 2736 ('systemctl') (unit session-11.scope)... Sep 16 05:32:11.934625 systemd[1]: Reloading... Sep 16 05:32:11.970050 zram_generator::config[2780]: No configuration found. Sep 16 05:32:12.115754 systemd[1]: Reloading finished in 180 ms. Sep 16 05:32:12.162561 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 16 05:32:12.162607 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 16 05:32:12.162733 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 05:32:12.163849 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 05:32:12.435464 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 05:32:12.457352 (kubelet)[2847]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 16 05:32:12.477584 kubelet[2847]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 05:32:12.477584 kubelet[2847]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 16 05:32:12.477584 kubelet[2847]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 05:32:12.477785 kubelet[2847]: I0916 05:32:12.477594 2847 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 16 05:32:12.644052 kubelet[2847]: I0916 05:32:12.644000 2847 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 16 05:32:12.644052 kubelet[2847]: I0916 05:32:12.644011 2847 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 16 05:32:12.644179 kubelet[2847]: I0916 05:32:12.644139 2847 server.go:954] "Client rotation is on, will bootstrap in background" Sep 16 05:32:12.672522 kubelet[2847]: E0916 05:32:12.672471 2847 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.94.21:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.94.21:6443: connect: connection refused" logger="UnhandledError" Sep 16 05:32:12.674249 kubelet[2847]: I0916 05:32:12.674212 2847 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 16 05:32:12.680395 kubelet[2847]: I0916 05:32:12.680386 2847 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 16 05:32:12.690205 kubelet[2847]: I0916 05:32:12.690114 2847 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 16 05:32:12.691250 kubelet[2847]: I0916 05:32:12.691209 2847 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 16 05:32:12.691335 kubelet[2847]: I0916 05:32:12.691226 2847 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.0.0-n-e7bf2d745b","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 16 05:32:12.691335 kubelet[2847]: I0916 05:32:12.691317 2847 topology_manager.go:138] "Creating topology manager with none policy" Sep 16 05:32:12.691335 kubelet[2847]: I0916 05:32:12.691323 2847 container_manager_linux.go:304] "Creating device plugin manager" Sep 16 05:32:12.691417 kubelet[2847]: I0916 05:32:12.691384 2847 state_mem.go:36] "Initialized new in-memory state store" Sep 16 05:32:12.694804 kubelet[2847]: I0916 05:32:12.694768 2847 kubelet.go:446] "Attempting to sync node with API server" Sep 16 05:32:12.694804 kubelet[2847]: I0916 05:32:12.694782 2847 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 16 05:32:12.694804 kubelet[2847]: I0916 05:32:12.694792 2847 kubelet.go:352] "Adding apiserver pod source" Sep 16 05:32:12.694804 kubelet[2847]: I0916 05:32:12.694798 2847 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 16 05:32:12.696548 kubelet[2847]: W0916 05:32:12.696495 2847 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.94.21:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.0.0-n-e7bf2d745b&limit=500&resourceVersion=0": dial tcp 139.178.94.21:6443: connect: connection refused Sep 16 05:32:12.696548 kubelet[2847]: E0916 05:32:12.696529 2847 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.94.21:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.0.0-n-e7bf2d745b&limit=500&resourceVersion=0\": dial tcp 139.178.94.21:6443: connect: connection refused" logger="UnhandledError" Sep 16 05:32:12.696624 kubelet[2847]: W0916 05:32:12.696569 2847 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.94.21:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.94.21:6443: connect: connection refused Sep 16 05:32:12.696624 kubelet[2847]: E0916 05:32:12.696609 2847 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.94.21:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.94.21:6443: connect: connection refused" logger="UnhandledError" Sep 16 05:32:12.697416 kubelet[2847]: I0916 05:32:12.697358 2847 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 16 05:32:12.697697 kubelet[2847]: I0916 05:32:12.697662 2847 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 16 05:32:12.698273 kubelet[2847]: W0916 05:32:12.698229 2847 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 16 05:32:12.699897 kubelet[2847]: I0916 05:32:12.699860 2847 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 16 05:32:12.699897 kubelet[2847]: I0916 05:32:12.699881 2847 server.go:1287] "Started kubelet" Sep 16 05:32:12.700005 kubelet[2847]: I0916 05:32:12.699959 2847 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 16 05:32:12.704165 kubelet[2847]: I0916 05:32:12.704025 2847 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 16 05:32:12.704346 kubelet[2847]: I0916 05:32:12.704334 2847 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 16 05:32:12.704588 kubelet[2847]: I0916 05:32:12.704481 2847 server.go:479] "Adding debug handlers to kubelet server" Sep 16 05:32:12.704627 kubelet[2847]: I0916 05:32:12.704614 2847 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 16 05:32:12.704666 kubelet[2847]: I0916 05:32:12.704648 2847 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 16 05:32:12.704666 kubelet[2847]: I0916 05:32:12.704662 2847 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 16 05:32:12.704721 kubelet[2847]: E0916 05:32:12.704671 2847 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459.0.0-n-e7bf2d745b\" not found" Sep 16 05:32:12.704721 kubelet[2847]: I0916 05:32:12.704707 2847 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 16 05:32:12.704771 kubelet[2847]: I0916 05:32:12.704756 2847 reconciler.go:26] "Reconciler: start to sync state" Sep 16 05:32:12.704827 kubelet[2847]: E0916 05:32:12.704817 2847 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 16 05:32:12.705069 kubelet[2847]: E0916 05:32:12.705040 2847 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.94.21:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.0.0-n-e7bf2d745b?timeout=10s\": dial tcp 139.178.94.21:6443: connect: connection refused" interval="200ms" Sep 16 05:32:12.705172 kubelet[2847]: W0916 05:32:12.705122 2847 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.94.21:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.94.21:6443: connect: connection refused Sep 16 05:32:12.705397 kubelet[2847]: E0916 05:32:12.705167 2847 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.94.21:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.94.21:6443: connect: connection refused" logger="UnhandledError" Sep 16 05:32:12.705473 kubelet[2847]: I0916 05:32:12.705462 2847 factory.go:221] Registration of the systemd container factory successfully Sep 16 05:32:12.705997 kubelet[2847]: I0916 05:32:12.705980 2847 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 16 05:32:12.706434 kubelet[2847]: I0916 05:32:12.706426 2847 factory.go:221] Registration of the containerd container factory successfully Sep 16 05:32:12.706835 kubelet[2847]: E0916 05:32:12.705995 2847 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.94.21:6443/api/v1/namespaces/default/events\": dial tcp 139.178.94.21:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459.0.0-n-e7bf2d745b.1865ac5a2373b838 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459.0.0-n-e7bf2d745b,UID:ci-4459.0.0-n-e7bf2d745b,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459.0.0-n-e7bf2d745b,},FirstTimestamp:2025-09-16 05:32:12.699867192 +0000 UTC m=+0.240748780,LastTimestamp:2025-09-16 05:32:12.699867192 +0000 UTC m=+0.240748780,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459.0.0-n-e7bf2d745b,}" Sep 16 05:32:12.713034 kubelet[2847]: I0916 05:32:12.713011 2847 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 16 05:32:12.713385 kubelet[2847]: I0916 05:32:12.713376 2847 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 16 05:32:12.713411 kubelet[2847]: I0916 05:32:12.713386 2847 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 16 05:32:12.713411 kubelet[2847]: I0916 05:32:12.713395 2847 state_mem.go:36] "Initialized new in-memory state store" Sep 16 05:32:12.713618 kubelet[2847]: I0916 05:32:12.713609 2847 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 16 05:32:12.713618 kubelet[2847]: I0916 05:32:12.713619 2847 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 16 05:32:12.713672 kubelet[2847]: I0916 05:32:12.713630 2847 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 16 05:32:12.713672 kubelet[2847]: I0916 05:32:12.713635 2847 kubelet.go:2382] "Starting kubelet main sync loop" Sep 16 05:32:12.713672 kubelet[2847]: E0916 05:32:12.713658 2847 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 16 05:32:12.713870 kubelet[2847]: W0916 05:32:12.713855 2847 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.94.21:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.94.21:6443: connect: connection refused Sep 16 05:32:12.713892 kubelet[2847]: E0916 05:32:12.713877 2847 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.94.21:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.94.21:6443: connect: connection refused" logger="UnhandledError" Sep 16 05:32:12.714304 kubelet[2847]: I0916 05:32:12.714297 2847 policy_none.go:49] "None policy: Start" Sep 16 05:32:12.714332 kubelet[2847]: I0916 05:32:12.714305 2847 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 16 05:32:12.714332 kubelet[2847]: I0916 05:32:12.714311 2847 state_mem.go:35] "Initializing new in-memory state store" Sep 16 05:32:12.716868 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 16 05:32:12.729596 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 16 05:32:12.731611 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 16 05:32:12.744690 kubelet[2847]: I0916 05:32:12.744648 2847 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 16 05:32:12.744811 kubelet[2847]: I0916 05:32:12.744772 2847 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 16 05:32:12.744811 kubelet[2847]: I0916 05:32:12.744781 2847 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 16 05:32:12.744924 kubelet[2847]: I0916 05:32:12.744888 2847 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 16 05:32:12.745310 kubelet[2847]: E0916 05:32:12.745270 2847 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 16 05:32:12.745310 kubelet[2847]: E0916 05:32:12.745293 2847 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459.0.0-n-e7bf2d745b\" not found" Sep 16 05:32:12.835929 systemd[1]: Created slice kubepods-burstable-pod08d1eacba28f1d840a07cec5058150c7.slice - libcontainer container kubepods-burstable-pod08d1eacba28f1d840a07cec5058150c7.slice. Sep 16 05:32:12.847879 kubelet[2847]: I0916 05:32:12.847757 2847 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:12.848591 kubelet[2847]: E0916 05:32:12.848487 2847 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.94.21:6443/api/v1/nodes\": dial tcp 139.178.94.21:6443: connect: connection refused" node="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:12.873761 kubelet[2847]: E0916 05:32:12.873702 2847 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.0.0-n-e7bf2d745b\" not found" node="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:12.882716 systemd[1]: Created slice kubepods-burstable-pod01e958eb7220d85e7aa731023a4bda2a.slice - libcontainer container kubepods-burstable-pod01e958eb7220d85e7aa731023a4bda2a.slice. Sep 16 05:32:12.887446 kubelet[2847]: E0916 05:32:12.887356 2847 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.0.0-n-e7bf2d745b\" not found" node="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:12.904444 systemd[1]: Created slice kubepods-burstable-pod4971bd86d3627586b223c577998462fa.slice - libcontainer container kubepods-burstable-pod4971bd86d3627586b223c577998462fa.slice. Sep 16 05:32:12.906261 kubelet[2847]: E0916 05:32:12.906190 2847 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.94.21:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.0.0-n-e7bf2d745b?timeout=10s\": dial tcp 139.178.94.21:6443: connect: connection refused" interval="400ms" Sep 16 05:32:12.908979 kubelet[2847]: E0916 05:32:12.908906 2847 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.0.0-n-e7bf2d745b\" not found" node="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.006566 kubelet[2847]: I0916 05:32:13.006326 2847 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4971bd86d3627586b223c577998462fa-ca-certs\") pod \"kube-controller-manager-ci-4459.0.0-n-e7bf2d745b\" (UID: \"4971bd86d3627586b223c577998462fa\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.006566 kubelet[2847]: I0916 05:32:13.006428 2847 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4971bd86d3627586b223c577998462fa-kubeconfig\") pod \"kube-controller-manager-ci-4459.0.0-n-e7bf2d745b\" (UID: \"4971bd86d3627586b223c577998462fa\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.006566 kubelet[2847]: I0916 05:32:13.006543 2847 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/08d1eacba28f1d840a07cec5058150c7-kubeconfig\") pod \"kube-scheduler-ci-4459.0.0-n-e7bf2d745b\" (UID: \"08d1eacba28f1d840a07cec5058150c7\") " pod="kube-system/kube-scheduler-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.007071 kubelet[2847]: I0916 05:32:13.006623 2847 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/01e958eb7220d85e7aa731023a4bda2a-ca-certs\") pod \"kube-apiserver-ci-4459.0.0-n-e7bf2d745b\" (UID: \"01e958eb7220d85e7aa731023a4bda2a\") " pod="kube-system/kube-apiserver-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.007071 kubelet[2847]: I0916 05:32:13.006686 2847 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/01e958eb7220d85e7aa731023a4bda2a-k8s-certs\") pod \"kube-apiserver-ci-4459.0.0-n-e7bf2d745b\" (UID: \"01e958eb7220d85e7aa731023a4bda2a\") " pod="kube-system/kube-apiserver-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.007071 kubelet[2847]: I0916 05:32:13.006737 2847 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/01e958eb7220d85e7aa731023a4bda2a-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.0.0-n-e7bf2d745b\" (UID: \"01e958eb7220d85e7aa731023a4bda2a\") " pod="kube-system/kube-apiserver-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.007071 kubelet[2847]: I0916 05:32:13.006799 2847 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4971bd86d3627586b223c577998462fa-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.0.0-n-e7bf2d745b\" (UID: \"4971bd86d3627586b223c577998462fa\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.007071 kubelet[2847]: I0916 05:32:13.006852 2847 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4971bd86d3627586b223c577998462fa-k8s-certs\") pod \"kube-controller-manager-ci-4459.0.0-n-e7bf2d745b\" (UID: \"4971bd86d3627586b223c577998462fa\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.007491 kubelet[2847]: I0916 05:32:13.006899 2847 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4971bd86d3627586b223c577998462fa-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.0.0-n-e7bf2d745b\" (UID: \"4971bd86d3627586b223c577998462fa\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.052586 kubelet[2847]: I0916 05:32:13.052523 2847 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.053321 kubelet[2847]: E0916 05:32:13.053252 2847 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.94.21:6443/api/v1/nodes\": dial tcp 139.178.94.21:6443: connect: connection refused" node="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.176662 containerd[1906]: time="2025-09-16T05:32:13.176554529Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.0.0-n-e7bf2d745b,Uid:08d1eacba28f1d840a07cec5058150c7,Namespace:kube-system,Attempt:0,}" Sep 16 05:32:13.185910 containerd[1906]: time="2025-09-16T05:32:13.185869603Z" level=info msg="connecting to shim 5054d6dcfa91d6ff583c97e596012570ba4a056127ab809900cf6aedddddb422" address="unix:///run/containerd/s/0dd37241d7dde9adda5cdd9f853a1141b841b9665bb9728cadeb25b143f0deb5" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:32:13.188999 containerd[1906]: time="2025-09-16T05:32:13.188978076Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.0.0-n-e7bf2d745b,Uid:01e958eb7220d85e7aa731023a4bda2a,Namespace:kube-system,Attempt:0,}" Sep 16 05:32:13.196714 containerd[1906]: time="2025-09-16T05:32:13.196690384Z" level=info msg="connecting to shim 490fbb9bbfeb5df8c88c8454de786005df7efa99565d8730ad4100b18900eb67" address="unix:///run/containerd/s/84c9c5d406213da9158b445e525091dc5805d2b6de4709248fd934fbbbcc7edb" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:32:13.210778 containerd[1906]: time="2025-09-16T05:32:13.210726805Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.0.0-n-e7bf2d745b,Uid:4971bd86d3627586b223c577998462fa,Namespace:kube-system,Attempt:0,}" Sep 16 05:32:13.213154 systemd[1]: Started cri-containerd-5054d6dcfa91d6ff583c97e596012570ba4a056127ab809900cf6aedddddb422.scope - libcontainer container 5054d6dcfa91d6ff583c97e596012570ba4a056127ab809900cf6aedddddb422. Sep 16 05:32:13.217415 systemd[1]: Started cri-containerd-490fbb9bbfeb5df8c88c8454de786005df7efa99565d8730ad4100b18900eb67.scope - libcontainer container 490fbb9bbfeb5df8c88c8454de786005df7efa99565d8730ad4100b18900eb67. Sep 16 05:32:13.235898 containerd[1906]: time="2025-09-16T05:32:13.235872730Z" level=info msg="connecting to shim 5e1b81eca42b7f80be950885ffe02a98551ba7939fc87936ebe93b51e205699e" address="unix:///run/containerd/s/3345b27be0708dc6b672ef850d71d4b86c44978fa8601e10ffaadaae19ca594e" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:32:13.240332 containerd[1906]: time="2025-09-16T05:32:13.240309170Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.0.0-n-e7bf2d745b,Uid:08d1eacba28f1d840a07cec5058150c7,Namespace:kube-system,Attempt:0,} returns sandbox id \"5054d6dcfa91d6ff583c97e596012570ba4a056127ab809900cf6aedddddb422\"" Sep 16 05:32:13.241692 containerd[1906]: time="2025-09-16T05:32:13.241677139Z" level=info msg="CreateContainer within sandbox \"5054d6dcfa91d6ff583c97e596012570ba4a056127ab809900cf6aedddddb422\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 16 05:32:13.244794 containerd[1906]: time="2025-09-16T05:32:13.244777432Z" level=info msg="Container 5c30ddb75115a2915b7c31b20a27bd1f306555a2a2906dca0b4f583ae40b5454: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:32:13.244872 containerd[1906]: time="2025-09-16T05:32:13.244861370Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.0.0-n-e7bf2d745b,Uid:01e958eb7220d85e7aa731023a4bda2a,Namespace:kube-system,Attempt:0,} returns sandbox id \"490fbb9bbfeb5df8c88c8454de786005df7efa99565d8730ad4100b18900eb67\"" Sep 16 05:32:13.245760 containerd[1906]: time="2025-09-16T05:32:13.245746041Z" level=info msg="CreateContainer within sandbox \"490fbb9bbfeb5df8c88c8454de786005df7efa99565d8730ad4100b18900eb67\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 16 05:32:13.247614 containerd[1906]: time="2025-09-16T05:32:13.247603587Z" level=info msg="CreateContainer within sandbox \"5054d6dcfa91d6ff583c97e596012570ba4a056127ab809900cf6aedddddb422\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"5c30ddb75115a2915b7c31b20a27bd1f306555a2a2906dca0b4f583ae40b5454\"" Sep 16 05:32:13.247851 containerd[1906]: time="2025-09-16T05:32:13.247839736Z" level=info msg="StartContainer for \"5c30ddb75115a2915b7c31b20a27bd1f306555a2a2906dca0b4f583ae40b5454\"" Sep 16 05:32:13.248419 containerd[1906]: time="2025-09-16T05:32:13.248403739Z" level=info msg="connecting to shim 5c30ddb75115a2915b7c31b20a27bd1f306555a2a2906dca0b4f583ae40b5454" address="unix:///run/containerd/s/0dd37241d7dde9adda5cdd9f853a1141b841b9665bb9728cadeb25b143f0deb5" protocol=ttrpc version=3 Sep 16 05:32:13.249317 containerd[1906]: time="2025-09-16T05:32:13.249305969Z" level=info msg="Container 572844a0ff62b781b4dbd452b9866d35ecb733bee993fa5fb1b1d653f036112b: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:32:13.251930 containerd[1906]: time="2025-09-16T05:32:13.251917359Z" level=info msg="CreateContainer within sandbox \"490fbb9bbfeb5df8c88c8454de786005df7efa99565d8730ad4100b18900eb67\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"572844a0ff62b781b4dbd452b9866d35ecb733bee993fa5fb1b1d653f036112b\"" Sep 16 05:32:13.252205 systemd[1]: Started cri-containerd-5e1b81eca42b7f80be950885ffe02a98551ba7939fc87936ebe93b51e205699e.scope - libcontainer container 5e1b81eca42b7f80be950885ffe02a98551ba7939fc87936ebe93b51e205699e. Sep 16 05:32:13.252331 containerd[1906]: time="2025-09-16T05:32:13.252255617Z" level=info msg="StartContainer for \"572844a0ff62b781b4dbd452b9866d35ecb733bee993fa5fb1b1d653f036112b\"" Sep 16 05:32:13.252843 containerd[1906]: time="2025-09-16T05:32:13.252831487Z" level=info msg="connecting to shim 572844a0ff62b781b4dbd452b9866d35ecb733bee993fa5fb1b1d653f036112b" address="unix:///run/containerd/s/84c9c5d406213da9158b445e525091dc5805d2b6de4709248fd934fbbbcc7edb" protocol=ttrpc version=3 Sep 16 05:32:13.255613 systemd[1]: Started cri-containerd-5c30ddb75115a2915b7c31b20a27bd1f306555a2a2906dca0b4f583ae40b5454.scope - libcontainer container 5c30ddb75115a2915b7c31b20a27bd1f306555a2a2906dca0b4f583ae40b5454. Sep 16 05:32:13.259328 systemd[1]: Started cri-containerd-572844a0ff62b781b4dbd452b9866d35ecb733bee993fa5fb1b1d653f036112b.scope - libcontainer container 572844a0ff62b781b4dbd452b9866d35ecb733bee993fa5fb1b1d653f036112b. Sep 16 05:32:13.277856 containerd[1906]: time="2025-09-16T05:32:13.277835994Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.0.0-n-e7bf2d745b,Uid:4971bd86d3627586b223c577998462fa,Namespace:kube-system,Attempt:0,} returns sandbox id \"5e1b81eca42b7f80be950885ffe02a98551ba7939fc87936ebe93b51e205699e\"" Sep 16 05:32:13.280221 containerd[1906]: time="2025-09-16T05:32:13.280199719Z" level=info msg="CreateContainer within sandbox \"5e1b81eca42b7f80be950885ffe02a98551ba7939fc87936ebe93b51e205699e\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 16 05:32:13.283035 containerd[1906]: time="2025-09-16T05:32:13.283005764Z" level=info msg="StartContainer for \"5c30ddb75115a2915b7c31b20a27bd1f306555a2a2906dca0b4f583ae40b5454\" returns successfully" Sep 16 05:32:13.283815 containerd[1906]: time="2025-09-16T05:32:13.283798841Z" level=info msg="Container 43127a4d78b07ca19e455890751a4d095ef127f7d3af144757390b3435bb58a1: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:32:13.286198 containerd[1906]: time="2025-09-16T05:32:13.286174995Z" level=info msg="CreateContainer within sandbox \"5e1b81eca42b7f80be950885ffe02a98551ba7939fc87936ebe93b51e205699e\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"43127a4d78b07ca19e455890751a4d095ef127f7d3af144757390b3435bb58a1\"" Sep 16 05:32:13.286267 containerd[1906]: time="2025-09-16T05:32:13.286252986Z" level=info msg="StartContainer for \"572844a0ff62b781b4dbd452b9866d35ecb733bee993fa5fb1b1d653f036112b\" returns successfully" Sep 16 05:32:13.286470 containerd[1906]: time="2025-09-16T05:32:13.286457379Z" level=info msg="StartContainer for \"43127a4d78b07ca19e455890751a4d095ef127f7d3af144757390b3435bb58a1\"" Sep 16 05:32:13.287031 containerd[1906]: time="2025-09-16T05:32:13.287019721Z" level=info msg="connecting to shim 43127a4d78b07ca19e455890751a4d095ef127f7d3af144757390b3435bb58a1" address="unix:///run/containerd/s/3345b27be0708dc6b672ef850d71d4b86c44978fa8601e10ffaadaae19ca594e" protocol=ttrpc version=3 Sep 16 05:32:13.302213 systemd[1]: Started cri-containerd-43127a4d78b07ca19e455890751a4d095ef127f7d3af144757390b3435bb58a1.scope - libcontainer container 43127a4d78b07ca19e455890751a4d095ef127f7d3af144757390b3435bb58a1. Sep 16 05:32:13.330844 containerd[1906]: time="2025-09-16T05:32:13.330820862Z" level=info msg="StartContainer for \"43127a4d78b07ca19e455890751a4d095ef127f7d3af144757390b3435bb58a1\" returns successfully" Sep 16 05:32:13.455468 kubelet[2847]: I0916 05:32:13.455447 2847 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.716524 kubelet[2847]: E0916 05:32:13.716506 2847 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.0.0-n-e7bf2d745b\" not found" node="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.716933 kubelet[2847]: E0916 05:32:13.716925 2847 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.0.0-n-e7bf2d745b\" not found" node="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.717576 kubelet[2847]: E0916 05:32:13.717551 2847 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.0.0-n-e7bf2d745b\" not found" node="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.759746 kubelet[2847]: E0916 05:32:13.759702 2847 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459.0.0-n-e7bf2d745b\" not found" node="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.859255 kubelet[2847]: I0916 05:32:13.859236 2847 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.859328 kubelet[2847]: E0916 05:32:13.859261 2847 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4459.0.0-n-e7bf2d745b\": node \"ci-4459.0.0-n-e7bf2d745b\" not found" Sep 16 05:32:13.863989 kubelet[2847]: E0916 05:32:13.863971 2847 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459.0.0-n-e7bf2d745b\" not found" Sep 16 05:32:13.905340 kubelet[2847]: I0916 05:32:13.905293 2847 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.907568 kubelet[2847]: E0916 05:32:13.907556 2847 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.0.0-n-e7bf2d745b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.907597 kubelet[2847]: I0916 05:32:13.907570 2847 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.908324 kubelet[2847]: E0916 05:32:13.908316 2847 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.0.0-n-e7bf2d745b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.908352 kubelet[2847]: I0916 05:32:13.908326 2847 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:13.909064 kubelet[2847]: E0916 05:32:13.909023 2847 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459.0.0-n-e7bf2d745b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:14.695758 kubelet[2847]: I0916 05:32:14.695671 2847 apiserver.go:52] "Watching apiserver" Sep 16 05:32:14.705644 kubelet[2847]: I0916 05:32:14.705602 2847 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 16 05:32:14.719422 kubelet[2847]: I0916 05:32:14.719338 2847 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:14.720154 kubelet[2847]: I0916 05:32:14.719523 2847 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:14.722984 kubelet[2847]: E0916 05:32:14.722883 2847 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.0.0-n-e7bf2d745b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:14.723194 kubelet[2847]: E0916 05:32:14.723147 2847 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.0.0-n-e7bf2d745b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:15.914489 systemd[1]: Reload requested from client PID 3163 ('systemctl') (unit session-11.scope)... Sep 16 05:32:15.914496 systemd[1]: Reloading... Sep 16 05:32:15.955074 zram_generator::config[3208]: No configuration found. Sep 16 05:32:16.108276 systemd[1]: Reloading finished in 193 ms. Sep 16 05:32:16.131496 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 05:32:16.141848 systemd[1]: kubelet.service: Deactivated successfully. Sep 16 05:32:16.141975 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 05:32:16.142007 systemd[1]: kubelet.service: Consumed 672ms CPU time, 139.5M memory peak. Sep 16 05:32:16.142902 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 05:32:16.419484 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 05:32:16.424945 (kubelet)[3272]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 16 05:32:16.464621 kubelet[3272]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 05:32:16.464621 kubelet[3272]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 16 05:32:16.464621 kubelet[3272]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 05:32:16.464881 kubelet[3272]: I0916 05:32:16.464682 3272 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 16 05:32:16.470117 kubelet[3272]: I0916 05:32:16.470100 3272 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 16 05:32:16.470117 kubelet[3272]: I0916 05:32:16.470115 3272 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 16 05:32:16.470272 kubelet[3272]: I0916 05:32:16.470265 3272 server.go:954] "Client rotation is on, will bootstrap in background" Sep 16 05:32:16.471054 kubelet[3272]: I0916 05:32:16.471045 3272 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 16 05:32:16.472483 kubelet[3272]: I0916 05:32:16.472473 3272 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 16 05:32:16.474394 kubelet[3272]: I0916 05:32:16.474384 3272 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 16 05:32:16.481939 kubelet[3272]: I0916 05:32:16.481927 3272 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 16 05:32:16.482059 kubelet[3272]: I0916 05:32:16.482044 3272 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 16 05:32:16.482162 kubelet[3272]: I0916 05:32:16.482060 3272 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.0.0-n-e7bf2d745b","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 16 05:32:16.482220 kubelet[3272]: I0916 05:32:16.482169 3272 topology_manager.go:138] "Creating topology manager with none policy" Sep 16 05:32:16.482220 kubelet[3272]: I0916 05:32:16.482175 3272 container_manager_linux.go:304] "Creating device plugin manager" Sep 16 05:32:16.482220 kubelet[3272]: I0916 05:32:16.482204 3272 state_mem.go:36] "Initialized new in-memory state store" Sep 16 05:32:16.482314 kubelet[3272]: I0916 05:32:16.482309 3272 kubelet.go:446] "Attempting to sync node with API server" Sep 16 05:32:16.482332 kubelet[3272]: I0916 05:32:16.482321 3272 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 16 05:32:16.482350 kubelet[3272]: I0916 05:32:16.482332 3272 kubelet.go:352] "Adding apiserver pod source" Sep 16 05:32:16.482350 kubelet[3272]: I0916 05:32:16.482338 3272 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 16 05:32:16.482688 kubelet[3272]: I0916 05:32:16.482676 3272 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 16 05:32:16.482918 kubelet[3272]: I0916 05:32:16.482912 3272 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 16 05:32:16.483162 kubelet[3272]: I0916 05:32:16.483156 3272 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 16 05:32:16.483188 kubelet[3272]: I0916 05:32:16.483172 3272 server.go:1287] "Started kubelet" Sep 16 05:32:16.483246 kubelet[3272]: I0916 05:32:16.483221 3272 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 16 05:32:16.483286 kubelet[3272]: I0916 05:32:16.483222 3272 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 16 05:32:16.483410 kubelet[3272]: I0916 05:32:16.483400 3272 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 16 05:32:16.483941 kubelet[3272]: I0916 05:32:16.483933 3272 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 16 05:32:16.484018 kubelet[3272]: I0916 05:32:16.484005 3272 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 16 05:32:16.484057 kubelet[3272]: E0916 05:32:16.484008 3272 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459.0.0-n-e7bf2d745b\" not found" Sep 16 05:32:16.484057 kubelet[3272]: I0916 05:32:16.484030 3272 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 16 05:32:16.484105 kubelet[3272]: I0916 05:32:16.484088 3272 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 16 05:32:16.484268 kubelet[3272]: I0916 05:32:16.484255 3272 reconciler.go:26] "Reconciler: start to sync state" Sep 16 05:32:16.484465 kubelet[3272]: I0916 05:32:16.484290 3272 server.go:479] "Adding debug handlers to kubelet server" Sep 16 05:32:16.484725 kubelet[3272]: E0916 05:32:16.484702 3272 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 16 05:32:16.484876 kubelet[3272]: I0916 05:32:16.484866 3272 factory.go:221] Registration of the systemd container factory successfully Sep 16 05:32:16.485322 kubelet[3272]: I0916 05:32:16.484955 3272 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 16 05:32:16.486953 kubelet[3272]: I0916 05:32:16.486942 3272 factory.go:221] Registration of the containerd container factory successfully Sep 16 05:32:16.491326 kubelet[3272]: I0916 05:32:16.491298 3272 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 16 05:32:16.492031 kubelet[3272]: I0916 05:32:16.492007 3272 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 16 05:32:16.492117 kubelet[3272]: I0916 05:32:16.492107 3272 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 16 05:32:16.492165 kubelet[3272]: I0916 05:32:16.492131 3272 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 16 05:32:16.492165 kubelet[3272]: I0916 05:32:16.492140 3272 kubelet.go:2382] "Starting kubelet main sync loop" Sep 16 05:32:16.492289 kubelet[3272]: E0916 05:32:16.492272 3272 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 16 05:32:16.502364 kubelet[3272]: I0916 05:32:16.502320 3272 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 16 05:32:16.502364 kubelet[3272]: I0916 05:32:16.502330 3272 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 16 05:32:16.502364 kubelet[3272]: I0916 05:32:16.502341 3272 state_mem.go:36] "Initialized new in-memory state store" Sep 16 05:32:16.502455 kubelet[3272]: I0916 05:32:16.502435 3272 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 16 05:32:16.502455 kubelet[3272]: I0916 05:32:16.502442 3272 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 16 05:32:16.502497 kubelet[3272]: I0916 05:32:16.502458 3272 policy_none.go:49] "None policy: Start" Sep 16 05:32:16.502497 kubelet[3272]: I0916 05:32:16.502467 3272 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 16 05:32:16.502497 kubelet[3272]: I0916 05:32:16.502474 3272 state_mem.go:35] "Initializing new in-memory state store" Sep 16 05:32:16.502553 kubelet[3272]: I0916 05:32:16.502545 3272 state_mem.go:75] "Updated machine memory state" Sep 16 05:32:16.504647 kubelet[3272]: I0916 05:32:16.504639 3272 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 16 05:32:16.504728 kubelet[3272]: I0916 05:32:16.504723 3272 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 16 05:32:16.504748 kubelet[3272]: I0916 05:32:16.504730 3272 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 16 05:32:16.504837 kubelet[3272]: I0916 05:32:16.504828 3272 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 16 05:32:16.505314 kubelet[3272]: E0916 05:32:16.505297 3272 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 16 05:32:16.593818 kubelet[3272]: I0916 05:32:16.593711 3272 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:16.594098 kubelet[3272]: I0916 05:32:16.593923 3272 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:16.594098 kubelet[3272]: I0916 05:32:16.594018 3272 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:16.601880 kubelet[3272]: W0916 05:32:16.601795 3272 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 16 05:32:16.602902 kubelet[3272]: W0916 05:32:16.602831 3272 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 16 05:32:16.603150 kubelet[3272]: W0916 05:32:16.602971 3272 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 16 05:32:16.613078 kubelet[3272]: I0916 05:32:16.612961 3272 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:16.622398 kubelet[3272]: I0916 05:32:16.622351 3272 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:16.622569 kubelet[3272]: I0916 05:32:16.622498 3272 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:16.686119 kubelet[3272]: I0916 05:32:16.685857 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/01e958eb7220d85e7aa731023a4bda2a-ca-certs\") pod \"kube-apiserver-ci-4459.0.0-n-e7bf2d745b\" (UID: \"01e958eb7220d85e7aa731023a4bda2a\") " pod="kube-system/kube-apiserver-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:16.686119 kubelet[3272]: I0916 05:32:16.686042 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/01e958eb7220d85e7aa731023a4bda2a-k8s-certs\") pod \"kube-apiserver-ci-4459.0.0-n-e7bf2d745b\" (UID: \"01e958eb7220d85e7aa731023a4bda2a\") " pod="kube-system/kube-apiserver-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:16.686504 kubelet[3272]: I0916 05:32:16.686150 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4971bd86d3627586b223c577998462fa-ca-certs\") pod \"kube-controller-manager-ci-4459.0.0-n-e7bf2d745b\" (UID: \"4971bd86d3627586b223c577998462fa\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:16.686504 kubelet[3272]: I0916 05:32:16.686262 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4971bd86d3627586b223c577998462fa-k8s-certs\") pod \"kube-controller-manager-ci-4459.0.0-n-e7bf2d745b\" (UID: \"4971bd86d3627586b223c577998462fa\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:16.686504 kubelet[3272]: I0916 05:32:16.686363 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4971bd86d3627586b223c577998462fa-kubeconfig\") pod \"kube-controller-manager-ci-4459.0.0-n-e7bf2d745b\" (UID: \"4971bd86d3627586b223c577998462fa\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:16.686504 kubelet[3272]: I0916 05:32:16.686459 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/01e958eb7220d85e7aa731023a4bda2a-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.0.0-n-e7bf2d745b\" (UID: \"01e958eb7220d85e7aa731023a4bda2a\") " pod="kube-system/kube-apiserver-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:16.686960 kubelet[3272]: I0916 05:32:16.686536 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4971bd86d3627586b223c577998462fa-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.0.0-n-e7bf2d745b\" (UID: \"4971bd86d3627586b223c577998462fa\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:16.686960 kubelet[3272]: I0916 05:32:16.686621 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4971bd86d3627586b223c577998462fa-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.0.0-n-e7bf2d745b\" (UID: \"4971bd86d3627586b223c577998462fa\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:16.686960 kubelet[3272]: I0916 05:32:16.686705 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/08d1eacba28f1d840a07cec5058150c7-kubeconfig\") pod \"kube-scheduler-ci-4459.0.0-n-e7bf2d745b\" (UID: \"08d1eacba28f1d840a07cec5058150c7\") " pod="kube-system/kube-scheduler-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:17.482662 kubelet[3272]: I0916 05:32:17.482647 3272 apiserver.go:52] "Watching apiserver" Sep 16 05:32:17.484862 kubelet[3272]: I0916 05:32:17.484850 3272 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 16 05:32:17.498562 kubelet[3272]: I0916 05:32:17.498498 3272 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:17.498796 kubelet[3272]: I0916 05:32:17.498678 3272 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:17.510884 kubelet[3272]: W0916 05:32:17.510871 3272 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 16 05:32:17.510932 kubelet[3272]: E0916 05:32:17.510903 3272 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.0.0-n-e7bf2d745b\" already exists" pod="kube-system/kube-apiserver-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:17.511010 kubelet[3272]: W0916 05:32:17.511000 3272 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 16 05:32:17.511050 kubelet[3272]: E0916 05:32:17.511024 3272 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.0.0-n-e7bf2d745b\" already exists" pod="kube-system/kube-scheduler-ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:17.518375 kubelet[3272]: I0916 05:32:17.518345 3272 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459.0.0-n-e7bf2d745b" podStartSLOduration=1.518337989 podStartE2EDuration="1.518337989s" podCreationTimestamp="2025-09-16 05:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 05:32:17.514565614 +0000 UTC m=+1.085491650" watchObservedRunningTime="2025-09-16 05:32:17.518337989 +0000 UTC m=+1.089264024" Sep 16 05:32:17.518456 kubelet[3272]: I0916 05:32:17.518402 3272 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459.0.0-n-e7bf2d745b" podStartSLOduration=1.518398833 podStartE2EDuration="1.518398833s" podCreationTimestamp="2025-09-16 05:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 05:32:17.518388755 +0000 UTC m=+1.089314801" watchObservedRunningTime="2025-09-16 05:32:17.518398833 +0000 UTC m=+1.089324869" Sep 16 05:32:17.526170 kubelet[3272]: I0916 05:32:17.526115 3272 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459.0.0-n-e7bf2d745b" podStartSLOduration=1.5261065139999999 podStartE2EDuration="1.526106514s" podCreationTimestamp="2025-09-16 05:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 05:32:17.521650958 +0000 UTC m=+1.092576995" watchObservedRunningTime="2025-09-16 05:32:17.526106514 +0000 UTC m=+1.097032550" Sep 16 05:32:21.479836 kubelet[3272]: I0916 05:32:21.479767 3272 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 16 05:32:21.480899 containerd[1906]: time="2025-09-16T05:32:21.480528725Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 16 05:32:21.481577 kubelet[3272]: I0916 05:32:21.480969 3272 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 16 05:32:22.496974 systemd[1]: Created slice kubepods-besteffort-pod855e6f1e_7832_4e56_ac99_532105f574c8.slice - libcontainer container kubepods-besteffort-pod855e6f1e_7832_4e56_ac99_532105f574c8.slice. Sep 16 05:32:22.526547 kubelet[3272]: I0916 05:32:22.526443 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/855e6f1e-7832-4e56-ac99-532105f574c8-xtables-lock\") pod \"kube-proxy-bc4wq\" (UID: \"855e6f1e-7832-4e56-ac99-532105f574c8\") " pod="kube-system/kube-proxy-bc4wq" Sep 16 05:32:22.526547 kubelet[3272]: I0916 05:32:22.526516 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/855e6f1e-7832-4e56-ac99-532105f574c8-lib-modules\") pod \"kube-proxy-bc4wq\" (UID: \"855e6f1e-7832-4e56-ac99-532105f574c8\") " pod="kube-system/kube-proxy-bc4wq" Sep 16 05:32:22.527266 kubelet[3272]: I0916 05:32:22.526564 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/855e6f1e-7832-4e56-ac99-532105f574c8-kube-proxy\") pod \"kube-proxy-bc4wq\" (UID: \"855e6f1e-7832-4e56-ac99-532105f574c8\") " pod="kube-system/kube-proxy-bc4wq" Sep 16 05:32:22.527266 kubelet[3272]: I0916 05:32:22.526612 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sflg8\" (UniqueName: \"kubernetes.io/projected/855e6f1e-7832-4e56-ac99-532105f574c8-kube-api-access-sflg8\") pod \"kube-proxy-bc4wq\" (UID: \"855e6f1e-7832-4e56-ac99-532105f574c8\") " pod="kube-system/kube-proxy-bc4wq" Sep 16 05:32:22.644842 systemd[1]: Created slice kubepods-besteffort-pod304092c3_ca1d_47a3_8863_d9c1f75acbc2.slice - libcontainer container kubepods-besteffort-pod304092c3_ca1d_47a3_8863_d9c1f75acbc2.slice. Sep 16 05:32:22.728904 kubelet[3272]: I0916 05:32:22.728838 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lj62\" (UniqueName: \"kubernetes.io/projected/304092c3-ca1d-47a3-8863-d9c1f75acbc2-kube-api-access-8lj62\") pod \"tigera-operator-755d956888-hrg9c\" (UID: \"304092c3-ca1d-47a3-8863-d9c1f75acbc2\") " pod="tigera-operator/tigera-operator-755d956888-hrg9c" Sep 16 05:32:22.728904 kubelet[3272]: I0916 05:32:22.728879 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/304092c3-ca1d-47a3-8863-d9c1f75acbc2-var-lib-calico\") pod \"tigera-operator-755d956888-hrg9c\" (UID: \"304092c3-ca1d-47a3-8863-d9c1f75acbc2\") " pod="tigera-operator/tigera-operator-755d956888-hrg9c" Sep 16 05:32:22.810591 containerd[1906]: time="2025-09-16T05:32:22.810447527Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-bc4wq,Uid:855e6f1e-7832-4e56-ac99-532105f574c8,Namespace:kube-system,Attempt:0,}" Sep 16 05:32:22.835221 containerd[1906]: time="2025-09-16T05:32:22.835195935Z" level=info msg="connecting to shim fe58eada3eb54b61e30cafa3bc1d6fa236948d38fbc7724503433758a90a088b" address="unix:///run/containerd/s/0f027095cc2b2adbc14a8dfc460d7027def53a739f2095bd3f2b7190112b467c" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:32:22.862224 systemd[1]: Started cri-containerd-fe58eada3eb54b61e30cafa3bc1d6fa236948d38fbc7724503433758a90a088b.scope - libcontainer container fe58eada3eb54b61e30cafa3bc1d6fa236948d38fbc7724503433758a90a088b. Sep 16 05:32:22.906255 containerd[1906]: time="2025-09-16T05:32:22.906231365Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-bc4wq,Uid:855e6f1e-7832-4e56-ac99-532105f574c8,Namespace:kube-system,Attempt:0,} returns sandbox id \"fe58eada3eb54b61e30cafa3bc1d6fa236948d38fbc7724503433758a90a088b\"" Sep 16 05:32:22.907538 containerd[1906]: time="2025-09-16T05:32:22.907526091Z" level=info msg="CreateContainer within sandbox \"fe58eada3eb54b61e30cafa3bc1d6fa236948d38fbc7724503433758a90a088b\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 16 05:32:22.915260 containerd[1906]: time="2025-09-16T05:32:22.915241101Z" level=info msg="Container 7d5e0db4d64d76fbf8e6f0f394ebd9273e291c78a1d572494d190f7459553341: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:32:22.948587 containerd[1906]: time="2025-09-16T05:32:22.948522772Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-hrg9c,Uid:304092c3-ca1d-47a3-8863-d9c1f75acbc2,Namespace:tigera-operator,Attempt:0,}" Sep 16 05:32:23.014367 containerd[1906]: time="2025-09-16T05:32:23.014316287Z" level=info msg="CreateContainer within sandbox \"fe58eada3eb54b61e30cafa3bc1d6fa236948d38fbc7724503433758a90a088b\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"7d5e0db4d64d76fbf8e6f0f394ebd9273e291c78a1d572494d190f7459553341\"" Sep 16 05:32:23.014765 containerd[1906]: time="2025-09-16T05:32:23.014707543Z" level=info msg="StartContainer for \"7d5e0db4d64d76fbf8e6f0f394ebd9273e291c78a1d572494d190f7459553341\"" Sep 16 05:32:23.015784 containerd[1906]: time="2025-09-16T05:32:23.015764975Z" level=info msg="connecting to shim 7d5e0db4d64d76fbf8e6f0f394ebd9273e291c78a1d572494d190f7459553341" address="unix:///run/containerd/s/0f027095cc2b2adbc14a8dfc460d7027def53a739f2095bd3f2b7190112b467c" protocol=ttrpc version=3 Sep 16 05:32:23.038176 systemd[1]: Started cri-containerd-7d5e0db4d64d76fbf8e6f0f394ebd9273e291c78a1d572494d190f7459553341.scope - libcontainer container 7d5e0db4d64d76fbf8e6f0f394ebd9273e291c78a1d572494d190f7459553341. Sep 16 05:32:23.124735 systemd[1]: Started sshd@9-139.178.94.21:22-45.32.113.42:40966.service - OpenSSH per-connection server daemon (45.32.113.42:40966). Sep 16 05:32:23.141091 containerd[1906]: time="2025-09-16T05:32:23.141058555Z" level=info msg="StartContainer for \"7d5e0db4d64d76fbf8e6f0f394ebd9273e291c78a1d572494d190f7459553341\" returns successfully" Sep 16 05:32:23.326691 containerd[1906]: time="2025-09-16T05:32:23.326626364Z" level=info msg="connecting to shim c6d5e0c6cb0ae06398f29b6eec2f6d712f51ee1805b3bfe2ac636579ecea2c3f" address="unix:///run/containerd/s/0c94087bf1bb6863b268a59792119b344f56cc1502020405b183ca9d3c2a0c66" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:32:23.342145 systemd[1]: Started cri-containerd-c6d5e0c6cb0ae06398f29b6eec2f6d712f51ee1805b3bfe2ac636579ecea2c3f.scope - libcontainer container c6d5e0c6cb0ae06398f29b6eec2f6d712f51ee1805b3bfe2ac636579ecea2c3f. Sep 16 05:32:23.395284 containerd[1906]: time="2025-09-16T05:32:23.395203145Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-hrg9c,Uid:304092c3-ca1d-47a3-8863-d9c1f75acbc2,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"c6d5e0c6cb0ae06398f29b6eec2f6d712f51ee1805b3bfe2ac636579ecea2c3f\"" Sep 16 05:32:23.396040 containerd[1906]: time="2025-09-16T05:32:23.396028109Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 16 05:32:23.536829 kubelet[3272]: I0916 05:32:23.536681 3272 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-bc4wq" podStartSLOduration=1.5366337159999999 podStartE2EDuration="1.536633716s" podCreationTimestamp="2025-09-16 05:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 05:32:23.536621561 +0000 UTC m=+7.107547662" watchObservedRunningTime="2025-09-16 05:32:23.536633716 +0000 UTC m=+7.107559804" Sep 16 05:32:23.651478 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4129323418.mount: Deactivated successfully. Sep 16 05:32:24.597844 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2047661093.mount: Deactivated successfully. Sep 16 05:32:24.883920 containerd[1906]: time="2025-09-16T05:32:24.883837585Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:24.884123 containerd[1906]: time="2025-09-16T05:32:24.884069932Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 16 05:32:24.884596 containerd[1906]: time="2025-09-16T05:32:24.884572136Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:24.886136 containerd[1906]: time="2025-09-16T05:32:24.886123979Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:24.886564 containerd[1906]: time="2025-09-16T05:32:24.886528603Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 1.490485301s" Sep 16 05:32:24.886564 containerd[1906]: time="2025-09-16T05:32:24.886541548Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 16 05:32:24.887412 containerd[1906]: time="2025-09-16T05:32:24.887402049Z" level=info msg="CreateContainer within sandbox \"c6d5e0c6cb0ae06398f29b6eec2f6d712f51ee1805b3bfe2ac636579ecea2c3f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 16 05:32:24.890261 containerd[1906]: time="2025-09-16T05:32:24.890223877Z" level=info msg="Container 4bfbbf76ed52de3a495a7d0603093bdc770b284ead0a2df0b0cc4d361043808c: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:32:24.892735 containerd[1906]: time="2025-09-16T05:32:24.892691320Z" level=info msg="CreateContainer within sandbox \"c6d5e0c6cb0ae06398f29b6eec2f6d712f51ee1805b3bfe2ac636579ecea2c3f\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"4bfbbf76ed52de3a495a7d0603093bdc770b284ead0a2df0b0cc4d361043808c\"" Sep 16 05:32:24.892875 containerd[1906]: time="2025-09-16T05:32:24.892864396Z" level=info msg="StartContainer for \"4bfbbf76ed52de3a495a7d0603093bdc770b284ead0a2df0b0cc4d361043808c\"" Sep 16 05:32:24.893238 containerd[1906]: time="2025-09-16T05:32:24.893224972Z" level=info msg="connecting to shim 4bfbbf76ed52de3a495a7d0603093bdc770b284ead0a2df0b0cc4d361043808c" address="unix:///run/containerd/s/0c94087bf1bb6863b268a59792119b344f56cc1502020405b183ca9d3c2a0c66" protocol=ttrpc version=3 Sep 16 05:32:24.911133 systemd[1]: Started cri-containerd-4bfbbf76ed52de3a495a7d0603093bdc770b284ead0a2df0b0cc4d361043808c.scope - libcontainer container 4bfbbf76ed52de3a495a7d0603093bdc770b284ead0a2df0b0cc4d361043808c. Sep 16 05:32:24.926130 containerd[1906]: time="2025-09-16T05:32:24.926076732Z" level=info msg="StartContainer for \"4bfbbf76ed52de3a495a7d0603093bdc770b284ead0a2df0b0cc4d361043808c\" returns successfully" Sep 16 05:32:25.541690 kubelet[3272]: I0916 05:32:25.541642 3272 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-hrg9c" podStartSLOduration=2.050549856 podStartE2EDuration="3.541632159s" podCreationTimestamp="2025-09-16 05:32:22 +0000 UTC" firstStartedPulling="2025-09-16 05:32:23.395815689 +0000 UTC m=+6.966741724" lastFinishedPulling="2025-09-16 05:32:24.886897993 +0000 UTC m=+8.457824027" observedRunningTime="2025-09-16 05:32:25.541603372 +0000 UTC m=+9.112529412" watchObservedRunningTime="2025-09-16 05:32:25.541632159 +0000 UTC m=+9.112558194" Sep 16 05:32:25.964498 sshd[3451]: banner exchange: Connection from 45.32.113.42 port 40966: invalid format Sep 16 05:32:25.964884 systemd[1]: sshd@9-139.178.94.21:22-45.32.113.42:40966.service: Deactivated successfully. Sep 16 05:32:29.084491 sudo[2220]: pam_unix(sudo:session): session closed for user root Sep 16 05:32:29.085199 sshd[2219]: Connection closed by 139.178.89.65 port 45896 Sep 16 05:32:29.085369 sshd-session[2216]: pam_unix(sshd:session): session closed for user core Sep 16 05:32:29.087465 systemd[1]: sshd@8-139.178.94.21:22-139.178.89.65:45896.service: Deactivated successfully. Sep 16 05:32:29.088388 systemd[1]: session-11.scope: Deactivated successfully. Sep 16 05:32:29.088490 systemd[1]: session-11.scope: Consumed 3.331s CPU time, 229.2M memory peak. Sep 16 05:32:29.089557 systemd-logind[1896]: Session 11 logged out. Waiting for processes to exit. Sep 16 05:32:29.090340 systemd-logind[1896]: Removed session 11. Sep 16 05:32:29.331708 systemd[1]: Started sshd@10-139.178.94.21:22-45.32.113.42:45676.service - OpenSSH per-connection server daemon (45.32.113.42:45676). Sep 16 05:32:29.468764 sshd[3798]: Connection closed by 45.32.113.42 port 45676 Sep 16 05:32:29.469680 systemd[1]: sshd@10-139.178.94.21:22-45.32.113.42:45676.service: Deactivated successfully. Sep 16 05:32:30.230104 update_engine[1901]: I20250916 05:32:30.230042 1901 update_attempter.cc:509] Updating boot flags... Sep 16 05:32:31.385037 systemd[1]: Created slice kubepods-besteffort-podc25fe39f_c613_404d_97be_23ef1539c79d.slice - libcontainer container kubepods-besteffort-podc25fe39f_c613_404d_97be_23ef1539c79d.slice. Sep 16 05:32:31.389999 kubelet[3272]: I0916 05:32:31.389956 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/c25fe39f-c613-404d-97be-23ef1539c79d-typha-certs\") pod \"calico-typha-54b4449544-nq42m\" (UID: \"c25fe39f-c613-404d-97be-23ef1539c79d\") " pod="calico-system/calico-typha-54b4449544-nq42m" Sep 16 05:32:31.390372 kubelet[3272]: I0916 05:32:31.390015 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c25fe39f-c613-404d-97be-23ef1539c79d-tigera-ca-bundle\") pod \"calico-typha-54b4449544-nq42m\" (UID: \"c25fe39f-c613-404d-97be-23ef1539c79d\") " pod="calico-system/calico-typha-54b4449544-nq42m" Sep 16 05:32:31.390372 kubelet[3272]: I0916 05:32:31.390049 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk44h\" (UniqueName: \"kubernetes.io/projected/c25fe39f-c613-404d-97be-23ef1539c79d-kube-api-access-vk44h\") pod \"calico-typha-54b4449544-nq42m\" (UID: \"c25fe39f-c613-404d-97be-23ef1539c79d\") " pod="calico-system/calico-typha-54b4449544-nq42m" Sep 16 05:32:31.671425 systemd[1]: Created slice kubepods-besteffort-pod3a237aa8_ab33_4ea2_9b49_0bc132266cc0.slice - libcontainer container kubepods-besteffort-pod3a237aa8_ab33_4ea2_9b49_0bc132266cc0.slice. Sep 16 05:32:31.688597 containerd[1906]: time="2025-09-16T05:32:31.688535499Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-54b4449544-nq42m,Uid:c25fe39f-c613-404d-97be-23ef1539c79d,Namespace:calico-system,Attempt:0,}" Sep 16 05:32:31.692820 kubelet[3272]: I0916 05:32:31.692775 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/3a237aa8-ab33-4ea2-9b49-0bc132266cc0-flexvol-driver-host\") pod \"calico-node-htsjr\" (UID: \"3a237aa8-ab33-4ea2-9b49-0bc132266cc0\") " pod="calico-system/calico-node-htsjr" Sep 16 05:32:31.692820 kubelet[3272]: I0916 05:32:31.692801 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/3a237aa8-ab33-4ea2-9b49-0bc132266cc0-node-certs\") pod \"calico-node-htsjr\" (UID: \"3a237aa8-ab33-4ea2-9b49-0bc132266cc0\") " pod="calico-system/calico-node-htsjr" Sep 16 05:32:31.692820 kubelet[3272]: I0916 05:32:31.692814 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3a237aa8-ab33-4ea2-9b49-0bc132266cc0-xtables-lock\") pod \"calico-node-htsjr\" (UID: \"3a237aa8-ab33-4ea2-9b49-0bc132266cc0\") " pod="calico-system/calico-node-htsjr" Sep 16 05:32:31.692897 kubelet[3272]: I0916 05:32:31.692824 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/3a237aa8-ab33-4ea2-9b49-0bc132266cc0-var-run-calico\") pod \"calico-node-htsjr\" (UID: \"3a237aa8-ab33-4ea2-9b49-0bc132266cc0\") " pod="calico-system/calico-node-htsjr" Sep 16 05:32:31.692897 kubelet[3272]: I0916 05:32:31.692834 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a237aa8-ab33-4ea2-9b49-0bc132266cc0-tigera-ca-bundle\") pod \"calico-node-htsjr\" (UID: \"3a237aa8-ab33-4ea2-9b49-0bc132266cc0\") " pod="calico-system/calico-node-htsjr" Sep 16 05:32:31.692897 kubelet[3272]: I0916 05:32:31.692842 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4mxb\" (UniqueName: \"kubernetes.io/projected/3a237aa8-ab33-4ea2-9b49-0bc132266cc0-kube-api-access-k4mxb\") pod \"calico-node-htsjr\" (UID: \"3a237aa8-ab33-4ea2-9b49-0bc132266cc0\") " pod="calico-system/calico-node-htsjr" Sep 16 05:32:31.692897 kubelet[3272]: I0916 05:32:31.692851 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/3a237aa8-ab33-4ea2-9b49-0bc132266cc0-cni-bin-dir\") pod \"calico-node-htsjr\" (UID: \"3a237aa8-ab33-4ea2-9b49-0bc132266cc0\") " pod="calico-system/calico-node-htsjr" Sep 16 05:32:31.692897 kubelet[3272]: I0916 05:32:31.692866 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3a237aa8-ab33-4ea2-9b49-0bc132266cc0-lib-modules\") pod \"calico-node-htsjr\" (UID: \"3a237aa8-ab33-4ea2-9b49-0bc132266cc0\") " pod="calico-system/calico-node-htsjr" Sep 16 05:32:31.692974 kubelet[3272]: I0916 05:32:31.692897 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3a237aa8-ab33-4ea2-9b49-0bc132266cc0-var-lib-calico\") pod \"calico-node-htsjr\" (UID: \"3a237aa8-ab33-4ea2-9b49-0bc132266cc0\") " pod="calico-system/calico-node-htsjr" Sep 16 05:32:31.692974 kubelet[3272]: I0916 05:32:31.692928 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/3a237aa8-ab33-4ea2-9b49-0bc132266cc0-cni-log-dir\") pod \"calico-node-htsjr\" (UID: \"3a237aa8-ab33-4ea2-9b49-0bc132266cc0\") " pod="calico-system/calico-node-htsjr" Sep 16 05:32:31.692974 kubelet[3272]: I0916 05:32:31.692939 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/3a237aa8-ab33-4ea2-9b49-0bc132266cc0-cni-net-dir\") pod \"calico-node-htsjr\" (UID: \"3a237aa8-ab33-4ea2-9b49-0bc132266cc0\") " pod="calico-system/calico-node-htsjr" Sep 16 05:32:31.692974 kubelet[3272]: I0916 05:32:31.692955 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/3a237aa8-ab33-4ea2-9b49-0bc132266cc0-policysync\") pod \"calico-node-htsjr\" (UID: \"3a237aa8-ab33-4ea2-9b49-0bc132266cc0\") " pod="calico-system/calico-node-htsjr" Sep 16 05:32:31.696036 containerd[1906]: time="2025-09-16T05:32:31.696015953Z" level=info msg="connecting to shim 83fac48863741f707a45b1d8dcb4dba49e43d3568ef26622b1fa7640ca25dbab" address="unix:///run/containerd/s/73502a3e3ff80172cb4602c37acd0366438bdf1dda79a93b90e5562d6dc6b799" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:32:31.711195 systemd[1]: Started cri-containerd-83fac48863741f707a45b1d8dcb4dba49e43d3568ef26622b1fa7640ca25dbab.scope - libcontainer container 83fac48863741f707a45b1d8dcb4dba49e43d3568ef26622b1fa7640ca25dbab. Sep 16 05:32:31.741029 containerd[1906]: time="2025-09-16T05:32:31.741007789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-54b4449544-nq42m,Uid:c25fe39f-c613-404d-97be-23ef1539c79d,Namespace:calico-system,Attempt:0,} returns sandbox id \"83fac48863741f707a45b1d8dcb4dba49e43d3568ef26622b1fa7640ca25dbab\"" Sep 16 05:32:31.741637 containerd[1906]: time="2025-09-16T05:32:31.741624618Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 16 05:32:31.794975 kubelet[3272]: E0916 05:32:31.794953 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.794975 kubelet[3272]: W0916 05:32:31.794971 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.795121 kubelet[3272]: E0916 05:32:31.795005 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.796440 kubelet[3272]: E0916 05:32:31.796403 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.796440 kubelet[3272]: W0916 05:32:31.796414 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.796440 kubelet[3272]: E0916 05:32:31.796422 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.800791 kubelet[3272]: E0916 05:32:31.800747 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.800791 kubelet[3272]: W0916 05:32:31.800756 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.800791 kubelet[3272]: E0916 05:32:31.800766 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.893625 kubelet[3272]: E0916 05:32:31.893524 3272 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-576jh" podUID="0d3912ee-3cd2-402b-ab83-a92684e5a0cd" Sep 16 05:32:31.977125 containerd[1906]: time="2025-09-16T05:32:31.976964822Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-htsjr,Uid:3a237aa8-ab33-4ea2-9b49-0bc132266cc0,Namespace:calico-system,Attempt:0,}" Sep 16 05:32:31.980301 kubelet[3272]: E0916 05:32:31.980288 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.980301 kubelet[3272]: W0916 05:32:31.980299 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.980367 kubelet[3272]: E0916 05:32:31.980310 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.980413 kubelet[3272]: E0916 05:32:31.980405 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.980413 kubelet[3272]: W0916 05:32:31.980410 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.980470 kubelet[3272]: E0916 05:32:31.980416 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.980495 kubelet[3272]: E0916 05:32:31.980479 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.980495 kubelet[3272]: W0916 05:32:31.980483 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.980495 kubelet[3272]: E0916 05:32:31.980488 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.980577 kubelet[3272]: E0916 05:32:31.980571 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.980577 kubelet[3272]: W0916 05:32:31.980575 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.980614 kubelet[3272]: E0916 05:32:31.980580 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.980650 kubelet[3272]: E0916 05:32:31.980644 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.980650 kubelet[3272]: W0916 05:32:31.980649 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.980687 kubelet[3272]: E0916 05:32:31.980653 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.980716 kubelet[3272]: E0916 05:32:31.980711 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.980716 kubelet[3272]: W0916 05:32:31.980715 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.980749 kubelet[3272]: E0916 05:32:31.980719 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.980781 kubelet[3272]: E0916 05:32:31.980776 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.980781 kubelet[3272]: W0916 05:32:31.980780 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.980819 kubelet[3272]: E0916 05:32:31.980785 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.980848 kubelet[3272]: E0916 05:32:31.980843 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.980848 kubelet[3272]: W0916 05:32:31.980847 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.980886 kubelet[3272]: E0916 05:32:31.980852 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.980919 kubelet[3272]: E0916 05:32:31.980912 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.980919 kubelet[3272]: W0916 05:32:31.980918 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.980996 kubelet[3272]: E0916 05:32:31.980922 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.980996 kubelet[3272]: E0916 05:32:31.980979 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.980996 kubelet[3272]: W0916 05:32:31.980983 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.980996 kubelet[3272]: E0916 05:32:31.980992 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.981065 kubelet[3272]: E0916 05:32:31.981056 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.981065 kubelet[3272]: W0916 05:32:31.981060 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.981065 kubelet[3272]: E0916 05:32:31.981064 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.981126 kubelet[3272]: E0916 05:32:31.981121 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.981126 kubelet[3272]: W0916 05:32:31.981125 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.981161 kubelet[3272]: E0916 05:32:31.981129 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.981192 kubelet[3272]: E0916 05:32:31.981187 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.981192 kubelet[3272]: W0916 05:32:31.981192 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.981225 kubelet[3272]: E0916 05:32:31.981196 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.981261 kubelet[3272]: E0916 05:32:31.981256 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.981261 kubelet[3272]: W0916 05:32:31.981261 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.981293 kubelet[3272]: E0916 05:32:31.981265 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.981328 kubelet[3272]: E0916 05:32:31.981322 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.981328 kubelet[3272]: W0916 05:32:31.981327 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.981373 kubelet[3272]: E0916 05:32:31.981331 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.981394 kubelet[3272]: E0916 05:32:31.981391 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.981409 kubelet[3272]: W0916 05:32:31.981395 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.981409 kubelet[3272]: E0916 05:32:31.981399 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.981467 kubelet[3272]: E0916 05:32:31.981461 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.981467 kubelet[3272]: W0916 05:32:31.981465 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.981523 kubelet[3272]: E0916 05:32:31.981470 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.981554 kubelet[3272]: E0916 05:32:31.981524 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.981554 kubelet[3272]: W0916 05:32:31.981528 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.981554 kubelet[3272]: E0916 05:32:31.981533 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.981618 kubelet[3272]: E0916 05:32:31.981586 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.981618 kubelet[3272]: W0916 05:32:31.981590 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.981618 kubelet[3272]: E0916 05:32:31.981594 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.981674 kubelet[3272]: E0916 05:32:31.981659 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.981674 kubelet[3272]: W0916 05:32:31.981663 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.981674 kubelet[3272]: E0916 05:32:31.981667 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.985117 containerd[1906]: time="2025-09-16T05:32:31.985092738Z" level=info msg="connecting to shim 960cd73907568744534ec1cd0c0a235be2b6c9addccffd645996c3ea621461c0" address="unix:///run/containerd/s/c3250fb547e6525fe57da265d5b74a8c89b96274841848b1764abf6032bffd80" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:32:31.995474 kubelet[3272]: E0916 05:32:31.995428 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.995474 kubelet[3272]: W0916 05:32:31.995440 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.995474 kubelet[3272]: E0916 05:32:31.995450 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.995474 kubelet[3272]: I0916 05:32:31.995464 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/0d3912ee-3cd2-402b-ab83-a92684e5a0cd-varrun\") pod \"csi-node-driver-576jh\" (UID: \"0d3912ee-3cd2-402b-ab83-a92684e5a0cd\") " pod="calico-system/csi-node-driver-576jh" Sep 16 05:32:31.995582 kubelet[3272]: E0916 05:32:31.995547 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.995582 kubelet[3272]: W0916 05:32:31.995553 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.995582 kubelet[3272]: E0916 05:32:31.995562 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.995582 kubelet[3272]: I0916 05:32:31.995576 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74gh7\" (UniqueName: \"kubernetes.io/projected/0d3912ee-3cd2-402b-ab83-a92684e5a0cd-kube-api-access-74gh7\") pod \"csi-node-driver-576jh\" (UID: \"0d3912ee-3cd2-402b-ab83-a92684e5a0cd\") " pod="calico-system/csi-node-driver-576jh" Sep 16 05:32:31.995760 kubelet[3272]: E0916 05:32:31.995724 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.995760 kubelet[3272]: W0916 05:32:31.995731 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.995760 kubelet[3272]: E0916 05:32:31.995738 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.995817 kubelet[3272]: E0916 05:32:31.995805 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.995817 kubelet[3272]: W0916 05:32:31.995809 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.995817 kubelet[3272]: E0916 05:32:31.995815 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.995892 kubelet[3272]: E0916 05:32:31.995887 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.995892 kubelet[3272]: W0916 05:32:31.995891 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.995927 kubelet[3272]: E0916 05:32:31.995897 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.995927 kubelet[3272]: I0916 05:32:31.995906 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d3912ee-3cd2-402b-ab83-a92684e5a0cd-socket-dir\") pod \"csi-node-driver-576jh\" (UID: \"0d3912ee-3cd2-402b-ab83-a92684e5a0cd\") " pod="calico-system/csi-node-driver-576jh" Sep 16 05:32:31.995978 kubelet[3272]: E0916 05:32:31.995973 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.996004 kubelet[3272]: W0916 05:32:31.995978 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.996004 kubelet[3272]: E0916 05:32:31.995983 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.996004 kubelet[3272]: I0916 05:32:31.995996 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d3912ee-3cd2-402b-ab83-a92684e5a0cd-registration-dir\") pod \"csi-node-driver-576jh\" (UID: \"0d3912ee-3cd2-402b-ab83-a92684e5a0cd\") " pod="calico-system/csi-node-driver-576jh" Sep 16 05:32:31.996150 kubelet[3272]: E0916 05:32:31.996109 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.996150 kubelet[3272]: W0916 05:32:31.996115 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.996150 kubelet[3272]: E0916 05:32:31.996122 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.996210 kubelet[3272]: E0916 05:32:31.996204 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.996210 kubelet[3272]: W0916 05:32:31.996209 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.996243 kubelet[3272]: E0916 05:32:31.996214 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.996285 kubelet[3272]: E0916 05:32:31.996280 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.996301 kubelet[3272]: W0916 05:32:31.996285 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.996301 kubelet[3272]: E0916 05:32:31.996291 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.996352 kubelet[3272]: E0916 05:32:31.996347 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.996372 kubelet[3272]: W0916 05:32:31.996353 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.996372 kubelet[3272]: E0916 05:32:31.996358 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.996436 kubelet[3272]: E0916 05:32:31.996430 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.996455 kubelet[3272]: W0916 05:32:31.996436 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.996455 kubelet[3272]: E0916 05:32:31.996443 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.996455 kubelet[3272]: I0916 05:32:31.996451 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3912ee-3cd2-402b-ab83-a92684e5a0cd-kubelet-dir\") pod \"csi-node-driver-576jh\" (UID: \"0d3912ee-3cd2-402b-ab83-a92684e5a0cd\") " pod="calico-system/csi-node-driver-576jh" Sep 16 05:32:31.996540 kubelet[3272]: E0916 05:32:31.996535 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.996556 kubelet[3272]: W0916 05:32:31.996540 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.996556 kubelet[3272]: E0916 05:32:31.996546 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.996613 kubelet[3272]: E0916 05:32:31.996608 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.996632 kubelet[3272]: W0916 05:32:31.996613 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.996632 kubelet[3272]: E0916 05:32:31.996618 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.996693 kubelet[3272]: E0916 05:32:31.996688 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.996710 kubelet[3272]: W0916 05:32:31.996693 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.996710 kubelet[3272]: E0916 05:32:31.996698 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.996763 kubelet[3272]: E0916 05:32:31.996759 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:31.996763 kubelet[3272]: W0916 05:32:31.996763 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:31.996795 kubelet[3272]: E0916 05:32:31.996768 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:31.999135 systemd[1]: Started cri-containerd-960cd73907568744534ec1cd0c0a235be2b6c9addccffd645996c3ea621461c0.scope - libcontainer container 960cd73907568744534ec1cd0c0a235be2b6c9addccffd645996c3ea621461c0. Sep 16 05:32:32.010855 containerd[1906]: time="2025-09-16T05:32:32.010805800Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-htsjr,Uid:3a237aa8-ab33-4ea2-9b49-0bc132266cc0,Namespace:calico-system,Attempt:0,} returns sandbox id \"960cd73907568744534ec1cd0c0a235be2b6c9addccffd645996c3ea621461c0\"" Sep 16 05:32:32.097110 kubelet[3272]: E0916 05:32:32.097046 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.097110 kubelet[3272]: W0916 05:32:32.097071 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.097110 kubelet[3272]: E0916 05:32:32.097096 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.097362 kubelet[3272]: E0916 05:32:32.097342 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.097362 kubelet[3272]: W0916 05:32:32.097358 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.097447 kubelet[3272]: E0916 05:32:32.097377 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.097679 kubelet[3272]: E0916 05:32:32.097632 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.097679 kubelet[3272]: W0916 05:32:32.097648 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.097679 kubelet[3272]: E0916 05:32:32.097668 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.097920 kubelet[3272]: E0916 05:32:32.097874 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.097920 kubelet[3272]: W0916 05:32:32.097888 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.097920 kubelet[3272]: E0916 05:32:32.097906 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.098119 kubelet[3272]: E0916 05:32:32.098112 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.098164 kubelet[3272]: W0916 05:32:32.098124 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.098164 kubelet[3272]: E0916 05:32:32.098144 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.098394 kubelet[3272]: E0916 05:32:32.098346 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.098394 kubelet[3272]: W0916 05:32:32.098358 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.098394 kubelet[3272]: E0916 05:32:32.098387 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.098552 kubelet[3272]: E0916 05:32:32.098525 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.098552 kubelet[3272]: W0916 05:32:32.098539 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.098624 kubelet[3272]: E0916 05:32:32.098565 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.098703 kubelet[3272]: E0916 05:32:32.098689 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.098756 kubelet[3272]: W0916 05:32:32.098704 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.098756 kubelet[3272]: E0916 05:32:32.098729 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.098875 kubelet[3272]: E0916 05:32:32.098860 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.098917 kubelet[3272]: W0916 05:32:32.098875 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.098917 kubelet[3272]: E0916 05:32:32.098892 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.099092 kubelet[3272]: E0916 05:32:32.099078 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.099151 kubelet[3272]: W0916 05:32:32.099093 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.099151 kubelet[3272]: E0916 05:32:32.099108 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.099306 kubelet[3272]: E0916 05:32:32.099294 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.099353 kubelet[3272]: W0916 05:32:32.099305 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.099353 kubelet[3272]: E0916 05:32:32.099335 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.099467 kubelet[3272]: E0916 05:32:32.099455 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.099467 kubelet[3272]: W0916 05:32:32.099466 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.099554 kubelet[3272]: E0916 05:32:32.099497 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.099632 kubelet[3272]: E0916 05:32:32.099620 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.099632 kubelet[3272]: W0916 05:32:32.099631 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.099706 kubelet[3272]: E0916 05:32:32.099678 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.099788 kubelet[3272]: E0916 05:32:32.099777 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.099828 kubelet[3272]: W0916 05:32:32.099788 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.099873 kubelet[3272]: E0916 05:32:32.099836 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.099945 kubelet[3272]: E0916 05:32:32.099934 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.100000 kubelet[3272]: W0916 05:32:32.099945 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.100000 kubelet[3272]: E0916 05:32:32.099966 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.100143 kubelet[3272]: E0916 05:32:32.100131 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.100184 kubelet[3272]: W0916 05:32:32.100142 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.100184 kubelet[3272]: E0916 05:32:32.100156 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.100366 kubelet[3272]: E0916 05:32:32.100352 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.100415 kubelet[3272]: W0916 05:32:32.100365 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.100415 kubelet[3272]: E0916 05:32:32.100387 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.100568 kubelet[3272]: E0916 05:32:32.100555 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.100616 kubelet[3272]: W0916 05:32:32.100567 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.100616 kubelet[3272]: E0916 05:32:32.100584 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.100813 kubelet[3272]: E0916 05:32:32.100799 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.100813 kubelet[3272]: W0916 05:32:32.100812 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.100901 kubelet[3272]: E0916 05:32:32.100829 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.101066 kubelet[3272]: E0916 05:32:32.101020 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.101066 kubelet[3272]: W0916 05:32:32.101035 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.101066 kubelet[3272]: E0916 05:32:32.101051 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.101298 kubelet[3272]: E0916 05:32:32.101280 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.101298 kubelet[3272]: W0916 05:32:32.101293 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.101397 kubelet[3272]: E0916 05:32:32.101342 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.101472 kubelet[3272]: E0916 05:32:32.101455 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.101472 kubelet[3272]: W0916 05:32:32.101470 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.101578 kubelet[3272]: E0916 05:32:32.101519 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.101661 kubelet[3272]: E0916 05:32:32.101644 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.101661 kubelet[3272]: W0916 05:32:32.101657 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.101751 kubelet[3272]: E0916 05:32:32.101673 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.101949 kubelet[3272]: E0916 05:32:32.101933 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.101949 kubelet[3272]: W0916 05:32:32.101947 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.102076 kubelet[3272]: E0916 05:32:32.101963 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.102306 kubelet[3272]: E0916 05:32:32.102274 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.102306 kubelet[3272]: W0916 05:32:32.102296 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.102476 kubelet[3272]: E0916 05:32:32.102316 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:32.115654 kubelet[3272]: E0916 05:32:32.115605 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:32.115654 kubelet[3272]: W0916 05:32:32.115635 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:32.115654 kubelet[3272]: E0916 05:32:32.115660 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:33.380341 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1464460023.mount: Deactivated successfully. Sep 16 05:32:33.493448 kubelet[3272]: E0916 05:32:33.493322 3272 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-576jh" podUID="0d3912ee-3cd2-402b-ab83-a92684e5a0cd" Sep 16 05:32:34.154336 containerd[1906]: time="2025-09-16T05:32:34.154314211Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:34.154552 containerd[1906]: time="2025-09-16T05:32:34.154474321Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 16 05:32:34.154909 containerd[1906]: time="2025-09-16T05:32:34.154898390Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:34.155684 containerd[1906]: time="2025-09-16T05:32:34.155671351Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:34.156124 containerd[1906]: time="2025-09-16T05:32:34.156086523Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.414446307s" Sep 16 05:32:34.156124 containerd[1906]: time="2025-09-16T05:32:34.156102983Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 16 05:32:34.156528 containerd[1906]: time="2025-09-16T05:32:34.156516389Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 16 05:32:34.159481 containerd[1906]: time="2025-09-16T05:32:34.159466085Z" level=info msg="CreateContainer within sandbox \"83fac48863741f707a45b1d8dcb4dba49e43d3568ef26622b1fa7640ca25dbab\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 16 05:32:34.162660 containerd[1906]: time="2025-09-16T05:32:34.162643881Z" level=info msg="Container 2f7ab839926cc868ee0471368bd98e10f887629a641eab976753079abb904113: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:32:34.165275 containerd[1906]: time="2025-09-16T05:32:34.165262118Z" level=info msg="CreateContainer within sandbox \"83fac48863741f707a45b1d8dcb4dba49e43d3568ef26622b1fa7640ca25dbab\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"2f7ab839926cc868ee0471368bd98e10f887629a641eab976753079abb904113\"" Sep 16 05:32:34.165534 containerd[1906]: time="2025-09-16T05:32:34.165522464Z" level=info msg="StartContainer for \"2f7ab839926cc868ee0471368bd98e10f887629a641eab976753079abb904113\"" Sep 16 05:32:34.166036 containerd[1906]: time="2025-09-16T05:32:34.166024007Z" level=info msg="connecting to shim 2f7ab839926cc868ee0471368bd98e10f887629a641eab976753079abb904113" address="unix:///run/containerd/s/73502a3e3ff80172cb4602c37acd0366438bdf1dda79a93b90e5562d6dc6b799" protocol=ttrpc version=3 Sep 16 05:32:34.189145 systemd[1]: Started cri-containerd-2f7ab839926cc868ee0471368bd98e10f887629a641eab976753079abb904113.scope - libcontainer container 2f7ab839926cc868ee0471368bd98e10f887629a641eab976753079abb904113. Sep 16 05:32:34.218273 containerd[1906]: time="2025-09-16T05:32:34.218248583Z" level=info msg="StartContainer for \"2f7ab839926cc868ee0471368bd98e10f887629a641eab976753079abb904113\" returns successfully" Sep 16 05:32:34.587035 kubelet[3272]: I0916 05:32:34.586873 3272 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-54b4449544-nq42m" podStartSLOduration=1.171866967 podStartE2EDuration="3.586826281s" podCreationTimestamp="2025-09-16 05:32:31 +0000 UTC" firstStartedPulling="2025-09-16 05:32:31.741508951 +0000 UTC m=+15.312434986" lastFinishedPulling="2025-09-16 05:32:34.156468264 +0000 UTC m=+17.727394300" observedRunningTime="2025-09-16 05:32:34.586403651 +0000 UTC m=+18.157329755" watchObservedRunningTime="2025-09-16 05:32:34.586826281 +0000 UTC m=+18.157752385" Sep 16 05:32:34.600078 kubelet[3272]: E0916 05:32:34.599972 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.600078 kubelet[3272]: W0916 05:32:34.600056 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.600418 kubelet[3272]: E0916 05:32:34.600096 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.600660 kubelet[3272]: E0916 05:32:34.600611 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.600660 kubelet[3272]: W0916 05:32:34.600647 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.600983 kubelet[3272]: E0916 05:32:34.600681 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.601259 kubelet[3272]: E0916 05:32:34.601208 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.601259 kubelet[3272]: W0916 05:32:34.601244 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.601582 kubelet[3272]: E0916 05:32:34.601277 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.601909 kubelet[3272]: E0916 05:32:34.601864 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.601909 kubelet[3272]: W0916 05:32:34.601893 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.602265 kubelet[3272]: E0916 05:32:34.601923 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.602478 kubelet[3272]: E0916 05:32:34.602420 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.602478 kubelet[3272]: W0916 05:32:34.602451 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.602478 kubelet[3272]: E0916 05:32:34.602481 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.602935 kubelet[3272]: E0916 05:32:34.602903 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.602935 kubelet[3272]: W0916 05:32:34.602930 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.603199 kubelet[3272]: E0916 05:32:34.602958 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.603458 kubelet[3272]: E0916 05:32:34.603407 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.603458 kubelet[3272]: W0916 05:32:34.603442 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.603669 kubelet[3272]: E0916 05:32:34.603472 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.603883 kubelet[3272]: E0916 05:32:34.603835 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.603883 kubelet[3272]: W0916 05:32:34.603862 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.604140 kubelet[3272]: E0916 05:32:34.603897 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.604490 kubelet[3272]: E0916 05:32:34.604437 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.604490 kubelet[3272]: W0916 05:32:34.604468 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.604712 kubelet[3272]: E0916 05:32:34.604497 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.604949 kubelet[3272]: E0916 05:32:34.604918 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.604949 kubelet[3272]: W0916 05:32:34.604947 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.605209 kubelet[3272]: E0916 05:32:34.604980 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.605479 kubelet[3272]: E0916 05:32:34.605439 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.605479 kubelet[3272]: W0916 05:32:34.605467 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.605850 kubelet[3272]: E0916 05:32:34.605494 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.606056 kubelet[3272]: E0916 05:32:34.605896 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.606056 kubelet[3272]: W0916 05:32:34.605920 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.606056 kubelet[3272]: E0916 05:32:34.605943 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.606489 kubelet[3272]: E0916 05:32:34.606397 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.606489 kubelet[3272]: W0916 05:32:34.606420 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.606489 kubelet[3272]: E0916 05:32:34.606444 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.606963 kubelet[3272]: E0916 05:32:34.606852 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.606963 kubelet[3272]: W0916 05:32:34.606876 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.606963 kubelet[3272]: E0916 05:32:34.606900 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.607444 kubelet[3272]: E0916 05:32:34.607360 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.607444 kubelet[3272]: W0916 05:32:34.607385 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.607444 kubelet[3272]: E0916 05:32:34.607410 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.616213 kubelet[3272]: E0916 05:32:34.616125 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.616213 kubelet[3272]: W0916 05:32:34.616176 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.616530 kubelet[3272]: E0916 05:32:34.616227 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.616942 kubelet[3272]: E0916 05:32:34.616858 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.616942 kubelet[3272]: W0916 05:32:34.616902 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.617323 kubelet[3272]: E0916 05:32:34.616958 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.617664 kubelet[3272]: E0916 05:32:34.617574 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.617664 kubelet[3272]: W0916 05:32:34.617620 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.617664 kubelet[3272]: E0916 05:32:34.617665 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.618287 kubelet[3272]: E0916 05:32:34.618195 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.618287 kubelet[3272]: W0916 05:32:34.618234 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.618287 kubelet[3272]: E0916 05:32:34.618277 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.618825 kubelet[3272]: E0916 05:32:34.618762 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.618825 kubelet[3272]: W0916 05:32:34.618791 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.619117 kubelet[3272]: E0916 05:32:34.618918 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.619224 kubelet[3272]: E0916 05:32:34.619187 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.619224 kubelet[3272]: W0916 05:32:34.619199 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.619340 kubelet[3272]: E0916 05:32:34.619239 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.619430 kubelet[3272]: E0916 05:32:34.619406 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.619430 kubelet[3272]: W0916 05:32:34.619412 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.619481 kubelet[3272]: E0916 05:32:34.619429 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.619574 kubelet[3272]: E0916 05:32:34.619567 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.619574 kubelet[3272]: W0916 05:32:34.619573 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.619626 kubelet[3272]: E0916 05:32:34.619582 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.619789 kubelet[3272]: E0916 05:32:34.619783 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.619808 kubelet[3272]: W0916 05:32:34.619799 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.619826 kubelet[3272]: E0916 05:32:34.619807 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.619891 kubelet[3272]: E0916 05:32:34.619887 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.619911 kubelet[3272]: W0916 05:32:34.619892 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.619911 kubelet[3272]: E0916 05:32:34.619898 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.619967 kubelet[3272]: E0916 05:32:34.619962 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.619993 kubelet[3272]: W0916 05:32:34.619967 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.619993 kubelet[3272]: E0916 05:32:34.619972 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.620065 kubelet[3272]: E0916 05:32:34.620060 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.620086 kubelet[3272]: W0916 05:32:34.620065 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.620086 kubelet[3272]: E0916 05:32:34.620072 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.620143 kubelet[3272]: E0916 05:32:34.620138 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.620159 kubelet[3272]: W0916 05:32:34.620143 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.620159 kubelet[3272]: E0916 05:32:34.620149 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.620263 kubelet[3272]: E0916 05:32:34.620258 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.620281 kubelet[3272]: W0916 05:32:34.620264 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.620281 kubelet[3272]: E0916 05:32:34.620272 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.620430 kubelet[3272]: E0916 05:32:34.620424 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.620455 kubelet[3272]: W0916 05:32:34.620431 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.620455 kubelet[3272]: E0916 05:32:34.620441 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.620533 kubelet[3272]: E0916 05:32:34.620527 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.620533 kubelet[3272]: W0916 05:32:34.620532 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.620589 kubelet[3272]: E0916 05:32:34.620538 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.620663 kubelet[3272]: E0916 05:32:34.620653 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.620693 kubelet[3272]: W0916 05:32:34.620663 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.620693 kubelet[3272]: E0916 05:32:34.620672 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:34.620893 kubelet[3272]: E0916 05:32:34.620885 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:34.620893 kubelet[3272]: W0916 05:32:34.620890 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:34.620981 kubelet[3272]: E0916 05:32:34.620895 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.020401 systemd[1]: Started sshd@11-139.178.94.21:22-45.32.113.42:47226.service - OpenSSH per-connection server daemon (45.32.113.42:47226). Sep 16 05:32:35.492980 kubelet[3272]: E0916 05:32:35.492952 3272 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-576jh" podUID="0d3912ee-3cd2-402b-ab83-a92684e5a0cd" Sep 16 05:32:35.551142 kubelet[3272]: I0916 05:32:35.551044 3272 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 05:32:35.615520 kubelet[3272]: E0916 05:32:35.615414 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.615520 kubelet[3272]: W0916 05:32:35.615457 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.615520 kubelet[3272]: E0916 05:32:35.615497 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.616543 kubelet[3272]: E0916 05:32:35.615940 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.616543 kubelet[3272]: W0916 05:32:35.615967 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.616543 kubelet[3272]: E0916 05:32:35.616027 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.616543 kubelet[3272]: E0916 05:32:35.616448 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.616543 kubelet[3272]: W0916 05:32:35.616473 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.616543 kubelet[3272]: E0916 05:32:35.616501 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.617072 kubelet[3272]: E0916 05:32:35.616931 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.617072 kubelet[3272]: W0916 05:32:35.616956 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.617072 kubelet[3272]: E0916 05:32:35.616983 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.617539 kubelet[3272]: E0916 05:32:35.617462 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.617539 kubelet[3272]: W0916 05:32:35.617490 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.617539 kubelet[3272]: E0916 05:32:35.617522 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.617945 kubelet[3272]: E0916 05:32:35.617916 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.617945 kubelet[3272]: W0916 05:32:35.617943 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.618191 kubelet[3272]: E0916 05:32:35.617969 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.618470 kubelet[3272]: E0916 05:32:35.618380 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.618470 kubelet[3272]: W0916 05:32:35.618404 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.618470 kubelet[3272]: E0916 05:32:35.618427 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.618854 kubelet[3272]: E0916 05:32:35.618823 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.618854 kubelet[3272]: W0916 05:32:35.618847 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.619056 kubelet[3272]: E0916 05:32:35.618873 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.619354 kubelet[3272]: E0916 05:32:35.619283 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.619354 kubelet[3272]: W0916 05:32:35.619308 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.619354 kubelet[3272]: E0916 05:32:35.619331 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.619815 kubelet[3272]: E0916 05:32:35.619764 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.619815 kubelet[3272]: W0916 05:32:35.619790 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.619815 kubelet[3272]: E0916 05:32:35.619816 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.620348 kubelet[3272]: E0916 05:32:35.620268 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.620348 kubelet[3272]: W0916 05:32:35.620304 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.620348 kubelet[3272]: E0916 05:32:35.620338 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.620900 kubelet[3272]: E0916 05:32:35.620843 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.620900 kubelet[3272]: W0916 05:32:35.620884 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.621151 kubelet[3272]: E0916 05:32:35.620918 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.621502 kubelet[3272]: E0916 05:32:35.621465 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.621632 kubelet[3272]: W0916 05:32:35.621501 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.621632 kubelet[3272]: E0916 05:32:35.621537 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.622076 kubelet[3272]: E0916 05:32:35.621980 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.622076 kubelet[3272]: W0916 05:32:35.622069 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.622301 kubelet[3272]: E0916 05:32:35.622103 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.622635 kubelet[3272]: E0916 05:32:35.622601 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.622746 kubelet[3272]: W0916 05:32:35.622638 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.622746 kubelet[3272]: E0916 05:32:35.622671 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.627071 kubelet[3272]: E0916 05:32:35.627015 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.627071 kubelet[3272]: W0916 05:32:35.627053 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.627360 kubelet[3272]: E0916 05:32:35.627087 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.627694 kubelet[3272]: E0916 05:32:35.627614 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.627694 kubelet[3272]: W0916 05:32:35.627649 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.627694 kubelet[3272]: E0916 05:32:35.627691 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.628356 kubelet[3272]: E0916 05:32:35.628277 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.628356 kubelet[3272]: W0916 05:32:35.628318 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.628356 kubelet[3272]: E0916 05:32:35.628361 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.628954 kubelet[3272]: E0916 05:32:35.628862 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.628954 kubelet[3272]: W0916 05:32:35.628892 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.628954 kubelet[3272]: E0916 05:32:35.628933 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.629561 kubelet[3272]: E0916 05:32:35.629482 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.629561 kubelet[3272]: W0916 05:32:35.629518 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.629827 kubelet[3272]: E0916 05:32:35.629610 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.630129 kubelet[3272]: E0916 05:32:35.630043 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.630129 kubelet[3272]: W0916 05:32:35.630072 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.630434 kubelet[3272]: E0916 05:32:35.630150 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.630628 kubelet[3272]: E0916 05:32:35.630576 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.630628 kubelet[3272]: W0916 05:32:35.630605 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.630825 kubelet[3272]: E0916 05:32:35.630718 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.631138 kubelet[3272]: E0916 05:32:35.631068 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.631138 kubelet[3272]: W0916 05:32:35.631094 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.631138 kubelet[3272]: E0916 05:32:35.631127 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.631813 kubelet[3272]: E0916 05:32:35.631739 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.631813 kubelet[3272]: W0916 05:32:35.631775 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.631813 kubelet[3272]: E0916 05:32:35.631815 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.632306 kubelet[3272]: E0916 05:32:35.632256 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.632306 kubelet[3272]: W0916 05:32:35.632282 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.632539 kubelet[3272]: E0916 05:32:35.632365 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.632720 kubelet[3272]: E0916 05:32:35.632673 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.632720 kubelet[3272]: W0916 05:32:35.632698 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.632927 kubelet[3272]: E0916 05:32:35.632801 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.633173 kubelet[3272]: E0916 05:32:35.633102 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.633173 kubelet[3272]: W0916 05:32:35.633127 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.633395 kubelet[3272]: E0916 05:32:35.633234 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.633561 kubelet[3272]: E0916 05:32:35.633533 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.633675 kubelet[3272]: W0916 05:32:35.633561 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.633675 kubelet[3272]: E0916 05:32:35.633595 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.634137 kubelet[3272]: E0916 05:32:35.634088 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.634137 kubelet[3272]: W0916 05:32:35.634113 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.634364 kubelet[3272]: E0916 05:32:35.634145 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.634760 kubelet[3272]: E0916 05:32:35.634728 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.634881 kubelet[3272]: W0916 05:32:35.634760 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.634881 kubelet[3272]: E0916 05:32:35.634800 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.635385 kubelet[3272]: E0916 05:32:35.635331 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.635385 kubelet[3272]: W0916 05:32:35.635361 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.635627 kubelet[3272]: E0916 05:32:35.635396 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.636071 kubelet[3272]: E0916 05:32:35.636018 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.636071 kubelet[3272]: W0916 05:32:35.636059 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.636264 kubelet[3272]: E0916 05:32:35.636096 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.636625 kubelet[3272]: E0916 05:32:35.636545 3272 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 05:32:35.636625 kubelet[3272]: W0916 05:32:35.636573 3272 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 05:32:35.636625 kubelet[3272]: E0916 05:32:35.636602 3272 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 05:32:35.776729 containerd[1906]: time="2025-09-16T05:32:35.776644620Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:35.776923 containerd[1906]: time="2025-09-16T05:32:35.776884175Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 16 05:32:35.777312 containerd[1906]: time="2025-09-16T05:32:35.777272748Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:35.778107 containerd[1906]: time="2025-09-16T05:32:35.778063764Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:35.778421 containerd[1906]: time="2025-09-16T05:32:35.778379174Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.621846372s" Sep 16 05:32:35.778421 containerd[1906]: time="2025-09-16T05:32:35.778395584Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 16 05:32:35.779314 containerd[1906]: time="2025-09-16T05:32:35.779302548Z" level=info msg="CreateContainer within sandbox \"960cd73907568744534ec1cd0c0a235be2b6c9addccffd645996c3ea621461c0\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 16 05:32:35.782410 containerd[1906]: time="2025-09-16T05:32:35.782374221Z" level=info msg="Container 411fa45dbe20ade38f441ee53b42d4253c15a893a43b099a57486747dfdb945e: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:32:35.785537 containerd[1906]: time="2025-09-16T05:32:35.785495274Z" level=info msg="CreateContainer within sandbox \"960cd73907568744534ec1cd0c0a235be2b6c9addccffd645996c3ea621461c0\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"411fa45dbe20ade38f441ee53b42d4253c15a893a43b099a57486747dfdb945e\"" Sep 16 05:32:35.785733 containerd[1906]: time="2025-09-16T05:32:35.785718153Z" level=info msg="StartContainer for \"411fa45dbe20ade38f441ee53b42d4253c15a893a43b099a57486747dfdb945e\"" Sep 16 05:32:35.786543 containerd[1906]: time="2025-09-16T05:32:35.786529197Z" level=info msg="connecting to shim 411fa45dbe20ade38f441ee53b42d4253c15a893a43b099a57486747dfdb945e" address="unix:///run/containerd/s/c3250fb547e6525fe57da265d5b74a8c89b96274841848b1764abf6032bffd80" protocol=ttrpc version=3 Sep 16 05:32:35.807306 systemd[1]: Started cri-containerd-411fa45dbe20ade38f441ee53b42d4253c15a893a43b099a57486747dfdb945e.scope - libcontainer container 411fa45dbe20ade38f441ee53b42d4253c15a893a43b099a57486747dfdb945e. Sep 16 05:32:35.825542 containerd[1906]: time="2025-09-16T05:32:35.825514549Z" level=info msg="StartContainer for \"411fa45dbe20ade38f441ee53b42d4253c15a893a43b099a57486747dfdb945e\" returns successfully" Sep 16 05:32:35.828941 systemd[1]: cri-containerd-411fa45dbe20ade38f441ee53b42d4253c15a893a43b099a57486747dfdb945e.scope: Deactivated successfully. Sep 16 05:32:35.829733 containerd[1906]: time="2025-09-16T05:32:35.829713955Z" level=info msg="received exit event container_id:\"411fa45dbe20ade38f441ee53b42d4253c15a893a43b099a57486747dfdb945e\" id:\"411fa45dbe20ade38f441ee53b42d4253c15a893a43b099a57486747dfdb945e\" pid:4150 exited_at:{seconds:1758000755 nanos:829550707}" Sep 16 05:32:35.829773 containerd[1906]: time="2025-09-16T05:32:35.829744730Z" level=info msg="TaskExit event in podsandbox handler container_id:\"411fa45dbe20ade38f441ee53b42d4253c15a893a43b099a57486747dfdb945e\" id:\"411fa45dbe20ade38f441ee53b42d4253c15a893a43b099a57486747dfdb945e\" pid:4150 exited_at:{seconds:1758000755 nanos:829550707}" Sep 16 05:32:35.839574 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-411fa45dbe20ade38f441ee53b42d4253c15a893a43b099a57486747dfdb945e-rootfs.mount: Deactivated successfully. Sep 16 05:32:37.493151 kubelet[3272]: E0916 05:32:37.493054 3272 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-576jh" podUID="0d3912ee-3cd2-402b-ab83-a92684e5a0cd" Sep 16 05:32:37.572707 containerd[1906]: time="2025-09-16T05:32:37.572618310Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 16 05:32:38.367970 sshd[4099]: Invalid user NL5xUDpV2xRa from 45.32.113.42 port 47226 Sep 16 05:32:38.373535 sshd[4099]: userauth_pubkey: parse publickey packet: incomplete message [preauth] Sep 16 05:32:38.377451 systemd[1]: sshd@11-139.178.94.21:22-45.32.113.42:47226.service: Deactivated successfully. Sep 16 05:32:39.065380 kubelet[3272]: I0916 05:32:39.065299 3272 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 05:32:39.492752 kubelet[3272]: E0916 05:32:39.492617 3272 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-576jh" podUID="0d3912ee-3cd2-402b-ab83-a92684e5a0cd" Sep 16 05:32:40.651579 containerd[1906]: time="2025-09-16T05:32:40.651527473Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:40.651773 containerd[1906]: time="2025-09-16T05:32:40.651722005Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 16 05:32:40.652070 containerd[1906]: time="2025-09-16T05:32:40.652020525Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:40.652892 containerd[1906]: time="2025-09-16T05:32:40.652851800Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:40.653550 containerd[1906]: time="2025-09-16T05:32:40.653508289Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.080805843s" Sep 16 05:32:40.653550 containerd[1906]: time="2025-09-16T05:32:40.653524149Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 16 05:32:40.654346 containerd[1906]: time="2025-09-16T05:32:40.654331662Z" level=info msg="CreateContainer within sandbox \"960cd73907568744534ec1cd0c0a235be2b6c9addccffd645996c3ea621461c0\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 16 05:32:40.657622 containerd[1906]: time="2025-09-16T05:32:40.657577973Z" level=info msg="Container 29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:32:40.661677 containerd[1906]: time="2025-09-16T05:32:40.661626611Z" level=info msg="CreateContainer within sandbox \"960cd73907568744534ec1cd0c0a235be2b6c9addccffd645996c3ea621461c0\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96\"" Sep 16 05:32:40.661860 containerd[1906]: time="2025-09-16T05:32:40.661847480Z" level=info msg="StartContainer for \"29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96\"" Sep 16 05:32:40.662619 containerd[1906]: time="2025-09-16T05:32:40.662576956Z" level=info msg="connecting to shim 29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96" address="unix:///run/containerd/s/c3250fb547e6525fe57da265d5b74a8c89b96274841848b1764abf6032bffd80" protocol=ttrpc version=3 Sep 16 05:32:40.684240 systemd[1]: Started cri-containerd-29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96.scope - libcontainer container 29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96. Sep 16 05:32:40.702296 containerd[1906]: time="2025-09-16T05:32:40.702276530Z" level=info msg="StartContainer for \"29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96\" returns successfully" Sep 16 05:32:41.317800 containerd[1906]: time="2025-09-16T05:32:41.317765150Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 16 05:32:41.319108 systemd[1]: cri-containerd-29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96.scope: Deactivated successfully. Sep 16 05:32:41.319321 systemd[1]: cri-containerd-29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96.scope: Consumed 371ms CPU time, 193.1M memory peak, 171.3M written to disk. Sep 16 05:32:41.320255 containerd[1906]: time="2025-09-16T05:32:41.320196939Z" level=info msg="received exit event container_id:\"29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96\" id:\"29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96\" pid:4218 exited_at:{seconds:1758000761 nanos:320046225}" Sep 16 05:32:41.320313 containerd[1906]: time="2025-09-16T05:32:41.320269622Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96\" id:\"29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96\" pid:4218 exited_at:{seconds:1758000761 nanos:320046225}" Sep 16 05:32:41.336518 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96-rootfs.mount: Deactivated successfully. Sep 16 05:32:41.359627 kubelet[3272]: I0916 05:32:41.359610 3272 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 16 05:32:41.381326 systemd[1]: Created slice kubepods-burstable-pod01b2b6da_0bd4_475d_aa4d_8dc67b3100e4.slice - libcontainer container kubepods-burstable-pod01b2b6da_0bd4_475d_aa4d_8dc67b3100e4.slice. Sep 16 05:32:41.387772 systemd[1]: Created slice kubepods-burstable-pod3262e0d4_7fee_4cbc_9baf_bc833d218f45.slice - libcontainer container kubepods-burstable-pod3262e0d4_7fee_4cbc_9baf_bc833d218f45.slice. Sep 16 05:32:41.393710 systemd[1]: Created slice kubepods-besteffort-pod13d60bc1_c4c8_48bc_b2cf_6befd811be20.slice - libcontainer container kubepods-besteffort-pod13d60bc1_c4c8_48bc_b2cf_6befd811be20.slice. Sep 16 05:32:41.400015 systemd[1]: Created slice kubepods-besteffort-pod31261ed2_aa62_4f40_998f_c07af6124687.slice - libcontainer container kubepods-besteffort-pod31261ed2_aa62_4f40_998f_c07af6124687.slice. Sep 16 05:32:41.405913 systemd[1]: Created slice kubepods-besteffort-podcc5427eb_63f9_46a9_913f_a95379cd7953.slice - libcontainer container kubepods-besteffort-podcc5427eb_63f9_46a9_913f_a95379cd7953.slice. Sep 16 05:32:41.412649 systemd[1]: Created slice kubepods-besteffort-pod5307370e_d59b_4a28_b277_586e844c8a66.slice - libcontainer container kubepods-besteffort-pod5307370e_d59b_4a28_b277_586e844c8a66.slice. Sep 16 05:32:41.418051 systemd[1]: Created slice kubepods-besteffort-pod905ec77c_cbec_4843_b250_4baac4cf8a0c.slice - libcontainer container kubepods-besteffort-pod905ec77c_cbec_4843_b250_4baac4cf8a0c.slice. Sep 16 05:32:41.473020 kubelet[3272]: I0916 05:32:41.472861 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13d60bc1-c4c8-48bc-b2cf-6befd811be20-tigera-ca-bundle\") pod \"calico-kube-controllers-76c6977657-fx526\" (UID: \"13d60bc1-c4c8-48bc-b2cf-6befd811be20\") " pod="calico-system/calico-kube-controllers-76c6977657-fx526" Sep 16 05:32:41.473287 kubelet[3272]: I0916 05:32:41.473045 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/cc5427eb-63f9-46a9-913f-a95379cd7953-calico-apiserver-certs\") pod \"calico-apiserver-6dfcc9ff54-dmntm\" (UID: \"cc5427eb-63f9-46a9-913f-a95379cd7953\") " pod="calico-apiserver/calico-apiserver-6dfcc9ff54-dmntm" Sep 16 05:32:41.473287 kubelet[3272]: I0916 05:32:41.473231 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/905ec77c-cbec-4843-b250-4baac4cf8a0c-config\") pod \"goldmane-54d579b49d-4mpgc\" (UID: \"905ec77c-cbec-4843-b250-4baac4cf8a0c\") " pod="calico-system/goldmane-54d579b49d-4mpgc" Sep 16 05:32:41.473510 kubelet[3272]: I0916 05:32:41.473338 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5307370e-d59b-4a28-b277-586e844c8a66-whisker-backend-key-pair\") pod \"whisker-755dcdbcc5-zjjhj\" (UID: \"5307370e-d59b-4a28-b277-586e844c8a66\") " pod="calico-system/whisker-755dcdbcc5-zjjhj" Sep 16 05:32:41.473510 kubelet[3272]: I0916 05:32:41.473446 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5307370e-d59b-4a28-b277-586e844c8a66-whisker-ca-bundle\") pod \"whisker-755dcdbcc5-zjjhj\" (UID: \"5307370e-d59b-4a28-b277-586e844c8a66\") " pod="calico-system/whisker-755dcdbcc5-zjjhj" Sep 16 05:32:41.473698 kubelet[3272]: I0916 05:32:41.473546 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24cvk\" (UniqueName: \"kubernetes.io/projected/3262e0d4-7fee-4cbc-9baf-bc833d218f45-kube-api-access-24cvk\") pod \"coredns-668d6bf9bc-sgd4r\" (UID: \"3262e0d4-7fee-4cbc-9baf-bc833d218f45\") " pod="kube-system/coredns-668d6bf9bc-sgd4r" Sep 16 05:32:41.473698 kubelet[3272]: I0916 05:32:41.473650 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01b2b6da-0bd4-475d-aa4d-8dc67b3100e4-config-volume\") pod \"coredns-668d6bf9bc-gl6kf\" (UID: \"01b2b6da-0bd4-475d-aa4d-8dc67b3100e4\") " pod="kube-system/coredns-668d6bf9bc-gl6kf" Sep 16 05:32:41.473939 kubelet[3272]: I0916 05:32:41.473745 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlprw\" (UniqueName: \"kubernetes.io/projected/01b2b6da-0bd4-475d-aa4d-8dc67b3100e4-kube-api-access-wlprw\") pod \"coredns-668d6bf9bc-gl6kf\" (UID: \"01b2b6da-0bd4-475d-aa4d-8dc67b3100e4\") " pod="kube-system/coredns-668d6bf9bc-gl6kf" Sep 16 05:32:41.473939 kubelet[3272]: I0916 05:32:41.473847 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xmw9\" (UniqueName: \"kubernetes.io/projected/5307370e-d59b-4a28-b277-586e844c8a66-kube-api-access-2xmw9\") pod \"whisker-755dcdbcc5-zjjhj\" (UID: \"5307370e-d59b-4a28-b277-586e844c8a66\") " pod="calico-system/whisker-755dcdbcc5-zjjhj" Sep 16 05:32:41.473939 kubelet[3272]: I0916 05:32:41.473926 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t865\" (UniqueName: \"kubernetes.io/projected/13d60bc1-c4c8-48bc-b2cf-6befd811be20-kube-api-access-2t865\") pod \"calico-kube-controllers-76c6977657-fx526\" (UID: \"13d60bc1-c4c8-48bc-b2cf-6befd811be20\") " pod="calico-system/calico-kube-controllers-76c6977657-fx526" Sep 16 05:32:41.474375 kubelet[3272]: I0916 05:32:41.474014 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3262e0d4-7fee-4cbc-9baf-bc833d218f45-config-volume\") pod \"coredns-668d6bf9bc-sgd4r\" (UID: \"3262e0d4-7fee-4cbc-9baf-bc833d218f45\") " pod="kube-system/coredns-668d6bf9bc-sgd4r" Sep 16 05:32:41.474375 kubelet[3272]: I0916 05:32:41.474092 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgrz5\" (UniqueName: \"kubernetes.io/projected/cc5427eb-63f9-46a9-913f-a95379cd7953-kube-api-access-fgrz5\") pod \"calico-apiserver-6dfcc9ff54-dmntm\" (UID: \"cc5427eb-63f9-46a9-913f-a95379cd7953\") " pod="calico-apiserver/calico-apiserver-6dfcc9ff54-dmntm" Sep 16 05:32:41.474375 kubelet[3272]: I0916 05:32:41.474145 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/905ec77c-cbec-4843-b250-4baac4cf8a0c-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-4mpgc\" (UID: \"905ec77c-cbec-4843-b250-4baac4cf8a0c\") " pod="calico-system/goldmane-54d579b49d-4mpgc" Sep 16 05:32:41.474375 kubelet[3272]: I0916 05:32:41.474193 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-959hh\" (UniqueName: \"kubernetes.io/projected/31261ed2-aa62-4f40-998f-c07af6124687-kube-api-access-959hh\") pod \"calico-apiserver-6dfcc9ff54-xnr9g\" (UID: \"31261ed2-aa62-4f40-998f-c07af6124687\") " pod="calico-apiserver/calico-apiserver-6dfcc9ff54-xnr9g" Sep 16 05:32:41.474375 kubelet[3272]: I0916 05:32:41.474319 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/905ec77c-cbec-4843-b250-4baac4cf8a0c-goldmane-key-pair\") pod \"goldmane-54d579b49d-4mpgc\" (UID: \"905ec77c-cbec-4843-b250-4baac4cf8a0c\") " pod="calico-system/goldmane-54d579b49d-4mpgc" Sep 16 05:32:41.474911 kubelet[3272]: I0916 05:32:41.474404 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p6gq\" (UniqueName: \"kubernetes.io/projected/905ec77c-cbec-4843-b250-4baac4cf8a0c-kube-api-access-6p6gq\") pod \"goldmane-54d579b49d-4mpgc\" (UID: \"905ec77c-cbec-4843-b250-4baac4cf8a0c\") " pod="calico-system/goldmane-54d579b49d-4mpgc" Sep 16 05:32:41.474911 kubelet[3272]: I0916 05:32:41.474461 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/31261ed2-aa62-4f40-998f-c07af6124687-calico-apiserver-certs\") pod \"calico-apiserver-6dfcc9ff54-xnr9g\" (UID: \"31261ed2-aa62-4f40-998f-c07af6124687\") " pod="calico-apiserver/calico-apiserver-6dfcc9ff54-xnr9g" Sep 16 05:32:41.509091 systemd[1]: Created slice kubepods-besteffort-pod0d3912ee_3cd2_402b_ab83_a92684e5a0cd.slice - libcontainer container kubepods-besteffort-pod0d3912ee_3cd2_402b_ab83_a92684e5a0cd.slice. Sep 16 05:32:41.535086 containerd[1906]: time="2025-09-16T05:32:41.534945596Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-576jh,Uid:0d3912ee-3cd2-402b-ab83-a92684e5a0cd,Namespace:calico-system,Attempt:0,}" Sep 16 05:32:41.697103 containerd[1906]: time="2025-09-16T05:32:41.697014352Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76c6977657-fx526,Uid:13d60bc1-c4c8-48bc-b2cf-6befd811be20,Namespace:calico-system,Attempt:0,}" Sep 16 05:32:41.703477 containerd[1906]: time="2025-09-16T05:32:41.703454826Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dfcc9ff54-xnr9g,Uid:31261ed2-aa62-4f40-998f-c07af6124687,Namespace:calico-apiserver,Attempt:0,}" Sep 16 05:32:41.708999 containerd[1906]: time="2025-09-16T05:32:41.708966532Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dfcc9ff54-dmntm,Uid:cc5427eb-63f9-46a9-913f-a95379cd7953,Namespace:calico-apiserver,Attempt:0,}" Sep 16 05:32:41.709457 containerd[1906]: time="2025-09-16T05:32:41.709439037Z" level=error msg="Failed to destroy network for sandbox \"2e2db20d60a87e2e8eada9ac72723c2e38b2356fa26c10fc0ed8914e729a507a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.710890 systemd[1]: run-netns-cni\x2d35e1057c\x2d4fe3\x2def27\x2d9c45\x2dd0acb5009490.mount: Deactivated successfully. Sep 16 05:32:41.715486 containerd[1906]: time="2025-09-16T05:32:41.715461721Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-755dcdbcc5-zjjhj,Uid:5307370e-d59b-4a28-b277-586e844c8a66,Namespace:calico-system,Attempt:0,}" Sep 16 05:32:41.721024 containerd[1906]: time="2025-09-16T05:32:41.720995143Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-4mpgc,Uid:905ec77c-cbec-4843-b250-4baac4cf8a0c,Namespace:calico-system,Attempt:0,}" Sep 16 05:32:41.723371 containerd[1906]: time="2025-09-16T05:32:41.723345144Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-576jh,Uid:0d3912ee-3cd2-402b-ab83-a92684e5a0cd,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e2db20d60a87e2e8eada9ac72723c2e38b2356fa26c10fc0ed8914e729a507a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.723469 containerd[1906]: time="2025-09-16T05:32:41.723451063Z" level=error msg="Failed to destroy network for sandbox \"268adfd35af513c2633d8b4121440f03b80b24fedbd8b4fc0755fa7cf6222cc1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.723506 kubelet[3272]: E0916 05:32:41.723460 3272 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e2db20d60a87e2e8eada9ac72723c2e38b2356fa26c10fc0ed8914e729a507a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.723506 kubelet[3272]: E0916 05:32:41.723502 3272 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e2db20d60a87e2e8eada9ac72723c2e38b2356fa26c10fc0ed8914e729a507a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-576jh" Sep 16 05:32:41.723600 kubelet[3272]: E0916 05:32:41.723516 3272 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e2db20d60a87e2e8eada9ac72723c2e38b2356fa26c10fc0ed8914e729a507a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-576jh" Sep 16 05:32:41.723600 kubelet[3272]: E0916 05:32:41.723546 3272 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-576jh_calico-system(0d3912ee-3cd2-402b-ab83-a92684e5a0cd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-576jh_calico-system(0d3912ee-3cd2-402b-ab83-a92684e5a0cd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2e2db20d60a87e2e8eada9ac72723c2e38b2356fa26c10fc0ed8914e729a507a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-576jh" podUID="0d3912ee-3cd2-402b-ab83-a92684e5a0cd" Sep 16 05:32:41.726874 containerd[1906]: time="2025-09-16T05:32:41.726852967Z" level=error msg="Failed to destroy network for sandbox \"3a0e76a94ca67412dacf011dc3c10f3685c32b4ad03bfada3102bf36c047460e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.727792 containerd[1906]: time="2025-09-16T05:32:41.727771042Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76c6977657-fx526,Uid:13d60bc1-c4c8-48bc-b2cf-6befd811be20,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"268adfd35af513c2633d8b4121440f03b80b24fedbd8b4fc0755fa7cf6222cc1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.727919 kubelet[3272]: E0916 05:32:41.727897 3272 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"268adfd35af513c2633d8b4121440f03b80b24fedbd8b4fc0755fa7cf6222cc1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.727961 kubelet[3272]: E0916 05:32:41.727938 3272 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"268adfd35af513c2633d8b4121440f03b80b24fedbd8b4fc0755fa7cf6222cc1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76c6977657-fx526" Sep 16 05:32:41.727961 kubelet[3272]: E0916 05:32:41.727953 3272 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"268adfd35af513c2633d8b4121440f03b80b24fedbd8b4fc0755fa7cf6222cc1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76c6977657-fx526" Sep 16 05:32:41.728027 kubelet[3272]: E0916 05:32:41.727980 3272 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-76c6977657-fx526_calico-system(13d60bc1-c4c8-48bc-b2cf-6befd811be20)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-76c6977657-fx526_calico-system(13d60bc1-c4c8-48bc-b2cf-6befd811be20)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"268adfd35af513c2633d8b4121440f03b80b24fedbd8b4fc0755fa7cf6222cc1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-76c6977657-fx526" podUID="13d60bc1-c4c8-48bc-b2cf-6befd811be20" Sep 16 05:32:41.728079 containerd[1906]: time="2025-09-16T05:32:41.728021055Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dfcc9ff54-xnr9g,Uid:31261ed2-aa62-4f40-998f-c07af6124687,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a0e76a94ca67412dacf011dc3c10f3685c32b4ad03bfada3102bf36c047460e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.728131 kubelet[3272]: E0916 05:32:41.728096 3272 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a0e76a94ca67412dacf011dc3c10f3685c32b4ad03bfada3102bf36c047460e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.728131 kubelet[3272]: E0916 05:32:41.728114 3272 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a0e76a94ca67412dacf011dc3c10f3685c32b4ad03bfada3102bf36c047460e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6dfcc9ff54-xnr9g" Sep 16 05:32:41.728131 kubelet[3272]: E0916 05:32:41.728124 3272 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a0e76a94ca67412dacf011dc3c10f3685c32b4ad03bfada3102bf36c047460e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6dfcc9ff54-xnr9g" Sep 16 05:32:41.728214 kubelet[3272]: E0916 05:32:41.728141 3272 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6dfcc9ff54-xnr9g_calico-apiserver(31261ed2-aa62-4f40-998f-c07af6124687)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6dfcc9ff54-xnr9g_calico-apiserver(31261ed2-aa62-4f40-998f-c07af6124687)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3a0e76a94ca67412dacf011dc3c10f3685c32b4ad03bfada3102bf36c047460e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6dfcc9ff54-xnr9g" podUID="31261ed2-aa62-4f40-998f-c07af6124687" Sep 16 05:32:41.750439 containerd[1906]: time="2025-09-16T05:32:41.750408960Z" level=error msg="Failed to destroy network for sandbox \"4fbbc8695a26ae558d04fc3da9e599ef10a7d4ccddd799f718e55e4a745e8aa0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.750571 containerd[1906]: time="2025-09-16T05:32:41.750554212Z" level=error msg="Failed to destroy network for sandbox \"6a69f4286295241ab0c71965b74e1c546f990f746207f9bcc2da36a64cb0b7b6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.750779 containerd[1906]: time="2025-09-16T05:32:41.750763740Z" level=error msg="Failed to destroy network for sandbox \"ac27ba0d89f6ca7e64dd4b3166d57b70b71f74c94e7132ef59e0700ed4b1eb2e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.750842 containerd[1906]: time="2025-09-16T05:32:41.750827970Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dfcc9ff54-dmntm,Uid:cc5427eb-63f9-46a9-913f-a95379cd7953,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4fbbc8695a26ae558d04fc3da9e599ef10a7d4ccddd799f718e55e4a745e8aa0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.750979 kubelet[3272]: E0916 05:32:41.750956 3272 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4fbbc8695a26ae558d04fc3da9e599ef10a7d4ccddd799f718e55e4a745e8aa0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.751028 kubelet[3272]: E0916 05:32:41.751007 3272 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4fbbc8695a26ae558d04fc3da9e599ef10a7d4ccddd799f718e55e4a745e8aa0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6dfcc9ff54-dmntm" Sep 16 05:32:41.751061 kubelet[3272]: E0916 05:32:41.751026 3272 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4fbbc8695a26ae558d04fc3da9e599ef10a7d4ccddd799f718e55e4a745e8aa0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6dfcc9ff54-dmntm" Sep 16 05:32:41.751094 kubelet[3272]: E0916 05:32:41.751067 3272 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6dfcc9ff54-dmntm_calico-apiserver(cc5427eb-63f9-46a9-913f-a95379cd7953)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6dfcc9ff54-dmntm_calico-apiserver(cc5427eb-63f9-46a9-913f-a95379cd7953)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4fbbc8695a26ae558d04fc3da9e599ef10a7d4ccddd799f718e55e4a745e8aa0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6dfcc9ff54-dmntm" podUID="cc5427eb-63f9-46a9-913f-a95379cd7953" Sep 16 05:32:41.751142 containerd[1906]: time="2025-09-16T05:32:41.751073367Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-4mpgc,Uid:905ec77c-cbec-4843-b250-4baac4cf8a0c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac27ba0d89f6ca7e64dd4b3166d57b70b71f74c94e7132ef59e0700ed4b1eb2e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.751186 kubelet[3272]: E0916 05:32:41.751152 3272 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac27ba0d89f6ca7e64dd4b3166d57b70b71f74c94e7132ef59e0700ed4b1eb2e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.751186 kubelet[3272]: E0916 05:32:41.751176 3272 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac27ba0d89f6ca7e64dd4b3166d57b70b71f74c94e7132ef59e0700ed4b1eb2e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-4mpgc" Sep 16 05:32:41.751242 kubelet[3272]: E0916 05:32:41.751186 3272 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac27ba0d89f6ca7e64dd4b3166d57b70b71f74c94e7132ef59e0700ed4b1eb2e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-4mpgc" Sep 16 05:32:41.751242 kubelet[3272]: E0916 05:32:41.751203 3272 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-4mpgc_calico-system(905ec77c-cbec-4843-b250-4baac4cf8a0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-4mpgc_calico-system(905ec77c-cbec-4843-b250-4baac4cf8a0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ac27ba0d89f6ca7e64dd4b3166d57b70b71f74c94e7132ef59e0700ed4b1eb2e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-4mpgc" podUID="905ec77c-cbec-4843-b250-4baac4cf8a0c" Sep 16 05:32:41.751349 containerd[1906]: time="2025-09-16T05:32:41.751334290Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-755dcdbcc5-zjjhj,Uid:5307370e-d59b-4a28-b277-586e844c8a66,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a69f4286295241ab0c71965b74e1c546f990f746207f9bcc2da36a64cb0b7b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.751416 kubelet[3272]: E0916 05:32:41.751402 3272 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a69f4286295241ab0c71965b74e1c546f990f746207f9bcc2da36a64cb0b7b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:41.751442 kubelet[3272]: E0916 05:32:41.751423 3272 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a69f4286295241ab0c71965b74e1c546f990f746207f9bcc2da36a64cb0b7b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-755dcdbcc5-zjjhj" Sep 16 05:32:41.751442 kubelet[3272]: E0916 05:32:41.751435 3272 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a69f4286295241ab0c71965b74e1c546f990f746207f9bcc2da36a64cb0b7b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-755dcdbcc5-zjjhj" Sep 16 05:32:41.751476 kubelet[3272]: E0916 05:32:41.751455 3272 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-755dcdbcc5-zjjhj_calico-system(5307370e-d59b-4a28-b277-586e844c8a66)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-755dcdbcc5-zjjhj_calico-system(5307370e-d59b-4a28-b277-586e844c8a66)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6a69f4286295241ab0c71965b74e1c546f990f746207f9bcc2da36a64cb0b7b6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-755dcdbcc5-zjjhj" podUID="5307370e-d59b-4a28-b277-586e844c8a66" Sep 16 05:32:41.986570 containerd[1906]: time="2025-09-16T05:32:41.986329915Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gl6kf,Uid:01b2b6da-0bd4-475d-aa4d-8dc67b3100e4,Namespace:kube-system,Attempt:0,}" Sep 16 05:32:41.990912 containerd[1906]: time="2025-09-16T05:32:41.990894556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-sgd4r,Uid:3262e0d4-7fee-4cbc-9baf-bc833d218f45,Namespace:kube-system,Attempt:0,}" Sep 16 05:32:42.013289 containerd[1906]: time="2025-09-16T05:32:42.013248381Z" level=error msg="Failed to destroy network for sandbox \"d9517f1f36963f2cd12bd198c279f03b3016525f2681bc5ed28d837f8af306e4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:42.013783 containerd[1906]: time="2025-09-16T05:32:42.013768797Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gl6kf,Uid:01b2b6da-0bd4-475d-aa4d-8dc67b3100e4,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d9517f1f36963f2cd12bd198c279f03b3016525f2681bc5ed28d837f8af306e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:42.013940 kubelet[3272]: E0916 05:32:42.013891 3272 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d9517f1f36963f2cd12bd198c279f03b3016525f2681bc5ed28d837f8af306e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:42.014027 kubelet[3272]: E0916 05:32:42.013951 3272 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d9517f1f36963f2cd12bd198c279f03b3016525f2681bc5ed28d837f8af306e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-gl6kf" Sep 16 05:32:42.014027 kubelet[3272]: E0916 05:32:42.013967 3272 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d9517f1f36963f2cd12bd198c279f03b3016525f2681bc5ed28d837f8af306e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-gl6kf" Sep 16 05:32:42.014103 kubelet[3272]: E0916 05:32:42.014000 3272 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-gl6kf_kube-system(01b2b6da-0bd4-475d-aa4d-8dc67b3100e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-gl6kf_kube-system(01b2b6da-0bd4-475d-aa4d-8dc67b3100e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d9517f1f36963f2cd12bd198c279f03b3016525f2681bc5ed28d837f8af306e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-gl6kf" podUID="01b2b6da-0bd4-475d-aa4d-8dc67b3100e4" Sep 16 05:32:42.015622 containerd[1906]: time="2025-09-16T05:32:42.015576245Z" level=error msg="Failed to destroy network for sandbox \"df1b9be391768859c1e91d022a0735d5c53c7fce983d5eacac71382561b57f9e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:42.016098 containerd[1906]: time="2025-09-16T05:32:42.016057407Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-sgd4r,Uid:3262e0d4-7fee-4cbc-9baf-bc833d218f45,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"df1b9be391768859c1e91d022a0735d5c53c7fce983d5eacac71382561b57f9e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:42.016197 kubelet[3272]: E0916 05:32:42.016184 3272 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df1b9be391768859c1e91d022a0735d5c53c7fce983d5eacac71382561b57f9e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 05:32:42.016223 kubelet[3272]: E0916 05:32:42.016209 3272 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df1b9be391768859c1e91d022a0735d5c53c7fce983d5eacac71382561b57f9e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-sgd4r" Sep 16 05:32:42.016241 kubelet[3272]: E0916 05:32:42.016220 3272 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df1b9be391768859c1e91d022a0735d5c53c7fce983d5eacac71382561b57f9e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-sgd4r" Sep 16 05:32:42.016259 kubelet[3272]: E0916 05:32:42.016238 3272 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-sgd4r_kube-system(3262e0d4-7fee-4cbc-9baf-bc833d218f45)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-sgd4r_kube-system(3262e0d4-7fee-4cbc-9baf-bc833d218f45)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"df1b9be391768859c1e91d022a0735d5c53c7fce983d5eacac71382561b57f9e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-sgd4r" podUID="3262e0d4-7fee-4cbc-9baf-bc833d218f45" Sep 16 05:32:42.594288 containerd[1906]: time="2025-09-16T05:32:42.594211253Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 16 05:32:42.663257 systemd[1]: run-netns-cni\x2d08d5d3d5\x2d3fd8\x2d2e4f\x2df091\x2d136e4ebf523b.mount: Deactivated successfully. Sep 16 05:32:42.663410 systemd[1]: run-netns-cni\x2d52417ed0\x2de75d\x2d8274\x2d9677\x2d5e80c2145f29.mount: Deactivated successfully. Sep 16 05:32:42.663445 systemd[1]: run-netns-cni\x2d85833585\x2d7896\x2def24\x2de35d\x2db0be9cf9fd9c.mount: Deactivated successfully. Sep 16 05:32:48.045039 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3375590676.mount: Deactivated successfully. Sep 16 05:32:48.070484 containerd[1906]: time="2025-09-16T05:32:48.070429150Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:48.070722 containerd[1906]: time="2025-09-16T05:32:48.070652760Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 16 05:32:48.070960 containerd[1906]: time="2025-09-16T05:32:48.070949404Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:48.071777 containerd[1906]: time="2025-09-16T05:32:48.071766336Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:48.072131 containerd[1906]: time="2025-09-16T05:32:48.072116959Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 5.477836453s" Sep 16 05:32:48.072131 containerd[1906]: time="2025-09-16T05:32:48.072132312Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 16 05:32:48.075668 containerd[1906]: time="2025-09-16T05:32:48.075620493Z" level=info msg="CreateContainer within sandbox \"960cd73907568744534ec1cd0c0a235be2b6c9addccffd645996c3ea621461c0\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 16 05:32:48.079387 containerd[1906]: time="2025-09-16T05:32:48.079342925Z" level=info msg="Container b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:32:48.083271 containerd[1906]: time="2025-09-16T05:32:48.083257829Z" level=info msg="CreateContainer within sandbox \"960cd73907568744534ec1cd0c0a235be2b6c9addccffd645996c3ea621461c0\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\"" Sep 16 05:32:48.083526 containerd[1906]: time="2025-09-16T05:32:48.083474739Z" level=info msg="StartContainer for \"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\"" Sep 16 05:32:48.084247 containerd[1906]: time="2025-09-16T05:32:48.084234521Z" level=info msg="connecting to shim b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d" address="unix:///run/containerd/s/c3250fb547e6525fe57da265d5b74a8c89b96274841848b1764abf6032bffd80" protocol=ttrpc version=3 Sep 16 05:32:48.103486 systemd[1]: Started cri-containerd-b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d.scope - libcontainer container b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d. Sep 16 05:32:48.163739 containerd[1906]: time="2025-09-16T05:32:48.163680346Z" level=info msg="StartContainer for \"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\" returns successfully" Sep 16 05:32:48.239850 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 16 05:32:48.239904 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 16 05:32:48.323888 kubelet[3272]: I0916 05:32:48.323864 3272 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xmw9\" (UniqueName: \"kubernetes.io/projected/5307370e-d59b-4a28-b277-586e844c8a66-kube-api-access-2xmw9\") pod \"5307370e-d59b-4a28-b277-586e844c8a66\" (UID: \"5307370e-d59b-4a28-b277-586e844c8a66\") " Sep 16 05:32:48.323888 kubelet[3272]: I0916 05:32:48.323892 3272 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5307370e-d59b-4a28-b277-586e844c8a66-whisker-ca-bundle\") pod \"5307370e-d59b-4a28-b277-586e844c8a66\" (UID: \"5307370e-d59b-4a28-b277-586e844c8a66\") " Sep 16 05:32:48.324200 kubelet[3272]: I0916 05:32:48.323907 3272 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5307370e-d59b-4a28-b277-586e844c8a66-whisker-backend-key-pair\") pod \"5307370e-d59b-4a28-b277-586e844c8a66\" (UID: \"5307370e-d59b-4a28-b277-586e844c8a66\") " Sep 16 05:32:48.324200 kubelet[3272]: I0916 05:32:48.324128 3272 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5307370e-d59b-4a28-b277-586e844c8a66-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "5307370e-d59b-4a28-b277-586e844c8a66" (UID: "5307370e-d59b-4a28-b277-586e844c8a66"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 16 05:32:48.325351 kubelet[3272]: I0916 05:32:48.325313 3272 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5307370e-d59b-4a28-b277-586e844c8a66-kube-api-access-2xmw9" (OuterVolumeSpecName: "kube-api-access-2xmw9") pod "5307370e-d59b-4a28-b277-586e844c8a66" (UID: "5307370e-d59b-4a28-b277-586e844c8a66"). InnerVolumeSpecName "kube-api-access-2xmw9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 16 05:32:48.325437 kubelet[3272]: I0916 05:32:48.325399 3272 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5307370e-d59b-4a28-b277-586e844c8a66-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "5307370e-d59b-4a28-b277-586e844c8a66" (UID: "5307370e-d59b-4a28-b277-586e844c8a66"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 16 05:32:48.424771 kubelet[3272]: I0916 05:32:48.424646 3272 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5307370e-d59b-4a28-b277-586e844c8a66-whisker-backend-key-pair\") on node \"ci-4459.0.0-n-e7bf2d745b\" DevicePath \"\"" Sep 16 05:32:48.424771 kubelet[3272]: I0916 05:32:48.424720 3272 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2xmw9\" (UniqueName: \"kubernetes.io/projected/5307370e-d59b-4a28-b277-586e844c8a66-kube-api-access-2xmw9\") on node \"ci-4459.0.0-n-e7bf2d745b\" DevicePath \"\"" Sep 16 05:32:48.424771 kubelet[3272]: I0916 05:32:48.424749 3272 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5307370e-d59b-4a28-b277-586e844c8a66-whisker-ca-bundle\") on node \"ci-4459.0.0-n-e7bf2d745b\" DevicePath \"\"" Sep 16 05:32:48.508367 systemd[1]: Removed slice kubepods-besteffort-pod5307370e_d59b_4a28_b277_586e844c8a66.slice - libcontainer container kubepods-besteffort-pod5307370e_d59b_4a28_b277_586e844c8a66.slice. Sep 16 05:32:48.653778 kubelet[3272]: I0916 05:32:48.653571 3272 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-htsjr" podStartSLOduration=1.593028192 podStartE2EDuration="17.653537869s" podCreationTimestamp="2025-09-16 05:32:31 +0000 UTC" firstStartedPulling="2025-09-16 05:32:32.011922397 +0000 UTC m=+15.582848436" lastFinishedPulling="2025-09-16 05:32:48.072432078 +0000 UTC m=+31.643358113" observedRunningTime="2025-09-16 05:32:48.653079048 +0000 UTC m=+32.224005210" watchObservedRunningTime="2025-09-16 05:32:48.653537869 +0000 UTC m=+32.224463946" Sep 16 05:32:48.718940 systemd[1]: Created slice kubepods-besteffort-pode4728b7b_7449_4e92_a28e_38d252c53d31.slice - libcontainer container kubepods-besteffort-pode4728b7b_7449_4e92_a28e_38d252c53d31.slice. Sep 16 05:32:48.727462 kubelet[3272]: I0916 05:32:48.727376 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e4728b7b-7449-4e92-a28e-38d252c53d31-whisker-backend-key-pair\") pod \"whisker-bdc6877b6-c2vnv\" (UID: \"e4728b7b-7449-4e92-a28e-38d252c53d31\") " pod="calico-system/whisker-bdc6877b6-c2vnv" Sep 16 05:32:48.727725 kubelet[3272]: I0916 05:32:48.727485 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4728b7b-7449-4e92-a28e-38d252c53d31-whisker-ca-bundle\") pod \"whisker-bdc6877b6-c2vnv\" (UID: \"e4728b7b-7449-4e92-a28e-38d252c53d31\") " pod="calico-system/whisker-bdc6877b6-c2vnv" Sep 16 05:32:48.727725 kubelet[3272]: I0916 05:32:48.727677 3272 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx2nd\" (UniqueName: \"kubernetes.io/projected/e4728b7b-7449-4e92-a28e-38d252c53d31-kube-api-access-vx2nd\") pod \"whisker-bdc6877b6-c2vnv\" (UID: \"e4728b7b-7449-4e92-a28e-38d252c53d31\") " pod="calico-system/whisker-bdc6877b6-c2vnv" Sep 16 05:32:49.026669 containerd[1906]: time="2025-09-16T05:32:49.026462601Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-bdc6877b6-c2vnv,Uid:e4728b7b-7449-4e92-a28e-38d252c53d31,Namespace:calico-system,Attempt:0,}" Sep 16 05:32:49.048236 systemd[1]: var-lib-kubelet-pods-5307370e\x2dd59b\x2d4a28\x2db277\x2d586e844c8a66-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d2xmw9.mount: Deactivated successfully. Sep 16 05:32:49.048306 systemd[1]: var-lib-kubelet-pods-5307370e\x2dd59b\x2d4a28\x2db277\x2d586e844c8a66-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 16 05:32:49.079629 systemd-networkd[1822]: cali8e2d46629f9: Link UP Sep 16 05:32:49.079757 systemd-networkd[1822]: cali8e2d46629f9: Gained carrier Sep 16 05:32:49.085149 containerd[1906]: 2025-09-16 05:32:49.039 [INFO][4715] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 16 05:32:49.085149 containerd[1906]: 2025-09-16 05:32:49.045 [INFO][4715] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--n--e7bf2d745b-k8s-whisker--bdc6877b6--c2vnv-eth0 whisker-bdc6877b6- calico-system e4728b7b-7449-4e92-a28e-38d252c53d31 889 0 2025-09-16 05:32:48 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:bdc6877b6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459.0.0-n-e7bf2d745b whisker-bdc6877b6-c2vnv eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali8e2d46629f9 [] [] }} ContainerID="d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" Namespace="calico-system" Pod="whisker-bdc6877b6-c2vnv" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-whisker--bdc6877b6--c2vnv-" Sep 16 05:32:49.085149 containerd[1906]: 2025-09-16 05:32:49.045 [INFO][4715] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" Namespace="calico-system" Pod="whisker-bdc6877b6-c2vnv" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-whisker--bdc6877b6--c2vnv-eth0" Sep 16 05:32:49.085149 containerd[1906]: 2025-09-16 05:32:49.058 [INFO][4738] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" HandleID="k8s-pod-network.d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-whisker--bdc6877b6--c2vnv-eth0" Sep 16 05:32:49.085421 containerd[1906]: 2025-09-16 05:32:49.058 [INFO][4738] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" HandleID="k8s-pod-network.d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-whisker--bdc6877b6--c2vnv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e36b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.0.0-n-e7bf2d745b", "pod":"whisker-bdc6877b6-c2vnv", "timestamp":"2025-09-16 05:32:49.05838042 +0000 UTC"}, Hostname:"ci-4459.0.0-n-e7bf2d745b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 05:32:49.085421 containerd[1906]: 2025-09-16 05:32:49.058 [INFO][4738] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 05:32:49.085421 containerd[1906]: 2025-09-16 05:32:49.058 [INFO][4738] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 05:32:49.085421 containerd[1906]: 2025-09-16 05:32:49.058 [INFO][4738] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-n-e7bf2d745b' Sep 16 05:32:49.085421 containerd[1906]: 2025-09-16 05:32:49.062 [INFO][4738] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:49.085421 containerd[1906]: 2025-09-16 05:32:49.065 [INFO][4738] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:49.085421 containerd[1906]: 2025-09-16 05:32:49.067 [INFO][4738] ipam/ipam.go 511: Trying affinity for 192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:49.085421 containerd[1906]: 2025-09-16 05:32:49.068 [INFO][4738] ipam/ipam.go 158: Attempting to load block cidr=192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:49.085421 containerd[1906]: 2025-09-16 05:32:49.069 [INFO][4738] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:49.085558 containerd[1906]: 2025-09-16 05:32:49.069 [INFO][4738] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.46.64/26 handle="k8s-pod-network.d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:49.085558 containerd[1906]: 2025-09-16 05:32:49.069 [INFO][4738] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8 Sep 16 05:32:49.085558 containerd[1906]: 2025-09-16 05:32:49.071 [INFO][4738] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.46.64/26 handle="k8s-pod-network.d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:49.085558 containerd[1906]: 2025-09-16 05:32:49.073 [INFO][4738] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.46.65/26] block=192.168.46.64/26 handle="k8s-pod-network.d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:49.085558 containerd[1906]: 2025-09-16 05:32:49.073 [INFO][4738] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.46.65/26] handle="k8s-pod-network.d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:49.085558 containerd[1906]: 2025-09-16 05:32:49.074 [INFO][4738] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 05:32:49.085558 containerd[1906]: 2025-09-16 05:32:49.074 [INFO][4738] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.46.65/26] IPv6=[] ContainerID="d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" HandleID="k8s-pod-network.d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-whisker--bdc6877b6--c2vnv-eth0" Sep 16 05:32:49.085653 containerd[1906]: 2025-09-16 05:32:49.075 [INFO][4715] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" Namespace="calico-system" Pod="whisker-bdc6877b6-c2vnv" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-whisker--bdc6877b6--c2vnv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--e7bf2d745b-k8s-whisker--bdc6877b6--c2vnv-eth0", GenerateName:"whisker-bdc6877b6-", Namespace:"calico-system", SelfLink:"", UID:"e4728b7b-7449-4e92-a28e-38d252c53d31", ResourceVersion:"889", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 32, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"bdc6877b6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-e7bf2d745b", ContainerID:"", Pod:"whisker-bdc6877b6-c2vnv", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.46.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali8e2d46629f9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:32:49.085653 containerd[1906]: 2025-09-16 05:32:49.075 [INFO][4715] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.46.65/32] ContainerID="d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" Namespace="calico-system" Pod="whisker-bdc6877b6-c2vnv" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-whisker--bdc6877b6--c2vnv-eth0" Sep 16 05:32:49.085700 containerd[1906]: 2025-09-16 05:32:49.075 [INFO][4715] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8e2d46629f9 ContainerID="d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" Namespace="calico-system" Pod="whisker-bdc6877b6-c2vnv" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-whisker--bdc6877b6--c2vnv-eth0" Sep 16 05:32:49.085700 containerd[1906]: 2025-09-16 05:32:49.079 [INFO][4715] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" Namespace="calico-system" Pod="whisker-bdc6877b6-c2vnv" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-whisker--bdc6877b6--c2vnv-eth0" Sep 16 05:32:49.085733 containerd[1906]: 2025-09-16 05:32:49.079 [INFO][4715] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" Namespace="calico-system" Pod="whisker-bdc6877b6-c2vnv" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-whisker--bdc6877b6--c2vnv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--e7bf2d745b-k8s-whisker--bdc6877b6--c2vnv-eth0", GenerateName:"whisker-bdc6877b6-", Namespace:"calico-system", SelfLink:"", UID:"e4728b7b-7449-4e92-a28e-38d252c53d31", ResourceVersion:"889", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 32, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"bdc6877b6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-e7bf2d745b", ContainerID:"d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8", Pod:"whisker-bdc6877b6-c2vnv", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.46.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali8e2d46629f9", MAC:"86:c8:44:99:9f:75", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:32:49.085769 containerd[1906]: 2025-09-16 05:32:49.083 [INFO][4715] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" Namespace="calico-system" Pod="whisker-bdc6877b6-c2vnv" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-whisker--bdc6877b6--c2vnv-eth0" Sep 16 05:32:49.093508 containerd[1906]: time="2025-09-16T05:32:49.093480245Z" level=info msg="connecting to shim d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8" address="unix:///run/containerd/s/31ef52aa78ba8cc79b12c88fef8b9aaa2655ac5d84f4ec42fc63bd2c132450c5" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:32:49.118377 systemd[1]: Started cri-containerd-d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8.scope - libcontainer container d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8. Sep 16 05:32:49.194092 containerd[1906]: time="2025-09-16T05:32:49.194069831Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-bdc6877b6-c2vnv,Uid:e4728b7b-7449-4e92-a28e-38d252c53d31,Namespace:calico-system,Attempt:0,} returns sandbox id \"d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8\"" Sep 16 05:32:49.194746 containerd[1906]: time="2025-09-16T05:32:49.194736166Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 16 05:32:49.587735 systemd-networkd[1822]: vxlan.calico: Link UP Sep 16 05:32:49.587738 systemd-networkd[1822]: vxlan.calico: Gained carrier Sep 16 05:32:49.621777 kubelet[3272]: I0916 05:32:49.621748 3272 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 05:32:50.494805 kubelet[3272]: I0916 05:32:50.494747 3272 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5307370e-d59b-4a28-b277-586e844c8a66" path="/var/lib/kubelet/pods/5307370e-d59b-4a28-b277-586e844c8a66/volumes" Sep 16 05:32:50.789114 systemd-networkd[1822]: cali8e2d46629f9: Gained IPv6LL Sep 16 05:32:50.831278 containerd[1906]: time="2025-09-16T05:32:50.831222042Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:50.831537 containerd[1906]: time="2025-09-16T05:32:50.831340383Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 16 05:32:50.831738 containerd[1906]: time="2025-09-16T05:32:50.831699620Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:50.832819 containerd[1906]: time="2025-09-16T05:32:50.832781014Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:50.833218 containerd[1906]: time="2025-09-16T05:32:50.833193734Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.638443041s" Sep 16 05:32:50.833218 containerd[1906]: time="2025-09-16T05:32:50.833208852Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 16 05:32:50.834283 containerd[1906]: time="2025-09-16T05:32:50.834267001Z" level=info msg="CreateContainer within sandbox \"d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 16 05:32:50.837318 containerd[1906]: time="2025-09-16T05:32:50.837283152Z" level=info msg="Container 3aac2eed9c49e7f93b8d0c862cdf6b0ccabf35065150124a97585d26fe105b9d: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:32:50.841108 containerd[1906]: time="2025-09-16T05:32:50.841067818Z" level=info msg="CreateContainer within sandbox \"d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"3aac2eed9c49e7f93b8d0c862cdf6b0ccabf35065150124a97585d26fe105b9d\"" Sep 16 05:32:50.841427 containerd[1906]: time="2025-09-16T05:32:50.841416031Z" level=info msg="StartContainer for \"3aac2eed9c49e7f93b8d0c862cdf6b0ccabf35065150124a97585d26fe105b9d\"" Sep 16 05:32:50.842314 containerd[1906]: time="2025-09-16T05:32:50.842274141Z" level=info msg="connecting to shim 3aac2eed9c49e7f93b8d0c862cdf6b0ccabf35065150124a97585d26fe105b9d" address="unix:///run/containerd/s/31ef52aa78ba8cc79b12c88fef8b9aaa2655ac5d84f4ec42fc63bd2c132450c5" protocol=ttrpc version=3 Sep 16 05:32:50.853131 systemd-networkd[1822]: vxlan.calico: Gained IPv6LL Sep 16 05:32:50.859293 systemd[1]: Started cri-containerd-3aac2eed9c49e7f93b8d0c862cdf6b0ccabf35065150124a97585d26fe105b9d.scope - libcontainer container 3aac2eed9c49e7f93b8d0c862cdf6b0ccabf35065150124a97585d26fe105b9d. Sep 16 05:32:50.885769 containerd[1906]: time="2025-09-16T05:32:50.885722803Z" level=info msg="StartContainer for \"3aac2eed9c49e7f93b8d0c862cdf6b0ccabf35065150124a97585d26fe105b9d\" returns successfully" Sep 16 05:32:50.886238 containerd[1906]: time="2025-09-16T05:32:50.886226293Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 16 05:32:53.432170 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3822925171.mount: Deactivated successfully. Sep 16 05:32:53.436816 containerd[1906]: time="2025-09-16T05:32:53.436769351Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:53.437053 containerd[1906]: time="2025-09-16T05:32:53.436993876Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 16 05:32:53.437413 containerd[1906]: time="2025-09-16T05:32:53.437370739Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:53.438270 containerd[1906]: time="2025-09-16T05:32:53.438231367Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:53.438957 containerd[1906]: time="2025-09-16T05:32:53.438915154Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.552672927s" Sep 16 05:32:53.438957 containerd[1906]: time="2025-09-16T05:32:53.438931498Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 16 05:32:53.439782 containerd[1906]: time="2025-09-16T05:32:53.439770752Z" level=info msg="CreateContainer within sandbox \"d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 16 05:32:53.442267 containerd[1906]: time="2025-09-16T05:32:53.442253499Z" level=info msg="Container 0d2ea5b2b24e250a4b2512e6640269828493295304b8ba5c7d3136d2999a394a: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:32:53.464725 containerd[1906]: time="2025-09-16T05:32:53.464668334Z" level=info msg="CreateContainer within sandbox \"d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"0d2ea5b2b24e250a4b2512e6640269828493295304b8ba5c7d3136d2999a394a\"" Sep 16 05:32:53.465114 containerd[1906]: time="2025-09-16T05:32:53.465055824Z" level=info msg="StartContainer for \"0d2ea5b2b24e250a4b2512e6640269828493295304b8ba5c7d3136d2999a394a\"" Sep 16 05:32:53.466225 containerd[1906]: time="2025-09-16T05:32:53.466164351Z" level=info msg="connecting to shim 0d2ea5b2b24e250a4b2512e6640269828493295304b8ba5c7d3136d2999a394a" address="unix:///run/containerd/s/31ef52aa78ba8cc79b12c88fef8b9aaa2655ac5d84f4ec42fc63bd2c132450c5" protocol=ttrpc version=3 Sep 16 05:32:53.494739 containerd[1906]: time="2025-09-16T05:32:53.494611631Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dfcc9ff54-xnr9g,Uid:31261ed2-aa62-4f40-998f-c07af6124687,Namespace:calico-apiserver,Attempt:0,}" Sep 16 05:32:53.494970 containerd[1906]: time="2025-09-16T05:32:53.494780581Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76c6977657-fx526,Uid:13d60bc1-c4c8-48bc-b2cf-6befd811be20,Namespace:calico-system,Attempt:0,}" Sep 16 05:32:53.495187 containerd[1906]: time="2025-09-16T05:32:53.495111029Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dfcc9ff54-dmntm,Uid:cc5427eb-63f9-46a9-913f-a95379cd7953,Namespace:calico-apiserver,Attempt:0,}" Sep 16 05:32:53.495417 systemd[1]: Started cri-containerd-0d2ea5b2b24e250a4b2512e6640269828493295304b8ba5c7d3136d2999a394a.scope - libcontainer container 0d2ea5b2b24e250a4b2512e6640269828493295304b8ba5c7d3136d2999a394a. Sep 16 05:32:53.523791 containerd[1906]: time="2025-09-16T05:32:53.523771098Z" level=info msg="StartContainer for \"0d2ea5b2b24e250a4b2512e6640269828493295304b8ba5c7d3136d2999a394a\" returns successfully" Sep 16 05:32:53.547265 systemd-networkd[1822]: cali6c198ccf282: Link UP Sep 16 05:32:53.547421 systemd-networkd[1822]: cali6c198ccf282: Gained carrier Sep 16 05:32:53.552396 containerd[1906]: 2025-09-16 05:32:53.514 [INFO][5165] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--dmntm-eth0 calico-apiserver-6dfcc9ff54- calico-apiserver cc5427eb-63f9-46a9-913f-a95379cd7953 825 0 2025-09-16 05:32:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6dfcc9ff54 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.0.0-n-e7bf2d745b calico-apiserver-6dfcc9ff54-dmntm eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali6c198ccf282 [] [] }} ContainerID="247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" Namespace="calico-apiserver" Pod="calico-apiserver-6dfcc9ff54-dmntm" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--dmntm-" Sep 16 05:32:53.552396 containerd[1906]: 2025-09-16 05:32:53.514 [INFO][5165] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" Namespace="calico-apiserver" Pod="calico-apiserver-6dfcc9ff54-dmntm" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--dmntm-eth0" Sep 16 05:32:53.552396 containerd[1906]: 2025-09-16 05:32:53.528 [INFO][5222] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" HandleID="k8s-pod-network.247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--dmntm-eth0" Sep 16 05:32:53.552560 containerd[1906]: 2025-09-16 05:32:53.528 [INFO][5222] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" HandleID="k8s-pod-network.247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--dmntm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f7b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459.0.0-n-e7bf2d745b", "pod":"calico-apiserver-6dfcc9ff54-dmntm", "timestamp":"2025-09-16 05:32:53.528025153 +0000 UTC"}, Hostname:"ci-4459.0.0-n-e7bf2d745b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 05:32:53.552560 containerd[1906]: 2025-09-16 05:32:53.528 [INFO][5222] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 05:32:53.552560 containerd[1906]: 2025-09-16 05:32:53.528 [INFO][5222] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 05:32:53.552560 containerd[1906]: 2025-09-16 05:32:53.528 [INFO][5222] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-n-e7bf2d745b' Sep 16 05:32:53.552560 containerd[1906]: 2025-09-16 05:32:53.532 [INFO][5222] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.552560 containerd[1906]: 2025-09-16 05:32:53.534 [INFO][5222] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.552560 containerd[1906]: 2025-09-16 05:32:53.537 [INFO][5222] ipam/ipam.go 511: Trying affinity for 192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.552560 containerd[1906]: 2025-09-16 05:32:53.538 [INFO][5222] ipam/ipam.go 158: Attempting to load block cidr=192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.552560 containerd[1906]: 2025-09-16 05:32:53.539 [INFO][5222] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.552808 containerd[1906]: 2025-09-16 05:32:53.539 [INFO][5222] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.46.64/26 handle="k8s-pod-network.247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.552808 containerd[1906]: 2025-09-16 05:32:53.540 [INFO][5222] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a Sep 16 05:32:53.552808 containerd[1906]: 2025-09-16 05:32:53.542 [INFO][5222] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.46.64/26 handle="k8s-pod-network.247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.552808 containerd[1906]: 2025-09-16 05:32:53.545 [INFO][5222] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.46.66/26] block=192.168.46.64/26 handle="k8s-pod-network.247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.552808 containerd[1906]: 2025-09-16 05:32:53.545 [INFO][5222] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.46.66/26] handle="k8s-pod-network.247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.552808 containerd[1906]: 2025-09-16 05:32:53.545 [INFO][5222] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 05:32:53.552808 containerd[1906]: 2025-09-16 05:32:53.545 [INFO][5222] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.46.66/26] IPv6=[] ContainerID="247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" HandleID="k8s-pod-network.247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--dmntm-eth0" Sep 16 05:32:53.552927 containerd[1906]: 2025-09-16 05:32:53.546 [INFO][5165] cni-plugin/k8s.go 418: Populated endpoint ContainerID="247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" Namespace="calico-apiserver" Pod="calico-apiserver-6dfcc9ff54-dmntm" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--dmntm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--dmntm-eth0", GenerateName:"calico-apiserver-6dfcc9ff54-", Namespace:"calico-apiserver", SelfLink:"", UID:"cc5427eb-63f9-46a9-913f-a95379cd7953", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 32, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dfcc9ff54", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-e7bf2d745b", ContainerID:"", Pod:"calico-apiserver-6dfcc9ff54-dmntm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.46.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6c198ccf282", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:32:53.552965 containerd[1906]: 2025-09-16 05:32:53.546 [INFO][5165] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.46.66/32] ContainerID="247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" Namespace="calico-apiserver" Pod="calico-apiserver-6dfcc9ff54-dmntm" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--dmntm-eth0" Sep 16 05:32:53.552965 containerd[1906]: 2025-09-16 05:32:53.546 [INFO][5165] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6c198ccf282 ContainerID="247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" Namespace="calico-apiserver" Pod="calico-apiserver-6dfcc9ff54-dmntm" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--dmntm-eth0" Sep 16 05:32:53.552965 containerd[1906]: 2025-09-16 05:32:53.547 [INFO][5165] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" Namespace="calico-apiserver" Pod="calico-apiserver-6dfcc9ff54-dmntm" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--dmntm-eth0" Sep 16 05:32:53.553033 containerd[1906]: 2025-09-16 05:32:53.547 [INFO][5165] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" Namespace="calico-apiserver" Pod="calico-apiserver-6dfcc9ff54-dmntm" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--dmntm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--dmntm-eth0", GenerateName:"calico-apiserver-6dfcc9ff54-", Namespace:"calico-apiserver", SelfLink:"", UID:"cc5427eb-63f9-46a9-913f-a95379cd7953", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 32, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dfcc9ff54", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-e7bf2d745b", ContainerID:"247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a", Pod:"calico-apiserver-6dfcc9ff54-dmntm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.46.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6c198ccf282", MAC:"de:87:85:d3:4a:71", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:32:53.553071 containerd[1906]: 2025-09-16 05:32:53.551 [INFO][5165] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" Namespace="calico-apiserver" Pod="calico-apiserver-6dfcc9ff54-dmntm" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--dmntm-eth0" Sep 16 05:32:53.560075 containerd[1906]: time="2025-09-16T05:32:53.560021186Z" level=info msg="connecting to shim 247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a" address="unix:///run/containerd/s/f92646748b52291b966ec74f18ca9e65cee46e9bec5d16337e2fbb6662f79280" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:32:53.580535 systemd[1]: Started cri-containerd-247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a.scope - libcontainer container 247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a. Sep 16 05:32:53.627034 containerd[1906]: time="2025-09-16T05:32:53.627010137Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dfcc9ff54-dmntm,Uid:cc5427eb-63f9-46a9-913f-a95379cd7953,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a\"" Sep 16 05:32:53.627632 containerd[1906]: time="2025-09-16T05:32:53.627621006Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 16 05:32:53.651958 systemd-networkd[1822]: cali28a419e89cb: Link UP Sep 16 05:32:53.652141 systemd-networkd[1822]: cali28a419e89cb: Gained carrier Sep 16 05:32:53.656498 kubelet[3272]: I0916 05:32:53.656456 3272 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-bdc6877b6-c2vnv" podStartSLOduration=1.41182207 podStartE2EDuration="5.656441021s" podCreationTimestamp="2025-09-16 05:32:48 +0000 UTC" firstStartedPulling="2025-09-16 05:32:49.194621875 +0000 UTC m=+32.765547910" lastFinishedPulling="2025-09-16 05:32:53.439240826 +0000 UTC m=+37.010166861" observedRunningTime="2025-09-16 05:32:53.647221402 +0000 UTC m=+37.218147442" watchObservedRunningTime="2025-09-16 05:32:53.656441021 +0000 UTC m=+37.227367055" Sep 16 05:32:53.657339 containerd[1906]: 2025-09-16 05:32:53.513 [INFO][5152] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--xnr9g-eth0 calico-apiserver-6dfcc9ff54- calico-apiserver 31261ed2-aa62-4f40-998f-c07af6124687 819 0 2025-09-16 05:32:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6dfcc9ff54 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.0.0-n-e7bf2d745b calico-apiserver-6dfcc9ff54-xnr9g eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali28a419e89cb [] [] }} ContainerID="d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" Namespace="calico-apiserver" Pod="calico-apiserver-6dfcc9ff54-xnr9g" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--xnr9g-" Sep 16 05:32:53.657339 containerd[1906]: 2025-09-16 05:32:53.513 [INFO][5152] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" Namespace="calico-apiserver" Pod="calico-apiserver-6dfcc9ff54-xnr9g" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--xnr9g-eth0" Sep 16 05:32:53.657339 containerd[1906]: 2025-09-16 05:32:53.528 [INFO][5219] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" HandleID="k8s-pod-network.d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--xnr9g-eth0" Sep 16 05:32:53.657501 containerd[1906]: 2025-09-16 05:32:53.528 [INFO][5219] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" HandleID="k8s-pod-network.d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--xnr9g-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000351110), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459.0.0-n-e7bf2d745b", "pod":"calico-apiserver-6dfcc9ff54-xnr9g", "timestamp":"2025-09-16 05:32:53.528025157 +0000 UTC"}, Hostname:"ci-4459.0.0-n-e7bf2d745b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 05:32:53.657501 containerd[1906]: 2025-09-16 05:32:53.528 [INFO][5219] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 05:32:53.657501 containerd[1906]: 2025-09-16 05:32:53.545 [INFO][5219] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 05:32:53.657501 containerd[1906]: 2025-09-16 05:32:53.545 [INFO][5219] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-n-e7bf2d745b' Sep 16 05:32:53.657501 containerd[1906]: 2025-09-16 05:32:53.634 [INFO][5219] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.657501 containerd[1906]: 2025-09-16 05:32:53.637 [INFO][5219] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.657501 containerd[1906]: 2025-09-16 05:32:53.640 [INFO][5219] ipam/ipam.go 511: Trying affinity for 192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.657501 containerd[1906]: 2025-09-16 05:32:53.641 [INFO][5219] ipam/ipam.go 158: Attempting to load block cidr=192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.657501 containerd[1906]: 2025-09-16 05:32:53.643 [INFO][5219] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.657694 containerd[1906]: 2025-09-16 05:32:53.643 [INFO][5219] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.46.64/26 handle="k8s-pod-network.d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.657694 containerd[1906]: 2025-09-16 05:32:53.644 [INFO][5219] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b Sep 16 05:32:53.657694 containerd[1906]: 2025-09-16 05:32:53.646 [INFO][5219] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.46.64/26 handle="k8s-pod-network.d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.657694 containerd[1906]: 2025-09-16 05:32:53.650 [INFO][5219] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.46.67/26] block=192.168.46.64/26 handle="k8s-pod-network.d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.657694 containerd[1906]: 2025-09-16 05:32:53.650 [INFO][5219] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.46.67/26] handle="k8s-pod-network.d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.657694 containerd[1906]: 2025-09-16 05:32:53.650 [INFO][5219] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 05:32:53.657694 containerd[1906]: 2025-09-16 05:32:53.650 [INFO][5219] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.46.67/26] IPv6=[] ContainerID="d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" HandleID="k8s-pod-network.d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--xnr9g-eth0" Sep 16 05:32:53.657800 containerd[1906]: 2025-09-16 05:32:53.651 [INFO][5152] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" Namespace="calico-apiserver" Pod="calico-apiserver-6dfcc9ff54-xnr9g" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--xnr9g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--xnr9g-eth0", GenerateName:"calico-apiserver-6dfcc9ff54-", Namespace:"calico-apiserver", SelfLink:"", UID:"31261ed2-aa62-4f40-998f-c07af6124687", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 32, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dfcc9ff54", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-e7bf2d745b", ContainerID:"", Pod:"calico-apiserver-6dfcc9ff54-xnr9g", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.46.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali28a419e89cb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:32:53.657839 containerd[1906]: 2025-09-16 05:32:53.651 [INFO][5152] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.46.67/32] ContainerID="d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" Namespace="calico-apiserver" Pod="calico-apiserver-6dfcc9ff54-xnr9g" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--xnr9g-eth0" Sep 16 05:32:53.657839 containerd[1906]: 2025-09-16 05:32:53.651 [INFO][5152] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali28a419e89cb ContainerID="d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" Namespace="calico-apiserver" Pod="calico-apiserver-6dfcc9ff54-xnr9g" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--xnr9g-eth0" Sep 16 05:32:53.657839 containerd[1906]: 2025-09-16 05:32:53.652 [INFO][5152] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" Namespace="calico-apiserver" Pod="calico-apiserver-6dfcc9ff54-xnr9g" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--xnr9g-eth0" Sep 16 05:32:53.657889 containerd[1906]: 2025-09-16 05:32:53.652 [INFO][5152] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" Namespace="calico-apiserver" Pod="calico-apiserver-6dfcc9ff54-xnr9g" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--xnr9g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--xnr9g-eth0", GenerateName:"calico-apiserver-6dfcc9ff54-", Namespace:"calico-apiserver", SelfLink:"", UID:"31261ed2-aa62-4f40-998f-c07af6124687", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 32, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dfcc9ff54", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-e7bf2d745b", ContainerID:"d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b", Pod:"calico-apiserver-6dfcc9ff54-xnr9g", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.46.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali28a419e89cb", MAC:"be:be:f7:fd:ed:89", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:32:53.657941 containerd[1906]: 2025-09-16 05:32:53.656 [INFO][5152] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" Namespace="calico-apiserver" Pod="calico-apiserver-6dfcc9ff54-xnr9g" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--apiserver--6dfcc9ff54--xnr9g-eth0" Sep 16 05:32:53.670284 containerd[1906]: time="2025-09-16T05:32:53.670259424Z" level=info msg="connecting to shim d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b" address="unix:///run/containerd/s/1f21acd60fbd0ec56a64c866683105856b0b5de688f11b0d88a52f1f2b78774b" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:32:53.694624 systemd[1]: Started cri-containerd-d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b.scope - libcontainer container d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b. Sep 16 05:32:53.767626 systemd-networkd[1822]: califb6674eb113: Link UP Sep 16 05:32:53.767850 systemd-networkd[1822]: califb6674eb113: Gained carrier Sep 16 05:32:53.776650 containerd[1906]: 2025-09-16 05:32:53.514 [INFO][5158] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--n--e7bf2d745b-k8s-calico--kube--controllers--76c6977657--fx526-eth0 calico-kube-controllers-76c6977657- calico-system 13d60bc1-c4c8-48bc-b2cf-6befd811be20 821 0 2025-09-16 05:32:32 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:76c6977657 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459.0.0-n-e7bf2d745b calico-kube-controllers-76c6977657-fx526 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] califb6674eb113 [] [] }} ContainerID="5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" Namespace="calico-system" Pod="calico-kube-controllers-76c6977657-fx526" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--kube--controllers--76c6977657--fx526-" Sep 16 05:32:53.776650 containerd[1906]: 2025-09-16 05:32:53.514 [INFO][5158] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" Namespace="calico-system" Pod="calico-kube-controllers-76c6977657-fx526" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--kube--controllers--76c6977657--fx526-eth0" Sep 16 05:32:53.776650 containerd[1906]: 2025-09-16 05:32:53.529 [INFO][5223] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" HandleID="k8s-pod-network.5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-calico--kube--controllers--76c6977657--fx526-eth0" Sep 16 05:32:53.776823 containerd[1906]: 2025-09-16 05:32:53.529 [INFO][5223] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" HandleID="k8s-pod-network.5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-calico--kube--controllers--76c6977657--fx526-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f6a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.0.0-n-e7bf2d745b", "pod":"calico-kube-controllers-76c6977657-fx526", "timestamp":"2025-09-16 05:32:53.529110387 +0000 UTC"}, Hostname:"ci-4459.0.0-n-e7bf2d745b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 05:32:53.776823 containerd[1906]: 2025-09-16 05:32:53.529 [INFO][5223] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 05:32:53.776823 containerd[1906]: 2025-09-16 05:32:53.650 [INFO][5223] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 05:32:53.776823 containerd[1906]: 2025-09-16 05:32:53.650 [INFO][5223] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-n-e7bf2d745b' Sep 16 05:32:53.776823 containerd[1906]: 2025-09-16 05:32:53.736 [INFO][5223] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.776823 containerd[1906]: 2025-09-16 05:32:53.745 [INFO][5223] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.776823 containerd[1906]: 2025-09-16 05:32:53.752 [INFO][5223] ipam/ipam.go 511: Trying affinity for 192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.776823 containerd[1906]: 2025-09-16 05:32:53.754 [INFO][5223] ipam/ipam.go 158: Attempting to load block cidr=192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.776823 containerd[1906]: 2025-09-16 05:32:53.756 [INFO][5223] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.777066 containerd[1906]: 2025-09-16 05:32:53.756 [INFO][5223] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.46.64/26 handle="k8s-pod-network.5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.777066 containerd[1906]: 2025-09-16 05:32:53.758 [INFO][5223] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0 Sep 16 05:32:53.777066 containerd[1906]: 2025-09-16 05:32:53.760 [INFO][5223] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.46.64/26 handle="k8s-pod-network.5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.777066 containerd[1906]: 2025-09-16 05:32:53.764 [INFO][5223] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.46.68/26] block=192.168.46.64/26 handle="k8s-pod-network.5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.777066 containerd[1906]: 2025-09-16 05:32:53.765 [INFO][5223] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.46.68/26] handle="k8s-pod-network.5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:53.777066 containerd[1906]: 2025-09-16 05:32:53.765 [INFO][5223] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 05:32:53.777066 containerd[1906]: 2025-09-16 05:32:53.765 [INFO][5223] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.46.68/26] IPv6=[] ContainerID="5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" HandleID="k8s-pod-network.5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-calico--kube--controllers--76c6977657--fx526-eth0" Sep 16 05:32:53.777230 containerd[1906]: 2025-09-16 05:32:53.766 [INFO][5158] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" Namespace="calico-system" Pod="calico-kube-controllers-76c6977657-fx526" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--kube--controllers--76c6977657--fx526-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--e7bf2d745b-k8s-calico--kube--controllers--76c6977657--fx526-eth0", GenerateName:"calico-kube-controllers-76c6977657-", Namespace:"calico-system", SelfLink:"", UID:"13d60bc1-c4c8-48bc-b2cf-6befd811be20", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 32, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"76c6977657", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-e7bf2d745b", ContainerID:"", Pod:"calico-kube-controllers-76c6977657-fx526", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.46.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"califb6674eb113", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:32:53.777291 containerd[1906]: 2025-09-16 05:32:53.766 [INFO][5158] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.46.68/32] ContainerID="5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" Namespace="calico-system" Pod="calico-kube-controllers-76c6977657-fx526" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--kube--controllers--76c6977657--fx526-eth0" Sep 16 05:32:53.777291 containerd[1906]: 2025-09-16 05:32:53.766 [INFO][5158] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califb6674eb113 ContainerID="5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" Namespace="calico-system" Pod="calico-kube-controllers-76c6977657-fx526" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--kube--controllers--76c6977657--fx526-eth0" Sep 16 05:32:53.777291 containerd[1906]: 2025-09-16 05:32:53.768 [INFO][5158] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" Namespace="calico-system" Pod="calico-kube-controllers-76c6977657-fx526" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--kube--controllers--76c6977657--fx526-eth0" Sep 16 05:32:53.777359 containerd[1906]: 2025-09-16 05:32:53.768 [INFO][5158] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" Namespace="calico-system" Pod="calico-kube-controllers-76c6977657-fx526" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--kube--controllers--76c6977657--fx526-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--e7bf2d745b-k8s-calico--kube--controllers--76c6977657--fx526-eth0", GenerateName:"calico-kube-controllers-76c6977657-", Namespace:"calico-system", SelfLink:"", UID:"13d60bc1-c4c8-48bc-b2cf-6befd811be20", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 32, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"76c6977657", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-e7bf2d745b", ContainerID:"5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0", Pod:"calico-kube-controllers-76c6977657-fx526", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.46.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"califb6674eb113", MAC:"ce:df:1e:28:00:66", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:32:53.777409 containerd[1906]: 2025-09-16 05:32:53.774 [INFO][5158] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" Namespace="calico-system" Pod="calico-kube-controllers-76c6977657-fx526" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-calico--kube--controllers--76c6977657--fx526-eth0" Sep 16 05:32:53.777409 containerd[1906]: time="2025-09-16T05:32:53.776998242Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dfcc9ff54-xnr9g,Uid:31261ed2-aa62-4f40-998f-c07af6124687,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b\"" Sep 16 05:32:53.784470 containerd[1906]: time="2025-09-16T05:32:53.784444822Z" level=info msg="connecting to shim 5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0" address="unix:///run/containerd/s/1cee62eb2662ea536ae6c326976f3b33c1232916723ae708971058fa28b6c1b1" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:32:53.800139 systemd[1]: Started cri-containerd-5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0.scope - libcontainer container 5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0. Sep 16 05:32:53.825896 containerd[1906]: time="2025-09-16T05:32:53.825875138Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76c6977657-fx526,Uid:13d60bc1-c4c8-48bc-b2cf-6befd811be20,Namespace:calico-system,Attempt:0,} returns sandbox id \"5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0\"" Sep 16 05:32:54.500756 containerd[1906]: time="2025-09-16T05:32:54.500625218Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gl6kf,Uid:01b2b6da-0bd4-475d-aa4d-8dc67b3100e4,Namespace:kube-system,Attempt:0,}" Sep 16 05:32:54.501902 kubelet[3272]: I0916 05:32:54.501890 3272 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 05:32:54.551262 containerd[1906]: time="2025-09-16T05:32:54.551240676Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\" id:\"72b02a2bf9730c65e81732c9b1e1c653e0d90fdc7f29d2e2f63e544b3fe5a03c\" pid:5494 exited_at:{seconds:1758000774 nanos:551055944}" Sep 16 05:32:54.556836 systemd-networkd[1822]: cali3be0be8f6f8: Link UP Sep 16 05:32:54.557055 systemd-networkd[1822]: cali3be0be8f6f8: Gained carrier Sep 16 05:32:54.561989 containerd[1906]: 2025-09-16 05:32:54.519 [INFO][5467] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--gl6kf-eth0 coredns-668d6bf9bc- kube-system 01b2b6da-0bd4-475d-aa4d-8dc67b3100e4 815 0 2025-09-16 05:32:22 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.0.0-n-e7bf2d745b coredns-668d6bf9bc-gl6kf eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3be0be8f6f8 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" Namespace="kube-system" Pod="coredns-668d6bf9bc-gl6kf" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--gl6kf-" Sep 16 05:32:54.561989 containerd[1906]: 2025-09-16 05:32:54.519 [INFO][5467] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" Namespace="kube-system" Pod="coredns-668d6bf9bc-gl6kf" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--gl6kf-eth0" Sep 16 05:32:54.561989 containerd[1906]: 2025-09-16 05:32:54.532 [INFO][5506] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" HandleID="k8s-pod-network.95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--gl6kf-eth0" Sep 16 05:32:54.562138 containerd[1906]: 2025-09-16 05:32:54.532 [INFO][5506] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" HandleID="k8s-pod-network.95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--gl6kf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001397c0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.0.0-n-e7bf2d745b", "pod":"coredns-668d6bf9bc-gl6kf", "timestamp":"2025-09-16 05:32:54.532910367 +0000 UTC"}, Hostname:"ci-4459.0.0-n-e7bf2d745b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 05:32:54.562138 containerd[1906]: 2025-09-16 05:32:54.533 [INFO][5506] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 05:32:54.562138 containerd[1906]: 2025-09-16 05:32:54.533 [INFO][5506] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 05:32:54.562138 containerd[1906]: 2025-09-16 05:32:54.533 [INFO][5506] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-n-e7bf2d745b' Sep 16 05:32:54.562138 containerd[1906]: 2025-09-16 05:32:54.537 [INFO][5506] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:54.562138 containerd[1906]: 2025-09-16 05:32:54.541 [INFO][5506] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:54.562138 containerd[1906]: 2025-09-16 05:32:54.543 [INFO][5506] ipam/ipam.go 511: Trying affinity for 192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:54.562138 containerd[1906]: 2025-09-16 05:32:54.545 [INFO][5506] ipam/ipam.go 158: Attempting to load block cidr=192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:54.562138 containerd[1906]: 2025-09-16 05:32:54.547 [INFO][5506] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:54.562292 containerd[1906]: 2025-09-16 05:32:54.547 [INFO][5506] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.46.64/26 handle="k8s-pod-network.95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:54.562292 containerd[1906]: 2025-09-16 05:32:54.548 [INFO][5506] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d Sep 16 05:32:54.562292 containerd[1906]: 2025-09-16 05:32:54.551 [INFO][5506] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.46.64/26 handle="k8s-pod-network.95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:54.562292 containerd[1906]: 2025-09-16 05:32:54.554 [INFO][5506] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.46.69/26] block=192.168.46.64/26 handle="k8s-pod-network.95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:54.562292 containerd[1906]: 2025-09-16 05:32:54.554 [INFO][5506] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.46.69/26] handle="k8s-pod-network.95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:54.562292 containerd[1906]: 2025-09-16 05:32:54.554 [INFO][5506] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 05:32:54.562292 containerd[1906]: 2025-09-16 05:32:54.554 [INFO][5506] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.46.69/26] IPv6=[] ContainerID="95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" HandleID="k8s-pod-network.95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--gl6kf-eth0" Sep 16 05:32:54.562410 containerd[1906]: 2025-09-16 05:32:54.555 [INFO][5467] cni-plugin/k8s.go 418: Populated endpoint ContainerID="95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" Namespace="kube-system" Pod="coredns-668d6bf9bc-gl6kf" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--gl6kf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--gl6kf-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"01b2b6da-0bd4-475d-aa4d-8dc67b3100e4", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 32, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-e7bf2d745b", ContainerID:"", Pod:"coredns-668d6bf9bc-gl6kf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.46.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3be0be8f6f8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:32:54.562410 containerd[1906]: 2025-09-16 05:32:54.555 [INFO][5467] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.46.69/32] ContainerID="95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" Namespace="kube-system" Pod="coredns-668d6bf9bc-gl6kf" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--gl6kf-eth0" Sep 16 05:32:54.562410 containerd[1906]: 2025-09-16 05:32:54.555 [INFO][5467] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3be0be8f6f8 ContainerID="95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" Namespace="kube-system" Pod="coredns-668d6bf9bc-gl6kf" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--gl6kf-eth0" Sep 16 05:32:54.562410 containerd[1906]: 2025-09-16 05:32:54.557 [INFO][5467] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" Namespace="kube-system" Pod="coredns-668d6bf9bc-gl6kf" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--gl6kf-eth0" Sep 16 05:32:54.562410 containerd[1906]: 2025-09-16 05:32:54.557 [INFO][5467] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" Namespace="kube-system" Pod="coredns-668d6bf9bc-gl6kf" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--gl6kf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--gl6kf-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"01b2b6da-0bd4-475d-aa4d-8dc67b3100e4", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 32, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-e7bf2d745b", ContainerID:"95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d", Pod:"coredns-668d6bf9bc-gl6kf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.46.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3be0be8f6f8", MAC:"de:6b:3e:5f:82:7c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:32:54.562410 containerd[1906]: 2025-09-16 05:32:54.561 [INFO][5467] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" Namespace="kube-system" Pod="coredns-668d6bf9bc-gl6kf" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--gl6kf-eth0" Sep 16 05:32:54.570956 containerd[1906]: time="2025-09-16T05:32:54.570924134Z" level=info msg="connecting to shim 95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d" address="unix:///run/containerd/s/cee552cdac392a7545891cc71750b66b7079073b0355f654b203d7b61db2e627" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:32:54.579033 systemd[1]: Started cri-containerd-95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d.scope - libcontainer container 95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d. Sep 16 05:32:54.595177 containerd[1906]: time="2025-09-16T05:32:54.595153673Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\" id:\"f3ce3583112cd1857e2de0eea57f38d02e3eb40e90199cc67934c464b68234e1\" pid:5553 exited_at:{seconds:1758000774 nanos:594961697}" Sep 16 05:32:54.604140 containerd[1906]: time="2025-09-16T05:32:54.604119771Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gl6kf,Uid:01b2b6da-0bd4-475d-aa4d-8dc67b3100e4,Namespace:kube-system,Attempt:0,} returns sandbox id \"95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d\"" Sep 16 05:32:54.605796 containerd[1906]: time="2025-09-16T05:32:54.605776967Z" level=info msg="CreateContainer within sandbox \"95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 16 05:32:54.609602 containerd[1906]: time="2025-09-16T05:32:54.609583922Z" level=info msg="Container 6c915ddf8c726d78508c6945de860c27a66d1109377139f058facdf393d57115: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:32:54.611715 containerd[1906]: time="2025-09-16T05:32:54.611673895Z" level=info msg="CreateContainer within sandbox \"95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"6c915ddf8c726d78508c6945de860c27a66d1109377139f058facdf393d57115\"" Sep 16 05:32:54.611947 containerd[1906]: time="2025-09-16T05:32:54.611933929Z" level=info msg="StartContainer for \"6c915ddf8c726d78508c6945de860c27a66d1109377139f058facdf393d57115\"" Sep 16 05:32:54.612331 containerd[1906]: time="2025-09-16T05:32:54.612318191Z" level=info msg="connecting to shim 6c915ddf8c726d78508c6945de860c27a66d1109377139f058facdf393d57115" address="unix:///run/containerd/s/cee552cdac392a7545891cc71750b66b7079073b0355f654b203d7b61db2e627" protocol=ttrpc version=3 Sep 16 05:32:54.637169 systemd[1]: Started cri-containerd-6c915ddf8c726d78508c6945de860c27a66d1109377139f058facdf393d57115.scope - libcontainer container 6c915ddf8c726d78508c6945de860c27a66d1109377139f058facdf393d57115. Sep 16 05:32:54.663673 containerd[1906]: time="2025-09-16T05:32:54.663615282Z" level=info msg="StartContainer for \"6c915ddf8c726d78508c6945de860c27a66d1109377139f058facdf393d57115\" returns successfully" Sep 16 05:32:55.013285 systemd-networkd[1822]: cali6c198ccf282: Gained IPv6LL Sep 16 05:32:55.268294 systemd-networkd[1822]: califb6674eb113: Gained IPv6LL Sep 16 05:32:55.511301 containerd[1906]: time="2025-09-16T05:32:55.511175621Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-576jh,Uid:0d3912ee-3cd2-402b-ab83-a92684e5a0cd,Namespace:calico-system,Attempt:0,}" Sep 16 05:32:55.511301 containerd[1906]: time="2025-09-16T05:32:55.511232447Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-sgd4r,Uid:3262e0d4-7fee-4cbc-9baf-bc833d218f45,Namespace:kube-system,Attempt:0,}" Sep 16 05:32:55.512097 containerd[1906]: time="2025-09-16T05:32:55.511217675Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-4mpgc,Uid:905ec77c-cbec-4843-b250-4baac4cf8a0c,Namespace:calico-system,Attempt:0,}" Sep 16 05:32:55.567236 systemd-networkd[1822]: calicf09c197577: Link UP Sep 16 05:32:55.567388 systemd-networkd[1822]: calicf09c197577: Gained carrier Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.530 [INFO][5683] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--n--e7bf2d745b-k8s-csi--node--driver--576jh-eth0 csi-node-driver- calico-system 0d3912ee-3cd2-402b-ab83-a92684e5a0cd 702 0 2025-09-16 05:32:31 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459.0.0-n-e7bf2d745b csi-node-driver-576jh eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calicf09c197577 [] [] }} ContainerID="8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" Namespace="calico-system" Pod="csi-node-driver-576jh" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-csi--node--driver--576jh-" Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.531 [INFO][5683] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" Namespace="calico-system" Pod="csi-node-driver-576jh" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-csi--node--driver--576jh-eth0" Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.544 [INFO][5753] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" HandleID="k8s-pod-network.8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-csi--node--driver--576jh-eth0" Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.544 [INFO][5753] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" HandleID="k8s-pod-network.8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-csi--node--driver--576jh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000139f20), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.0.0-n-e7bf2d745b", "pod":"csi-node-driver-576jh", "timestamp":"2025-09-16 05:32:55.544420585 +0000 UTC"}, Hostname:"ci-4459.0.0-n-e7bf2d745b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.544 [INFO][5753] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.544 [INFO][5753] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.544 [INFO][5753] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-n-e7bf2d745b' Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.549 [INFO][5753] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.552 [INFO][5753] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.555 [INFO][5753] ipam/ipam.go 511: Trying affinity for 192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.556 [INFO][5753] ipam/ipam.go 158: Attempting to load block cidr=192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.558 [INFO][5753] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.558 [INFO][5753] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.46.64/26 handle="k8s-pod-network.8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.559 [INFO][5753] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6 Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.562 [INFO][5753] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.46.64/26 handle="k8s-pod-network.8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.565 [INFO][5753] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.46.70/26] block=192.168.46.64/26 handle="k8s-pod-network.8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.565 [INFO][5753] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.46.70/26] handle="k8s-pod-network.8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.565 [INFO][5753] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 05:32:55.573919 containerd[1906]: 2025-09-16 05:32:55.565 [INFO][5753] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.46.70/26] IPv6=[] ContainerID="8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" HandleID="k8s-pod-network.8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-csi--node--driver--576jh-eth0" Sep 16 05:32:55.574328 containerd[1906]: 2025-09-16 05:32:55.566 [INFO][5683] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" Namespace="calico-system" Pod="csi-node-driver-576jh" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-csi--node--driver--576jh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--e7bf2d745b-k8s-csi--node--driver--576jh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0d3912ee-3cd2-402b-ab83-a92684e5a0cd", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 32, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-e7bf2d745b", ContainerID:"", Pod:"csi-node-driver-576jh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.46.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicf09c197577", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:32:55.574328 containerd[1906]: 2025-09-16 05:32:55.566 [INFO][5683] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.46.70/32] ContainerID="8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" Namespace="calico-system" Pod="csi-node-driver-576jh" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-csi--node--driver--576jh-eth0" Sep 16 05:32:55.574328 containerd[1906]: 2025-09-16 05:32:55.566 [INFO][5683] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicf09c197577 ContainerID="8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" Namespace="calico-system" Pod="csi-node-driver-576jh" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-csi--node--driver--576jh-eth0" Sep 16 05:32:55.574328 containerd[1906]: 2025-09-16 05:32:55.567 [INFO][5683] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" Namespace="calico-system" Pod="csi-node-driver-576jh" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-csi--node--driver--576jh-eth0" Sep 16 05:32:55.574328 containerd[1906]: 2025-09-16 05:32:55.567 [INFO][5683] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" Namespace="calico-system" Pod="csi-node-driver-576jh" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-csi--node--driver--576jh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--e7bf2d745b-k8s-csi--node--driver--576jh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0d3912ee-3cd2-402b-ab83-a92684e5a0cd", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 32, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-e7bf2d745b", ContainerID:"8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6", Pod:"csi-node-driver-576jh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.46.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicf09c197577", MAC:"02:d7:01:6f:a0:7c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:32:55.574328 containerd[1906]: 2025-09-16 05:32:55.572 [INFO][5683] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" Namespace="calico-system" Pod="csi-node-driver-576jh" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-csi--node--driver--576jh-eth0" Sep 16 05:32:55.581321 containerd[1906]: time="2025-09-16T05:32:55.581296675Z" level=info msg="connecting to shim 8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6" address="unix:///run/containerd/s/222cbc33f2e265825d8e73491a91a125999608a3b468c30fc8f42ee50300c751" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:32:55.589096 systemd-networkd[1822]: cali28a419e89cb: Gained IPv6LL Sep 16 05:32:55.601122 systemd[1]: Started cri-containerd-8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6.scope - libcontainer container 8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6. Sep 16 05:32:55.612468 containerd[1906]: time="2025-09-16T05:32:55.612447927Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-576jh,Uid:0d3912ee-3cd2-402b-ab83-a92684e5a0cd,Namespace:calico-system,Attempt:0,} returns sandbox id \"8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6\"" Sep 16 05:32:55.680937 kubelet[3272]: I0916 05:32:55.680845 3272 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-gl6kf" podStartSLOduration=33.680815918 podStartE2EDuration="33.680815918s" podCreationTimestamp="2025-09-16 05:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 05:32:55.680083816 +0000 UTC m=+39.251009923" watchObservedRunningTime="2025-09-16 05:32:55.680815918 +0000 UTC m=+39.251741983" Sep 16 05:32:55.697426 systemd-networkd[1822]: cali77d9f52e730: Link UP Sep 16 05:32:55.697686 systemd-networkd[1822]: cali77d9f52e730: Gained carrier Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.531 [INFO][5688] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--sgd4r-eth0 coredns-668d6bf9bc- kube-system 3262e0d4-7fee-4cbc-9baf-bc833d218f45 824 0 2025-09-16 05:32:22 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.0.0-n-e7bf2d745b coredns-668d6bf9bc-sgd4r eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali77d9f52e730 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" Namespace="kube-system" Pod="coredns-668d6bf9bc-sgd4r" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--sgd4r-" Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.531 [INFO][5688] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" Namespace="kube-system" Pod="coredns-668d6bf9bc-sgd4r" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--sgd4r-eth0" Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.544 [INFO][5751] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" HandleID="k8s-pod-network.b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--sgd4r-eth0" Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.544 [INFO][5751] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" HandleID="k8s-pod-network.b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--sgd4r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e7860), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.0.0-n-e7bf2d745b", "pod":"coredns-668d6bf9bc-sgd4r", "timestamp":"2025-09-16 05:32:55.544456158 +0000 UTC"}, Hostname:"ci-4459.0.0-n-e7bf2d745b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.544 [INFO][5751] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.565 [INFO][5751] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.565 [INFO][5751] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-n-e7bf2d745b' Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.651 [INFO][5751] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.660 [INFO][5751] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.673 [INFO][5751] ipam/ipam.go 511: Trying affinity for 192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.678 [INFO][5751] ipam/ipam.go 158: Attempting to load block cidr=192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.682 [INFO][5751] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.682 [INFO][5751] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.46.64/26 handle="k8s-pod-network.b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.684 [INFO][5751] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1 Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.688 [INFO][5751] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.46.64/26 handle="k8s-pod-network.b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.693 [INFO][5751] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.46.71/26] block=192.168.46.64/26 handle="k8s-pod-network.b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.694 [INFO][5751] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.46.71/26] handle="k8s-pod-network.b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.694 [INFO][5751] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 05:32:55.705222 containerd[1906]: 2025-09-16 05:32:55.694 [INFO][5751] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.46.71/26] IPv6=[] ContainerID="b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" HandleID="k8s-pod-network.b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--sgd4r-eth0" Sep 16 05:32:55.705740 containerd[1906]: 2025-09-16 05:32:55.696 [INFO][5688] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" Namespace="kube-system" Pod="coredns-668d6bf9bc-sgd4r" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--sgd4r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--sgd4r-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"3262e0d4-7fee-4cbc-9baf-bc833d218f45", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 32, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-e7bf2d745b", ContainerID:"", Pod:"coredns-668d6bf9bc-sgd4r", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.46.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali77d9f52e730", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:32:55.705740 containerd[1906]: 2025-09-16 05:32:55.696 [INFO][5688] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.46.71/32] ContainerID="b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" Namespace="kube-system" Pod="coredns-668d6bf9bc-sgd4r" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--sgd4r-eth0" Sep 16 05:32:55.705740 containerd[1906]: 2025-09-16 05:32:55.696 [INFO][5688] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali77d9f52e730 ContainerID="b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" Namespace="kube-system" Pod="coredns-668d6bf9bc-sgd4r" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--sgd4r-eth0" Sep 16 05:32:55.705740 containerd[1906]: 2025-09-16 05:32:55.697 [INFO][5688] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" Namespace="kube-system" Pod="coredns-668d6bf9bc-sgd4r" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--sgd4r-eth0" Sep 16 05:32:55.705740 containerd[1906]: 2025-09-16 05:32:55.698 [INFO][5688] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" Namespace="kube-system" Pod="coredns-668d6bf9bc-sgd4r" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--sgd4r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--sgd4r-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"3262e0d4-7fee-4cbc-9baf-bc833d218f45", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 32, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-e7bf2d745b", ContainerID:"b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1", Pod:"coredns-668d6bf9bc-sgd4r", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.46.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali77d9f52e730", MAC:"32:71:5e:e6:41:d7", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:32:55.705740 containerd[1906]: 2025-09-16 05:32:55.704 [INFO][5688] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" Namespace="kube-system" Pod="coredns-668d6bf9bc-sgd4r" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-coredns--668d6bf9bc--sgd4r-eth0" Sep 16 05:32:55.713795 containerd[1906]: time="2025-09-16T05:32:55.713743422Z" level=info msg="connecting to shim b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1" address="unix:///run/containerd/s/bc05ed97059034a9d6cd77de2f859f6c9ff619903edcfeb2a8c321800a90bb56" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:32:55.737559 systemd[1]: Started cri-containerd-b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1.scope - libcontainer container b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1. Sep 16 05:32:55.783904 systemd-networkd[1822]: cali53261b51fd1: Link UP Sep 16 05:32:55.784087 systemd-networkd[1822]: cali53261b51fd1: Gained carrier Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.531 [INFO][5704] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--n--e7bf2d745b-k8s-goldmane--54d579b49d--4mpgc-eth0 goldmane-54d579b49d- calico-system 905ec77c-cbec-4843-b250-4baac4cf8a0c 823 0 2025-09-16 05:32:31 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459.0.0-n-e7bf2d745b goldmane-54d579b49d-4mpgc eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali53261b51fd1 [] [] }} ContainerID="2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" Namespace="calico-system" Pod="goldmane-54d579b49d-4mpgc" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-goldmane--54d579b49d--4mpgc-" Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.531 [INFO][5704] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" Namespace="calico-system" Pod="goldmane-54d579b49d-4mpgc" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-goldmane--54d579b49d--4mpgc-eth0" Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.545 [INFO][5749] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" HandleID="k8s-pod-network.2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-goldmane--54d579b49d--4mpgc-eth0" Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.545 [INFO][5749] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" HandleID="k8s-pod-network.2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-goldmane--54d579b49d--4mpgc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00018c290), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.0.0-n-e7bf2d745b", "pod":"goldmane-54d579b49d-4mpgc", "timestamp":"2025-09-16 05:32:55.545025846 +0000 UTC"}, Hostname:"ci-4459.0.0-n-e7bf2d745b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.545 [INFO][5749] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.694 [INFO][5749] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.694 [INFO][5749] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-n-e7bf2d745b' Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.751 [INFO][5749] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.760 [INFO][5749] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.766 [INFO][5749] ipam/ipam.go 511: Trying affinity for 192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.768 [INFO][5749] ipam/ipam.go 158: Attempting to load block cidr=192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.770 [INFO][5749] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.46.64/26 host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.770 [INFO][5749] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.46.64/26 handle="k8s-pod-network.2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.772 [INFO][5749] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3 Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.775 [INFO][5749] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.46.64/26 handle="k8s-pod-network.2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.781 [INFO][5749] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.46.72/26] block=192.168.46.64/26 handle="k8s-pod-network.2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.781 [INFO][5749] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.46.72/26] handle="k8s-pod-network.2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" host="ci-4459.0.0-n-e7bf2d745b" Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.781 [INFO][5749] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 05:32:55.790599 containerd[1906]: 2025-09-16 05:32:55.781 [INFO][5749] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.46.72/26] IPv6=[] ContainerID="2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" HandleID="k8s-pod-network.2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" Workload="ci--4459.0.0--n--e7bf2d745b-k8s-goldmane--54d579b49d--4mpgc-eth0" Sep 16 05:32:55.791034 containerd[1906]: 2025-09-16 05:32:55.782 [INFO][5704] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" Namespace="calico-system" Pod="goldmane-54d579b49d-4mpgc" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-goldmane--54d579b49d--4mpgc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--e7bf2d745b-k8s-goldmane--54d579b49d--4mpgc-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"905ec77c-cbec-4843-b250-4baac4cf8a0c", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 32, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-e7bf2d745b", ContainerID:"", Pod:"goldmane-54d579b49d-4mpgc", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.46.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali53261b51fd1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:32:55.791034 containerd[1906]: 2025-09-16 05:32:55.782 [INFO][5704] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.46.72/32] ContainerID="2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" Namespace="calico-system" Pod="goldmane-54d579b49d-4mpgc" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-goldmane--54d579b49d--4mpgc-eth0" Sep 16 05:32:55.791034 containerd[1906]: 2025-09-16 05:32:55.782 [INFO][5704] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali53261b51fd1 ContainerID="2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" Namespace="calico-system" Pod="goldmane-54d579b49d-4mpgc" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-goldmane--54d579b49d--4mpgc-eth0" Sep 16 05:32:55.791034 containerd[1906]: 2025-09-16 05:32:55.784 [INFO][5704] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" Namespace="calico-system" Pod="goldmane-54d579b49d-4mpgc" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-goldmane--54d579b49d--4mpgc-eth0" Sep 16 05:32:55.791034 containerd[1906]: 2025-09-16 05:32:55.784 [INFO][5704] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" Namespace="calico-system" Pod="goldmane-54d579b49d-4mpgc" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-goldmane--54d579b49d--4mpgc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--n--e7bf2d745b-k8s-goldmane--54d579b49d--4mpgc-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"905ec77c-cbec-4843-b250-4baac4cf8a0c", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 5, 32, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-n-e7bf2d745b", ContainerID:"2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3", Pod:"goldmane-54d579b49d-4mpgc", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.46.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali53261b51fd1", MAC:"1e:0b:ff:35:29:ae", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 05:32:55.791034 containerd[1906]: 2025-09-16 05:32:55.789 [INFO][5704] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" Namespace="calico-system" Pod="goldmane-54d579b49d-4mpgc" WorkloadEndpoint="ci--4459.0.0--n--e7bf2d745b-k8s-goldmane--54d579b49d--4mpgc-eth0" Sep 16 05:32:55.794861 containerd[1906]: time="2025-09-16T05:32:55.794839637Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-sgd4r,Uid:3262e0d4-7fee-4cbc-9baf-bc833d218f45,Namespace:kube-system,Attempt:0,} returns sandbox id \"b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1\"" Sep 16 05:32:55.795974 containerd[1906]: time="2025-09-16T05:32:55.795955163Z" level=info msg="CreateContainer within sandbox \"b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 16 05:32:55.798466 containerd[1906]: time="2025-09-16T05:32:55.798431666Z" level=info msg="connecting to shim 2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3" address="unix:///run/containerd/s/4f5ca260618fb8212082e9c082eea62f08a47ee27c6d3dcb89128686b1bf1be0" namespace=k8s.io protocol=ttrpc version=3 Sep 16 05:32:55.798807 containerd[1906]: time="2025-09-16T05:32:55.798792827Z" level=info msg="Container c0e29b85aa94cf6bd26aecb92c6dac6accabd9f6e04c7fd0e2df00b04076284b: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:32:55.801367 containerd[1906]: time="2025-09-16T05:32:55.801350010Z" level=info msg="CreateContainer within sandbox \"b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c0e29b85aa94cf6bd26aecb92c6dac6accabd9f6e04c7fd0e2df00b04076284b\"" Sep 16 05:32:55.801604 containerd[1906]: time="2025-09-16T05:32:55.801594468Z" level=info msg="StartContainer for \"c0e29b85aa94cf6bd26aecb92c6dac6accabd9f6e04c7fd0e2df00b04076284b\"" Sep 16 05:32:55.802051 containerd[1906]: time="2025-09-16T05:32:55.802036275Z" level=info msg="connecting to shim c0e29b85aa94cf6bd26aecb92c6dac6accabd9f6e04c7fd0e2df00b04076284b" address="unix:///run/containerd/s/bc05ed97059034a9d6cd77de2f859f6c9ff619903edcfeb2a8c321800a90bb56" protocol=ttrpc version=3 Sep 16 05:32:55.818326 systemd[1]: Started cri-containerd-2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3.scope - libcontainer container 2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3. Sep 16 05:32:55.820088 systemd[1]: Started cri-containerd-c0e29b85aa94cf6bd26aecb92c6dac6accabd9f6e04c7fd0e2df00b04076284b.scope - libcontainer container c0e29b85aa94cf6bd26aecb92c6dac6accabd9f6e04c7fd0e2df00b04076284b. Sep 16 05:32:55.833553 containerd[1906]: time="2025-09-16T05:32:55.833503587Z" level=info msg="StartContainer for \"c0e29b85aa94cf6bd26aecb92c6dac6accabd9f6e04c7fd0e2df00b04076284b\" returns successfully" Sep 16 05:32:55.844302 containerd[1906]: time="2025-09-16T05:32:55.844278193Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-4mpgc,Uid:905ec77c-cbec-4843-b250-4baac4cf8a0c,Namespace:calico-system,Attempt:0,} returns sandbox id \"2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3\"" Sep 16 05:32:56.321826 containerd[1906]: time="2025-09-16T05:32:56.321802002Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:56.322010 containerd[1906]: time="2025-09-16T05:32:56.321996420Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 16 05:32:56.322517 containerd[1906]: time="2025-09-16T05:32:56.322505816Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:56.323413 containerd[1906]: time="2025-09-16T05:32:56.323399348Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:56.323784 containerd[1906]: time="2025-09-16T05:32:56.323771265Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 2.696131104s" Sep 16 05:32:56.323803 containerd[1906]: time="2025-09-16T05:32:56.323790563Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 16 05:32:56.324243 containerd[1906]: time="2025-09-16T05:32:56.324233600Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 16 05:32:56.324766 containerd[1906]: time="2025-09-16T05:32:56.324753999Z" level=info msg="CreateContainer within sandbox \"247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 16 05:32:56.327459 containerd[1906]: time="2025-09-16T05:32:56.327418480Z" level=info msg="Container 7c74d6213b47a892210feba7326aa899c7db0c28fc7698e0f04e3657e9f6a4e0: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:32:56.331485 containerd[1906]: time="2025-09-16T05:32:56.331472126Z" level=info msg="CreateContainer within sandbox \"247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"7c74d6213b47a892210feba7326aa899c7db0c28fc7698e0f04e3657e9f6a4e0\"" Sep 16 05:32:56.331716 containerd[1906]: time="2025-09-16T05:32:56.331699541Z" level=info msg="StartContainer for \"7c74d6213b47a892210feba7326aa899c7db0c28fc7698e0f04e3657e9f6a4e0\"" Sep 16 05:32:56.332229 containerd[1906]: time="2025-09-16T05:32:56.332191461Z" level=info msg="connecting to shim 7c74d6213b47a892210feba7326aa899c7db0c28fc7698e0f04e3657e9f6a4e0" address="unix:///run/containerd/s/f92646748b52291b966ec74f18ca9e65cee46e9bec5d16337e2fbb6662f79280" protocol=ttrpc version=3 Sep 16 05:32:56.357183 systemd[1]: Started cri-containerd-7c74d6213b47a892210feba7326aa899c7db0c28fc7698e0f04e3657e9f6a4e0.scope - libcontainer container 7c74d6213b47a892210feba7326aa899c7db0c28fc7698e0f04e3657e9f6a4e0. Sep 16 05:32:56.390913 containerd[1906]: time="2025-09-16T05:32:56.390886117Z" level=info msg="StartContainer for \"7c74d6213b47a892210feba7326aa899c7db0c28fc7698e0f04e3657e9f6a4e0\" returns successfully" Sep 16 05:32:56.612156 systemd-networkd[1822]: cali3be0be8f6f8: Gained IPv6LL Sep 16 05:32:56.665116 kubelet[3272]: I0916 05:32:56.665081 3272 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6dfcc9ff54-dmntm" podStartSLOduration=24.968396799 podStartE2EDuration="27.665070545s" podCreationTimestamp="2025-09-16 05:32:29 +0000 UTC" firstStartedPulling="2025-09-16 05:32:53.627520903 +0000 UTC m=+37.198446938" lastFinishedPulling="2025-09-16 05:32:56.324194648 +0000 UTC m=+39.895120684" observedRunningTime="2025-09-16 05:32:56.664932302 +0000 UTC m=+40.235858342" watchObservedRunningTime="2025-09-16 05:32:56.665070545 +0000 UTC m=+40.235996578" Sep 16 05:32:56.669870 kubelet[3272]: I0916 05:32:56.669839 3272 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-sgd4r" podStartSLOduration=34.669828472 podStartE2EDuration="34.669828472s" podCreationTimestamp="2025-09-16 05:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 05:32:56.669731516 +0000 UTC m=+40.240657556" watchObservedRunningTime="2025-09-16 05:32:56.669828472 +0000 UTC m=+40.240754504" Sep 16 05:32:56.717199 containerd[1906]: time="2025-09-16T05:32:56.717177266Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:56.717402 containerd[1906]: time="2025-09-16T05:32:56.717363073Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 16 05:32:56.718457 containerd[1906]: time="2025-09-16T05:32:56.718417542Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 394.169869ms" Sep 16 05:32:56.718457 containerd[1906]: time="2025-09-16T05:32:56.718432470Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 16 05:32:56.718874 containerd[1906]: time="2025-09-16T05:32:56.718862214Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 16 05:32:56.719349 containerd[1906]: time="2025-09-16T05:32:56.719336375Z" level=info msg="CreateContainer within sandbox \"d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 16 05:32:56.722021 containerd[1906]: time="2025-09-16T05:32:56.721999279Z" level=info msg="Container 70111634f7be182c48524bc34000a8b5349040cf8dc814c18568187f5bb919be: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:32:56.724909 containerd[1906]: time="2025-09-16T05:32:56.724869459Z" level=info msg="CreateContainer within sandbox \"d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"70111634f7be182c48524bc34000a8b5349040cf8dc814c18568187f5bb919be\"" Sep 16 05:32:56.725195 containerd[1906]: time="2025-09-16T05:32:56.725142361Z" level=info msg="StartContainer for \"70111634f7be182c48524bc34000a8b5349040cf8dc814c18568187f5bb919be\"" Sep 16 05:32:56.725697 containerd[1906]: time="2025-09-16T05:32:56.725669928Z" level=info msg="connecting to shim 70111634f7be182c48524bc34000a8b5349040cf8dc814c18568187f5bb919be" address="unix:///run/containerd/s/1f21acd60fbd0ec56a64c866683105856b0b5de688f11b0d88a52f1f2b78774b" protocol=ttrpc version=3 Sep 16 05:32:56.749144 systemd[1]: Started cri-containerd-70111634f7be182c48524bc34000a8b5349040cf8dc814c18568187f5bb919be.scope - libcontainer container 70111634f7be182c48524bc34000a8b5349040cf8dc814c18568187f5bb919be. Sep 16 05:32:56.775842 containerd[1906]: time="2025-09-16T05:32:56.775822144Z" level=info msg="StartContainer for \"70111634f7be182c48524bc34000a8b5349040cf8dc814c18568187f5bb919be\" returns successfully" Sep 16 05:32:56.932102 systemd-networkd[1822]: calicf09c197577: Gained IPv6LL Sep 16 05:32:57.060292 systemd-networkd[1822]: cali77d9f52e730: Gained IPv6LL Sep 16 05:32:57.124142 systemd-networkd[1822]: cali53261b51fd1: Gained IPv6LL Sep 16 05:32:57.664906 kubelet[3272]: I0916 05:32:57.664875 3272 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6dfcc9ff54-xnr9g" podStartSLOduration=25.723662971 podStartE2EDuration="28.664864112s" podCreationTimestamp="2025-09-16 05:32:29 +0000 UTC" firstStartedPulling="2025-09-16 05:32:53.777590055 +0000 UTC m=+37.348516090" lastFinishedPulling="2025-09-16 05:32:56.718791196 +0000 UTC m=+40.289717231" observedRunningTime="2025-09-16 05:32:57.664749018 +0000 UTC m=+41.235675055" watchObservedRunningTime="2025-09-16 05:32:57.664864112 +0000 UTC m=+41.235790144" Sep 16 05:32:58.663575 kubelet[3272]: I0916 05:32:58.663493 3272 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 05:32:59.752071 containerd[1906]: time="2025-09-16T05:32:59.752038422Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:59.752287 containerd[1906]: time="2025-09-16T05:32:59.752230356Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 16 05:32:59.752645 containerd[1906]: time="2025-09-16T05:32:59.752615077Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:59.753444 containerd[1906]: time="2025-09-16T05:32:59.753404509Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:32:59.753808 containerd[1906]: time="2025-09-16T05:32:59.753768296Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 3.034889237s" Sep 16 05:32:59.753808 containerd[1906]: time="2025-09-16T05:32:59.753784189Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 16 05:32:59.754303 containerd[1906]: time="2025-09-16T05:32:59.754287685Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 16 05:32:59.757228 containerd[1906]: time="2025-09-16T05:32:59.757209458Z" level=info msg="CreateContainer within sandbox \"5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 16 05:32:59.759884 containerd[1906]: time="2025-09-16T05:32:59.759847782Z" level=info msg="Container 062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:32:59.762539 containerd[1906]: time="2025-09-16T05:32:59.762526690Z" level=info msg="CreateContainer within sandbox \"5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\"" Sep 16 05:32:59.762748 containerd[1906]: time="2025-09-16T05:32:59.762737531Z" level=info msg="StartContainer for \"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\"" Sep 16 05:32:59.763262 containerd[1906]: time="2025-09-16T05:32:59.763251468Z" level=info msg="connecting to shim 062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714" address="unix:///run/containerd/s/1cee62eb2662ea536ae6c326976f3b33c1232916723ae708971058fa28b6c1b1" protocol=ttrpc version=3 Sep 16 05:32:59.785202 systemd[1]: Started cri-containerd-062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714.scope - libcontainer container 062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714. Sep 16 05:32:59.817688 containerd[1906]: time="2025-09-16T05:32:59.817661555Z" level=info msg="StartContainer for \"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" returns successfully" Sep 16 05:33:00.695978 kubelet[3272]: I0916 05:33:00.695854 3272 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-76c6977657-fx526" podStartSLOduration=22.768019784 podStartE2EDuration="28.695814839s" podCreationTimestamp="2025-09-16 05:32:32 +0000 UTC" firstStartedPulling="2025-09-16 05:32:53.826448047 +0000 UTC m=+37.397374082" lastFinishedPulling="2025-09-16 05:32:59.754243101 +0000 UTC m=+43.325169137" observedRunningTime="2025-09-16 05:33:00.69523026 +0000 UTC m=+44.266156414" watchObservedRunningTime="2025-09-16 05:33:00.695814839 +0000 UTC m=+44.266740924" Sep 16 05:33:00.772766 containerd[1906]: time="2025-09-16T05:33:00.772733701Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"bc4f475681e364cbf2e7f3f6f2718bce21937d76e32157b8c213409976dd0237\" pid:6200 exited_at:{seconds:1758000780 nanos:772508388}" Sep 16 05:33:01.499160 containerd[1906]: time="2025-09-16T05:33:01.499110545Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:33:01.499362 containerd[1906]: time="2025-09-16T05:33:01.499321085Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 16 05:33:01.499819 containerd[1906]: time="2025-09-16T05:33:01.499770277Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:33:01.500629 containerd[1906]: time="2025-09-16T05:33:01.500582801Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:33:01.501213 containerd[1906]: time="2025-09-16T05:33:01.501172235Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.746870861s" Sep 16 05:33:01.501213 containerd[1906]: time="2025-09-16T05:33:01.501187231Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 16 05:33:01.501644 containerd[1906]: time="2025-09-16T05:33:01.501633056Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 16 05:33:01.502253 containerd[1906]: time="2025-09-16T05:33:01.502242161Z" level=info msg="CreateContainer within sandbox \"8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 16 05:33:01.506082 containerd[1906]: time="2025-09-16T05:33:01.506039979Z" level=info msg="Container ef01868d654efe394384c8458a30ca26ccb4dbec8a05fc7c552b0bfdffa8a584: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:33:01.512034 containerd[1906]: time="2025-09-16T05:33:01.512010348Z" level=info msg="CreateContainer within sandbox \"8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"ef01868d654efe394384c8458a30ca26ccb4dbec8a05fc7c552b0bfdffa8a584\"" Sep 16 05:33:01.512298 containerd[1906]: time="2025-09-16T05:33:01.512281323Z" level=info msg="StartContainer for \"ef01868d654efe394384c8458a30ca26ccb4dbec8a05fc7c552b0bfdffa8a584\"" Sep 16 05:33:01.513134 containerd[1906]: time="2025-09-16T05:33:01.513091672Z" level=info msg="connecting to shim ef01868d654efe394384c8458a30ca26ccb4dbec8a05fc7c552b0bfdffa8a584" address="unix:///run/containerd/s/222cbc33f2e265825d8e73491a91a125999608a3b468c30fc8f42ee50300c751" protocol=ttrpc version=3 Sep 16 05:33:01.536117 systemd[1]: Started cri-containerd-ef01868d654efe394384c8458a30ca26ccb4dbec8a05fc7c552b0bfdffa8a584.scope - libcontainer container ef01868d654efe394384c8458a30ca26ccb4dbec8a05fc7c552b0bfdffa8a584. Sep 16 05:33:01.554732 containerd[1906]: time="2025-09-16T05:33:01.554706783Z" level=info msg="StartContainer for \"ef01868d654efe394384c8458a30ca26ccb4dbec8a05fc7c552b0bfdffa8a584\" returns successfully" Sep 16 05:33:04.491283 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4007757588.mount: Deactivated successfully. Sep 16 05:33:04.720419 containerd[1906]: time="2025-09-16T05:33:04.720367863Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:33:04.720626 containerd[1906]: time="2025-09-16T05:33:04.720580938Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 16 05:33:04.720911 containerd[1906]: time="2025-09-16T05:33:04.720871896Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:33:04.722565 containerd[1906]: time="2025-09-16T05:33:04.722552656Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:33:04.723081 containerd[1906]: time="2025-09-16T05:33:04.723062807Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 3.221412801s" Sep 16 05:33:04.723121 containerd[1906]: time="2025-09-16T05:33:04.723084881Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 16 05:33:04.723555 containerd[1906]: time="2025-09-16T05:33:04.723520452Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 16 05:33:04.724066 containerd[1906]: time="2025-09-16T05:33:04.724051487Z" level=info msg="CreateContainer within sandbox \"2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 16 05:33:04.727222 containerd[1906]: time="2025-09-16T05:33:04.727176329Z" level=info msg="Container 91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:33:04.731376 containerd[1906]: time="2025-09-16T05:33:04.731328990Z" level=info msg="CreateContainer within sandbox \"2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\"" Sep 16 05:33:04.731638 containerd[1906]: time="2025-09-16T05:33:04.731587743Z" level=info msg="StartContainer for \"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\"" Sep 16 05:33:04.732144 containerd[1906]: time="2025-09-16T05:33:04.732104490Z" level=info msg="connecting to shim 91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5" address="unix:///run/containerd/s/4f5ca260618fb8212082e9c082eea62f08a47ee27c6d3dcb89128686b1bf1be0" protocol=ttrpc version=3 Sep 16 05:33:04.755203 systemd[1]: Started cri-containerd-91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5.scope - libcontainer container 91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5. Sep 16 05:33:04.784103 containerd[1906]: time="2025-09-16T05:33:04.784077067Z" level=info msg="StartContainer for \"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" returns successfully" Sep 16 05:33:05.695276 kubelet[3272]: I0916 05:33:05.695237 3272 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-4mpgc" podStartSLOduration=25.816578575 podStartE2EDuration="34.695225272s" podCreationTimestamp="2025-09-16 05:32:31 +0000 UTC" firstStartedPulling="2025-09-16 05:32:55.844815334 +0000 UTC m=+39.415741369" lastFinishedPulling="2025-09-16 05:33:04.72346203 +0000 UTC m=+48.294388066" observedRunningTime="2025-09-16 05:33:05.695101668 +0000 UTC m=+49.266027706" watchObservedRunningTime="2025-09-16 05:33:05.695225272 +0000 UTC m=+49.266151306" Sep 16 05:33:05.751361 containerd[1906]: time="2025-09-16T05:33:05.751303478Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"c951c73c9c78b9e4cbd37edd3476e9feb2c296806e70e8057ed0d604e51e9c3a\" pid:6316 exit_status:1 exited_at:{seconds:1758000785 nanos:751022221}" Sep 16 05:33:06.574822 containerd[1906]: time="2025-09-16T05:33:06.574797337Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:33:06.575061 containerd[1906]: time="2025-09-16T05:33:06.574993544Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 16 05:33:06.575455 containerd[1906]: time="2025-09-16T05:33:06.575416625Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:33:06.576229 containerd[1906]: time="2025-09-16T05:33:06.576185805Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 05:33:06.576578 containerd[1906]: time="2025-09-16T05:33:06.576538524Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.853003131s" Sep 16 05:33:06.576578 containerd[1906]: time="2025-09-16T05:33:06.576553800Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 16 05:33:06.577436 containerd[1906]: time="2025-09-16T05:33:06.577424735Z" level=info msg="CreateContainer within sandbox \"8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 16 05:33:06.580243 containerd[1906]: time="2025-09-16T05:33:06.580229027Z" level=info msg="Container d5ef683a2855614b05be77cb56c0877e26828e2e9d50e31bee898e09769b02dc: CDI devices from CRI Config.CDIDevices: []" Sep 16 05:33:06.584872 containerd[1906]: time="2025-09-16T05:33:06.584826333Z" level=info msg="CreateContainer within sandbox \"8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"d5ef683a2855614b05be77cb56c0877e26828e2e9d50e31bee898e09769b02dc\"" Sep 16 05:33:06.585111 containerd[1906]: time="2025-09-16T05:33:06.585098074Z" level=info msg="StartContainer for \"d5ef683a2855614b05be77cb56c0877e26828e2e9d50e31bee898e09769b02dc\"" Sep 16 05:33:06.585854 containerd[1906]: time="2025-09-16T05:33:06.585807929Z" level=info msg="connecting to shim d5ef683a2855614b05be77cb56c0877e26828e2e9d50e31bee898e09769b02dc" address="unix:///run/containerd/s/222cbc33f2e265825d8e73491a91a125999608a3b468c30fc8f42ee50300c751" protocol=ttrpc version=3 Sep 16 05:33:06.604175 systemd[1]: Started cri-containerd-d5ef683a2855614b05be77cb56c0877e26828e2e9d50e31bee898e09769b02dc.scope - libcontainer container d5ef683a2855614b05be77cb56c0877e26828e2e9d50e31bee898e09769b02dc. Sep 16 05:33:06.623005 containerd[1906]: time="2025-09-16T05:33:06.622978673Z" level=info msg="StartContainer for \"d5ef683a2855614b05be77cb56c0877e26828e2e9d50e31bee898e09769b02dc\" returns successfully" Sep 16 05:33:06.714895 kubelet[3272]: I0916 05:33:06.714842 3272 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-576jh" podStartSLOduration=24.750899647 podStartE2EDuration="35.7148255s" podCreationTimestamp="2025-09-16 05:32:31 +0000 UTC" firstStartedPulling="2025-09-16 05:32:55.612954271 +0000 UTC m=+39.183880306" lastFinishedPulling="2025-09-16 05:33:06.576880126 +0000 UTC m=+50.147806159" observedRunningTime="2025-09-16 05:33:06.714456243 +0000 UTC m=+50.285382294" watchObservedRunningTime="2025-09-16 05:33:06.7148255 +0000 UTC m=+50.285751545" Sep 16 05:33:06.766598 containerd[1906]: time="2025-09-16T05:33:06.766572261Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"7e5df397483f0c4893c95f50548217b207ba41b8d133c1b336b0f786edc07050\" pid:6392 exit_status:1 exited_at:{seconds:1758000786 nanos:766413931}" Sep 16 05:33:07.541073 kubelet[3272]: I0916 05:33:07.540974 3272 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 16 05:33:07.541338 kubelet[3272]: I0916 05:33:07.541100 3272 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 16 05:33:24.599646 containerd[1906]: time="2025-09-16T05:33:24.599590718Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\" id:\"f5551dc9a58f8a2f9c516d10e5c3c3ea1fdccfdd8e57e36841a85828f2f938a2\" pid:6440 exited_at:{seconds:1758000804 nanos:599347230}" Sep 16 05:33:30.710837 containerd[1906]: time="2025-09-16T05:33:30.710813858Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"ae432253b881fc6c858be558a5e477a2a94d83c4c8e3805b5a461e28e4a66e4b\" pid:6481 exited_at:{seconds:1758000810 nanos:710657675}" Sep 16 05:33:35.429151 containerd[1906]: time="2025-09-16T05:33:35.429125798Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"4c3079f6db856efe6f887efd327f99f19afe97971b6bf566efd7e8bdad048ad6\" pid:6505 exited_at:{seconds:1758000815 nanos:428926796}" Sep 16 05:33:36.801883 containerd[1906]: time="2025-09-16T05:33:36.801857455Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"1c4982110363a050e58c8bb106e3bcda966dfa2963d25c8f114e6c9af06d2e19\" pid:6540 exited_at:{seconds:1758000816 nanos:801605524}" Sep 16 05:33:41.331537 kubelet[3272]: I0916 05:33:41.331422 3272 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 05:33:46.610812 containerd[1906]: time="2025-09-16T05:33:46.610785810Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"7adb1021331eeb6f2283ae68b5497307bae979915d46de1c5bfbb84011bf9953\" pid:6579 exited_at:{seconds:1758000826 nanos:610663497}" Sep 16 05:33:47.742520 systemd[1]: Started sshd@12-139.178.94.21:22-120.157.9.205:50462.service - OpenSSH per-connection server daemon (120.157.9.205:50462). Sep 16 05:33:47.759893 sshd[6590]: Connection closed by 120.157.9.205 port 50462 Sep 16 05:33:47.760208 systemd[1]: sshd@12-139.178.94.21:22-120.157.9.205:50462.service: Deactivated successfully. Sep 16 05:33:54.601923 containerd[1906]: time="2025-09-16T05:33:54.601898240Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\" id:\"d54765219efa29ccfc3814923d4be4b295637fa738fe19cc5fde17407c6b3276\" pid:6608 exited_at:{seconds:1758000834 nanos:601603132}" Sep 16 05:34:00.720896 containerd[1906]: time="2025-09-16T05:34:00.720865656Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"dd9dab13df4d77c049e5a3246212580578ef366aa65e427844f0b89bdc6f380a\" pid:6642 exited_at:{seconds:1758000840 nanos:720733781}" Sep 16 05:34:06.813601 containerd[1906]: time="2025-09-16T05:34:06.813571843Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"91bb7b1a23ff990bf3d2099d7ac0fc39a0ae2ffcef1c474b9fa4acb013fe85a6\" pid:6665 exited_at:{seconds:1758000846 nanos:813392301}" Sep 16 05:34:24.608825 containerd[1906]: time="2025-09-16T05:34:24.608797365Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\" id:\"eae3fff03c0eb32777c03386b7c0779de9587e70134f7ca6116c8940cbc349ce\" pid:6717 exited_at:{seconds:1758000864 nanos:608608440}" Sep 16 05:34:30.724852 containerd[1906]: time="2025-09-16T05:34:30.724825434Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"9bc87f0a0d2fa19cac2291a0cee665283d3592e724f34a12e62e6a932e734659\" pid:6769 exited_at:{seconds:1758000870 nanos:724710908}" Sep 16 05:34:35.436964 containerd[1906]: time="2025-09-16T05:34:35.436931443Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"116d1ba1655342484b5e6acf56e95cb0845846db4ed46893bffab8e33977e8cf\" pid:6795 exited_at:{seconds:1758000875 nanos:436734156}" Sep 16 05:34:36.827885 containerd[1906]: time="2025-09-16T05:34:36.827861806Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"0a46b665b45ff1a1616ac4653717b49600856dca810b7bff71eefa5b694fd90a\" pid:6828 exited_at:{seconds:1758000876 nanos:827677484}" Sep 16 05:34:46.548632 containerd[1906]: time="2025-09-16T05:34:46.548582333Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"91e5a5e9e23424322093c87812d75691fbaed38337f10141ef91af47da038dbc\" pid:6864 exited_at:{seconds:1758000886 nanos:548465128}" Sep 16 05:34:54.598218 containerd[1906]: time="2025-09-16T05:34:54.598193705Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\" id:\"46f9e91997a9a80cfc011009c6493574d49184a8d89a9cb0b32b97a5a419688a\" pid:6889 exited_at:{seconds:1758000894 nanos:597851031}" Sep 16 05:35:00.729540 containerd[1906]: time="2025-09-16T05:35:00.729503348Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"0a0756e85d1e4384d733d96be8c823da25c0100f0c264d5876bdbf713cf71e9a\" pid:6924 exited_at:{seconds:1758000900 nanos:729317994}" Sep 16 05:35:06.767899 containerd[1906]: time="2025-09-16T05:35:06.767868978Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"b78982c107338fb4ab4edea0e6e04143968acccc572a213ab7168b9148f02b86\" pid:6947 exited_at:{seconds:1758000906 nanos:767670937}" Sep 16 05:35:24.604318 containerd[1906]: time="2025-09-16T05:35:24.604283944Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\" id:\"5447f7793f1ecd372b1b2a37410213101013ac6b610c6f6136869011a26475d5\" pid:6984 exited_at:{seconds:1758000924 nanos:604044139}" Sep 16 05:35:30.760827 containerd[1906]: time="2025-09-16T05:35:30.760796454Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"fe0d2a5b32ec7b2e68a61603f1f344ccc9e0b42a1bdf68b3c129adf967444fd0\" pid:7025 exited_at:{seconds:1758000930 nanos:760636705}" Sep 16 05:35:35.398583 containerd[1906]: time="2025-09-16T05:35:35.398555092Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"01b2278c4552ab57763701064fc29dab57f6c5b67d8957b9f35f30417c5e2f21\" pid:7048 exited_at:{seconds:1758000935 nanos:398390887}" Sep 16 05:35:36.819488 containerd[1906]: time="2025-09-16T05:35:36.819461273Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"f469fc6c13575005a786ad6621ea6b2a0a6465aab93ab73fda7207ef54024ba8\" pid:7084 exited_at:{seconds:1758000936 nanos:819297778}" Sep 16 05:35:46.601900 containerd[1906]: time="2025-09-16T05:35:46.601871159Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"eb5e7a9c47cae7b01f05aaa2cb9f997f08ccb3c6bedde684e3335a3169dea82c\" pid:7117 exited_at:{seconds:1758000946 nanos:601744964}" Sep 16 05:35:54.602264 containerd[1906]: time="2025-09-16T05:35:54.602241336Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\" id:\"368be9403d02b660d2c7a414e1a6d960f15d7c62b25ec3774a6240a5fe0f7591\" pid:7148 exited_at:{seconds:1758000954 nanos:601997176}" Sep 16 05:36:00.759300 containerd[1906]: time="2025-09-16T05:36:00.759271951Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"9359ee1c711a6f3041319cbaa2e027fd1a04399c357416ab499ec898a7ac89ba\" pid:7183 exited_at:{seconds:1758000960 nanos:759074148}" Sep 16 05:36:06.820184 containerd[1906]: time="2025-09-16T05:36:06.820118339Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"fad2a497179c3d5c853fc30ceaa8bedfd616ae080227c2f48ee92abba2f1ade9\" pid:7222 exited_at:{seconds:1758000966 nanos:819933843}" Sep 16 05:36:18.236225 update_engine[1901]: I20250916 05:36:18.235961 1901 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 16 05:36:18.236225 update_engine[1901]: I20250916 05:36:18.236099 1901 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 16 05:36:18.237195 update_engine[1901]: I20250916 05:36:18.236467 1901 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 16 05:36:18.237562 update_engine[1901]: I20250916 05:36:18.237468 1901 omaha_request_params.cc:62] Current group set to developer Sep 16 05:36:18.237727 update_engine[1901]: I20250916 05:36:18.237695 1901 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 16 05:36:18.237889 update_engine[1901]: I20250916 05:36:18.237725 1901 update_attempter.cc:643] Scheduling an action processor start. Sep 16 05:36:18.237889 update_engine[1901]: I20250916 05:36:18.237766 1901 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 16 05:36:18.237889 update_engine[1901]: I20250916 05:36:18.237836 1901 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 16 05:36:18.238175 update_engine[1901]: I20250916 05:36:18.238013 1901 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 16 05:36:18.238175 update_engine[1901]: I20250916 05:36:18.238044 1901 omaha_request_action.cc:272] Request: Sep 16 05:36:18.238175 update_engine[1901]: Sep 16 05:36:18.238175 update_engine[1901]: Sep 16 05:36:18.238175 update_engine[1901]: Sep 16 05:36:18.238175 update_engine[1901]: Sep 16 05:36:18.238175 update_engine[1901]: Sep 16 05:36:18.238175 update_engine[1901]: Sep 16 05:36:18.238175 update_engine[1901]: Sep 16 05:36:18.238175 update_engine[1901]: Sep 16 05:36:18.238175 update_engine[1901]: I20250916 05:36:18.238060 1901 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 16 05:36:18.239003 locksmithd[1977]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 16 05:36:18.240974 update_engine[1901]: I20250916 05:36:18.240962 1901 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 16 05:36:18.241371 update_engine[1901]: I20250916 05:36:18.241331 1901 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 16 05:36:18.241691 update_engine[1901]: E20250916 05:36:18.241650 1901 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 16 05:36:18.241691 update_engine[1901]: I20250916 05:36:18.241683 1901 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 16 05:36:24.602823 containerd[1906]: time="2025-09-16T05:36:24.602794620Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\" id:\"7e9bacb1d0722171c0a4a6f4ec7b081779e8f02583b01cba8e831294c40df2e7\" pid:7259 exited_at:{seconds:1758000984 nanos:602586620}" Sep 16 05:36:28.238226 update_engine[1901]: I20250916 05:36:28.238102 1901 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 16 05:36:28.239127 update_engine[1901]: I20250916 05:36:28.238276 1901 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 16 05:36:28.239248 update_engine[1901]: I20250916 05:36:28.239202 1901 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 16 05:36:28.239636 update_engine[1901]: E20250916 05:36:28.239535 1901 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 16 05:36:28.239809 update_engine[1901]: I20250916 05:36:28.239699 1901 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 16 05:36:30.719264 containerd[1906]: time="2025-09-16T05:36:30.719240991Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"9358e1541add016b39907dabde03c4a4bcbfa52dd62fcdbd8de774c8c7b43311\" pid:7291 exited_at:{seconds:1758000990 nanos:719118750}" Sep 16 05:36:35.406750 containerd[1906]: time="2025-09-16T05:36:35.406722129Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"2168466bf5fc2d93c4a5d3dfc8c97da618473e0bc8eeb50d10f34662290ce22f\" pid:7311 exited_at:{seconds:1758000995 nanos:406510544}" Sep 16 05:36:36.770327 containerd[1906]: time="2025-09-16T05:36:36.770275993Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"6fc9378f9bbbfe7ef123129851f4e8193e08c8576cdc0ea298e7f0347a9786a6\" pid:7346 exited_at:{seconds:1758000996 nanos:770040536}" Sep 16 05:36:38.239243 update_engine[1901]: I20250916 05:36:38.239077 1901 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 16 05:36:38.239243 update_engine[1901]: I20250916 05:36:38.239236 1901 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 16 05:36:38.240444 update_engine[1901]: I20250916 05:36:38.240373 1901 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 16 05:36:38.240615 update_engine[1901]: E20250916 05:36:38.240533 1901 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 16 05:36:38.240745 update_engine[1901]: I20250916 05:36:38.240694 1901 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Sep 16 05:36:46.552812 containerd[1906]: time="2025-09-16T05:36:46.552784824Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"37fde40bf2a53e0a5cfcb89dbe19cc818e0c2e1f0799b439396c3d2675e4fa4c\" pid:7381 exited_at:{seconds:1758001006 nanos:552693169}" Sep 16 05:36:48.230039 update_engine[1901]: I20250916 05:36:48.229836 1901 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 16 05:36:48.230039 update_engine[1901]: I20250916 05:36:48.230043 1901 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 16 05:36:48.231076 update_engine[1901]: I20250916 05:36:48.230983 1901 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 16 05:36:48.231602 update_engine[1901]: E20250916 05:36:48.231487 1901 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 16 05:36:48.231773 update_engine[1901]: I20250916 05:36:48.231666 1901 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 16 05:36:48.231773 update_engine[1901]: I20250916 05:36:48.231699 1901 omaha_request_action.cc:617] Omaha request response: Sep 16 05:36:48.232015 update_engine[1901]: E20250916 05:36:48.231861 1901 omaha_request_action.cc:636] Omaha request network transfer failed. Sep 16 05:36:48.232015 update_engine[1901]: I20250916 05:36:48.231910 1901 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Sep 16 05:36:48.232015 update_engine[1901]: I20250916 05:36:48.231926 1901 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 16 05:36:48.232015 update_engine[1901]: I20250916 05:36:48.231940 1901 update_attempter.cc:306] Processing Done. Sep 16 05:36:48.232015 update_engine[1901]: E20250916 05:36:48.231969 1901 update_attempter.cc:619] Update failed. Sep 16 05:36:48.232440 update_engine[1901]: I20250916 05:36:48.232015 1901 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Sep 16 05:36:48.232440 update_engine[1901]: I20250916 05:36:48.232035 1901 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Sep 16 05:36:48.232440 update_engine[1901]: I20250916 05:36:48.232050 1901 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Sep 16 05:36:48.232440 update_engine[1901]: I20250916 05:36:48.232208 1901 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 16 05:36:48.232440 update_engine[1901]: I20250916 05:36:48.232275 1901 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 16 05:36:48.232440 update_engine[1901]: I20250916 05:36:48.232291 1901 omaha_request_action.cc:272] Request: Sep 16 05:36:48.232440 update_engine[1901]: Sep 16 05:36:48.232440 update_engine[1901]: Sep 16 05:36:48.232440 update_engine[1901]: Sep 16 05:36:48.232440 update_engine[1901]: Sep 16 05:36:48.232440 update_engine[1901]: Sep 16 05:36:48.232440 update_engine[1901]: Sep 16 05:36:48.232440 update_engine[1901]: I20250916 05:36:48.232308 1901 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 16 05:36:48.232440 update_engine[1901]: I20250916 05:36:48.232354 1901 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 16 05:36:48.233522 update_engine[1901]: I20250916 05:36:48.233119 1901 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 16 05:36:48.233522 update_engine[1901]: E20250916 05:36:48.233378 1901 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 16 05:36:48.233686 locksmithd[1977]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Sep 16 05:36:48.234332 update_engine[1901]: I20250916 05:36:48.233507 1901 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 16 05:36:48.234332 update_engine[1901]: I20250916 05:36:48.233536 1901 omaha_request_action.cc:617] Omaha request response: Sep 16 05:36:48.234332 update_engine[1901]: I20250916 05:36:48.233552 1901 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 16 05:36:48.234332 update_engine[1901]: I20250916 05:36:48.233566 1901 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 16 05:36:48.234332 update_engine[1901]: I20250916 05:36:48.233579 1901 update_attempter.cc:306] Processing Done. Sep 16 05:36:48.234332 update_engine[1901]: I20250916 05:36:48.233594 1901 update_attempter.cc:310] Error event sent. Sep 16 05:36:48.234332 update_engine[1901]: I20250916 05:36:48.233615 1901 update_check_scheduler.cc:74] Next update check in 44m40s Sep 16 05:36:48.234895 locksmithd[1977]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Sep 16 05:36:54.606724 containerd[1906]: time="2025-09-16T05:36:54.606695365Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\" id:\"120e49033b74d92eeceb7b1bc5ea6b4e45af40a5dff4b2b6b7e7561fbf986470\" pid:7406 exited_at:{seconds:1758001014 nanos:606444937}" Sep 16 05:37:00.734915 containerd[1906]: time="2025-09-16T05:37:00.734872544Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"5a9480c90adf6eb5b7c88032f3ac703bc068db843f48dba68320265c8f1e17ac\" pid:7441 exited_at:{seconds:1758001020 nanos:734642640}" Sep 16 05:37:06.769576 containerd[1906]: time="2025-09-16T05:37:06.769540459Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"4cc823bd3d5622c66c9555d41c11f1a621aa994c7085d47cb21045332722bd8c\" pid:7464 exited_at:{seconds:1758001026 nanos:769337679}" Sep 16 05:37:13.240926 containerd[1906]: time="2025-09-16T05:37:13.240703973Z" level=warning msg="container event discarded" container=5054d6dcfa91d6ff583c97e596012570ba4a056127ab809900cf6aedddddb422 type=CONTAINER_CREATED_EVENT Sep 16 05:37:13.240926 containerd[1906]: time="2025-09-16T05:37:13.240870707Z" level=warning msg="container event discarded" container=5054d6dcfa91d6ff583c97e596012570ba4a056127ab809900cf6aedddddb422 type=CONTAINER_STARTED_EVENT Sep 16 05:37:13.255365 containerd[1906]: time="2025-09-16T05:37:13.255223399Z" level=warning msg="container event discarded" container=490fbb9bbfeb5df8c88c8454de786005df7efa99565d8730ad4100b18900eb67 type=CONTAINER_CREATED_EVENT Sep 16 05:37:13.255365 containerd[1906]: time="2025-09-16T05:37:13.255310176Z" level=warning msg="container event discarded" container=490fbb9bbfeb5df8c88c8454de786005df7efa99565d8730ad4100b18900eb67 type=CONTAINER_STARTED_EVENT Sep 16 05:37:13.255365 containerd[1906]: time="2025-09-16T05:37:13.255347125Z" level=warning msg="container event discarded" container=5c30ddb75115a2915b7c31b20a27bd1f306555a2a2906dca0b4f583ae40b5454 type=CONTAINER_CREATED_EVENT Sep 16 05:37:13.255747 containerd[1906]: time="2025-09-16T05:37:13.255377180Z" level=warning msg="container event discarded" container=572844a0ff62b781b4dbd452b9866d35ecb733bee993fa5fb1b1d653f036112b type=CONTAINER_CREATED_EVENT Sep 16 05:37:13.289014 containerd[1906]: time="2025-09-16T05:37:13.288845957Z" level=warning msg="container event discarded" container=5e1b81eca42b7f80be950885ffe02a98551ba7939fc87936ebe93b51e205699e type=CONTAINER_CREATED_EVENT Sep 16 05:37:13.289014 containerd[1906]: time="2025-09-16T05:37:13.288930721Z" level=warning msg="container event discarded" container=5e1b81eca42b7f80be950885ffe02a98551ba7939fc87936ebe93b51e205699e type=CONTAINER_STARTED_EVENT Sep 16 05:37:13.289014 containerd[1906]: time="2025-09-16T05:37:13.288959196Z" level=warning msg="container event discarded" container=5c30ddb75115a2915b7c31b20a27bd1f306555a2a2906dca0b4f583ae40b5454 type=CONTAINER_STARTED_EVENT Sep 16 05:37:13.289014 containerd[1906]: time="2025-09-16T05:37:13.288982172Z" level=warning msg="container event discarded" container=43127a4d78b07ca19e455890751a4d095ef127f7d3af144757390b3435bb58a1 type=CONTAINER_CREATED_EVENT Sep 16 05:37:13.289014 containerd[1906]: time="2025-09-16T05:37:13.289037208Z" level=warning msg="container event discarded" container=572844a0ff62b781b4dbd452b9866d35ecb733bee993fa5fb1b1d653f036112b type=CONTAINER_STARTED_EVENT Sep 16 05:37:13.340405 containerd[1906]: time="2025-09-16T05:37:13.340347707Z" level=warning msg="container event discarded" container=43127a4d78b07ca19e455890751a4d095ef127f7d3af144757390b3435bb58a1 type=CONTAINER_STARTED_EVENT Sep 16 05:37:22.916960 containerd[1906]: time="2025-09-16T05:37:22.916801405Z" level=warning msg="container event discarded" container=fe58eada3eb54b61e30cafa3bc1d6fa236948d38fbc7724503433758a90a088b type=CONTAINER_CREATED_EVENT Sep 16 05:37:22.916960 containerd[1906]: time="2025-09-16T05:37:22.916902277Z" level=warning msg="container event discarded" container=fe58eada3eb54b61e30cafa3bc1d6fa236948d38fbc7724503433758a90a088b type=CONTAINER_STARTED_EVENT Sep 16 05:37:23.024620 containerd[1906]: time="2025-09-16T05:37:23.024439232Z" level=warning msg="container event discarded" container=7d5e0db4d64d76fbf8e6f0f394ebd9273e291c78a1d572494d190f7459553341 type=CONTAINER_CREATED_EVENT Sep 16 05:37:23.149753 containerd[1906]: time="2025-09-16T05:37:23.149607913Z" level=warning msg="container event discarded" container=7d5e0db4d64d76fbf8e6f0f394ebd9273e291c78a1d572494d190f7459553341 type=CONTAINER_STARTED_EVENT Sep 16 05:37:23.405434 containerd[1906]: time="2025-09-16T05:37:23.405285611Z" level=warning msg="container event discarded" container=c6d5e0c6cb0ae06398f29b6eec2f6d712f51ee1805b3bfe2ac636579ecea2c3f type=CONTAINER_CREATED_EVENT Sep 16 05:37:23.405434 containerd[1906]: time="2025-09-16T05:37:23.405406245Z" level=warning msg="container event discarded" container=c6d5e0c6cb0ae06398f29b6eec2f6d712f51ee1805b3bfe2ac636579ecea2c3f type=CONTAINER_STARTED_EVENT Sep 16 05:37:24.600622 containerd[1906]: time="2025-09-16T05:37:24.600598670Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\" id:\"361bdb18ca2c5ebe57b72aae48c2386b86460c3d22d76637e2c13e120a5ef272\" pid:7502 exited_at:{seconds:1758001044 nanos:600433028}" Sep 16 05:37:24.903328 containerd[1906]: time="2025-09-16T05:37:24.903079592Z" level=warning msg="container event discarded" container=4bfbbf76ed52de3a495a7d0603093bdc770b284ead0a2df0b0cc4d361043808c type=CONTAINER_CREATED_EVENT Sep 16 05:37:24.936626 containerd[1906]: time="2025-09-16T05:37:24.936514208Z" level=warning msg="container event discarded" container=4bfbbf76ed52de3a495a7d0603093bdc770b284ead0a2df0b0cc4d361043808c type=CONTAINER_STARTED_EVENT Sep 16 05:37:30.773821 containerd[1906]: time="2025-09-16T05:37:30.773793684Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"579d3336bc970b2d925967b050c36bb965d9aa34597990627700a60e975e4cc2\" pid:7538 exited_at:{seconds:1758001050 nanos:773639633}" Sep 16 05:37:31.751720 containerd[1906]: time="2025-09-16T05:37:31.751535441Z" level=warning msg="container event discarded" container=83fac48863741f707a45b1d8dcb4dba49e43d3568ef26622b1fa7640ca25dbab type=CONTAINER_CREATED_EVENT Sep 16 05:37:31.751720 containerd[1906]: time="2025-09-16T05:37:31.751696839Z" level=warning msg="container event discarded" container=83fac48863741f707a45b1d8dcb4dba49e43d3568ef26622b1fa7640ca25dbab type=CONTAINER_STARTED_EVENT Sep 16 05:37:32.021633 containerd[1906]: time="2025-09-16T05:37:32.021334185Z" level=warning msg="container event discarded" container=960cd73907568744534ec1cd0c0a235be2b6c9addccffd645996c3ea621461c0 type=CONTAINER_CREATED_EVENT Sep 16 05:37:32.021633 containerd[1906]: time="2025-09-16T05:37:32.021459426Z" level=warning msg="container event discarded" container=960cd73907568744534ec1cd0c0a235be2b6c9addccffd645996c3ea621461c0 type=CONTAINER_STARTED_EVENT Sep 16 05:37:34.175531 containerd[1906]: time="2025-09-16T05:37:34.175391720Z" level=warning msg="container event discarded" container=2f7ab839926cc868ee0471368bd98e10f887629a641eab976753079abb904113 type=CONTAINER_CREATED_EVENT Sep 16 05:37:34.228941 containerd[1906]: time="2025-09-16T05:37:34.228790670Z" level=warning msg="container event discarded" container=2f7ab839926cc868ee0471368bd98e10f887629a641eab976753079abb904113 type=CONTAINER_STARTED_EVENT Sep 16 05:37:35.392730 containerd[1906]: time="2025-09-16T05:37:35.392708250Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"577c07944b9767e4a0b89ad9c16a71af155d0c9b93d10e249fae124448b79f3e\" pid:7567 exited_at:{seconds:1758001055 nanos:392424172}" Sep 16 05:37:35.795432 containerd[1906]: time="2025-09-16T05:37:35.795146728Z" level=warning msg="container event discarded" container=411fa45dbe20ade38f441ee53b42d4253c15a893a43b099a57486747dfdb945e type=CONTAINER_CREATED_EVENT Sep 16 05:37:35.835850 containerd[1906]: time="2025-09-16T05:37:35.835699406Z" level=warning msg="container event discarded" container=411fa45dbe20ade38f441ee53b42d4253c15a893a43b099a57486747dfdb945e type=CONTAINER_STARTED_EVENT Sep 16 05:37:36.761342 containerd[1906]: time="2025-09-16T05:37:36.761198334Z" level=warning msg="container event discarded" container=411fa45dbe20ade38f441ee53b42d4253c15a893a43b099a57486747dfdb945e type=CONTAINER_STOPPED_EVENT Sep 16 05:37:36.823762 containerd[1906]: time="2025-09-16T05:37:36.823736293Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"b066b547a1aa1216be448a0e124afe68bf79b6ed936dc7816b1c32fe98b80886\" pid:7600 exited_at:{seconds:1758001056 nanos:823520488}" Sep 16 05:37:40.671272 containerd[1906]: time="2025-09-16T05:37:40.671150296Z" level=warning msg="container event discarded" container=29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96 type=CONTAINER_CREATED_EVENT Sep 16 05:37:40.712700 containerd[1906]: time="2025-09-16T05:37:40.712535252Z" level=warning msg="container event discarded" container=29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96 type=CONTAINER_STARTED_EVENT Sep 16 05:37:41.713708 containerd[1906]: time="2025-09-16T05:37:41.713547359Z" level=warning msg="container event discarded" container=29650e239d3fe93346174a32631faa23c48ae0abe66cae655b047de290c12d96 type=CONTAINER_STOPPED_EVENT Sep 16 05:37:46.561174 containerd[1906]: time="2025-09-16T05:37:46.561124150Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"b3cf8ddd03ec55143d80308ba2089b28971025163d2fdf6170d0b429a58a9458\" pid:7652 exited_at:{seconds:1758001066 nanos:561028204}" Sep 16 05:37:48.093324 containerd[1906]: time="2025-09-16T05:37:48.093279205Z" level=warning msg="container event discarded" container=b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d type=CONTAINER_CREATED_EVENT Sep 16 05:37:48.173076 containerd[1906]: time="2025-09-16T05:37:48.172908993Z" level=warning msg="container event discarded" container=b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d type=CONTAINER_STARTED_EVENT Sep 16 05:37:49.204727 containerd[1906]: time="2025-09-16T05:37:49.204577021Z" level=warning msg="container event discarded" container=d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8 type=CONTAINER_CREATED_EVENT Sep 16 05:37:49.204727 containerd[1906]: time="2025-09-16T05:37:49.204665992Z" level=warning msg="container event discarded" container=d00108a16d01e1a2eb7f908eb3bbb76dd0f378c92f0617936e8661e80b56f5c8 type=CONTAINER_STARTED_EVENT Sep 16 05:37:50.851865 containerd[1906]: time="2025-09-16T05:37:50.851697784Z" level=warning msg="container event discarded" container=3aac2eed9c49e7f93b8d0c862cdf6b0ccabf35065150124a97585d26fe105b9d type=CONTAINER_CREATED_EVENT Sep 16 05:37:50.896339 containerd[1906]: time="2025-09-16T05:37:50.896193697Z" level=warning msg="container event discarded" container=3aac2eed9c49e7f93b8d0c862cdf6b0ccabf35065150124a97585d26fe105b9d type=CONTAINER_STARTED_EVENT Sep 16 05:37:53.474638 containerd[1906]: time="2025-09-16T05:37:53.474462514Z" level=warning msg="container event discarded" container=0d2ea5b2b24e250a4b2512e6640269828493295304b8ba5c7d3136d2999a394a type=CONTAINER_CREATED_EVENT Sep 16 05:37:53.534176 containerd[1906]: time="2025-09-16T05:37:53.534024139Z" level=warning msg="container event discarded" container=0d2ea5b2b24e250a4b2512e6640269828493295304b8ba5c7d3136d2999a394a type=CONTAINER_STARTED_EVENT Sep 16 05:37:53.637326 containerd[1906]: time="2025-09-16T05:37:53.637175259Z" level=warning msg="container event discarded" container=247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a type=CONTAINER_CREATED_EVENT Sep 16 05:37:53.637326 containerd[1906]: time="2025-09-16T05:37:53.637271398Z" level=warning msg="container event discarded" container=247838b7b255f6c883c55ce98922fef52d71dd78f555d86090fbc4b79e01bc3a type=CONTAINER_STARTED_EVENT Sep 16 05:37:53.788130 containerd[1906]: time="2025-09-16T05:37:53.787832578Z" level=warning msg="container event discarded" container=d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b type=CONTAINER_CREATED_EVENT Sep 16 05:37:53.788130 containerd[1906]: time="2025-09-16T05:37:53.787921034Z" level=warning msg="container event discarded" container=d4c18ccbe2628fb91da93595abacbd7efa3b062d5bf045729d7a66cef2491c1b type=CONTAINER_STARTED_EVENT Sep 16 05:37:53.836667 containerd[1906]: time="2025-09-16T05:37:53.836503462Z" level=warning msg="container event discarded" container=5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0 type=CONTAINER_CREATED_EVENT Sep 16 05:37:53.836667 containerd[1906]: time="2025-09-16T05:37:53.836610227Z" level=warning msg="container event discarded" container=5af9d075fae43a0d309138c9f6b75531cf5b2cd497961cbbb94c95193dc06af0 type=CONTAINER_STARTED_EVENT Sep 16 05:37:54.606320 containerd[1906]: time="2025-09-16T05:37:54.606271544Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\" id:\"d072cd34f11eb4d8701fa91f30a15624788a5f50eb6830e004357a18f8bafe1b\" pid:7675 exited_at:{seconds:1758001074 nanos:605999625}" Sep 16 05:37:54.614317 containerd[1906]: time="2025-09-16T05:37:54.614158669Z" level=warning msg="container event discarded" container=95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d type=CONTAINER_CREATED_EVENT Sep 16 05:37:54.614317 containerd[1906]: time="2025-09-16T05:37:54.614269066Z" level=warning msg="container event discarded" container=95034793c5fa51fec56da7617fdbb39aee68b21451931aa6b463a6654c07c15d type=CONTAINER_STARTED_EVENT Sep 16 05:37:54.614317 containerd[1906]: time="2025-09-16T05:37:54.614298084Z" level=warning msg="container event discarded" container=6c915ddf8c726d78508c6945de860c27a66d1109377139f058facdf393d57115 type=CONTAINER_CREATED_EVENT Sep 16 05:37:54.673923 containerd[1906]: time="2025-09-16T05:37:54.673754756Z" level=warning msg="container event discarded" container=6c915ddf8c726d78508c6945de860c27a66d1109377139f058facdf393d57115 type=CONTAINER_STARTED_EVENT Sep 16 05:37:55.622636 containerd[1906]: time="2025-09-16T05:37:55.622511529Z" level=warning msg="container event discarded" container=8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6 type=CONTAINER_CREATED_EVENT Sep 16 05:37:55.622636 containerd[1906]: time="2025-09-16T05:37:55.622611070Z" level=warning msg="container event discarded" container=8d0995f2e611f76eba3889f84c88b0420ec971498121533b68855e64e9a3f9c6 type=CONTAINER_STARTED_EVENT Sep 16 05:37:55.805208 containerd[1906]: time="2025-09-16T05:37:55.805057304Z" level=warning msg="container event discarded" container=b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1 type=CONTAINER_CREATED_EVENT Sep 16 05:37:55.805208 containerd[1906]: time="2025-09-16T05:37:55.805183641Z" level=warning msg="container event discarded" container=b4e6a1f6a3a2e39de6af3fe367b0d1fa5e592b3e111e8b090199d078b2f003e1 type=CONTAINER_STARTED_EVENT Sep 16 05:37:55.805596 containerd[1906]: time="2025-09-16T05:37:55.805226628Z" level=warning msg="container event discarded" container=c0e29b85aa94cf6bd26aecb92c6dac6accabd9f6e04c7fd0e2df00b04076284b type=CONTAINER_CREATED_EVENT Sep 16 05:37:55.843852 containerd[1906]: time="2025-09-16T05:37:55.843669072Z" level=warning msg="container event discarded" container=c0e29b85aa94cf6bd26aecb92c6dac6accabd9f6e04c7fd0e2df00b04076284b type=CONTAINER_STARTED_EVENT Sep 16 05:37:55.855236 containerd[1906]: time="2025-09-16T05:37:55.855137374Z" level=warning msg="container event discarded" container=2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3 type=CONTAINER_CREATED_EVENT Sep 16 05:37:55.855453 containerd[1906]: time="2025-09-16T05:37:55.855234225Z" level=warning msg="container event discarded" container=2997531787b6e3b7b371960bcd3c396047f3e61d44aff60749b03e90d41acff3 type=CONTAINER_STARTED_EVENT Sep 16 05:37:56.341717 containerd[1906]: time="2025-09-16T05:37:56.341539203Z" level=warning msg="container event discarded" container=7c74d6213b47a892210feba7326aa899c7db0c28fc7698e0f04e3657e9f6a4e0 type=CONTAINER_CREATED_EVENT Sep 16 05:37:56.401214 containerd[1906]: time="2025-09-16T05:37:56.401051615Z" level=warning msg="container event discarded" container=7c74d6213b47a892210feba7326aa899c7db0c28fc7698e0f04e3657e9f6a4e0 type=CONTAINER_STARTED_EVENT Sep 16 05:37:56.735298 containerd[1906]: time="2025-09-16T05:37:56.735073747Z" level=warning msg="container event discarded" container=70111634f7be182c48524bc34000a8b5349040cf8dc814c18568187f5bb919be type=CONTAINER_CREATED_EVENT Sep 16 05:37:56.786651 containerd[1906]: time="2025-09-16T05:37:56.786470353Z" level=warning msg="container event discarded" container=70111634f7be182c48524bc34000a8b5349040cf8dc814c18568187f5bb919be type=CONTAINER_STARTED_EVENT Sep 16 05:37:59.773268 containerd[1906]: time="2025-09-16T05:37:59.773136792Z" level=warning msg="container event discarded" container=062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714 type=CONTAINER_CREATED_EVENT Sep 16 05:37:59.827839 containerd[1906]: time="2025-09-16T05:37:59.827679563Z" level=warning msg="container event discarded" container=062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714 type=CONTAINER_STARTED_EVENT Sep 16 05:38:00.771283 containerd[1906]: time="2025-09-16T05:38:00.771254888Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"3aa20660dd9c9a65bbbff7c2c3ab0ed0c76390bffa2aed193685024a91132643\" pid:7711 exited_at:{seconds:1758001080 nanos:771110686}" Sep 16 05:38:01.521267 containerd[1906]: time="2025-09-16T05:38:01.521098327Z" level=warning msg="container event discarded" container=ef01868d654efe394384c8458a30ca26ccb4dbec8a05fc7c552b0bfdffa8a584 type=CONTAINER_CREATED_EVENT Sep 16 05:38:01.564729 containerd[1906]: time="2025-09-16T05:38:01.564578439Z" level=warning msg="container event discarded" container=ef01868d654efe394384c8458a30ca26ccb4dbec8a05fc7c552b0bfdffa8a584 type=CONTAINER_STARTED_EVENT Sep 16 05:38:04.742098 containerd[1906]: time="2025-09-16T05:38:04.741917614Z" level=warning msg="container event discarded" container=91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5 type=CONTAINER_CREATED_EVENT Sep 16 05:38:04.794728 containerd[1906]: time="2025-09-16T05:38:04.794571115Z" level=warning msg="container event discarded" container=91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5 type=CONTAINER_STARTED_EVENT Sep 16 05:38:06.595212 containerd[1906]: time="2025-09-16T05:38:06.595024718Z" level=warning msg="container event discarded" container=d5ef683a2855614b05be77cb56c0877e26828e2e9d50e31bee898e09769b02dc type=CONTAINER_CREATED_EVENT Sep 16 05:38:06.633155 containerd[1906]: time="2025-09-16T05:38:06.632984630Z" level=warning msg="container event discarded" container=d5ef683a2855614b05be77cb56c0877e26828e2e9d50e31bee898e09769b02dc type=CONTAINER_STARTED_EVENT Sep 16 05:38:06.822371 containerd[1906]: time="2025-09-16T05:38:06.822313981Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"b177185bfc320f2ed5250351271438ebd1712af9b051fa035ff73bef0781d55e\" pid:7732 exited_at:{seconds:1758001086 nanos:822137758}" Sep 16 05:38:24.619786 containerd[1906]: time="2025-09-16T05:38:24.619743171Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\" id:\"848bb2017926b798489b4f57641fcc87d6b7ec7da1b04c9ef58bd8ff90930d96\" pid:7774 exited_at:{seconds:1758001104 nanos:619484375}" Sep 16 05:38:30.096109 systemd[1]: Started sshd@13-139.178.94.21:22-139.178.89.65:57852.service - OpenSSH per-connection server daemon (139.178.89.65:57852). Sep 16 05:38:30.129454 sshd[7801]: Accepted publickey for core from 139.178.89.65 port 57852 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:38:30.130122 sshd-session[7801]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:38:30.132792 systemd-logind[1896]: New session 12 of user core. Sep 16 05:38:30.156474 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 16 05:38:30.300622 sshd[7804]: Connection closed by 139.178.89.65 port 57852 Sep 16 05:38:30.300781 sshd-session[7801]: pam_unix(sshd:session): session closed for user core Sep 16 05:38:30.302580 systemd[1]: sshd@13-139.178.94.21:22-139.178.89.65:57852.service: Deactivated successfully. Sep 16 05:38:30.303537 systemd[1]: session-12.scope: Deactivated successfully. Sep 16 05:38:30.304484 systemd-logind[1896]: Session 12 logged out. Waiting for processes to exit. Sep 16 05:38:30.304997 systemd-logind[1896]: Removed session 12. Sep 16 05:38:30.770835 containerd[1906]: time="2025-09-16T05:38:30.770804168Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"8ca51331c4a43bed7728ab0eed5b333d59d7f1b36508edf1aadbf9a30fb826e0\" pid:7843 exited_at:{seconds:1758001110 nanos:770635797}" Sep 16 05:38:35.329636 systemd[1]: Started sshd@14-139.178.94.21:22-139.178.89.65:57864.service - OpenSSH per-connection server daemon (139.178.89.65:57864). Sep 16 05:38:35.423602 sshd[7854]: Accepted publickey for core from 139.178.89.65 port 57864 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:38:35.424500 sshd-session[7854]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:38:35.427259 systemd-logind[1896]: New session 13 of user core. Sep 16 05:38:35.429290 containerd[1906]: time="2025-09-16T05:38:35.429261542Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"6b82599fd6659a5e61cb0fef51fcdc8187d2f898eb4d0051ed5d775f7ce15f39\" pid:7867 exited_at:{seconds:1758001115 nanos:429041482}" Sep 16 05:38:35.437154 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 16 05:38:35.538910 sshd[7890]: Connection closed by 139.178.89.65 port 57864 Sep 16 05:38:35.539107 sshd-session[7854]: pam_unix(sshd:session): session closed for user core Sep 16 05:38:35.541163 systemd[1]: sshd@14-139.178.94.21:22-139.178.89.65:57864.service: Deactivated successfully. Sep 16 05:38:35.542247 systemd[1]: session-13.scope: Deactivated successfully. Sep 16 05:38:35.543007 systemd-logind[1896]: Session 13 logged out. Waiting for processes to exit. Sep 16 05:38:35.543685 systemd-logind[1896]: Removed session 13. Sep 16 05:38:36.822930 containerd[1906]: time="2025-09-16T05:38:36.822905246Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"96b1a310d3f2c864b67f73253c0305381b482744bd6e05f55b8b689101ba237f\" pid:7928 exited_at:{seconds:1758001116 nanos:822749613}" Sep 16 05:38:40.555525 systemd[1]: Started sshd@15-139.178.94.21:22-139.178.89.65:50428.service - OpenSSH per-connection server daemon (139.178.89.65:50428). Sep 16 05:38:40.588613 sshd[7953]: Accepted publickey for core from 139.178.89.65 port 50428 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:38:40.589302 sshd-session[7953]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:38:40.592354 systemd-logind[1896]: New session 14 of user core. Sep 16 05:38:40.605209 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 16 05:38:40.691618 sshd[7956]: Connection closed by 139.178.89.65 port 50428 Sep 16 05:38:40.691821 sshd-session[7953]: pam_unix(sshd:session): session closed for user core Sep 16 05:38:40.706267 systemd[1]: sshd@15-139.178.94.21:22-139.178.89.65:50428.service: Deactivated successfully. Sep 16 05:38:40.707210 systemd[1]: session-14.scope: Deactivated successfully. Sep 16 05:38:40.707728 systemd-logind[1896]: Session 14 logged out. Waiting for processes to exit. Sep 16 05:38:40.709064 systemd[1]: Started sshd@16-139.178.94.21:22-139.178.89.65:50440.service - OpenSSH per-connection server daemon (139.178.89.65:50440). Sep 16 05:38:40.709437 systemd-logind[1896]: Removed session 14. Sep 16 05:38:40.742665 sshd[7982]: Accepted publickey for core from 139.178.89.65 port 50440 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:38:40.743306 sshd-session[7982]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:38:40.746161 systemd-logind[1896]: New session 15 of user core. Sep 16 05:38:40.757176 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 16 05:38:40.915751 sshd[7985]: Connection closed by 139.178.89.65 port 50440 Sep 16 05:38:40.915931 sshd-session[7982]: pam_unix(sshd:session): session closed for user core Sep 16 05:38:40.929822 systemd[1]: sshd@16-139.178.94.21:22-139.178.89.65:50440.service: Deactivated successfully. Sep 16 05:38:40.931055 systemd[1]: session-15.scope: Deactivated successfully. Sep 16 05:38:40.931667 systemd-logind[1896]: Session 15 logged out. Waiting for processes to exit. Sep 16 05:38:40.933269 systemd[1]: Started sshd@17-139.178.94.21:22-139.178.89.65:50454.service - OpenSSH per-connection server daemon (139.178.89.65:50454). Sep 16 05:38:40.933832 systemd-logind[1896]: Removed session 15. Sep 16 05:38:40.976439 sshd[8009]: Accepted publickey for core from 139.178.89.65 port 50454 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:38:40.977170 sshd-session[8009]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:38:40.980483 systemd-logind[1896]: New session 16 of user core. Sep 16 05:38:40.991169 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 16 05:38:41.123178 sshd[8013]: Connection closed by 139.178.89.65 port 50454 Sep 16 05:38:41.123347 sshd-session[8009]: pam_unix(sshd:session): session closed for user core Sep 16 05:38:41.124996 systemd[1]: sshd@17-139.178.94.21:22-139.178.89.65:50454.service: Deactivated successfully. Sep 16 05:38:41.125963 systemd[1]: session-16.scope: Deactivated successfully. Sep 16 05:38:41.126658 systemd-logind[1896]: Session 16 logged out. Waiting for processes to exit. Sep 16 05:38:41.127228 systemd-logind[1896]: Removed session 16. Sep 16 05:38:46.149242 systemd[1]: Started sshd@18-139.178.94.21:22-139.178.89.65:50458.service - OpenSSH per-connection server daemon (139.178.89.65:50458). Sep 16 05:38:46.209434 sshd[8042]: Accepted publickey for core from 139.178.89.65 port 50458 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:38:46.210063 sshd-session[8042]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:38:46.212759 systemd-logind[1896]: New session 17 of user core. Sep 16 05:38:46.234442 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 16 05:38:46.327534 sshd[8045]: Connection closed by 139.178.89.65 port 50458 Sep 16 05:38:46.327748 sshd-session[8042]: pam_unix(sshd:session): session closed for user core Sep 16 05:38:46.329497 systemd[1]: sshd@18-139.178.94.21:22-139.178.89.65:50458.service: Deactivated successfully. Sep 16 05:38:46.330473 systemd[1]: session-17.scope: Deactivated successfully. Sep 16 05:38:46.331140 systemd-logind[1896]: Session 17 logged out. Waiting for processes to exit. Sep 16 05:38:46.331753 systemd-logind[1896]: Removed session 17. Sep 16 05:38:46.555233 containerd[1906]: time="2025-09-16T05:38:46.555203960Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"cdec0a8d75c4a4d98451028d8dc5f0c5d3e988938a2d463c22bc4e49885a0360\" pid:8081 exited_at:{seconds:1758001126 nanos:555020537}" Sep 16 05:38:51.346967 systemd[1]: Started sshd@19-139.178.94.21:22-139.178.89.65:50388.service - OpenSSH per-connection server daemon (139.178.89.65:50388). Sep 16 05:38:51.417097 sshd[8094]: Accepted publickey for core from 139.178.89.65 port 50388 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:38:51.418352 sshd-session[8094]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:38:51.423450 systemd-logind[1896]: New session 18 of user core. Sep 16 05:38:51.434252 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 16 05:38:51.525436 sshd[8097]: Connection closed by 139.178.89.65 port 50388 Sep 16 05:38:51.525603 sshd-session[8094]: pam_unix(sshd:session): session closed for user core Sep 16 05:38:51.527440 systemd[1]: sshd@19-139.178.94.21:22-139.178.89.65:50388.service: Deactivated successfully. Sep 16 05:38:51.528410 systemd[1]: session-18.scope: Deactivated successfully. Sep 16 05:38:51.529419 systemd-logind[1896]: Session 18 logged out. Waiting for processes to exit. Sep 16 05:38:51.529999 systemd-logind[1896]: Removed session 18. Sep 16 05:38:54.662719 containerd[1906]: time="2025-09-16T05:38:54.662688477Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b2e5e66fd9163c13cb38673c27958704982bded282944e80efa3e48e8590ab1d\" id:\"b94b08b62844808be36a6e622f446998c077098c3567cbf67eef9ef45a906f9d\" pid:8134 exited_at:{seconds:1758001134 nanos:662350739}" Sep 16 05:38:56.548979 systemd[1]: Started sshd@20-139.178.94.21:22-139.178.89.65:50398.service - OpenSSH per-connection server daemon (139.178.89.65:50398). Sep 16 05:38:56.590428 sshd[8160]: Accepted publickey for core from 139.178.89.65 port 50398 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:38:56.591153 sshd-session[8160]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:38:56.593945 systemd-logind[1896]: New session 19 of user core. Sep 16 05:38:56.612273 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 16 05:38:56.701240 sshd[8163]: Connection closed by 139.178.89.65 port 50398 Sep 16 05:38:56.701433 sshd-session[8160]: pam_unix(sshd:session): session closed for user core Sep 16 05:38:56.703362 systemd[1]: sshd@20-139.178.94.21:22-139.178.89.65:50398.service: Deactivated successfully. Sep 16 05:38:56.704382 systemd[1]: session-19.scope: Deactivated successfully. Sep 16 05:38:56.705387 systemd-logind[1896]: Session 19 logged out. Waiting for processes to exit. Sep 16 05:38:56.705959 systemd-logind[1896]: Removed session 19. Sep 16 05:39:00.729363 containerd[1906]: time="2025-09-16T05:39:00.729331134Z" level=info msg="TaskExit event in podsandbox handler container_id:\"062d1cc6533f7228c8e950dd46923827138aea5613bffc3751b31dda41834714\" id:\"8423dff9e295bc4d5b4f377d5bad29a4134a1f4bde6698dbf92878468bbfeef4\" pid:8198 exited_at:{seconds:1758001140 nanos:729179516}" Sep 16 05:39:01.722044 systemd[1]: Started sshd@21-139.178.94.21:22-139.178.89.65:56550.service - OpenSSH per-connection server daemon (139.178.89.65:56550). Sep 16 05:39:01.769297 sshd[8210]: Accepted publickey for core from 139.178.89.65 port 56550 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:39:01.770079 sshd-session[8210]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:39:01.773367 systemd-logind[1896]: New session 20 of user core. Sep 16 05:39:01.793258 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 16 05:39:01.880971 sshd[8213]: Connection closed by 139.178.89.65 port 56550 Sep 16 05:39:01.881196 sshd-session[8210]: pam_unix(sshd:session): session closed for user core Sep 16 05:39:01.893438 systemd[1]: sshd@21-139.178.94.21:22-139.178.89.65:56550.service: Deactivated successfully. Sep 16 05:39:01.894484 systemd[1]: session-20.scope: Deactivated successfully. Sep 16 05:39:01.895016 systemd-logind[1896]: Session 20 logged out. Waiting for processes to exit. Sep 16 05:39:01.896430 systemd[1]: Started sshd@22-139.178.94.21:22-139.178.89.65:56560.service - OpenSSH per-connection server daemon (139.178.89.65:56560). Sep 16 05:39:01.896955 systemd-logind[1896]: Removed session 20. Sep 16 05:39:01.954174 sshd[8238]: Accepted publickey for core from 139.178.89.65 port 56560 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:39:01.955844 sshd-session[8238]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:39:01.962622 systemd-logind[1896]: New session 21 of user core. Sep 16 05:39:01.973497 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 16 05:39:02.121432 sshd[8241]: Connection closed by 139.178.89.65 port 56560 Sep 16 05:39:02.121644 sshd-session[8238]: pam_unix(sshd:session): session closed for user core Sep 16 05:39:02.144555 systemd[1]: sshd@22-139.178.94.21:22-139.178.89.65:56560.service: Deactivated successfully. Sep 16 05:39:02.146124 systemd[1]: session-21.scope: Deactivated successfully. Sep 16 05:39:02.147002 systemd-logind[1896]: Session 21 logged out. Waiting for processes to exit. Sep 16 05:39:02.149395 systemd[1]: Started sshd@23-139.178.94.21:22-139.178.89.65:56562.service - OpenSSH per-connection server daemon (139.178.89.65:56562). Sep 16 05:39:02.150327 systemd-logind[1896]: Removed session 21. Sep 16 05:39:02.249057 sshd[8264]: Accepted publickey for core from 139.178.89.65 port 56562 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:39:02.252317 sshd-session[8264]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:39:02.265158 systemd-logind[1896]: New session 22 of user core. Sep 16 05:39:02.274450 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 16 05:39:03.026395 sshd[8267]: Connection closed by 139.178.89.65 port 56562 Sep 16 05:39:03.026589 sshd-session[8264]: pam_unix(sshd:session): session closed for user core Sep 16 05:39:03.039664 systemd[1]: sshd@23-139.178.94.21:22-139.178.89.65:56562.service: Deactivated successfully. Sep 16 05:39:03.041155 systemd[1]: session-22.scope: Deactivated successfully. Sep 16 05:39:03.041795 systemd-logind[1896]: Session 22 logged out. Waiting for processes to exit. Sep 16 05:39:03.043824 systemd[1]: Started sshd@24-139.178.94.21:22-139.178.89.65:56568.service - OpenSSH per-connection server daemon (139.178.89.65:56568). Sep 16 05:39:03.044425 systemd-logind[1896]: Removed session 22. Sep 16 05:39:03.103999 sshd[8296]: Accepted publickey for core from 139.178.89.65 port 56568 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:39:03.105534 sshd-session[8296]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:39:03.110188 systemd-logind[1896]: New session 23 of user core. Sep 16 05:39:03.126197 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 16 05:39:03.272374 sshd[8303]: Connection closed by 139.178.89.65 port 56568 Sep 16 05:39:03.272550 sshd-session[8296]: pam_unix(sshd:session): session closed for user core Sep 16 05:39:03.290090 systemd[1]: sshd@24-139.178.94.21:22-139.178.89.65:56568.service: Deactivated successfully. Sep 16 05:39:03.290981 systemd[1]: session-23.scope: Deactivated successfully. Sep 16 05:39:03.291438 systemd-logind[1896]: Session 23 logged out. Waiting for processes to exit. Sep 16 05:39:03.292690 systemd[1]: Started sshd@25-139.178.94.21:22-139.178.89.65:56584.service - OpenSSH per-connection server daemon (139.178.89.65:56584). Sep 16 05:39:03.293026 systemd-logind[1896]: Removed session 23. Sep 16 05:39:03.324793 sshd[8326]: Accepted publickey for core from 139.178.89.65 port 56584 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:39:03.325554 sshd-session[8326]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:39:03.328473 systemd-logind[1896]: New session 24 of user core. Sep 16 05:39:03.344213 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 16 05:39:03.423033 sshd[8330]: Connection closed by 139.178.89.65 port 56584 Sep 16 05:39:03.423232 sshd-session[8326]: pam_unix(sshd:session): session closed for user core Sep 16 05:39:03.424883 systemd[1]: sshd@25-139.178.94.21:22-139.178.89.65:56584.service: Deactivated successfully. Sep 16 05:39:03.425846 systemd[1]: session-24.scope: Deactivated successfully. Sep 16 05:39:03.426580 systemd-logind[1896]: Session 24 logged out. Waiting for processes to exit. Sep 16 05:39:03.427373 systemd-logind[1896]: Removed session 24. Sep 16 05:39:06.818977 containerd[1906]: time="2025-09-16T05:39:06.818953420Z" level=info msg="TaskExit event in podsandbox handler container_id:\"91b1ba89fa51178e241f28f68aec30a434abe1288c4cbee159b2841df24e49b5\" id:\"f68d3d3b500cb64f44ee658ef88d87b786d522b3216ff292678f25f87acd32c7\" pid:8376 exited_at:{seconds:1758001146 nanos:818782870}" Sep 16 05:39:08.439972 systemd[1]: Started sshd@26-139.178.94.21:22-139.178.89.65:56590.service - OpenSSH per-connection server daemon (139.178.89.65:56590). Sep 16 05:39:08.518517 sshd[8397]: Accepted publickey for core from 139.178.89.65 port 56590 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:39:08.519880 sshd-session[8397]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:39:08.525362 systemd-logind[1896]: New session 25 of user core. Sep 16 05:39:08.542279 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 16 05:39:08.630959 sshd[8400]: Connection closed by 139.178.89.65 port 56590 Sep 16 05:39:08.631147 sshd-session[8397]: pam_unix(sshd:session): session closed for user core Sep 16 05:39:08.633105 systemd[1]: sshd@26-139.178.94.21:22-139.178.89.65:56590.service: Deactivated successfully. Sep 16 05:39:08.634232 systemd[1]: session-25.scope: Deactivated successfully. Sep 16 05:39:08.634978 systemd-logind[1896]: Session 25 logged out. Waiting for processes to exit. Sep 16 05:39:08.635763 systemd-logind[1896]: Removed session 25. Sep 16 05:39:13.661085 systemd[1]: Started sshd@27-139.178.94.21:22-139.178.89.65:33060.service - OpenSSH per-connection server daemon (139.178.89.65:33060). Sep 16 05:39:13.732319 sshd[8427]: Accepted publickey for core from 139.178.89.65 port 33060 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:39:13.732908 sshd-session[8427]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:39:13.735701 systemd-logind[1896]: New session 26 of user core. Sep 16 05:39:13.752168 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 16 05:39:13.839061 sshd[8430]: Connection closed by 139.178.89.65 port 33060 Sep 16 05:39:13.839254 sshd-session[8427]: pam_unix(sshd:session): session closed for user core Sep 16 05:39:13.841114 systemd[1]: sshd@27-139.178.94.21:22-139.178.89.65:33060.service: Deactivated successfully. Sep 16 05:39:13.842132 systemd[1]: session-26.scope: Deactivated successfully. Sep 16 05:39:13.842996 systemd-logind[1896]: Session 26 logged out. Waiting for processes to exit. Sep 16 05:39:13.843707 systemd-logind[1896]: Removed session 26. Sep 16 05:39:18.851879 systemd[1]: Started sshd@28-139.178.94.21:22-139.178.89.65:33072.service - OpenSSH per-connection server daemon (139.178.89.65:33072). Sep 16 05:39:18.896994 sshd[8457]: Accepted publickey for core from 139.178.89.65 port 33072 ssh2: RSA SHA256:OpNGT073RtXqTCMRfzQHu7KC88oHgqXnBSLfiBitbzw Sep 16 05:39:18.897921 sshd-session[8457]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 05:39:18.901350 systemd-logind[1896]: New session 27 of user core. Sep 16 05:39:18.927266 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 16 05:39:19.007579 sshd[8460]: Connection closed by 139.178.89.65 port 33072 Sep 16 05:39:19.007748 sshd-session[8457]: pam_unix(sshd:session): session closed for user core Sep 16 05:39:19.009625 systemd[1]: sshd@28-139.178.94.21:22-139.178.89.65:33072.service: Deactivated successfully. Sep 16 05:39:19.010782 systemd[1]: session-27.scope: Deactivated successfully. Sep 16 05:39:19.011673 systemd-logind[1896]: Session 27 logged out. Waiting for processes to exit. Sep 16 05:39:19.012340 systemd-logind[1896]: Removed session 27.