Mar 25 02:36:39.498831 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Mon Mar 24 23:38:35 -00 2025 Mar 25 02:36:39.498846 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 02:36:39.498852 kernel: BIOS-provided physical RAM map: Mar 25 02:36:39.498857 kernel: BIOS-e820: [mem 0x0000000000000000-0x00000000000997ff] usable Mar 25 02:36:39.498861 kernel: BIOS-e820: [mem 0x0000000000099800-0x000000000009ffff] reserved Mar 25 02:36:39.498865 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Mar 25 02:36:39.498870 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Mar 25 02:36:39.498875 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Mar 25 02:36:39.498879 kernel: BIOS-e820: [mem 0x0000000040400000-0x0000000081b2afff] usable Mar 25 02:36:39.498883 kernel: BIOS-e820: [mem 0x0000000081b2b000-0x0000000081b2bfff] ACPI NVS Mar 25 02:36:39.498887 kernel: BIOS-e820: [mem 0x0000000081b2c000-0x0000000081b2cfff] reserved Mar 25 02:36:39.498892 kernel: BIOS-e820: [mem 0x0000000081b2d000-0x000000008afccfff] usable Mar 25 02:36:39.498897 kernel: BIOS-e820: [mem 0x000000008afcd000-0x000000008c0b1fff] reserved Mar 25 02:36:39.498901 kernel: BIOS-e820: [mem 0x000000008c0b2000-0x000000008c23afff] usable Mar 25 02:36:39.498907 kernel: BIOS-e820: [mem 0x000000008c23b000-0x000000008c66cfff] ACPI NVS Mar 25 02:36:39.498912 kernel: BIOS-e820: [mem 0x000000008c66d000-0x000000008eefefff] reserved Mar 25 02:36:39.498916 kernel: BIOS-e820: [mem 0x000000008eeff000-0x000000008eefffff] usable Mar 25 02:36:39.498922 kernel: BIOS-e820: [mem 0x000000008ef00000-0x000000008fffffff] reserved Mar 25 02:36:39.498927 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 25 02:36:39.498931 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Mar 25 02:36:39.498936 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Mar 25 02:36:39.498941 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Mar 25 02:36:39.498946 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Mar 25 02:36:39.498950 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000086effffff] usable Mar 25 02:36:39.498955 kernel: NX (Execute Disable) protection: active Mar 25 02:36:39.498960 kernel: APIC: Static calls initialized Mar 25 02:36:39.498965 kernel: SMBIOS 3.2.1 present. Mar 25 02:36:39.498969 kernel: DMI: Supermicro X11SCM-F/X11SCM-F, BIOS 1.9 09/16/2022 Mar 25 02:36:39.498975 kernel: tsc: Detected 3400.000 MHz processor Mar 25 02:36:39.498980 kernel: tsc: Detected 3399.906 MHz TSC Mar 25 02:36:39.498985 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 25 02:36:39.498990 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 25 02:36:39.498995 kernel: last_pfn = 0x86f000 max_arch_pfn = 0x400000000 Mar 25 02:36:39.499000 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 23), built from 10 variable MTRRs Mar 25 02:36:39.499005 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 25 02:36:39.499010 kernel: last_pfn = 0x8ef00 max_arch_pfn = 0x400000000 Mar 25 02:36:39.499014 kernel: Using GB pages for direct mapping Mar 25 02:36:39.499019 kernel: ACPI: Early table checksum verification disabled Mar 25 02:36:39.499025 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Mar 25 02:36:39.499030 kernel: ACPI: XSDT 0x000000008C54E0C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Mar 25 02:36:39.499037 kernel: ACPI: FACP 0x000000008C58A670 000114 (v06 01072009 AMI 00010013) Mar 25 02:36:39.499042 kernel: ACPI: DSDT 0x000000008C54E268 03C404 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Mar 25 02:36:39.499047 kernel: ACPI: FACS 0x000000008C66CF80 000040 Mar 25 02:36:39.499053 kernel: ACPI: APIC 0x000000008C58A788 00012C (v04 01072009 AMI 00010013) Mar 25 02:36:39.499059 kernel: ACPI: FPDT 0x000000008C58A8B8 000044 (v01 01072009 AMI 00010013) Mar 25 02:36:39.499064 kernel: ACPI: FIDT 0x000000008C58A900 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Mar 25 02:36:39.499069 kernel: ACPI: MCFG 0x000000008C58A9A0 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Mar 25 02:36:39.499074 kernel: ACPI: SPMI 0x000000008C58A9E0 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Mar 25 02:36:39.499079 kernel: ACPI: SSDT 0x000000008C58AA28 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Mar 25 02:36:39.499085 kernel: ACPI: SSDT 0x000000008C58C548 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Mar 25 02:36:39.499090 kernel: ACPI: SSDT 0x000000008C58F710 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Mar 25 02:36:39.499095 kernel: ACPI: HPET 0x000000008C591A40 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Mar 25 02:36:39.499101 kernel: ACPI: SSDT 0x000000008C591A78 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Mar 25 02:36:39.499106 kernel: ACPI: SSDT 0x000000008C592A28 0008F4 (v02 INTEL xh_mossb 00000000 INTL 20160527) Mar 25 02:36:39.499111 kernel: ACPI: UEFI 0x000000008C593320 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Mar 25 02:36:39.499116 kernel: ACPI: LPIT 0x000000008C593368 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Mar 25 02:36:39.499122 kernel: ACPI: SSDT 0x000000008C593400 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Mar 25 02:36:39.499127 kernel: ACPI: SSDT 0x000000008C595BE0 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Mar 25 02:36:39.499132 kernel: ACPI: DBGP 0x000000008C5970C8 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Mar 25 02:36:39.499137 kernel: ACPI: DBG2 0x000000008C597100 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Mar 25 02:36:39.499143 kernel: ACPI: SSDT 0x000000008C597158 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Mar 25 02:36:39.499148 kernel: ACPI: DMAR 0x000000008C598CC0 000070 (v01 INTEL EDK2 00000002 01000013) Mar 25 02:36:39.499154 kernel: ACPI: SSDT 0x000000008C598D30 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Mar 25 02:36:39.499159 kernel: ACPI: TPM2 0x000000008C598E78 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Mar 25 02:36:39.499164 kernel: ACPI: SSDT 0x000000008C598EB0 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Mar 25 02:36:39.499169 kernel: ACPI: WSMT 0x000000008C599C40 000028 (v01 SUPERM 01072009 AMI 00010013) Mar 25 02:36:39.499174 kernel: ACPI: EINJ 0x000000008C599C68 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Mar 25 02:36:39.499179 kernel: ACPI: ERST 0x000000008C599D98 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Mar 25 02:36:39.499184 kernel: ACPI: BERT 0x000000008C599FC8 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Mar 25 02:36:39.499190 kernel: ACPI: HEST 0x000000008C599FF8 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Mar 25 02:36:39.499196 kernel: ACPI: SSDT 0x000000008C59A278 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Mar 25 02:36:39.499201 kernel: ACPI: Reserving FACP table memory at [mem 0x8c58a670-0x8c58a783] Mar 25 02:36:39.499206 kernel: ACPI: Reserving DSDT table memory at [mem 0x8c54e268-0x8c58a66b] Mar 25 02:36:39.499211 kernel: ACPI: Reserving FACS table memory at [mem 0x8c66cf80-0x8c66cfbf] Mar 25 02:36:39.499216 kernel: ACPI: Reserving APIC table memory at [mem 0x8c58a788-0x8c58a8b3] Mar 25 02:36:39.499221 kernel: ACPI: Reserving FPDT table memory at [mem 0x8c58a8b8-0x8c58a8fb] Mar 25 02:36:39.499226 kernel: ACPI: Reserving FIDT table memory at [mem 0x8c58a900-0x8c58a99b] Mar 25 02:36:39.499232 kernel: ACPI: Reserving MCFG table memory at [mem 0x8c58a9a0-0x8c58a9db] Mar 25 02:36:39.499238 kernel: ACPI: Reserving SPMI table memory at [mem 0x8c58a9e0-0x8c58aa20] Mar 25 02:36:39.499243 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58aa28-0x8c58c543] Mar 25 02:36:39.499248 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58c548-0x8c58f70d] Mar 25 02:36:39.499253 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58f710-0x8c591a3a] Mar 25 02:36:39.499258 kernel: ACPI: Reserving HPET table memory at [mem 0x8c591a40-0x8c591a77] Mar 25 02:36:39.499263 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c591a78-0x8c592a25] Mar 25 02:36:39.499268 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c592a28-0x8c59331b] Mar 25 02:36:39.499273 kernel: ACPI: Reserving UEFI table memory at [mem 0x8c593320-0x8c593361] Mar 25 02:36:39.499278 kernel: ACPI: Reserving LPIT table memory at [mem 0x8c593368-0x8c5933fb] Mar 25 02:36:39.499284 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c593400-0x8c595bdd] Mar 25 02:36:39.499289 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c595be0-0x8c5970c1] Mar 25 02:36:39.499295 kernel: ACPI: Reserving DBGP table memory at [mem 0x8c5970c8-0x8c5970fb] Mar 25 02:36:39.499300 kernel: ACPI: Reserving DBG2 table memory at [mem 0x8c597100-0x8c597153] Mar 25 02:36:39.499305 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c597158-0x8c598cbe] Mar 25 02:36:39.499310 kernel: ACPI: Reserving DMAR table memory at [mem 0x8c598cc0-0x8c598d2f] Mar 25 02:36:39.499315 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598d30-0x8c598e73] Mar 25 02:36:39.499320 kernel: ACPI: Reserving TPM2 table memory at [mem 0x8c598e78-0x8c598eab] Mar 25 02:36:39.499325 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598eb0-0x8c599c3e] Mar 25 02:36:39.499331 kernel: ACPI: Reserving WSMT table memory at [mem 0x8c599c40-0x8c599c67] Mar 25 02:36:39.499336 kernel: ACPI: Reserving EINJ table memory at [mem 0x8c599c68-0x8c599d97] Mar 25 02:36:39.499344 kernel: ACPI: Reserving ERST table memory at [mem 0x8c599d98-0x8c599fc7] Mar 25 02:36:39.499350 kernel: ACPI: Reserving BERT table memory at [mem 0x8c599fc8-0x8c599ff7] Mar 25 02:36:39.499355 kernel: ACPI: Reserving HEST table memory at [mem 0x8c599ff8-0x8c59a273] Mar 25 02:36:39.499360 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c59a278-0x8c59a3d9] Mar 25 02:36:39.499383 kernel: No NUMA configuration found Mar 25 02:36:39.499392 kernel: Faking a node at [mem 0x0000000000000000-0x000000086effffff] Mar 25 02:36:39.499398 kernel: NODE_DATA(0) allocated [mem 0x86effa000-0x86effffff] Mar 25 02:36:39.499405 kernel: Zone ranges: Mar 25 02:36:39.499411 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 25 02:36:39.499416 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 25 02:36:39.499421 kernel: Normal [mem 0x0000000100000000-0x000000086effffff] Mar 25 02:36:39.499427 kernel: Movable zone start for each node Mar 25 02:36:39.499452 kernel: Early memory node ranges Mar 25 02:36:39.499476 kernel: node 0: [mem 0x0000000000001000-0x0000000000098fff] Mar 25 02:36:39.499482 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Mar 25 02:36:39.499487 kernel: node 0: [mem 0x0000000040400000-0x0000000081b2afff] Mar 25 02:36:39.499514 kernel: node 0: [mem 0x0000000081b2d000-0x000000008afccfff] Mar 25 02:36:39.499522 kernel: node 0: [mem 0x000000008c0b2000-0x000000008c23afff] Mar 25 02:36:39.499542 kernel: node 0: [mem 0x000000008eeff000-0x000000008eefffff] Mar 25 02:36:39.499548 kernel: node 0: [mem 0x0000000100000000-0x000000086effffff] Mar 25 02:36:39.499556 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000086effffff] Mar 25 02:36:39.499578 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 25 02:36:39.499600 kernel: On node 0, zone DMA: 103 pages in unavailable ranges Mar 25 02:36:39.499624 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Mar 25 02:36:39.499655 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Mar 25 02:36:39.499661 kernel: On node 0, zone DMA32: 4325 pages in unavailable ranges Mar 25 02:36:39.499683 kernel: On node 0, zone DMA32: 11460 pages in unavailable ranges Mar 25 02:36:39.499688 kernel: On node 0, zone Normal: 4352 pages in unavailable ranges Mar 25 02:36:39.499694 kernel: On node 0, zone Normal: 4096 pages in unavailable ranges Mar 25 02:36:39.499699 kernel: ACPI: PM-Timer IO Port: 0x1808 Mar 25 02:36:39.499705 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Mar 25 02:36:39.499710 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Mar 25 02:36:39.499716 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Mar 25 02:36:39.499722 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Mar 25 02:36:39.499728 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Mar 25 02:36:39.499733 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Mar 25 02:36:39.499738 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Mar 25 02:36:39.499744 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Mar 25 02:36:39.499749 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Mar 25 02:36:39.499754 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Mar 25 02:36:39.499760 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Mar 25 02:36:39.499765 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Mar 25 02:36:39.499771 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Mar 25 02:36:39.499777 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Mar 25 02:36:39.499782 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Mar 25 02:36:39.499788 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Mar 25 02:36:39.499793 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Mar 25 02:36:39.499799 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 25 02:36:39.499804 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 25 02:36:39.499810 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 25 02:36:39.499815 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Mar 25 02:36:39.499821 kernel: TSC deadline timer available Mar 25 02:36:39.499827 kernel: smpboot: Allowing 16 CPUs, 0 hotplug CPUs Mar 25 02:36:39.499833 kernel: [mem 0x90000000-0xdfffffff] available for PCI devices Mar 25 02:36:39.499838 kernel: Booting paravirtualized kernel on bare hardware Mar 25 02:36:39.499844 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 25 02:36:39.499850 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Mar 25 02:36:39.499855 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Mar 25 02:36:39.499861 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Mar 25 02:36:39.499866 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Mar 25 02:36:39.499872 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 02:36:39.499879 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 25 02:36:39.499885 kernel: random: crng init done Mar 25 02:36:39.499890 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Mar 25 02:36:39.499896 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Mar 25 02:36:39.499901 kernel: Fallback order for Node 0: 0 Mar 25 02:36:39.499907 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8232415 Mar 25 02:36:39.499912 kernel: Policy zone: Normal Mar 25 02:36:39.499917 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 25 02:36:39.499924 kernel: software IO TLB: area num 16. Mar 25 02:36:39.499930 kernel: Memory: 32716212K/33452980K available (14336K kernel code, 2304K rwdata, 25060K rodata, 43592K init, 1472K bss, 736508K reserved, 0K cma-reserved) Mar 25 02:36:39.499935 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Mar 25 02:36:39.499941 kernel: ftrace: allocating 37985 entries in 149 pages Mar 25 02:36:39.499946 kernel: ftrace: allocated 149 pages with 4 groups Mar 25 02:36:39.499952 kernel: Dynamic Preempt: voluntary Mar 25 02:36:39.499957 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 25 02:36:39.499963 kernel: rcu: RCU event tracing is enabled. Mar 25 02:36:39.499969 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Mar 25 02:36:39.499975 kernel: Trampoline variant of Tasks RCU enabled. Mar 25 02:36:39.499981 kernel: Rude variant of Tasks RCU enabled. Mar 25 02:36:39.499987 kernel: Tracing variant of Tasks RCU enabled. Mar 25 02:36:39.499992 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 25 02:36:39.499997 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Mar 25 02:36:39.500003 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Mar 25 02:36:39.500008 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 25 02:36:39.500014 kernel: Console: colour VGA+ 80x25 Mar 25 02:36:39.500019 kernel: printk: console [tty0] enabled Mar 25 02:36:39.500025 kernel: printk: console [ttyS1] enabled Mar 25 02:36:39.500031 kernel: ACPI: Core revision 20230628 Mar 25 02:36:39.500037 kernel: hpet: HPET dysfunctional in PC10. Force disabled. Mar 25 02:36:39.500042 kernel: APIC: Switch to symmetric I/O mode setup Mar 25 02:36:39.500047 kernel: DMAR: Host address width 39 Mar 25 02:36:39.500053 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Mar 25 02:36:39.500059 kernel: DMAR: dmar0: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Mar 25 02:36:39.500064 kernel: DMAR: RMRR base: 0x0000008cf18000 end: 0x0000008d161fff Mar 25 02:36:39.500070 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 0 Mar 25 02:36:39.500075 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Mar 25 02:36:39.500082 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Mar 25 02:36:39.500087 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Mar 25 02:36:39.500093 kernel: x2apic enabled Mar 25 02:36:39.500098 kernel: APIC: Switched APIC routing to: cluster x2apic Mar 25 02:36:39.500104 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Mar 25 02:36:39.500109 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Mar 25 02:36:39.500115 kernel: CPU0: Thermal monitoring enabled (TM1) Mar 25 02:36:39.500122 kernel: process: using mwait in idle threads Mar 25 02:36:39.500128 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Mar 25 02:36:39.500134 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Mar 25 02:36:39.500159 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 25 02:36:39.500182 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Mar 25 02:36:39.500187 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Mar 25 02:36:39.500209 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Mar 25 02:36:39.500214 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 25 02:36:39.500220 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Mar 25 02:36:39.500225 kernel: RETBleed: Mitigation: Enhanced IBRS Mar 25 02:36:39.500231 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 25 02:36:39.500236 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 25 02:36:39.500241 kernel: TAA: Mitigation: TSX disabled Mar 25 02:36:39.500248 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Mar 25 02:36:39.500253 kernel: SRBDS: Mitigation: Microcode Mar 25 02:36:39.500259 kernel: GDS: Mitigation: Microcode Mar 25 02:36:39.500264 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 25 02:36:39.500269 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 25 02:36:39.500275 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 25 02:36:39.500280 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Mar 25 02:36:39.500286 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Mar 25 02:36:39.500291 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 25 02:36:39.500296 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Mar 25 02:36:39.500302 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Mar 25 02:36:39.500308 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Mar 25 02:36:39.500314 kernel: Freeing SMP alternatives memory: 32K Mar 25 02:36:39.500319 kernel: pid_max: default: 32768 minimum: 301 Mar 25 02:36:39.500324 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 25 02:36:39.500330 kernel: landlock: Up and running. Mar 25 02:36:39.500335 kernel: SELinux: Initializing. Mar 25 02:36:39.500340 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 25 02:36:39.500346 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 25 02:36:39.500351 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Mar 25 02:36:39.500357 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 25 02:36:39.500362 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 25 02:36:39.500369 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 25 02:36:39.500375 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Mar 25 02:36:39.500380 kernel: ... version: 4 Mar 25 02:36:39.500386 kernel: ... bit width: 48 Mar 25 02:36:39.500391 kernel: ... generic registers: 4 Mar 25 02:36:39.500397 kernel: ... value mask: 0000ffffffffffff Mar 25 02:36:39.500402 kernel: ... max period: 00007fffffffffff Mar 25 02:36:39.500408 kernel: ... fixed-purpose events: 3 Mar 25 02:36:39.500413 kernel: ... event mask: 000000070000000f Mar 25 02:36:39.500419 kernel: signal: max sigframe size: 2032 Mar 25 02:36:39.500425 kernel: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Mar 25 02:36:39.500431 kernel: rcu: Hierarchical SRCU implementation. Mar 25 02:36:39.500436 kernel: rcu: Max phase no-delay instances is 400. Mar 25 02:36:39.500442 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Mar 25 02:36:39.500447 kernel: smp: Bringing up secondary CPUs ... Mar 25 02:36:39.500452 kernel: smpboot: x86: Booting SMP configuration: Mar 25 02:36:39.500458 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 #11 #12 #13 #14 #15 Mar 25 02:36:39.500464 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Mar 25 02:36:39.500471 kernel: smp: Brought up 1 node, 16 CPUs Mar 25 02:36:39.500476 kernel: smpboot: Max logical packages: 1 Mar 25 02:36:39.500482 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Mar 25 02:36:39.500487 kernel: devtmpfs: initialized Mar 25 02:36:39.500492 kernel: x86/mm: Memory block size: 128MB Mar 25 02:36:39.500498 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x81b2b000-0x81b2bfff] (4096 bytes) Mar 25 02:36:39.500503 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x8c23b000-0x8c66cfff] (4399104 bytes) Mar 25 02:36:39.500509 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 25 02:36:39.500515 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Mar 25 02:36:39.500521 kernel: pinctrl core: initialized pinctrl subsystem Mar 25 02:36:39.500527 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 25 02:36:39.500532 kernel: audit: initializing netlink subsys (disabled) Mar 25 02:36:39.500538 kernel: audit: type=2000 audit(1742870193.041:1): state=initialized audit_enabled=0 res=1 Mar 25 02:36:39.500543 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 25 02:36:39.500548 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 25 02:36:39.500554 kernel: cpuidle: using governor menu Mar 25 02:36:39.500559 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 25 02:36:39.500566 kernel: dca service started, version 1.12.1 Mar 25 02:36:39.500572 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xe0000000-0xefffffff] (base 0xe0000000) Mar 25 02:36:39.500577 kernel: PCI: Using configuration type 1 for base access Mar 25 02:36:39.500582 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Mar 25 02:36:39.500588 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 25 02:36:39.500593 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 25 02:36:39.500599 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 25 02:36:39.500604 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 25 02:36:39.500610 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 25 02:36:39.500616 kernel: ACPI: Added _OSI(Module Device) Mar 25 02:36:39.500622 kernel: ACPI: Added _OSI(Processor Device) Mar 25 02:36:39.500629 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 25 02:36:39.500635 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 25 02:36:39.500641 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Mar 25 02:36:39.500661 kernel: ACPI: Dynamic OEM Table Load: Mar 25 02:36:39.500667 kernel: ACPI: SSDT 0xFFFF8882C1627C00 000400 (v02 PmRef Cpu0Cst 00003001 INTL 20160527) Mar 25 02:36:39.500672 kernel: ACPI: Dynamic OEM Table Load: Mar 25 02:36:39.500692 kernel: ACPI: SSDT 0xFFFF8882C161C000 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Mar 25 02:36:39.500698 kernel: ACPI: Dynamic OEM Table Load: Mar 25 02:36:39.500704 kernel: ACPI: SSDT 0xFFFF8882C1605E00 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Mar 25 02:36:39.500710 kernel: ACPI: Dynamic OEM Table Load: Mar 25 02:36:39.500715 kernel: ACPI: SSDT 0xFFFF8882C161B800 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Mar 25 02:36:39.500720 kernel: ACPI: Dynamic OEM Table Load: Mar 25 02:36:39.500726 kernel: ACPI: SSDT 0xFFFF8882C162A000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Mar 25 02:36:39.500731 kernel: ACPI: Dynamic OEM Table Load: Mar 25 02:36:39.500737 kernel: ACPI: SSDT 0xFFFF8882C0F0B800 00030A (v02 PmRef ApCst 00003000 INTL 20160527) Mar 25 02:36:39.500742 kernel: ACPI: _OSC evaluated successfully for all CPUs Mar 25 02:36:39.500747 kernel: ACPI: Interpreter enabled Mar 25 02:36:39.500754 kernel: ACPI: PM: (supports S0 S5) Mar 25 02:36:39.500759 kernel: ACPI: Using IOAPIC for interrupt routing Mar 25 02:36:39.500765 kernel: HEST: Enabling Firmware First mode for corrected errors. Mar 25 02:36:39.500770 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Mar 25 02:36:39.500776 kernel: HEST: Table parsing has been initialized. Mar 25 02:36:39.500781 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Mar 25 02:36:39.500787 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 25 02:36:39.500792 kernel: PCI: Using E820 reservations for host bridge windows Mar 25 02:36:39.500798 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Mar 25 02:36:39.500804 kernel: ACPI: \_SB_.PCI0.XDCI.USBC: New power resource Mar 25 02:36:39.500810 kernel: ACPI: \_SB_.PCI0.SAT0.VOL0.V0PR: New power resource Mar 25 02:36:39.500815 kernel: ACPI: \_SB_.PCI0.SAT0.VOL1.V1PR: New power resource Mar 25 02:36:39.500821 kernel: ACPI: \_SB_.PCI0.SAT0.VOL2.V2PR: New power resource Mar 25 02:36:39.500826 kernel: ACPI: \_SB_.PCI0.CNVW.WRST: New power resource Mar 25 02:36:39.500832 kernel: ACPI: \_TZ_.FN00: New power resource Mar 25 02:36:39.500837 kernel: ACPI: \_TZ_.FN01: New power resource Mar 25 02:36:39.500843 kernel: ACPI: \_TZ_.FN02: New power resource Mar 25 02:36:39.500848 kernel: ACPI: \_TZ_.FN03: New power resource Mar 25 02:36:39.500855 kernel: ACPI: \_TZ_.FN04: New power resource Mar 25 02:36:39.500860 kernel: ACPI: \PIN_: New power resource Mar 25 02:36:39.500866 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Mar 25 02:36:39.500945 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 25 02:36:39.500998 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Mar 25 02:36:39.501047 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Mar 25 02:36:39.501055 kernel: PCI host bridge to bus 0000:00 Mar 25 02:36:39.501110 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 25 02:36:39.501155 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 25 02:36:39.501199 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 25 02:36:39.501244 kernel: pci_bus 0000:00: root bus resource [mem 0x90000000-0xdfffffff window] Mar 25 02:36:39.501288 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Mar 25 02:36:39.501332 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Mar 25 02:36:39.501393 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 Mar 25 02:36:39.501457 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 Mar 25 02:36:39.501511 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Mar 25 02:36:39.501565 kernel: pci 0000:00:08.0: [8086:1911] type 00 class 0x088000 Mar 25 02:36:39.501618 kernel: pci 0000:00:08.0: reg 0x10: [mem 0x9551f000-0x9551ffff 64bit] Mar 25 02:36:39.501711 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 Mar 25 02:36:39.501763 kernel: pci 0000:00:12.0: reg 0x10: [mem 0x9551e000-0x9551efff 64bit] Mar 25 02:36:39.501820 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 Mar 25 02:36:39.501870 kernel: pci 0000:00:14.0: reg 0x10: [mem 0x95500000-0x9550ffff 64bit] Mar 25 02:36:39.501920 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Mar 25 02:36:39.501972 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 Mar 25 02:36:39.502023 kernel: pci 0000:00:14.2: reg 0x10: [mem 0x95512000-0x95513fff 64bit] Mar 25 02:36:39.502072 kernel: pci 0000:00:14.2: reg 0x18: [mem 0x9551d000-0x9551dfff 64bit] Mar 25 02:36:39.502129 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 Mar 25 02:36:39.502179 kernel: pci 0000:00:15.0: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Mar 25 02:36:39.502235 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 Mar 25 02:36:39.502286 kernel: pci 0000:00:15.1: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Mar 25 02:36:39.502339 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 Mar 25 02:36:39.502390 kernel: pci 0000:00:16.0: reg 0x10: [mem 0x9551a000-0x9551afff 64bit] Mar 25 02:36:39.502440 kernel: pci 0000:00:16.0: PME# supported from D3hot Mar 25 02:36:39.502495 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 Mar 25 02:36:39.502552 kernel: pci 0000:00:16.1: reg 0x10: [mem 0x95519000-0x95519fff 64bit] Mar 25 02:36:39.502605 kernel: pci 0000:00:16.1: PME# supported from D3hot Mar 25 02:36:39.502663 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 Mar 25 02:36:39.502715 kernel: pci 0000:00:16.4: reg 0x10: [mem 0x95518000-0x95518fff 64bit] Mar 25 02:36:39.502766 kernel: pci 0000:00:16.4: PME# supported from D3hot Mar 25 02:36:39.502821 kernel: pci 0000:00:17.0: [8086:a352] type 00 class 0x010601 Mar 25 02:36:39.502872 kernel: pci 0000:00:17.0: reg 0x10: [mem 0x95510000-0x95511fff] Mar 25 02:36:39.502920 kernel: pci 0000:00:17.0: reg 0x14: [mem 0x95517000-0x955170ff] Mar 25 02:36:39.502971 kernel: pci 0000:00:17.0: reg 0x18: [io 0x6050-0x6057] Mar 25 02:36:39.503020 kernel: pci 0000:00:17.0: reg 0x1c: [io 0x6040-0x6043] Mar 25 02:36:39.503072 kernel: pci 0000:00:17.0: reg 0x20: [io 0x6020-0x603f] Mar 25 02:36:39.503123 kernel: pci 0000:00:17.0: reg 0x24: [mem 0x95516000-0x955167ff] Mar 25 02:36:39.503174 kernel: pci 0000:00:17.0: PME# supported from D3hot Mar 25 02:36:39.503230 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 Mar 25 02:36:39.503281 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Mar 25 02:36:39.503339 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 Mar 25 02:36:39.503391 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Mar 25 02:36:39.503446 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 Mar 25 02:36:39.503498 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Mar 25 02:36:39.503552 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 Mar 25 02:36:39.503603 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Mar 25 02:36:39.503670 kernel: pci 0000:00:1c.3: [8086:a33b] type 01 class 0x060400 Mar 25 02:36:39.503724 kernel: pci 0000:00:1c.3: PME# supported from D0 D3hot D3cold Mar 25 02:36:39.503778 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 Mar 25 02:36:39.503828 kernel: pci 0000:00:1e.0: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Mar 25 02:36:39.503882 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 Mar 25 02:36:39.503939 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 Mar 25 02:36:39.503990 kernel: pci 0000:00:1f.4: reg 0x10: [mem 0x95514000-0x955140ff 64bit] Mar 25 02:36:39.504042 kernel: pci 0000:00:1f.4: reg 0x20: [io 0xefa0-0xefbf] Mar 25 02:36:39.504096 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 Mar 25 02:36:39.504145 kernel: pci 0000:00:1f.5: reg 0x10: [mem 0xfe010000-0xfe010fff] Mar 25 02:36:39.504202 kernel: pci 0000:01:00.0: [15b3:1015] type 00 class 0x020000 Mar 25 02:36:39.504254 kernel: pci 0000:01:00.0: reg 0x10: [mem 0x92000000-0x93ffffff 64bit pref] Mar 25 02:36:39.504306 kernel: pci 0000:01:00.0: reg 0x30: [mem 0x95200000-0x952fffff pref] Mar 25 02:36:39.504359 kernel: pci 0000:01:00.0: PME# supported from D3cold Mar 25 02:36:39.504410 kernel: pci 0000:01:00.0: reg 0x1a4: [mem 0x00000000-0x000fffff 64bit pref] Mar 25 02:36:39.504462 kernel: pci 0000:01:00.0: VF(n) BAR0 space: [mem 0x00000000-0x007fffff 64bit pref] (contains BAR0 for 8 VFs) Mar 25 02:36:39.504519 kernel: pci 0000:01:00.1: [15b3:1015] type 00 class 0x020000 Mar 25 02:36:39.504571 kernel: pci 0000:01:00.1: reg 0x10: [mem 0x90000000-0x91ffffff 64bit pref] Mar 25 02:36:39.504624 kernel: pci 0000:01:00.1: reg 0x30: [mem 0x95100000-0x951fffff pref] Mar 25 02:36:39.504686 kernel: pci 0000:01:00.1: PME# supported from D3cold Mar 25 02:36:39.504740 kernel: pci 0000:01:00.1: reg 0x1a4: [mem 0x00000000-0x000fffff 64bit pref] Mar 25 02:36:39.504795 kernel: pci 0000:01:00.1: VF(n) BAR0 space: [mem 0x00000000-0x007fffff 64bit pref] (contains BAR0 for 8 VFs) Mar 25 02:36:39.504846 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Mar 25 02:36:39.504898 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Mar 25 02:36:39.504949 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Mar 25 02:36:39.504999 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Mar 25 02:36:39.505054 kernel: pci 0000:03:00.0: working around ROM BAR overlap defect Mar 25 02:36:39.505106 kernel: pci 0000:03:00.0: [8086:1533] type 00 class 0x020000 Mar 25 02:36:39.505161 kernel: pci 0000:03:00.0: reg 0x10: [mem 0x95400000-0x9547ffff] Mar 25 02:36:39.505212 kernel: pci 0000:03:00.0: reg 0x18: [io 0x5000-0x501f] Mar 25 02:36:39.505263 kernel: pci 0000:03:00.0: reg 0x1c: [mem 0x95480000-0x95483fff] Mar 25 02:36:39.505314 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Mar 25 02:36:39.505367 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Mar 25 02:36:39.505417 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Mar 25 02:36:39.505466 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Mar 25 02:36:39.505524 kernel: pci 0000:04:00.0: working around ROM BAR overlap defect Mar 25 02:36:39.505576 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 Mar 25 02:36:39.505635 kernel: pci 0000:04:00.0: reg 0x10: [mem 0x95300000-0x9537ffff] Mar 25 02:36:39.505688 kernel: pci 0000:04:00.0: reg 0x18: [io 0x4000-0x401f] Mar 25 02:36:39.505740 kernel: pci 0000:04:00.0: reg 0x1c: [mem 0x95380000-0x95383fff] Mar 25 02:36:39.505791 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Mar 25 02:36:39.505842 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Mar 25 02:36:39.505892 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Mar 25 02:36:39.505944 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Mar 25 02:36:39.505996 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Mar 25 02:36:39.506051 kernel: pci 0000:06:00.0: [1a03:1150] type 01 class 0x060400 Mar 25 02:36:39.506105 kernel: pci 0000:06:00.0: enabling Extended Tags Mar 25 02:36:39.506156 kernel: pci 0000:06:00.0: supports D1 D2 Mar 25 02:36:39.506207 kernel: pci 0000:06:00.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 25 02:36:39.506258 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Mar 25 02:36:39.506311 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Mar 25 02:36:39.506362 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Mar 25 02:36:39.506419 kernel: pci_bus 0000:07: extended config space not accessible Mar 25 02:36:39.506479 kernel: pci 0000:07:00.0: [1a03:2000] type 00 class 0x030000 Mar 25 02:36:39.506535 kernel: pci 0000:07:00.0: reg 0x10: [mem 0x94000000-0x94ffffff] Mar 25 02:36:39.506588 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x95000000-0x9501ffff] Mar 25 02:36:39.506655 kernel: pci 0000:07:00.0: reg 0x18: [io 0x3000-0x307f] Mar 25 02:36:39.506711 kernel: pci 0000:07:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 25 02:36:39.506764 kernel: pci 0000:07:00.0: supports D1 D2 Mar 25 02:36:39.506817 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 25 02:36:39.506869 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Mar 25 02:36:39.506921 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Mar 25 02:36:39.506973 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Mar 25 02:36:39.506981 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Mar 25 02:36:39.506988 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Mar 25 02:36:39.506995 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Mar 25 02:36:39.507001 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Mar 25 02:36:39.507007 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Mar 25 02:36:39.507013 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Mar 25 02:36:39.507019 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Mar 25 02:36:39.507025 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Mar 25 02:36:39.507030 kernel: iommu: Default domain type: Translated Mar 25 02:36:39.507036 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 25 02:36:39.507042 kernel: PCI: Using ACPI for IRQ routing Mar 25 02:36:39.507049 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 25 02:36:39.507055 kernel: e820: reserve RAM buffer [mem 0x00099800-0x0009ffff] Mar 25 02:36:39.507061 kernel: e820: reserve RAM buffer [mem 0x81b2b000-0x83ffffff] Mar 25 02:36:39.507066 kernel: e820: reserve RAM buffer [mem 0x8afcd000-0x8bffffff] Mar 25 02:36:39.507072 kernel: e820: reserve RAM buffer [mem 0x8c23b000-0x8fffffff] Mar 25 02:36:39.507078 kernel: e820: reserve RAM buffer [mem 0x8ef00000-0x8fffffff] Mar 25 02:36:39.507083 kernel: e820: reserve RAM buffer [mem 0x86f000000-0x86fffffff] Mar 25 02:36:39.507135 kernel: pci 0000:07:00.0: vgaarb: setting as boot VGA device Mar 25 02:36:39.507189 kernel: pci 0000:07:00.0: vgaarb: bridge control possible Mar 25 02:36:39.507245 kernel: pci 0000:07:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 25 02:36:39.507254 kernel: vgaarb: loaded Mar 25 02:36:39.507260 kernel: clocksource: Switched to clocksource tsc-early Mar 25 02:36:39.507266 kernel: VFS: Disk quotas dquot_6.6.0 Mar 25 02:36:39.507272 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 25 02:36:39.507278 kernel: pnp: PnP ACPI init Mar 25 02:36:39.507327 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Mar 25 02:36:39.507380 kernel: pnp 00:02: [dma 0 disabled] Mar 25 02:36:39.507431 kernel: pnp 00:03: [dma 0 disabled] Mar 25 02:36:39.507483 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Mar 25 02:36:39.507528 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Mar 25 02:36:39.507578 kernel: system 00:05: [io 0x1854-0x1857] has been reserved Mar 25 02:36:39.507635 kernel: system 00:06: [mem 0xfed10000-0xfed17fff] has been reserved Mar 25 02:36:39.507684 kernel: system 00:06: [mem 0xfed18000-0xfed18fff] has been reserved Mar 25 02:36:39.507734 kernel: system 00:06: [mem 0xfed19000-0xfed19fff] has been reserved Mar 25 02:36:39.507781 kernel: system 00:06: [mem 0xe0000000-0xefffffff] has been reserved Mar 25 02:36:39.507826 kernel: system 00:06: [mem 0xfed20000-0xfed3ffff] has been reserved Mar 25 02:36:39.507873 kernel: system 00:06: [mem 0xfed90000-0xfed93fff] could not be reserved Mar 25 02:36:39.507918 kernel: system 00:06: [mem 0xfed45000-0xfed8ffff] has been reserved Mar 25 02:36:39.507965 kernel: system 00:06: [mem 0xfee00000-0xfeefffff] could not be reserved Mar 25 02:36:39.508014 kernel: system 00:07: [io 0x1800-0x18fe] could not be reserved Mar 25 02:36:39.508063 kernel: system 00:07: [mem 0xfd000000-0xfd69ffff] has been reserved Mar 25 02:36:39.508109 kernel: system 00:07: [mem 0xfd6c0000-0xfd6cffff] has been reserved Mar 25 02:36:39.508155 kernel: system 00:07: [mem 0xfd6f0000-0xfdffffff] has been reserved Mar 25 02:36:39.508201 kernel: system 00:07: [mem 0xfe000000-0xfe01ffff] could not be reserved Mar 25 02:36:39.508246 kernel: system 00:07: [mem 0xfe200000-0xfe7fffff] has been reserved Mar 25 02:36:39.508292 kernel: system 00:07: [mem 0xff000000-0xffffffff] has been reserved Mar 25 02:36:39.508341 kernel: system 00:08: [io 0x2000-0x20fe] has been reserved Mar 25 02:36:39.508351 kernel: pnp: PnP ACPI: found 10 devices Mar 25 02:36:39.508361 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 25 02:36:39.508367 kernel: NET: Registered PF_INET protocol family Mar 25 02:36:39.508373 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 25 02:36:39.508418 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Mar 25 02:36:39.508443 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 25 02:36:39.508449 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 25 02:36:39.508479 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Mar 25 02:36:39.508486 kernel: TCP: Hash tables configured (established 262144 bind 65536) Mar 25 02:36:39.508492 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Mar 25 02:36:39.508498 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Mar 25 02:36:39.508504 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 25 02:36:39.508511 kernel: NET: Registered PF_XDP protocol family Mar 25 02:36:39.508563 kernel: pci 0000:00:15.0: BAR 0: assigned [mem 0x95515000-0x95515fff 64bit] Mar 25 02:36:39.508614 kernel: pci 0000:00:15.1: BAR 0: assigned [mem 0x9551b000-0x9551bfff 64bit] Mar 25 02:36:39.508669 kernel: pci 0000:00:1e.0: BAR 0: assigned [mem 0x9551c000-0x9551cfff 64bit] Mar 25 02:36:39.508722 kernel: pci 0000:01:00.0: BAR 7: no space for [mem size 0x00800000 64bit pref] Mar 25 02:36:39.508780 kernel: pci 0000:01:00.0: BAR 7: failed to assign [mem size 0x00800000 64bit pref] Mar 25 02:36:39.508933 kernel: pci 0000:01:00.1: BAR 7: no space for [mem size 0x00800000 64bit pref] Mar 25 02:36:39.509005 kernel: pci 0000:01:00.1: BAR 7: failed to assign [mem size 0x00800000 64bit pref] Mar 25 02:36:39.509057 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Mar 25 02:36:39.509108 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Mar 25 02:36:39.509158 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Mar 25 02:36:39.509210 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Mar 25 02:36:39.509262 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Mar 25 02:36:39.509313 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Mar 25 02:36:39.509362 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Mar 25 02:36:39.509414 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Mar 25 02:36:39.509464 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Mar 25 02:36:39.509516 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Mar 25 02:36:39.509567 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Mar 25 02:36:39.509618 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Mar 25 02:36:39.509676 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Mar 25 02:36:39.509727 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Mar 25 02:36:39.509778 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Mar 25 02:36:39.509827 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Mar 25 02:36:39.509877 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Mar 25 02:36:39.509924 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Mar 25 02:36:39.509971 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 25 02:36:39.510017 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 25 02:36:39.510061 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 25 02:36:39.510105 kernel: pci_bus 0000:00: resource 7 [mem 0x90000000-0xdfffffff window] Mar 25 02:36:39.510149 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Mar 25 02:36:39.510200 kernel: pci_bus 0000:01: resource 1 [mem 0x95100000-0x952fffff] Mar 25 02:36:39.510247 kernel: pci_bus 0000:01: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Mar 25 02:36:39.510302 kernel: pci_bus 0000:03: resource 0 [io 0x5000-0x5fff] Mar 25 02:36:39.510349 kernel: pci_bus 0000:03: resource 1 [mem 0x95400000-0x954fffff] Mar 25 02:36:39.510399 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Mar 25 02:36:39.510446 kernel: pci_bus 0000:04: resource 1 [mem 0x95300000-0x953fffff] Mar 25 02:36:39.510496 kernel: pci_bus 0000:06: resource 0 [io 0x3000-0x3fff] Mar 25 02:36:39.510543 kernel: pci_bus 0000:06: resource 1 [mem 0x94000000-0x950fffff] Mar 25 02:36:39.510593 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Mar 25 02:36:39.510646 kernel: pci_bus 0000:07: resource 1 [mem 0x94000000-0x950fffff] Mar 25 02:36:39.510655 kernel: PCI: CLS 64 bytes, default 64 Mar 25 02:36:39.510661 kernel: DMAR: No ATSR found Mar 25 02:36:39.510667 kernel: DMAR: No SATC found Mar 25 02:36:39.510673 kernel: DMAR: dmar0: Using Queued invalidation Mar 25 02:36:39.510723 kernel: pci 0000:00:00.0: Adding to iommu group 0 Mar 25 02:36:39.510775 kernel: pci 0000:00:01.0: Adding to iommu group 1 Mar 25 02:36:39.510827 kernel: pci 0000:00:08.0: Adding to iommu group 2 Mar 25 02:36:39.510880 kernel: pci 0000:00:12.0: Adding to iommu group 3 Mar 25 02:36:39.510931 kernel: pci 0000:00:14.0: Adding to iommu group 4 Mar 25 02:36:39.510980 kernel: pci 0000:00:14.2: Adding to iommu group 4 Mar 25 02:36:39.511031 kernel: pci 0000:00:15.0: Adding to iommu group 5 Mar 25 02:36:39.511080 kernel: pci 0000:00:15.1: Adding to iommu group 5 Mar 25 02:36:39.511130 kernel: pci 0000:00:16.0: Adding to iommu group 6 Mar 25 02:36:39.511179 kernel: pci 0000:00:16.1: Adding to iommu group 6 Mar 25 02:36:39.511230 kernel: pci 0000:00:16.4: Adding to iommu group 6 Mar 25 02:36:39.511282 kernel: pci 0000:00:17.0: Adding to iommu group 7 Mar 25 02:36:39.511332 kernel: pci 0000:00:1b.0: Adding to iommu group 8 Mar 25 02:36:39.511383 kernel: pci 0000:00:1b.4: Adding to iommu group 9 Mar 25 02:36:39.511432 kernel: pci 0000:00:1b.5: Adding to iommu group 10 Mar 25 02:36:39.511484 kernel: pci 0000:00:1c.0: Adding to iommu group 11 Mar 25 02:36:39.511533 kernel: pci 0000:00:1c.3: Adding to iommu group 12 Mar 25 02:36:39.511584 kernel: pci 0000:00:1e.0: Adding to iommu group 13 Mar 25 02:36:39.511638 kernel: pci 0000:00:1f.0: Adding to iommu group 14 Mar 25 02:36:39.511691 kernel: pci 0000:00:1f.4: Adding to iommu group 14 Mar 25 02:36:39.511741 kernel: pci 0000:00:1f.5: Adding to iommu group 14 Mar 25 02:36:39.511793 kernel: pci 0000:01:00.0: Adding to iommu group 1 Mar 25 02:36:39.511845 kernel: pci 0000:01:00.1: Adding to iommu group 1 Mar 25 02:36:39.511896 kernel: pci 0000:03:00.0: Adding to iommu group 15 Mar 25 02:36:39.511948 kernel: pci 0000:04:00.0: Adding to iommu group 16 Mar 25 02:36:39.512000 kernel: pci 0000:06:00.0: Adding to iommu group 17 Mar 25 02:36:39.512053 kernel: pci 0000:07:00.0: Adding to iommu group 17 Mar 25 02:36:39.512063 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Mar 25 02:36:39.512069 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 25 02:36:39.512075 kernel: software IO TLB: mapped [mem 0x0000000086fcd000-0x000000008afcd000] (64MB) Mar 25 02:36:39.512081 kernel: RAPL PMU: API unit is 2^-32 Joules, 3 fixed counters, 655360 ms ovfl timer Mar 25 02:36:39.512087 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Mar 25 02:36:39.512093 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Mar 25 02:36:39.512099 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Mar 25 02:36:39.512152 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Mar 25 02:36:39.512162 kernel: Initialise system trusted keyrings Mar 25 02:36:39.512168 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Mar 25 02:36:39.512174 kernel: Key type asymmetric registered Mar 25 02:36:39.512180 kernel: Asymmetric key parser 'x509' registered Mar 25 02:36:39.512186 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 25 02:36:39.512192 kernel: io scheduler mq-deadline registered Mar 25 02:36:39.512198 kernel: io scheduler kyber registered Mar 25 02:36:39.512203 kernel: io scheduler bfq registered Mar 25 02:36:39.512253 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 121 Mar 25 02:36:39.512307 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 122 Mar 25 02:36:39.512357 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 123 Mar 25 02:36:39.512408 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 124 Mar 25 02:36:39.512459 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 125 Mar 25 02:36:39.512510 kernel: pcieport 0000:00:1c.3: PME: Signaling with IRQ 126 Mar 25 02:36:39.512564 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Mar 25 02:36:39.512574 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Mar 25 02:36:39.512580 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Mar 25 02:36:39.512587 kernel: pstore: Using crash dump compression: deflate Mar 25 02:36:39.512593 kernel: pstore: Registered erst as persistent store backend Mar 25 02:36:39.512599 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 25 02:36:39.512605 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 25 02:36:39.512611 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 25 02:36:39.512617 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Mar 25 02:36:39.512623 kernel: hpet_acpi_add: no address or irqs in _CRS Mar 25 02:36:39.512678 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Mar 25 02:36:39.512689 kernel: i8042: PNP: No PS/2 controller found. Mar 25 02:36:39.512736 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Mar 25 02:36:39.512782 kernel: rtc_cmos rtc_cmos: registered as rtc0 Mar 25 02:36:39.512829 kernel: rtc_cmos rtc_cmos: setting system clock to 2025-03-25T02:36:38 UTC (1742870198) Mar 25 02:36:39.512876 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Mar 25 02:36:39.512885 kernel: intel_pstate: Intel P-state driver initializing Mar 25 02:36:39.512891 kernel: intel_pstate: Disabling energy efficiency optimization Mar 25 02:36:39.512896 kernel: intel_pstate: HWP enabled Mar 25 02:36:39.512904 kernel: NET: Registered PF_INET6 protocol family Mar 25 02:36:39.512910 kernel: Segment Routing with IPv6 Mar 25 02:36:39.512916 kernel: In-situ OAM (IOAM) with IPv6 Mar 25 02:36:39.512921 kernel: NET: Registered PF_PACKET protocol family Mar 25 02:36:39.512927 kernel: Key type dns_resolver registered Mar 25 02:36:39.512933 kernel: microcode: Current revision: 0x00000102 Mar 25 02:36:39.512939 kernel: microcode: Updated early from: 0x000000f4 Mar 25 02:36:39.512945 kernel: microcode: Microcode Update Driver: v2.2. Mar 25 02:36:39.512951 kernel: IPI shorthand broadcast: enabled Mar 25 02:36:39.512958 kernel: sched_clock: Marking stable (2502189413, 1441291719)->(4506626171, -563145039) Mar 25 02:36:39.512964 kernel: registered taskstats version 1 Mar 25 02:36:39.512970 kernel: Loading compiled-in X.509 certificates Mar 25 02:36:39.512976 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: eff01054e94a599f8e404b9a9482f4e2220f5386' Mar 25 02:36:39.512981 kernel: Key type .fscrypt registered Mar 25 02:36:39.512987 kernel: Key type fscrypt-provisioning registered Mar 25 02:36:39.512993 kernel: ima: Allocated hash algorithm: sha1 Mar 25 02:36:39.512998 kernel: ima: No architecture policies found Mar 25 02:36:39.513004 kernel: clk: Disabling unused clocks Mar 25 02:36:39.513011 kernel: Freeing unused kernel image (initmem) memory: 43592K Mar 25 02:36:39.513017 kernel: Write protecting the kernel read-only data: 40960k Mar 25 02:36:39.513023 kernel: Freeing unused kernel image (rodata/data gap) memory: 1564K Mar 25 02:36:39.513028 kernel: Run /init as init process Mar 25 02:36:39.513034 kernel: with arguments: Mar 25 02:36:39.513040 kernel: /init Mar 25 02:36:39.513046 kernel: with environment: Mar 25 02:36:39.513051 kernel: HOME=/ Mar 25 02:36:39.513057 kernel: TERM=linux Mar 25 02:36:39.513064 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 25 02:36:39.513070 systemd[1]: Successfully made /usr/ read-only. Mar 25 02:36:39.513078 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 02:36:39.513084 systemd[1]: Detected architecture x86-64. Mar 25 02:36:39.513090 systemd[1]: Running in initrd. Mar 25 02:36:39.513096 systemd[1]: No hostname configured, using default hostname. Mar 25 02:36:39.513102 systemd[1]: Hostname set to . Mar 25 02:36:39.513109 systemd[1]: Initializing machine ID from random generator. Mar 25 02:36:39.513115 systemd[1]: Queued start job for default target initrd.target. Mar 25 02:36:39.513121 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 02:36:39.513128 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 02:36:39.513134 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 25 02:36:39.513140 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 02:36:39.513146 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 25 02:36:39.513152 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 25 02:36:39.513160 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 25 02:36:39.513167 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 25 02:36:39.513173 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 02:36:39.513179 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 02:36:39.513185 systemd[1]: Reached target paths.target - Path Units. Mar 25 02:36:39.513191 systemd[1]: Reached target slices.target - Slice Units. Mar 25 02:36:39.513197 systemd[1]: Reached target swap.target - Swaps. Mar 25 02:36:39.513204 systemd[1]: Reached target timers.target - Timer Units. Mar 25 02:36:39.513211 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 02:36:39.513217 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 02:36:39.513223 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 25 02:36:39.513229 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 25 02:36:39.513235 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 02:36:39.513241 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 02:36:39.513247 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 02:36:39.513253 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 02:36:39.513260 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 25 02:36:39.513267 kernel: tsc: Refined TSC clocksource calibration: 3407.999 MHz Mar 25 02:36:39.513273 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd336761, max_idle_ns: 440795243819 ns Mar 25 02:36:39.513278 kernel: clocksource: Switched to clocksource tsc Mar 25 02:36:39.513284 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 02:36:39.513291 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 25 02:36:39.513297 systemd[1]: Starting systemd-fsck-usr.service... Mar 25 02:36:39.513303 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 02:36:39.513310 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 02:36:39.513327 systemd-journald[269]: Collecting audit messages is disabled. Mar 25 02:36:39.513342 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 02:36:39.513349 systemd-journald[269]: Journal started Mar 25 02:36:39.513365 systemd-journald[269]: Runtime Journal (/run/log/journal/f6a5c9cacb934bf8a2540370db4d15cd) is 8M, max 639.8M, 631.8M free. Mar 25 02:36:39.498723 systemd-modules-load[273]: Inserted module 'overlay' Mar 25 02:36:39.551075 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 02:36:39.551087 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 25 02:36:39.551098 kernel: Bridge firewalling registered Mar 25 02:36:39.551403 systemd-modules-load[273]: Inserted module 'br_netfilter' Mar 25 02:36:39.551795 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 25 02:36:39.551892 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 02:36:39.551986 systemd[1]: Finished systemd-fsck-usr.service. Mar 25 02:36:39.552068 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 02:36:39.553178 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 02:36:39.553530 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 02:36:39.554001 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 02:36:39.586960 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 02:36:39.694418 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 02:36:39.714127 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 02:36:39.735921 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 02:36:39.763504 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 02:36:39.785823 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 02:36:39.796624 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 02:36:39.826558 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 02:36:39.859647 systemd-resolved[299]: Positive Trust Anchors: Mar 25 02:36:39.859655 systemd-resolved[299]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 02:36:39.859689 systemd-resolved[299]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 02:36:39.862064 systemd-resolved[299]: Defaulting to hostname 'linux'. Mar 25 02:36:39.862869 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 02:36:39.862941 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 02:36:39.866966 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 02:36:39.878381 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 25 02:36:39.923459 dracut-cmdline[313]: dracut-dracut-053 Mar 25 02:36:39.930769 dracut-cmdline[313]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 02:36:40.092659 kernel: SCSI subsystem initialized Mar 25 02:36:40.103675 kernel: Loading iSCSI transport class v2.0-870. Mar 25 02:36:40.116660 kernel: iscsi: registered transport (tcp) Mar 25 02:36:40.137501 kernel: iscsi: registered transport (qla4xxx) Mar 25 02:36:40.137518 kernel: QLogic iSCSI HBA Driver Mar 25 02:36:40.160502 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 25 02:36:40.172832 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 25 02:36:40.246928 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 25 02:36:40.246957 kernel: device-mapper: uevent: version 1.0.3 Mar 25 02:36:40.255743 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 25 02:36:40.291694 kernel: raid6: avx2x4 gen() 46876 MB/s Mar 25 02:36:40.312694 kernel: raid6: avx2x2 gen() 53794 MB/s Mar 25 02:36:40.338771 kernel: raid6: avx2x1 gen() 45184 MB/s Mar 25 02:36:40.338787 kernel: raid6: using algorithm avx2x2 gen() 53794 MB/s Mar 25 02:36:40.365819 kernel: raid6: .... xor() 32334 MB/s, rmw enabled Mar 25 02:36:40.365838 kernel: raid6: using avx2x2 recovery algorithm Mar 25 02:36:40.386633 kernel: xor: automatically using best checksumming function avx Mar 25 02:36:40.484661 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 25 02:36:40.490369 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 25 02:36:40.501793 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 02:36:40.551385 systemd-udevd[496]: Using default interface naming scheme 'v255'. Mar 25 02:36:40.555787 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 02:36:40.573472 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 25 02:36:40.619506 dracut-pre-trigger[507]: rd.md=0: removing MD RAID activation Mar 25 02:36:40.636188 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 02:36:40.647917 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 02:36:40.738201 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 02:36:40.783931 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 25 02:36:40.783947 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 25 02:36:40.783959 kernel: cryptd: max_cpu_qlen set to 1000 Mar 25 02:36:40.783966 kernel: ACPI: bus type USB registered Mar 25 02:36:40.783973 kernel: usbcore: registered new interface driver usbfs Mar 25 02:36:40.764614 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 25 02:36:40.807098 kernel: usbcore: registered new interface driver hub Mar 25 02:36:40.807114 kernel: usbcore: registered new device driver usb Mar 25 02:36:40.807121 kernel: PTP clock support registered Mar 25 02:36:40.807135 kernel: libata version 3.00 loaded. Mar 25 02:36:40.807432 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 02:36:40.918875 kernel: AVX2 version of gcm_enc/dec engaged. Mar 25 02:36:40.918892 kernel: AES CTR mode by8 optimization enabled Mar 25 02:36:40.918900 kernel: ahci 0000:00:17.0: version 3.0 Mar 25 02:36:40.918995 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Mar 25 02:36:40.919004 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Mar 25 02:36:40.919011 kernel: ahci 0000:00:17.0: AHCI 0001.0301 32 slots 7 ports 6 Gbps 0x7f impl SATA mode Mar 25 02:36:40.919083 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Mar 25 02:36:40.919151 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Mar 25 02:36:40.993188 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Mar 25 02:36:40.993262 kernel: igb 0000:03:00.0: added PHC on eth0 Mar 25 02:36:40.993337 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Mar 25 02:36:40.993401 kernel: igb 0000:03:00.0: Intel(R) Gigabit Ethernet Network Connection Mar 25 02:36:40.993465 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Mar 25 02:36:40.993526 kernel: scsi host0: ahci Mar 25 02:36:40.993593 kernel: scsi host1: ahci Mar 25 02:36:40.993658 kernel: scsi host2: ahci Mar 25 02:36:40.993716 kernel: scsi host3: ahci Mar 25 02:36:40.993774 kernel: scsi host4: ahci Mar 25 02:36:40.993834 kernel: scsi host5: ahci Mar 25 02:36:40.993892 kernel: scsi host6: ahci Mar 25 02:36:40.993950 kernel: ata1: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516100 irq 127 Mar 25 02:36:40.993958 kernel: ata2: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516180 irq 127 Mar 25 02:36:40.993966 kernel: ata3: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516200 irq 127 Mar 25 02:36:40.993973 kernel: ata4: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516280 irq 127 Mar 25 02:36:40.993980 kernel: ata5: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516300 irq 127 Mar 25 02:36:40.993987 kernel: ata6: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516380 irq 127 Mar 25 02:36:40.993994 kernel: ata7: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516400 irq 127 Mar 25 02:36:40.994001 kernel: igb 0000:03:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:6a:ef:e6 Mar 25 02:36:40.994069 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Mar 25 02:36:40.994132 kernel: igb 0000:03:00.0: eth0: PBA No: 010000-000 Mar 25 02:36:40.994196 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Mar 25 02:36:40.994260 kernel: igb 0000:03:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Mar 25 02:36:40.994324 kernel: hub 1-0:1.0: USB hub found Mar 25 02:36:40.994390 kernel: igb 0000:04:00.0: added PHC on eth1 Mar 25 02:36:41.054152 kernel: hub 1-0:1.0: 16 ports detected Mar 25 02:36:41.054220 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Mar 25 02:36:41.054286 kernel: hub 2-0:1.0: USB hub found Mar 25 02:36:41.054353 kernel: igb 0000:04:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:6a:ef:e7 Mar 25 02:36:41.054418 kernel: hub 2-0:1.0: 10 ports detected Mar 25 02:36:41.054478 kernel: igb 0000:04:00.0: eth1: PBA No: 010000-000 Mar 25 02:36:41.054541 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Mar 25 02:36:40.807546 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 02:36:41.082391 kernel: mlx5_core 0000:01:00.0: firmware version: 14.27.1016 Mar 25 02:36:41.548792 kernel: mlx5_core 0000:01:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Mar 25 02:36:41.548878 kernel: ata7: SATA link down (SStatus 0 SControl 300) Mar 25 02:36:41.548888 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 25 02:36:41.548895 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 25 02:36:41.548907 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 25 02:36:41.548914 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Mar 25 02:36:41.549030 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 25 02:36:41.549039 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Mar 25 02:36:41.549047 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Mar 25 02:36:41.549054 kernel: ata2.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Mar 25 02:36:41.549061 kernel: ata1.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Mar 25 02:36:41.549068 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Mar 25 02:36:41.549077 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Mar 25 02:36:41.549085 kernel: ata1.00: Features: NCQ-prio Mar 25 02:36:41.549092 kernel: ata2.00: Features: NCQ-prio Mar 25 02:36:41.549099 kernel: ata1.00: configured for UDMA/133 Mar 25 02:36:41.549106 kernel: ata2.00: configured for UDMA/133 Mar 25 02:36:41.549113 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Mar 25 02:36:41.549188 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Mar 25 02:36:41.549256 kernel: igb 0000:03:00.0 eno1: renamed from eth0 Mar 25 02:36:41.549328 kernel: mlx5_core 0000:01:00.0: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Mar 25 02:36:41.549395 kernel: hub 1-14:1.0: USB hub found Mar 25 02:36:41.549474 kernel: mlx5_core 0000:01:00.0: Port module event: module 0, Cable plugged Mar 25 02:36:41.549540 kernel: hub 1-14:1.0: 4 ports detected Mar 25 02:36:41.549611 kernel: igb 0000:04:00.0 eno2: renamed from eth1 Mar 25 02:36:41.549690 kernel: ata2.00: Enabling discard_zeroes_data Mar 25 02:36:41.549698 kernel: ata1.00: Enabling discard_zeroes_data Mar 25 02:36:41.549705 kernel: sd 1:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Mar 25 02:36:41.549771 kernel: sd 0:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Mar 25 02:36:41.549834 kernel: sd 1:0:0:0: [sdb] 4096-byte physical blocks Mar 25 02:36:41.549893 kernel: sd 1:0:0:0: [sdb] Write Protect is off Mar 25 02:36:41.549954 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Mar 25 02:36:41.550014 kernel: sd 1:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Mar 25 02:36:41.550075 kernel: sd 1:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Mar 25 02:36:41.550136 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 25 02:36:41.550197 kernel: sd 0:0:0:0: [sda] Mode Sense: 00 3a 00 00 Mar 25 02:36:41.550258 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Mar 25 02:36:41.550316 kernel: sd 1:0:0:0: [sdb] Preferred minimum I/O size 4096 bytes Mar 25 02:36:41.550376 kernel: sd 0:0:0:0: [sda] Preferred minimum I/O size 4096 bytes Mar 25 02:36:41.550435 kernel: ata2.00: Enabling discard_zeroes_data Mar 25 02:36:41.550443 kernel: ata1.00: Enabling discard_zeroes_data Mar 25 02:36:41.550450 kernel: sd 1:0:0:0: [sdb] Attached SCSI disk Mar 25 02:36:41.550508 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 25 02:36:41.550518 kernel: GPT:9289727 != 937703087 Mar 25 02:36:41.550525 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 25 02:36:41.550532 kernel: GPT:9289727 != 937703087 Mar 25 02:36:41.550539 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 25 02:36:41.550547 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 25 02:36:41.550554 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 25 02:36:41.550614 kernel: mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Mar 25 02:36:41.550688 kernel: BTRFS: device fsid 6d9424cd-1432-492b-b006-b311869817e2 devid 1 transid 39 /dev/sda3 scanned by (udev-worker) (563) Mar 25 02:36:41.550698 kernel: mlx5_core 0000:01:00.1: firmware version: 14.27.1016 Mar 25 02:36:42.097537 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sda6 scanned by (udev-worker) (567) Mar 25 02:36:42.097600 kernel: mlx5_core 0000:01:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Mar 25 02:36:42.098034 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Mar 25 02:36:42.098540 kernel: ata1.00: Enabling discard_zeroes_data Mar 25 02:36:42.098579 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 25 02:36:42.098611 kernel: ata1.00: Enabling discard_zeroes_data Mar 25 02:36:42.098668 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 25 02:36:42.098706 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 25 02:36:42.098753 kernel: usbcore: registered new interface driver usbhid Mar 25 02:36:42.098785 kernel: usbhid: USB HID core driver Mar 25 02:36:42.098816 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Mar 25 02:36:42.098847 kernel: mlx5_core 0000:01:00.1: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Mar 25 02:36:42.099185 kernel: mlx5_core 0000:01:00.1: Port module event: module 1, Cable plugged Mar 25 02:36:42.099484 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Mar 25 02:36:42.099854 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Mar 25 02:36:42.099901 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Mar 25 02:36:42.100232 kernel: mlx5_core 0000:01:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Mar 25 02:36:41.097827 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 02:36:42.120879 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: renamed from eth1 Mar 25 02:36:42.121279 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: renamed from eth0 Mar 25 02:36:41.108727 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 02:36:41.108822 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 02:36:41.119775 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 02:36:41.130207 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 02:36:41.139910 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 25 02:36:41.152866 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 25 02:36:41.163474 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 02:36:42.204867 disk-uuid[708]: Primary Header is updated. Mar 25 02:36:42.204867 disk-uuid[708]: Secondary Entries is updated. Mar 25 02:36:42.204867 disk-uuid[708]: Secondary Header is updated. Mar 25 02:36:41.163514 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 02:36:41.163529 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 02:36:41.163992 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 25 02:36:41.232132 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 02:36:41.238964 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 02:36:41.307554 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 25 02:36:41.372895 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 02:36:41.562797 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Micron_5300_MTFDDAK480TDT ROOT. Mar 25 02:36:41.607533 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Micron_5300_MTFDDAK480TDT EFI-SYSTEM. Mar 25 02:36:41.629863 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Micron_5300_MTFDDAK480TDT USR-A. Mar 25 02:36:41.640706 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Micron_5300_MTFDDAK480TDT USR-A. Mar 25 02:36:41.658487 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5300_MTFDDAK480TDT OEM. Mar 25 02:36:41.677160 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 25 02:36:42.716464 kernel: ata1.00: Enabling discard_zeroes_data Mar 25 02:36:42.724550 disk-uuid[709]: The operation has completed successfully. Mar 25 02:36:42.733711 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 25 02:36:42.765145 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 25 02:36:42.765193 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 25 02:36:42.808646 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 25 02:36:42.836508 sh[739]: Success Mar 25 02:36:42.851677 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Mar 25 02:36:42.893770 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 25 02:36:42.904848 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 25 02:36:42.937854 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 25 02:36:43.002738 kernel: BTRFS info (device dm-0): first mount of filesystem 6d9424cd-1432-492b-b006-b311869817e2 Mar 25 02:36:43.002755 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 25 02:36:43.002763 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 25 02:36:43.002849 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 25 02:36:43.002856 kernel: BTRFS info (device dm-0): using free space tree Mar 25 02:36:43.002863 kernel: BTRFS info (device dm-0): enabling ssd optimizations Mar 25 02:36:43.000206 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 25 02:36:43.011078 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 25 02:36:43.011522 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 25 02:36:43.020155 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 25 02:36:43.085515 kernel: BTRFS info (device sda6): first mount of filesystem 3596bdb1-01cf-4fa7-b56c-116513d97811 Mar 25 02:36:43.085536 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 02:36:43.092550 kernel: BTRFS info (device sda6): using free space tree Mar 25 02:36:43.108069 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 25 02:36:43.108086 kernel: BTRFS info (device sda6): auto enabling async discard Mar 25 02:36:43.121642 kernel: BTRFS info (device sda6): last unmount of filesystem 3596bdb1-01cf-4fa7-b56c-116513d97811 Mar 25 02:36:43.122311 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 25 02:36:43.132442 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 25 02:36:43.170841 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 02:36:43.193057 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 02:36:43.236592 ignition[855]: Ignition 2.20.0 Mar 25 02:36:43.236597 ignition[855]: Stage: fetch-offline Mar 25 02:36:43.237607 systemd-networkd[919]: lo: Link UP Mar 25 02:36:43.236616 ignition[855]: no configs at "/usr/lib/ignition/base.d" Mar 25 02:36:43.237610 systemd-networkd[919]: lo: Gained carrier Mar 25 02:36:43.236621 ignition[855]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Mar 25 02:36:43.239302 unknown[855]: fetched base config from "system" Mar 25 02:36:43.236677 ignition[855]: parsed url from cmdline: "" Mar 25 02:36:43.239306 unknown[855]: fetched user config from "system" Mar 25 02:36:43.236679 ignition[855]: no config URL provided Mar 25 02:36:43.240129 systemd-networkd[919]: Enumeration completed Mar 25 02:36:43.236681 ignition[855]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 02:36:43.240192 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 02:36:43.236703 ignition[855]: parsing config with SHA512: 7ab5ac613122594119411e4f33d2796c6947e1217c4f9fd81a7a36ad15c9a3645e755129512e43cf68d0c8ea9547e5e51071506fa3f1250a847336d676ccf714 Mar 25 02:36:43.240764 systemd-networkd[919]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 02:36:43.239774 ignition[855]: fetch-offline: fetch-offline passed Mar 25 02:36:43.257993 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 02:36:43.239780 ignition[855]: POST message to Packet Timeline Mar 25 02:36:43.268620 systemd-networkd[919]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 02:36:43.239789 ignition[855]: POST Status error: resource requires networking Mar 25 02:36:43.277098 systemd[1]: Reached target network.target - Network. Mar 25 02:36:43.239856 ignition[855]: Ignition finished successfully Mar 25 02:36:43.296818 systemd-networkd[919]: enp1s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 02:36:43.338035 ignition[933]: Ignition 2.20.0 Mar 25 02:36:43.299798 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Mar 25 02:36:43.484848 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Mar 25 02:36:43.338040 ignition[933]: Stage: kargs Mar 25 02:36:43.300587 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 25 02:36:43.338153 ignition[933]: no configs at "/usr/lib/ignition/base.d" Mar 25 02:36:43.474142 systemd-networkd[919]: enp1s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 02:36:43.338160 ignition[933]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Mar 25 02:36:43.338731 ignition[933]: kargs: kargs passed Mar 25 02:36:43.338734 ignition[933]: POST message to Packet Timeline Mar 25 02:36:43.338746 ignition[933]: GET https://metadata.packet.net/metadata: attempt #1 Mar 25 02:36:43.339229 ignition[933]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:50699->[::1]:53: read: connection refused Mar 25 02:36:43.540109 ignition[933]: GET https://metadata.packet.net/metadata: attempt #2 Mar 25 02:36:43.541174 ignition[933]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:33685->[::1]:53: read: connection refused Mar 25 02:36:43.690741 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Mar 25 02:36:43.691357 systemd-networkd[919]: eno1: Link UP Mar 25 02:36:43.691483 systemd-networkd[919]: eno2: Link UP Mar 25 02:36:43.691602 systemd-networkd[919]: enp1s0f0np0: Link UP Mar 25 02:36:43.691766 systemd-networkd[919]: enp1s0f0np0: Gained carrier Mar 25 02:36:43.704884 systemd-networkd[919]: enp1s0f1np1: Link UP Mar 25 02:36:43.733832 systemd-networkd[919]: enp1s0f0np0: DHCPv4 address 147.75.90.239/31, gateway 147.75.90.238 acquired from 145.40.83.140 Mar 25 02:36:43.941701 ignition[933]: GET https://metadata.packet.net/metadata: attempt #3 Mar 25 02:36:43.942873 ignition[933]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:51789->[::1]:53: read: connection refused Mar 25 02:36:44.482415 systemd-networkd[919]: enp1s0f1np1: Gained carrier Mar 25 02:36:44.743298 ignition[933]: GET https://metadata.packet.net/metadata: attempt #4 Mar 25 02:36:44.744471 ignition[933]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:38848->[::1]:53: read: connection refused Mar 25 02:36:44.866278 systemd-networkd[919]: enp1s0f0np0: Gained IPv6LL Mar 25 02:36:45.634237 systemd-networkd[919]: enp1s0f1np1: Gained IPv6LL Mar 25 02:36:46.346007 ignition[933]: GET https://metadata.packet.net/metadata: attempt #5 Mar 25 02:36:46.347219 ignition[933]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:47616->[::1]:53: read: connection refused Mar 25 02:36:49.550674 ignition[933]: GET https://metadata.packet.net/metadata: attempt #6 Mar 25 02:36:50.439278 ignition[933]: GET result: OK Mar 25 02:36:50.782703 ignition[933]: Ignition finished successfully Mar 25 02:36:50.785860 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 25 02:36:50.803062 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 25 02:36:50.841120 ignition[953]: Ignition 2.20.0 Mar 25 02:36:50.841126 ignition[953]: Stage: disks Mar 25 02:36:50.841253 ignition[953]: no configs at "/usr/lib/ignition/base.d" Mar 25 02:36:50.841262 ignition[953]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Mar 25 02:36:50.841992 ignition[953]: disks: disks passed Mar 25 02:36:50.841996 ignition[953]: POST message to Packet Timeline Mar 25 02:36:50.842010 ignition[953]: GET https://metadata.packet.net/metadata: attempt #1 Mar 25 02:36:51.837450 ignition[953]: GET result: OK Mar 25 02:36:52.215141 ignition[953]: Ignition finished successfully Mar 25 02:36:52.218667 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 25 02:36:52.233362 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 25 02:36:52.252952 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 25 02:36:52.274047 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 02:36:52.294942 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 02:36:52.312039 systemd[1]: Reached target basic.target - Basic System. Mar 25 02:36:52.324899 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 25 02:36:52.380790 systemd-fsck[973]: ROOT: clean, 14/553520 files, 52654/553472 blocks Mar 25 02:36:52.391004 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 25 02:36:52.407494 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 25 02:36:52.508703 kernel: EXT4-fs (sda9): mounted filesystem 4e6dca82-2e50-453c-be25-61f944b72008 r/w with ordered data mode. Quota mode: none. Mar 25 02:36:52.508744 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 25 02:36:52.518136 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 25 02:36:52.526832 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 02:36:52.544623 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 25 02:36:52.570487 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 25 02:36:52.634864 kernel: BTRFS: device label OEM devid 1 transid 19 /dev/sda6 scanned by mount (982) Mar 25 02:36:52.634879 kernel: BTRFS info (device sda6): first mount of filesystem 3596bdb1-01cf-4fa7-b56c-116513d97811 Mar 25 02:36:52.634887 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 02:36:52.634897 kernel: BTRFS info (device sda6): using free space tree Mar 25 02:36:52.634905 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 25 02:36:52.634912 kernel: BTRFS info (device sda6): auto enabling async discard Mar 25 02:36:52.572834 systemd[1]: Starting flatcar-static-network.service - Flatcar Static Network Agent... Mar 25 02:36:52.645926 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 25 02:36:52.645953 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 02:36:52.661016 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 02:36:52.716910 coreos-metadata[984]: Mar 25 02:36:52.713 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Mar 25 02:36:52.691027 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 25 02:36:52.747717 coreos-metadata[985]: Mar 25 02:36:52.713 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Mar 25 02:36:52.709587 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 25 02:36:52.768783 initrd-setup-root[1014]: cut: /sysroot/etc/passwd: No such file or directory Mar 25 02:36:52.778712 initrd-setup-root[1021]: cut: /sysroot/etc/group: No such file or directory Mar 25 02:36:52.788689 initrd-setup-root[1028]: cut: /sysroot/etc/shadow: No such file or directory Mar 25 02:36:52.798745 initrd-setup-root[1035]: cut: /sysroot/etc/gshadow: No such file or directory Mar 25 02:36:52.822130 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 25 02:36:52.832529 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 25 02:36:52.858144 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 25 02:36:52.885852 kernel: BTRFS info (device sda6): last unmount of filesystem 3596bdb1-01cf-4fa7-b56c-116513d97811 Mar 25 02:36:52.862355 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 25 02:36:52.898153 ignition[1102]: INFO : Ignition 2.20.0 Mar 25 02:36:52.898153 ignition[1102]: INFO : Stage: mount Mar 25 02:36:52.912667 ignition[1102]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 02:36:52.912667 ignition[1102]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Mar 25 02:36:52.912667 ignition[1102]: INFO : mount: mount passed Mar 25 02:36:52.912667 ignition[1102]: INFO : POST message to Packet Timeline Mar 25 02:36:52.912667 ignition[1102]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Mar 25 02:36:52.911188 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 25 02:36:53.323608 coreos-metadata[984]: Mar 25 02:36:53.323 INFO Fetch successful Mar 25 02:36:53.402109 coreos-metadata[984]: Mar 25 02:36:53.402 INFO wrote hostname ci-4284.0.0-a-3a00d206eb to /sysroot/etc/hostname Mar 25 02:36:53.403441 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 25 02:36:53.717102 coreos-metadata[985]: Mar 25 02:36:53.716 INFO Fetch successful Mar 25 02:36:53.796286 systemd[1]: flatcar-static-network.service: Deactivated successfully. Mar 25 02:36:53.796349 systemd[1]: Finished flatcar-static-network.service - Flatcar Static Network Agent. Mar 25 02:36:53.961192 ignition[1102]: INFO : GET result: OK Mar 25 02:36:54.440950 ignition[1102]: INFO : Ignition finished successfully Mar 25 02:36:54.443979 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 25 02:36:54.465326 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 25 02:36:54.504927 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 02:36:54.565380 kernel: BTRFS: device label OEM devid 1 transid 20 /dev/sda6 scanned by mount (1126) Mar 25 02:36:54.565408 kernel: BTRFS info (device sda6): first mount of filesystem 3596bdb1-01cf-4fa7-b56c-116513d97811 Mar 25 02:36:54.573458 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 02:36:54.579359 kernel: BTRFS info (device sda6): using free space tree Mar 25 02:36:54.594645 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 25 02:36:54.594661 kernel: BTRFS info (device sda6): auto enabling async discard Mar 25 02:36:54.596623 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 02:36:54.630544 ignition[1143]: INFO : Ignition 2.20.0 Mar 25 02:36:54.630544 ignition[1143]: INFO : Stage: files Mar 25 02:36:54.644868 ignition[1143]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 02:36:54.644868 ignition[1143]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Mar 25 02:36:54.644868 ignition[1143]: DEBUG : files: compiled without relabeling support, skipping Mar 25 02:36:54.644868 ignition[1143]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 25 02:36:54.644868 ignition[1143]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 25 02:36:54.644868 ignition[1143]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 25 02:36:54.644868 ignition[1143]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 25 02:36:54.644868 ignition[1143]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 25 02:36:54.644868 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Mar 25 02:36:54.644868 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Mar 25 02:36:54.635233 unknown[1143]: wrote ssh authorized keys file for user: core Mar 25 02:36:54.774746 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 25 02:36:54.785857 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Mar 25 02:36:54.785857 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 25 02:36:54.785857 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 25 02:36:54.785857 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 25 02:36:54.785857 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 25 02:36:54.785857 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 02:36:54.785857 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 02:36:54.785857 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 02:36:54.785857 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 02:36:54.785857 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 02:36:54.785857 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 02:36:54.785857 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Mar 25 02:36:54.785857 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Mar 25 02:36:54.785857 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Mar 25 02:36:54.785857 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.32.0-x86-64.raw: attempt #1 Mar 25 02:36:55.237382 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 25 02:36:56.162014 ignition[1143]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Mar 25 02:36:56.162014 ignition[1143]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 25 02:36:56.193860 ignition[1143]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 02:36:56.193860 ignition[1143]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 02:36:56.193860 ignition[1143]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 25 02:36:56.193860 ignition[1143]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 25 02:36:56.193860 ignition[1143]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 25 02:36:56.193860 ignition[1143]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 25 02:36:56.193860 ignition[1143]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 25 02:36:56.193860 ignition[1143]: INFO : files: files passed Mar 25 02:36:56.193860 ignition[1143]: INFO : POST message to Packet Timeline Mar 25 02:36:56.193860 ignition[1143]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Mar 25 02:36:57.679333 ignition[1143]: INFO : GET result: OK Mar 25 02:36:58.497194 ignition[1143]: INFO : Ignition finished successfully Mar 25 02:36:58.500504 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 25 02:36:58.519047 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 25 02:36:58.534249 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 25 02:36:58.559101 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 25 02:36:58.559173 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 25 02:36:58.605020 initrd-setup-root-after-ignition[1182]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 02:36:58.605020 initrd-setup-root-after-ignition[1182]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 25 02:36:58.618900 initrd-setup-root-after-ignition[1186]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 02:36:58.605975 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 02:36:58.643406 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 25 02:36:58.669137 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 25 02:36:58.790119 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 25 02:36:58.790182 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 25 02:36:58.810254 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 25 02:36:58.831891 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 25 02:36:58.852075 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 25 02:36:58.854542 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 25 02:36:58.940037 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 02:36:58.954792 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 25 02:36:59.006608 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 25 02:36:59.017874 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 02:36:59.040089 systemd[1]: Stopped target timers.target - Timer Units. Mar 25 02:36:59.059181 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 25 02:36:59.059429 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 02:36:59.098069 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 25 02:36:59.108263 systemd[1]: Stopped target basic.target - Basic System. Mar 25 02:36:59.127268 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 25 02:36:59.146248 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 02:36:59.167243 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 25 02:36:59.188279 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 25 02:36:59.208254 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 02:36:59.229305 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 25 02:36:59.251286 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 25 02:36:59.271248 systemd[1]: Stopped target swap.target - Swaps. Mar 25 02:36:59.289261 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 25 02:36:59.289715 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 25 02:36:59.316369 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 25 02:36:59.336289 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 02:36:59.357121 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 25 02:36:59.357484 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 02:36:59.379148 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 25 02:36:59.379567 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 25 02:36:59.411243 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 25 02:36:59.411736 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 02:36:59.431459 systemd[1]: Stopped target paths.target - Path Units. Mar 25 02:36:59.449118 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 25 02:36:59.452883 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 02:36:59.470255 systemd[1]: Stopped target slices.target - Slice Units. Mar 25 02:36:59.489241 systemd[1]: Stopped target sockets.target - Socket Units. Mar 25 02:36:59.508250 systemd[1]: iscsid.socket: Deactivated successfully. Mar 25 02:36:59.508560 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 02:36:59.529272 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 25 02:36:59.529569 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 02:36:59.553374 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 25 02:36:59.553817 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 02:36:59.574347 systemd[1]: ignition-files.service: Deactivated successfully. Mar 25 02:36:59.704842 ignition[1206]: INFO : Ignition 2.20.0 Mar 25 02:36:59.704842 ignition[1206]: INFO : Stage: umount Mar 25 02:36:59.704842 ignition[1206]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 02:36:59.704842 ignition[1206]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Mar 25 02:36:59.704842 ignition[1206]: INFO : umount: umount passed Mar 25 02:36:59.704842 ignition[1206]: INFO : POST message to Packet Timeline Mar 25 02:36:59.704842 ignition[1206]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Mar 25 02:36:59.574769 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 25 02:36:59.593347 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 25 02:36:59.593786 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 25 02:36:59.616698 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 25 02:36:59.630838 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 25 02:36:59.630920 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 02:36:59.658664 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 25 02:36:59.668925 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 25 02:36:59.669134 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 02:36:59.695917 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 25 02:36:59.695986 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 02:36:59.732345 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 25 02:36:59.733234 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 25 02:36:59.733326 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 25 02:36:59.742411 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 25 02:36:59.742515 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 25 02:37:00.848000 ignition[1206]: INFO : GET result: OK Mar 25 02:37:01.191140 ignition[1206]: INFO : Ignition finished successfully Mar 25 02:37:01.192702 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 25 02:37:01.192851 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 25 02:37:01.210912 systemd[1]: Stopped target network.target - Network. Mar 25 02:37:01.218350 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 25 02:37:01.218552 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 25 02:37:01.245158 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 25 02:37:01.245336 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 25 02:37:01.263151 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 25 02:37:01.263322 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 25 02:37:01.271407 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 25 02:37:01.271569 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 25 02:37:01.299116 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 25 02:37:01.299297 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 25 02:37:01.317515 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 25 02:37:01.335240 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 25 02:37:01.354815 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 25 02:37:01.355097 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 25 02:37:01.377881 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 25 02:37:01.378456 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 25 02:37:01.378801 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 25 02:37:01.395504 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 25 02:37:01.397153 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 25 02:37:01.397180 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 25 02:37:01.413352 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 25 02:37:01.422847 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 25 02:37:01.422885 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 02:37:01.439997 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 25 02:37:01.440073 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 25 02:37:01.469258 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 25 02:37:01.469412 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 25 02:37:01.479379 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 25 02:37:01.479549 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 02:37:01.507340 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 02:37:01.531295 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 25 02:37:01.531505 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 25 02:37:01.532564 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 25 02:37:01.532943 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 02:37:01.550975 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 25 02:37:01.551015 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 25 02:37:01.557904 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 25 02:37:01.557932 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 02:37:01.596087 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 25 02:37:01.596252 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 25 02:37:01.626212 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 25 02:37:01.626397 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 25 02:37:01.666763 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 02:37:01.666824 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 02:37:01.705660 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 25 02:37:01.979875 systemd-journald[269]: Received SIGTERM from PID 1 (systemd). Mar 25 02:37:01.714922 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 25 02:37:01.715085 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 02:37:01.744290 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 02:37:01.744456 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 02:37:01.770305 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 25 02:37:01.770488 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 25 02:37:01.771689 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 25 02:37:01.771934 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 25 02:37:01.830252 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 25 02:37:01.830564 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 25 02:37:01.843876 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 25 02:37:01.865223 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 25 02:37:01.923609 systemd[1]: Switching root. Mar 25 02:37:02.103741 systemd-journald[269]: Journal stopped Mar 25 02:37:03.798945 kernel: SELinux: policy capability network_peer_controls=1 Mar 25 02:37:03.798961 kernel: SELinux: policy capability open_perms=1 Mar 25 02:37:03.798968 kernel: SELinux: policy capability extended_socket_class=1 Mar 25 02:37:03.798973 kernel: SELinux: policy capability always_check_network=0 Mar 25 02:37:03.798980 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 25 02:37:03.798985 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 25 02:37:03.798991 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 25 02:37:03.798997 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 25 02:37:03.799002 kernel: audit: type=1403 audit(1742870222.197:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 25 02:37:03.799009 systemd[1]: Successfully loaded SELinux policy in 72.359ms. Mar 25 02:37:03.799017 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.450ms. Mar 25 02:37:03.799024 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 02:37:03.799030 systemd[1]: Detected architecture x86-64. Mar 25 02:37:03.799036 systemd[1]: Detected first boot. Mar 25 02:37:03.799042 systemd[1]: Hostname set to . Mar 25 02:37:03.799049 systemd[1]: Initializing machine ID from random generator. Mar 25 02:37:03.799056 zram_generator::config[1262]: No configuration found. Mar 25 02:37:03.799063 systemd[1]: Populated /etc with preset unit settings. Mar 25 02:37:03.799069 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 25 02:37:03.799075 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 25 02:37:03.799081 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 25 02:37:03.799088 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 25 02:37:03.799095 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 25 02:37:03.799101 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 25 02:37:03.799108 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 25 02:37:03.799114 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 25 02:37:03.799121 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 25 02:37:03.799127 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 25 02:37:03.799134 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 25 02:37:03.799141 systemd[1]: Created slice user.slice - User and Session Slice. Mar 25 02:37:03.799148 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 02:37:03.799154 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 02:37:03.799160 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 25 02:37:03.799167 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 25 02:37:03.799173 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 25 02:37:03.799180 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 02:37:03.799186 systemd[1]: Expecting device dev-ttyS1.device - /dev/ttyS1... Mar 25 02:37:03.799193 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 02:37:03.799201 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 25 02:37:03.799208 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 25 02:37:03.799216 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 25 02:37:03.799223 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 25 02:37:03.799229 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 02:37:03.799236 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 02:37:03.799242 systemd[1]: Reached target slices.target - Slice Units. Mar 25 02:37:03.799250 systemd[1]: Reached target swap.target - Swaps. Mar 25 02:37:03.799257 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 25 02:37:03.799263 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 25 02:37:03.799270 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 25 02:37:03.799276 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 02:37:03.799283 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 02:37:03.799291 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 02:37:03.799298 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 25 02:37:03.799304 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 25 02:37:03.799311 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 25 02:37:03.799318 systemd[1]: Mounting media.mount - External Media Directory... Mar 25 02:37:03.799324 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 02:37:03.799331 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 25 02:37:03.799339 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 25 02:37:03.799345 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 25 02:37:03.799352 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 25 02:37:03.799359 systemd[1]: Reached target machines.target - Containers. Mar 25 02:37:03.799366 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 25 02:37:03.799372 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 02:37:03.799379 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 02:37:03.799386 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 25 02:37:03.799393 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 02:37:03.799400 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 02:37:03.799407 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 02:37:03.799414 kernel: ACPI: bus type drm_connector registered Mar 25 02:37:03.799420 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 25 02:37:03.799426 kernel: fuse: init (API version 7.39) Mar 25 02:37:03.799432 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 02:37:03.799439 kernel: loop: module loaded Mar 25 02:37:03.799445 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 25 02:37:03.799453 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 25 02:37:03.799460 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 25 02:37:03.799466 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 25 02:37:03.799473 systemd[1]: Stopped systemd-fsck-usr.service. Mar 25 02:37:03.799480 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 02:37:03.799488 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 02:37:03.799494 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 02:37:03.799510 systemd-journald[1366]: Collecting audit messages is disabled. Mar 25 02:37:03.799526 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 25 02:37:03.799533 systemd-journald[1366]: Journal started Mar 25 02:37:03.799549 systemd-journald[1366]: Runtime Journal (/run/log/journal/778c39bdbd8d4ae59a070870faf3e3bf) is 8M, max 639.8M, 631.8M free. Mar 25 02:37:02.649081 systemd[1]: Queued start job for default target multi-user.target. Mar 25 02:37:02.661440 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 25 02:37:02.661710 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 25 02:37:03.819678 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 25 02:37:03.852712 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 25 02:37:03.873694 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 02:37:03.894840 systemd[1]: verity-setup.service: Deactivated successfully. Mar 25 02:37:03.894867 systemd[1]: Stopped verity-setup.service. Mar 25 02:37:03.920668 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 02:37:03.928659 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 02:37:03.938139 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 25 02:37:03.947945 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 25 02:37:03.957925 systemd[1]: Mounted media.mount - External Media Directory. Mar 25 02:37:03.967920 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 25 02:37:03.977920 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 25 02:37:03.987781 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 25 02:37:03.998006 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 25 02:37:04.010013 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 02:37:04.022132 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 25 02:37:04.022335 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 25 02:37:04.035223 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 02:37:04.035488 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 02:37:04.047503 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 02:37:04.047998 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 02:37:04.058640 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 02:37:04.059126 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 02:37:04.070574 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 25 02:37:04.071147 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 25 02:37:04.081727 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 02:37:04.082255 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 02:37:04.092731 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 02:37:04.103702 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 25 02:37:04.115665 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 25 02:37:04.127674 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 25 02:37:04.139874 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 02:37:04.159587 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 25 02:37:04.170583 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 25 02:37:04.188148 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 25 02:37:04.197809 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 25 02:37:04.197837 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 02:37:04.209111 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 25 02:37:04.221465 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 25 02:37:04.244002 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 25 02:37:04.253891 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 02:37:04.267746 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 25 02:37:04.285135 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 25 02:37:04.295776 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 02:37:04.296418 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 25 02:37:04.301751 systemd-journald[1366]: Time spent on flushing to /var/log/journal/778c39bdbd8d4ae59a070870faf3e3bf is 12.674ms for 1371 entries. Mar 25 02:37:04.301751 systemd-journald[1366]: System Journal (/var/log/journal/778c39bdbd8d4ae59a070870faf3e3bf) is 8M, max 195.6M, 187.6M free. Mar 25 02:37:04.333241 systemd-journald[1366]: Received client request to flush runtime journal. Mar 25 02:37:04.313738 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 02:37:04.314536 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 02:37:04.324569 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 25 02:37:04.336530 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 25 02:37:04.349678 kernel: loop0: detected capacity change from 0 to 109808 Mar 25 02:37:04.352535 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 25 02:37:04.365528 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 25 02:37:04.374630 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 25 02:37:04.383835 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 25 02:37:04.394863 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 25 02:37:04.409632 kernel: loop1: detected capacity change from 0 to 218376 Mar 25 02:37:04.411879 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 25 02:37:04.422820 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 25 02:37:04.433879 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 02:37:04.443874 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 25 02:37:04.458133 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 25 02:37:04.469633 kernel: loop2: detected capacity change from 0 to 8 Mar 25 02:37:04.475566 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 25 02:37:04.494173 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 02:37:04.507676 kernel: loop3: detected capacity change from 0 to 151640 Mar 25 02:37:04.512283 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 25 02:37:04.527968 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 25 02:37:04.541347 udevadm[1408]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 25 02:37:04.542412 systemd-tmpfiles[1422]: ACLs are not supported, ignoring. Mar 25 02:37:04.542421 systemd-tmpfiles[1422]: ACLs are not supported, ignoring. Mar 25 02:37:04.544973 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 02:37:04.569634 kernel: loop4: detected capacity change from 0 to 109808 Mar 25 02:37:04.588635 kernel: loop5: detected capacity change from 0 to 218376 Mar 25 02:37:04.607640 kernel: loop6: detected capacity change from 0 to 8 Mar 25 02:37:04.614633 kernel: loop7: detected capacity change from 0 to 151640 Mar 25 02:37:04.634603 (sd-merge)[1427]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-packet'. Mar 25 02:37:04.634865 (sd-merge)[1427]: Merged extensions into '/usr'. Mar 25 02:37:04.637865 systemd[1]: Reload requested from client PID 1403 ('systemd-sysext') (unit systemd-sysext.service)... Mar 25 02:37:04.637876 systemd[1]: Reloading... Mar 25 02:37:04.640790 ldconfig[1398]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 25 02:37:04.670642 zram_generator::config[1455]: No configuration found. Mar 25 02:37:04.752171 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 02:37:04.804144 systemd[1]: Reloading finished in 165 ms. Mar 25 02:37:04.821456 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 25 02:37:04.832010 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 25 02:37:04.844005 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 25 02:37:04.868508 systemd[1]: Starting ensure-sysext.service... Mar 25 02:37:04.888196 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 02:37:04.909258 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 02:37:04.917031 systemd-tmpfiles[1513]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 25 02:37:04.917182 systemd-tmpfiles[1513]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 25 02:37:04.917661 systemd-tmpfiles[1513]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 25 02:37:04.917858 systemd-tmpfiles[1513]: ACLs are not supported, ignoring. Mar 25 02:37:04.917894 systemd-tmpfiles[1513]: ACLs are not supported, ignoring. Mar 25 02:37:04.919974 systemd-tmpfiles[1513]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 02:37:04.919978 systemd-tmpfiles[1513]: Skipping /boot Mar 25 02:37:04.925169 systemd-tmpfiles[1513]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 02:37:04.925173 systemd-tmpfiles[1513]: Skipping /boot Mar 25 02:37:04.933681 systemd[1]: Reload requested from client PID 1512 ('systemctl') (unit ensure-sysext.service)... Mar 25 02:37:04.933688 systemd[1]: Reloading... Mar 25 02:37:04.944363 systemd-udevd[1514]: Using default interface naming scheme 'v255'. Mar 25 02:37:04.958664 zram_generator::config[1543]: No configuration found. Mar 25 02:37:05.006637 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1620) Mar 25 02:37:05.006676 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Mar 25 02:37:05.020636 kernel: ACPI: button: Sleep Button [SLPB] Mar 25 02:37:05.029637 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Mar 25 02:37:05.034634 kernel: IPMI message handler: version 39.2 Mar 25 02:37:05.034669 kernel: mousedev: PS/2 mouse device common for all mice Mar 25 02:37:05.040637 kernel: ACPI: button: Power Button [PWRF] Mar 25 02:37:05.046511 kernel: ipmi device interface Mar 25 02:37:05.053700 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 02:37:05.069855 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Mar 25 02:37:05.099621 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Mar 25 02:37:05.099744 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Mar 25 02:37:05.099845 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Mar 25 02:37:05.099940 kernel: ipmi_si: IPMI System Interface driver Mar 25 02:37:05.099952 kernel: i2c i2c-0: 2/4 memory slots populated (from DMI) Mar 25 02:37:05.100045 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Mar 25 02:37:05.118842 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Mar 25 02:37:05.118858 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Mar 25 02:37:05.118872 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Mar 25 02:37:05.148970 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Mar 25 02:37:05.149056 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Mar 25 02:37:05.149132 kernel: ipmi_si: Adding ACPI-specified kcs state machine Mar 25 02:37:05.149142 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Mar 25 02:37:05.167878 systemd[1]: Condition check resulted in dev-ttyS1.device - /dev/ttyS1 being skipped. Mar 25 02:37:05.168137 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5300_MTFDDAK480TDT OEM. Mar 25 02:37:05.177633 kernel: iTCO_vendor_support: vendor-support=0 Mar 25 02:37:05.191484 systemd[1]: Reloading finished in 257 ms. Mar 25 02:37:05.191633 kernel: iTCO_wdt iTCO_wdt: Found a Intel PCH TCO device (Version=6, TCOBASE=0x0400) Mar 25 02:37:05.199173 kernel: iTCO_wdt iTCO_wdt: initialized. heartbeat=30 sec (nowayout=0) Mar 25 02:37:05.223987 kernel: intel_rapl_common: Found RAPL domain package Mar 25 02:37:05.224021 kernel: intel_rapl_common: Found RAPL domain core Mar 25 02:37:05.229313 kernel: intel_rapl_common: Found RAPL domain dram Mar 25 02:37:05.229337 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Mar 25 02:37:05.232398 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 02:37:05.274632 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b0f, dev_id: 0x20) Mar 25 02:37:05.276039 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 02:37:05.296792 systemd[1]: Finished ensure-sysext.service. Mar 25 02:37:05.321897 systemd[1]: Reached target tpm2.target - Trusted Platform Module. Mar 25 02:37:05.331735 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 02:37:05.332421 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 02:37:05.351632 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Mar 25 02:37:05.353079 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 25 02:37:05.359633 kernel: ipmi_ssif: IPMI SSIF Interface driver Mar 25 02:37:05.368845 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 02:37:05.371278 augenrules[1716]: No rules Mar 25 02:37:05.383192 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 02:37:05.393246 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 02:37:05.403276 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 02:37:05.414279 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 02:37:05.423791 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 02:37:05.424460 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 25 02:37:05.435720 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 02:37:05.436379 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 25 02:37:05.448616 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 02:37:05.449594 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 02:37:05.450520 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 25 02:37:05.475271 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 25 02:37:05.491911 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 02:37:05.501670 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 02:37:05.513020 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 25 02:37:05.525024 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 02:37:05.525129 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 02:37:05.525382 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 25 02:37:05.525519 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 02:37:05.525602 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 02:37:05.525758 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 02:37:05.525838 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 02:37:05.525976 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 02:37:05.526056 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 02:37:05.526196 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 02:37:05.526274 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 02:37:05.526489 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 25 02:37:05.526648 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 25 02:37:05.531862 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 25 02:37:05.532881 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 25 02:37:05.532913 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 02:37:05.532946 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 02:37:05.533638 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 25 02:37:05.534522 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 25 02:37:05.534554 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 25 02:37:05.548268 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 25 02:37:05.551641 lvm[1745]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 02:37:05.564283 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 25 02:37:05.587313 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 25 02:37:05.613020 systemd-resolved[1729]: Positive Trust Anchors: Mar 25 02:37:05.613028 systemd-resolved[1729]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 02:37:05.613072 systemd-resolved[1729]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 02:37:05.616497 systemd-resolved[1729]: Using system hostname 'ci-4284.0.0-a-3a00d206eb'. Mar 25 02:37:05.617750 systemd-networkd[1728]: lo: Link UP Mar 25 02:37:05.617753 systemd-networkd[1728]: lo: Gained carrier Mar 25 02:37:05.620488 systemd-networkd[1728]: bond0: netdev ready Mar 25 02:37:05.621534 systemd-networkd[1728]: Enumeration completed Mar 25 02:37:05.626525 systemd-networkd[1728]: enp1s0f0np0: Configuring with /etc/systemd/network/10-1c:34:da:42:bc:24.network. Mar 25 02:37:05.655877 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 25 02:37:05.666941 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 02:37:05.676738 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 02:37:05.686854 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 02:37:05.698964 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 02:37:05.708704 systemd[1]: Reached target network.target - Network. Mar 25 02:37:05.716697 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 02:37:05.727712 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 02:37:05.737750 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 25 02:37:05.748712 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 25 02:37:05.759711 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 25 02:37:05.770694 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 25 02:37:05.770711 systemd[1]: Reached target paths.target - Path Units. Mar 25 02:37:05.778698 systemd[1]: Reached target time-set.target - System Time Set. Mar 25 02:37:05.788854 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 25 02:37:05.798756 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 25 02:37:05.809698 systemd[1]: Reached target timers.target - Timer Units. Mar 25 02:37:05.818236 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 25 02:37:05.828560 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 25 02:37:05.838157 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 25 02:37:05.863345 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 25 02:37:05.873719 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 25 02:37:05.887968 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 25 02:37:05.903288 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 25 02:37:05.903708 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Mar 25 02:37:05.917633 kernel: bond0: (slave enp1s0f0np0): Enslaving as a backup interface with an up link Mar 25 02:37:05.922143 systemd-networkd[1728]: enp1s0f1np1: Configuring with /etc/systemd/network/10-1c:34:da:42:bc:25.network. Mar 25 02:37:05.923579 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 25 02:37:05.924369 lvm[1770]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 02:37:05.945772 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 25 02:37:05.955875 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 02:37:05.965741 systemd[1]: Reached target basic.target - Basic System. Mar 25 02:37:05.973741 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 25 02:37:05.973759 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 25 02:37:05.974385 systemd[1]: Starting containerd.service - containerd container runtime... Mar 25 02:37:06.000469 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 25 02:37:06.019438 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 25 02:37:06.027866 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 25 02:37:06.048653 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 25 02:37:06.052791 jq[1778]: false Mar 25 02:37:06.054993 coreos-metadata[1774]: Mar 25 02:37:06.054 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Mar 25 02:37:06.055953 coreos-metadata[1774]: Mar 25 02:37:06.055 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata) Mar 25 02:37:06.058710 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 25 02:37:06.059407 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 25 02:37:06.061032 dbus-daemon[1775]: [system] SELinux support is enabled Mar 25 02:37:06.065700 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Mar 25 02:37:06.066926 extend-filesystems[1779]: Found loop4 Mar 25 02:37:06.066926 extend-filesystems[1779]: Found loop5 Mar 25 02:37:06.124699 kernel: bond0: (slave enp1s0f1np1): Enslaving as a backup interface with an up link Mar 25 02:37:06.124736 kernel: EXT4-fs (sda9): resizing filesystem from 553472 to 116605649 blocks Mar 25 02:37:06.124750 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Mar 25 02:37:06.124764 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1609) Mar 25 02:37:06.077455 systemd-networkd[1728]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Mar 25 02:37:06.124805 extend-filesystems[1779]: Found loop6 Mar 25 02:37:06.124805 extend-filesystems[1779]: Found loop7 Mar 25 02:37:06.124805 extend-filesystems[1779]: Found sda Mar 25 02:37:06.124805 extend-filesystems[1779]: Found sda1 Mar 25 02:37:06.124805 extend-filesystems[1779]: Found sda2 Mar 25 02:37:06.124805 extend-filesystems[1779]: Found sda3 Mar 25 02:37:06.124805 extend-filesystems[1779]: Found usr Mar 25 02:37:06.124805 extend-filesystems[1779]: Found sda4 Mar 25 02:37:06.124805 extend-filesystems[1779]: Found sda6 Mar 25 02:37:06.124805 extend-filesystems[1779]: Found sda7 Mar 25 02:37:06.124805 extend-filesystems[1779]: Found sda9 Mar 25 02:37:06.124805 extend-filesystems[1779]: Checking size of /dev/sda9 Mar 25 02:37:06.124805 extend-filesystems[1779]: Resized partition /dev/sda9 Mar 25 02:37:06.250739 kernel: bond0: (slave enp1s0f0np0): link status definitely up, 10000 Mbps full duplex Mar 25 02:37:06.250765 kernel: bond0: active interface up! Mar 25 02:37:06.078949 systemd-networkd[1728]: enp1s0f0np0: Link UP Mar 25 02:37:06.250852 extend-filesystems[1787]: resize2fs 1.47.2 (1-Jan-2025) Mar 25 02:37:06.079131 systemd-networkd[1728]: enp1s0f0np0: Gained carrier Mar 25 02:37:06.109504 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 25 02:37:06.125019 systemd-networkd[1728]: enp1s0f1np1: Reconfiguring with /etc/systemd/network/10-1c:34:da:42:bc:24.network. Mar 25 02:37:06.125184 systemd-networkd[1728]: enp1s0f1np1: Link UP Mar 25 02:37:06.125312 systemd-networkd[1728]: enp1s0f1np1: Gained carrier Mar 25 02:37:06.127149 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 25 02:37:06.135854 systemd-networkd[1728]: bond0: Link UP Mar 25 02:37:06.136012 systemd-networkd[1728]: bond0: Gained carrier Mar 25 02:37:06.136173 systemd-timesyncd[1730]: Network configuration changed, trying to establish connection. Mar 25 02:37:06.136546 systemd-timesyncd[1730]: Network configuration changed, trying to establish connection. Mar 25 02:37:06.136640 systemd-timesyncd[1730]: Network configuration changed, trying to establish connection. Mar 25 02:37:06.136774 systemd-timesyncd[1730]: Network configuration changed, trying to establish connection. Mar 25 02:37:06.153336 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 25 02:37:06.182073 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 25 02:37:06.237167 systemd[1]: Starting tcsd.service - TCG Core Services Daemon... Mar 25 02:37:06.251033 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 25 02:37:06.251405 systemd[1]: Starting update-engine.service - Update Engine... Mar 25 02:37:06.267821 systemd-logind[1800]: Watching system buttons on /dev/input/event3 (Power Button) Mar 25 02:37:06.267833 systemd-logind[1800]: Watching system buttons on /dev/input/event2 (Sleep Button) Mar 25 02:37:06.267844 systemd-logind[1800]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Mar 25 02:37:06.268077 systemd-logind[1800]: New seat seat0. Mar 25 02:37:06.293056 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 25 02:37:06.299344 update_engine[1805]: I20250325 02:37:06.299309 1805 main.cc:92] Flatcar Update Engine starting Mar 25 02:37:06.300026 update_engine[1805]: I20250325 02:37:06.300010 1805 update_check_scheduler.cc:74] Next update check in 4m31s Mar 25 02:37:06.316631 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 10000 Mbps full duplex Mar 25 02:37:06.318846 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 25 02:37:06.320166 jq[1806]: true Mar 25 02:37:06.330181 systemd[1]: Started systemd-logind.service - User Login Management. Mar 25 02:37:06.339979 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 25 02:37:06.350863 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 25 02:37:06.362239 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 25 02:37:06.362344 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 25 02:37:06.362491 systemd[1]: motdgen.service: Deactivated successfully. Mar 25 02:37:06.362592 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 25 02:37:06.378979 sshd_keygen[1804]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 25 02:37:06.384821 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 25 02:37:06.384933 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 25 02:37:06.396964 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 25 02:37:06.409615 (ntainerd)[1819]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 25 02:37:06.411069 jq[1818]: true Mar 25 02:37:06.414447 dbus-daemon[1775]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 25 02:37:06.416890 tar[1808]: linux-amd64/LICENSE Mar 25 02:37:06.417086 tar[1808]: linux-amd64/helm Mar 25 02:37:06.421247 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Mar 25 02:37:06.421359 systemd[1]: Condition check resulted in tcsd.service - TCG Core Services Daemon being skipped. Mar 25 02:37:06.429796 systemd[1]: Started update-engine.service - Update Engine. Mar 25 02:37:06.441197 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 25 02:37:06.450216 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 25 02:37:06.450318 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 25 02:37:06.461791 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 25 02:37:06.461872 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 25 02:37:06.473521 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 25 02:37:06.476922 bash[1848]: Updated "/home/core/.ssh/authorized_keys" Mar 25 02:37:06.491991 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 25 02:37:06.503000 systemd[1]: issuegen.service: Deactivated successfully. Mar 25 02:37:06.503106 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 25 02:37:06.514497 systemd[1]: Starting sshkeys.service... Mar 25 02:37:06.522882 locksmithd[1856]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 25 02:37:06.537164 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 25 02:37:06.556996 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 25 02:37:06.568643 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 25 02:37:06.581015 containerd[1819]: time="2025-03-25T02:37:06Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 25 02:37:06.593782 containerd[1819]: time="2025-03-25T02:37:06.581673772Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 25 02:37:06.593782 containerd[1819]: time="2025-03-25T02:37:06.587012227Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="3.935µs" Mar 25 02:37:06.593782 containerd[1819]: time="2025-03-25T02:37:06.587026188Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 25 02:37:06.593782 containerd[1819]: time="2025-03-25T02:37:06.587037259Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 25 02:37:06.593782 containerd[1819]: time="2025-03-25T02:37:06.587111911Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 25 02:37:06.593782 containerd[1819]: time="2025-03-25T02:37:06.587121961Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 25 02:37:06.593782 containerd[1819]: time="2025-03-25T02:37:06.587135829Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 02:37:06.593782 containerd[1819]: time="2025-03-25T02:37:06.587167590Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 02:37:06.593782 containerd[1819]: time="2025-03-25T02:37:06.587175651Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 02:37:06.593782 containerd[1819]: time="2025-03-25T02:37:06.587291834Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 02:37:06.593782 containerd[1819]: time="2025-03-25T02:37:06.587300561Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 02:37:06.593782 containerd[1819]: time="2025-03-25T02:37:06.587306665Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 02:37:06.593862 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 25 02:37:06.594035 containerd[1819]: time="2025-03-25T02:37:06.587311494Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 25 02:37:06.594035 containerd[1819]: time="2025-03-25T02:37:06.587350813Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 25 02:37:06.594035 containerd[1819]: time="2025-03-25T02:37:06.587459611Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 02:37:06.594035 containerd[1819]: time="2025-03-25T02:37:06.587476047Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 02:37:06.594035 containerd[1819]: time="2025-03-25T02:37:06.587482754Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 25 02:37:06.594035 containerd[1819]: time="2025-03-25T02:37:06.587497593Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 25 02:37:06.594035 containerd[1819]: time="2025-03-25T02:37:06.587641333Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 25 02:37:06.594035 containerd[1819]: time="2025-03-25T02:37:06.587674123Z" level=info msg="metadata content store policy set" policy=shared Mar 25 02:37:06.598957 containerd[1819]: time="2025-03-25T02:37:06.598941968Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 25 02:37:06.598981 containerd[1819]: time="2025-03-25T02:37:06.598967209Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 25 02:37:06.599002 containerd[1819]: time="2025-03-25T02:37:06.598980340Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 25 02:37:06.599002 containerd[1819]: time="2025-03-25T02:37:06.598989079Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 25 02:37:06.599002 containerd[1819]: time="2025-03-25T02:37:06.598996722Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 25 02:37:06.599046 containerd[1819]: time="2025-03-25T02:37:06.599002881Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 25 02:37:06.599046 containerd[1819]: time="2025-03-25T02:37:06.599009925Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 25 02:37:06.599046 containerd[1819]: time="2025-03-25T02:37:06.599017643Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 25 02:37:06.599046 containerd[1819]: time="2025-03-25T02:37:06.599023571Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 25 02:37:06.599046 containerd[1819]: time="2025-03-25T02:37:06.599031583Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 25 02:37:06.599046 containerd[1819]: time="2025-03-25T02:37:06.599037185Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 25 02:37:06.599046 containerd[1819]: time="2025-03-25T02:37:06.599044462Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 25 02:37:06.599140 containerd[1819]: time="2025-03-25T02:37:06.599106849Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 25 02:37:06.599140 containerd[1819]: time="2025-03-25T02:37:06.599118772Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 25 02:37:06.599140 containerd[1819]: time="2025-03-25T02:37:06.599125936Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 25 02:37:06.599140 containerd[1819]: time="2025-03-25T02:37:06.599132663Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 25 02:37:06.599140 containerd[1819]: time="2025-03-25T02:37:06.599139193Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 25 02:37:06.599208 containerd[1819]: time="2025-03-25T02:37:06.599145361Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 25 02:37:06.599208 containerd[1819]: time="2025-03-25T02:37:06.599153919Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 25 02:37:06.599208 containerd[1819]: time="2025-03-25T02:37:06.599160298Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 25 02:37:06.599208 containerd[1819]: time="2025-03-25T02:37:06.599167374Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 25 02:37:06.599208 containerd[1819]: time="2025-03-25T02:37:06.599173813Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 25 02:37:06.599208 containerd[1819]: time="2025-03-25T02:37:06.599179774Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 25 02:37:06.599296 containerd[1819]: time="2025-03-25T02:37:06.599214044Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 25 02:37:06.599296 containerd[1819]: time="2025-03-25T02:37:06.599222233Z" level=info msg="Start snapshots syncer" Mar 25 02:37:06.599296 containerd[1819]: time="2025-03-25T02:37:06.599235009Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 25 02:37:06.599418 containerd[1819]: time="2025-03-25T02:37:06.599397047Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 25 02:37:06.599485 containerd[1819]: time="2025-03-25T02:37:06.599428769Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 25 02:37:06.599485 containerd[1819]: time="2025-03-25T02:37:06.599466355Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 25 02:37:06.599524 containerd[1819]: time="2025-03-25T02:37:06.599516688Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 25 02:37:06.599541 containerd[1819]: time="2025-03-25T02:37:06.599530446Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 25 02:37:06.599541 containerd[1819]: time="2025-03-25T02:37:06.599538063Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 25 02:37:06.599572 containerd[1819]: time="2025-03-25T02:37:06.599544483Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 25 02:37:06.599572 containerd[1819]: time="2025-03-25T02:37:06.599552694Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 25 02:37:06.599572 containerd[1819]: time="2025-03-25T02:37:06.599566136Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 25 02:37:06.599615 containerd[1819]: time="2025-03-25T02:37:06.599573352Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 25 02:37:06.599615 containerd[1819]: time="2025-03-25T02:37:06.599586416Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 25 02:37:06.599615 containerd[1819]: time="2025-03-25T02:37:06.599593870Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 25 02:37:06.599615 containerd[1819]: time="2025-03-25T02:37:06.599599403Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 25 02:37:06.599683 containerd[1819]: time="2025-03-25T02:37:06.599617762Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 02:37:06.599683 containerd[1819]: time="2025-03-25T02:37:06.599631243Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 02:37:06.599683 containerd[1819]: time="2025-03-25T02:37:06.599639515Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 02:37:06.599683 containerd[1819]: time="2025-03-25T02:37:06.599646210Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 02:37:06.599683 containerd[1819]: time="2025-03-25T02:37:06.599651480Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 25 02:37:06.599683 containerd[1819]: time="2025-03-25T02:37:06.599657778Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 25 02:37:06.599683 containerd[1819]: time="2025-03-25T02:37:06.599663980Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 25 02:37:06.599683 containerd[1819]: time="2025-03-25T02:37:06.599674500Z" level=info msg="runtime interface created" Mar 25 02:37:06.599683 containerd[1819]: time="2025-03-25T02:37:06.599677880Z" level=info msg="created NRI interface" Mar 25 02:37:06.599683 containerd[1819]: time="2025-03-25T02:37:06.599682824Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 25 02:37:06.599821 containerd[1819]: time="2025-03-25T02:37:06.599689571Z" level=info msg="Connect containerd service" Mar 25 02:37:06.599821 containerd[1819]: time="2025-03-25T02:37:06.599703845Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 25 02:37:06.600373 containerd[1819]: time="2025-03-25T02:37:06.600361651Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 25 02:37:06.604578 coreos-metadata[1870]: Mar 25 02:37:06.604 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Mar 25 02:37:06.607506 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 25 02:37:06.616574 systemd[1]: Started serial-getty@ttyS1.service - Serial Getty on ttyS1. Mar 25 02:37:06.629636 kernel: EXT4-fs (sda9): resized filesystem to 116605649 Mar 25 02:37:06.632930 systemd[1]: Reached target getty.target - Login Prompts. Mar 25 02:37:06.677386 extend-filesystems[1787]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Mar 25 02:37:06.677386 extend-filesystems[1787]: old_desc_blocks = 1, new_desc_blocks = 56 Mar 25 02:37:06.677386 extend-filesystems[1787]: The filesystem on /dev/sda9 is now 116605649 (4k) blocks long. Mar 25 02:37:06.708782 extend-filesystems[1779]: Resized filesystem in /dev/sda9 Mar 25 02:37:06.708782 extend-filesystems[1779]: Found sdb Mar 25 02:37:06.726761 containerd[1819]: time="2025-03-25T02:37:06.700065038Z" level=info msg="Start subscribing containerd event" Mar 25 02:37:06.726761 containerd[1819]: time="2025-03-25T02:37:06.700096495Z" level=info msg="Start recovering state" Mar 25 02:37:06.726761 containerd[1819]: time="2025-03-25T02:37:06.700130244Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 25 02:37:06.726761 containerd[1819]: time="2025-03-25T02:37:06.700160379Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 25 02:37:06.726761 containerd[1819]: time="2025-03-25T02:37:06.700160639Z" level=info msg="Start event monitor" Mar 25 02:37:06.726761 containerd[1819]: time="2025-03-25T02:37:06.700179433Z" level=info msg="Start cni network conf syncer for default" Mar 25 02:37:06.726761 containerd[1819]: time="2025-03-25T02:37:06.700183896Z" level=info msg="Start streaming server" Mar 25 02:37:06.726761 containerd[1819]: time="2025-03-25T02:37:06.700192506Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 25 02:37:06.726761 containerd[1819]: time="2025-03-25T02:37:06.700196730Z" level=info msg="runtime interface starting up..." Mar 25 02:37:06.726761 containerd[1819]: time="2025-03-25T02:37:06.700199794Z" level=info msg="starting plugins..." Mar 25 02:37:06.726761 containerd[1819]: time="2025-03-25T02:37:06.700209256Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 25 02:37:06.726761 containerd[1819]: time="2025-03-25T02:37:06.700384103Z" level=info msg="containerd successfully booted in 0.119591s" Mar 25 02:37:06.677919 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 25 02:37:06.678033 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 25 02:37:06.709019 systemd[1]: Started containerd.service - containerd container runtime. Mar 25 02:37:06.760726 tar[1808]: linux-amd64/README.md Mar 25 02:37:06.779486 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 25 02:37:07.056115 coreos-metadata[1774]: Mar 25 02:37:07.056 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Mar 25 02:37:07.457922 systemd-networkd[1728]: bond0: Gained IPv6LL Mar 25 02:37:07.459130 systemd-timesyncd[1730]: Network configuration changed, trying to establish connection. Mar 25 02:37:07.777940 systemd-timesyncd[1730]: Network configuration changed, trying to establish connection. Mar 25 02:37:07.778025 systemd-timesyncd[1730]: Network configuration changed, trying to establish connection. Mar 25 02:37:07.779269 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 25 02:37:07.791092 systemd[1]: Reached target network-online.target - Network is Online. Mar 25 02:37:07.802761 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 02:37:07.824918 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 25 02:37:07.850301 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 25 02:37:08.567938 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 02:37:08.580091 (kubelet)[1922]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 02:37:08.838954 kernel: mlx5_core 0000:01:00.0: lag map: port 1:1 port 2:2 Mar 25 02:37:08.839097 kernel: mlx5_core 0000:01:00.0: shared_fdb:0 mode:queue_affinity Mar 25 02:37:09.048031 kubelet[1922]: E0325 02:37:09.048005 1922 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 02:37:09.049375 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 02:37:09.049461 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 02:37:09.049700 systemd[1]: kubelet.service: Consumed 578ms CPU time, 259.1M memory peak. Mar 25 02:37:09.998585 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 25 02:37:10.009871 systemd[1]: Started sshd@0-147.75.90.239:22-135.125.238.48:32974.service - OpenSSH per-connection server daemon (135.125.238.48:32974). Mar 25 02:37:10.165442 systemd[1]: Started sshd@1-147.75.90.239:22-139.178.68.195:52162.service - OpenSSH per-connection server daemon (139.178.68.195:52162). Mar 25 02:37:10.235232 sshd[1942]: Accepted publickey for core from 139.178.68.195 port 52162 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:37:10.236309 sshd-session[1942]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:37:10.243513 systemd-logind[1800]: New session 1 of user core. Mar 25 02:37:10.244386 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 25 02:37:10.254622 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 25 02:37:10.287697 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 25 02:37:10.300282 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 25 02:37:10.325298 (systemd)[1946]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 25 02:37:10.327872 systemd-logind[1800]: New session c1 of user core. Mar 25 02:37:10.446551 systemd[1946]: Queued start job for default target default.target. Mar 25 02:37:10.464314 systemd[1946]: Created slice app.slice - User Application Slice. Mar 25 02:37:10.464329 systemd[1946]: Reached target paths.target - Paths. Mar 25 02:37:10.464350 systemd[1946]: Reached target timers.target - Timers. Mar 25 02:37:10.465020 systemd[1946]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 25 02:37:10.471197 systemd[1946]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 25 02:37:10.471232 systemd[1946]: Reached target sockets.target - Sockets. Mar 25 02:37:10.471256 systemd[1946]: Reached target basic.target - Basic System. Mar 25 02:37:10.471309 systemd[1946]: Reached target default.target - Main User Target. Mar 25 02:37:10.471326 systemd[1946]: Startup finished in 136ms. Mar 25 02:37:10.471352 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 25 02:37:10.483790 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 25 02:37:10.553433 systemd[1]: Started sshd@2-147.75.90.239:22-139.178.68.195:52168.service - OpenSSH per-connection server daemon (139.178.68.195:52168). Mar 25 02:37:10.598856 sshd[1957]: Accepted publickey for core from 139.178.68.195 port 52168 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:37:10.599476 sshd-session[1957]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:37:10.602350 systemd-logind[1800]: New session 2 of user core. Mar 25 02:37:10.622901 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 25 02:37:10.687565 sshd[1959]: Connection closed by 139.178.68.195 port 52168 Mar 25 02:37:10.687774 sshd-session[1957]: pam_unix(sshd:session): session closed for user core Mar 25 02:37:10.690337 coreos-metadata[1870]: Mar 25 02:37:10.690 INFO Fetch successful Mar 25 02:37:10.698895 systemd[1]: sshd@2-147.75.90.239:22-139.178.68.195:52168.service: Deactivated successfully. Mar 25 02:37:10.699769 systemd[1]: session-2.scope: Deactivated successfully. Mar 25 02:37:10.700507 systemd-logind[1800]: Session 2 logged out. Waiting for processes to exit. Mar 25 02:37:10.701226 systemd[1]: Started sshd@3-147.75.90.239:22-139.178.68.195:52184.service - OpenSSH per-connection server daemon (139.178.68.195:52184). Mar 25 02:37:10.713460 systemd-logind[1800]: Removed session 2. Mar 25 02:37:10.728377 unknown[1870]: wrote ssh authorized keys file for user: core Mar 25 02:37:10.751065 coreos-metadata[1774]: Mar 25 02:37:10.751 INFO Fetch successful Mar 25 02:37:10.753954 sshd[1964]: Accepted publickey for core from 139.178.68.195 port 52184 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:37:10.755554 sshd-session[1964]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:37:10.758445 systemd-logind[1800]: New session 3 of user core. Mar 25 02:37:10.761537 update-ssh-keys[1966]: Updated "/home/core/.ssh/authorized_keys" Mar 25 02:37:10.769893 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 25 02:37:10.781769 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 25 02:37:10.800989 systemd[1]: Finished sshkeys.service. Mar 25 02:37:10.813173 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 25 02:37:10.825384 systemd[1]: Starting packet-phone-home.service - Report Success to Packet... Mar 25 02:37:10.853259 sshd[1939]: Invalid user sulthana from 135.125.238.48 port 32974 Mar 25 02:37:10.862294 sshd[1975]: Connection closed by 139.178.68.195 port 52184 Mar 25 02:37:10.862435 sshd-session[1964]: pam_unix(sshd:session): session closed for user core Mar 25 02:37:10.863761 systemd[1]: sshd@3-147.75.90.239:22-139.178.68.195:52184.service: Deactivated successfully. Mar 25 02:37:10.864633 systemd[1]: session-3.scope: Deactivated successfully. Mar 25 02:37:10.865336 systemd-logind[1800]: Session 3 logged out. Waiting for processes to exit. Mar 25 02:37:10.866079 systemd-logind[1800]: Removed session 3. Mar 25 02:37:10.897703 systemd[1]: Started sshd@4-147.75.90.239:22-160.30.159.175:37106.service - OpenSSH per-connection server daemon (160.30.159.175:37106). Mar 25 02:37:11.004912 sshd[1939]: Received disconnect from 135.125.238.48 port 32974:11: Bye Bye [preauth] Mar 25 02:37:11.004912 sshd[1939]: Disconnected from invalid user sulthana 135.125.238.48 port 32974 [preauth] Mar 25 02:37:11.007834 systemd[1]: sshd@0-147.75.90.239:22-135.125.238.48:32974.service: Deactivated successfully. Mar 25 02:37:11.231419 systemd[1]: Finished packet-phone-home.service - Report Success to Packet. Mar 25 02:37:11.245319 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 25 02:37:11.255339 systemd[1]: Startup finished in 2.689s (kernel) + 23.351s (initrd) + 9.129s (userspace) = 35.170s. Mar 25 02:37:11.277342 login[1885]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 25 02:37:11.281558 systemd-logind[1800]: New session 4 of user core. Mar 25 02:37:11.282136 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 25 02:37:11.283774 login[1884]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 25 02:37:11.286219 systemd-logind[1800]: New session 5 of user core. Mar 25 02:37:11.286603 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 25 02:37:11.991210 sshd[1984]: Invalid user shenyue from 160.30.159.175 port 37106 Mar 25 02:37:12.181096 sshd[1984]: Received disconnect from 160.30.159.175 port 37106:11: Bye Bye [preauth] Mar 25 02:37:12.181096 sshd[1984]: Disconnected from invalid user shenyue 160.30.159.175 port 37106 [preauth] Mar 25 02:37:12.184440 systemd[1]: sshd@4-147.75.90.239:22-160.30.159.175:37106.service: Deactivated successfully. Mar 25 02:37:12.645615 systemd-timesyncd[1730]: Network configuration changed, trying to establish connection. Mar 25 02:37:15.334228 systemd[1]: Started sshd@5-147.75.90.239:22-189.8.108.39:50582.service - OpenSSH per-connection server daemon (189.8.108.39:50582). Mar 25 02:37:16.424290 sshd[2016]: Invalid user alex from 189.8.108.39 port 50582 Mar 25 02:37:16.623086 sshd[2016]: Received disconnect from 189.8.108.39 port 50582:11: Bye Bye [preauth] Mar 25 02:37:16.623086 sshd[2016]: Disconnected from invalid user alex 189.8.108.39 port 50582 [preauth] Mar 25 02:37:16.626270 systemd[1]: sshd@5-147.75.90.239:22-189.8.108.39:50582.service: Deactivated successfully. Mar 25 02:37:17.930169 systemd[1]: Started sshd@6-147.75.90.239:22-103.31.39.72:47504.service - OpenSSH per-connection server daemon (103.31.39.72:47504). Mar 25 02:37:19.301864 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 25 02:37:19.304428 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 02:37:19.459653 sshd[2021]: Invalid user super from 103.31.39.72 port 47504 Mar 25 02:37:19.597069 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 02:37:19.599349 (kubelet)[2031]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 02:37:19.624913 kubelet[2031]: E0325 02:37:19.624858 2031 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 02:37:19.627559 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 02:37:19.627679 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 02:37:19.627908 systemd[1]: kubelet.service: Consumed 154ms CPU time, 110.1M memory peak. Mar 25 02:37:19.747153 sshd[2021]: Received disconnect from 103.31.39.72 port 47504:11: Bye Bye [preauth] Mar 25 02:37:19.747153 sshd[2021]: Disconnected from invalid user super 103.31.39.72 port 47504 [preauth] Mar 25 02:37:19.750435 systemd[1]: sshd@6-147.75.90.239:22-103.31.39.72:47504.service: Deactivated successfully. Mar 25 02:37:20.879936 systemd[1]: Started sshd@7-147.75.90.239:22-139.178.68.195:51102.service - OpenSSH per-connection server daemon (139.178.68.195:51102). Mar 25 02:37:20.916581 sshd[2051]: Accepted publickey for core from 139.178.68.195 port 51102 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:37:20.917238 sshd-session[2051]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:37:20.920260 systemd-logind[1800]: New session 6 of user core. Mar 25 02:37:20.928891 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 25 02:37:20.981308 sshd[2053]: Connection closed by 139.178.68.195 port 51102 Mar 25 02:37:20.981467 sshd-session[2051]: pam_unix(sshd:session): session closed for user core Mar 25 02:37:21.002460 systemd[1]: sshd@7-147.75.90.239:22-139.178.68.195:51102.service: Deactivated successfully. Mar 25 02:37:21.006409 systemd[1]: session-6.scope: Deactivated successfully. Mar 25 02:37:21.009813 systemd-logind[1800]: Session 6 logged out. Waiting for processes to exit. Mar 25 02:37:21.013294 systemd[1]: Started sshd@8-147.75.90.239:22-139.178.68.195:51112.service - OpenSSH per-connection server daemon (139.178.68.195:51112). Mar 25 02:37:21.016185 systemd-logind[1800]: Removed session 6. Mar 25 02:37:21.053947 sshd[2058]: Accepted publickey for core from 139.178.68.195 port 51112 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:37:21.054575 sshd-session[2058]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:37:21.057319 systemd-logind[1800]: New session 7 of user core. Mar 25 02:37:21.066073 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 25 02:37:21.125389 sshd[2061]: Connection closed by 139.178.68.195 port 51112 Mar 25 02:37:21.126096 sshd-session[2058]: pam_unix(sshd:session): session closed for user core Mar 25 02:37:21.146860 systemd[1]: sshd@8-147.75.90.239:22-139.178.68.195:51112.service: Deactivated successfully. Mar 25 02:37:21.150711 systemd[1]: session-7.scope: Deactivated successfully. Mar 25 02:37:21.152915 systemd-logind[1800]: Session 7 logged out. Waiting for processes to exit. Mar 25 02:37:21.157477 systemd[1]: Started sshd@9-147.75.90.239:22-139.178.68.195:51118.service - OpenSSH per-connection server daemon (139.178.68.195:51118). Mar 25 02:37:21.160408 systemd-logind[1800]: Removed session 7. Mar 25 02:37:21.251042 sshd[2066]: Accepted publickey for core from 139.178.68.195 port 51118 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:37:21.252173 sshd-session[2066]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:37:21.256619 systemd-logind[1800]: New session 8 of user core. Mar 25 02:37:21.267857 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 25 02:37:21.330719 sshd[2069]: Connection closed by 139.178.68.195 port 51118 Mar 25 02:37:21.331398 sshd-session[2066]: pam_unix(sshd:session): session closed for user core Mar 25 02:37:21.349342 systemd[1]: sshd@9-147.75.90.239:22-139.178.68.195:51118.service: Deactivated successfully. Mar 25 02:37:21.353271 systemd[1]: session-8.scope: Deactivated successfully. Mar 25 02:37:21.355536 systemd-logind[1800]: Session 8 logged out. Waiting for processes to exit. Mar 25 02:37:21.359876 systemd[1]: Started sshd@10-147.75.90.239:22-139.178.68.195:51134.service - OpenSSH per-connection server daemon (139.178.68.195:51134). Mar 25 02:37:21.362667 systemd-logind[1800]: Removed session 8. Mar 25 02:37:21.406705 sshd[2074]: Accepted publickey for core from 139.178.68.195 port 51134 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:37:21.407551 sshd-session[2074]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:37:21.410542 systemd-logind[1800]: New session 9 of user core. Mar 25 02:37:21.419090 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 25 02:37:21.491269 sudo[2079]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 25 02:37:21.491410 sudo[2079]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 02:37:21.513428 sudo[2079]: pam_unix(sudo:session): session closed for user root Mar 25 02:37:21.514269 sshd[2078]: Connection closed by 139.178.68.195 port 51134 Mar 25 02:37:21.514459 sshd-session[2074]: pam_unix(sshd:session): session closed for user core Mar 25 02:37:21.533035 systemd[1]: sshd@10-147.75.90.239:22-139.178.68.195:51134.service: Deactivated successfully. Mar 25 02:37:21.534329 systemd[1]: session-9.scope: Deactivated successfully. Mar 25 02:37:21.535203 systemd-logind[1800]: Session 9 logged out. Waiting for processes to exit. Mar 25 02:37:21.536763 systemd[1]: Started sshd@11-147.75.90.239:22-139.178.68.195:51150.service - OpenSSH per-connection server daemon (139.178.68.195:51150). Mar 25 02:37:21.537721 systemd-logind[1800]: Removed session 9. Mar 25 02:37:21.613962 sshd[2084]: Accepted publickey for core from 139.178.68.195 port 51150 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:37:21.614548 sshd-session[2084]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:37:21.617376 systemd-logind[1800]: New session 10 of user core. Mar 25 02:37:21.630068 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 25 02:37:21.693967 sudo[2089]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 25 02:37:21.694107 sudo[2089]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 02:37:21.696195 sudo[2089]: pam_unix(sudo:session): session closed for user root Mar 25 02:37:21.698775 sudo[2088]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 25 02:37:21.698915 sudo[2088]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 02:37:21.704505 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 02:37:21.739522 augenrules[2111]: No rules Mar 25 02:37:21.740218 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 02:37:21.740508 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 02:37:21.741608 sudo[2088]: pam_unix(sudo:session): session closed for user root Mar 25 02:37:21.742882 sshd[2087]: Connection closed by 139.178.68.195 port 51150 Mar 25 02:37:21.743273 sshd-session[2084]: pam_unix(sshd:session): session closed for user core Mar 25 02:37:21.767284 systemd[1]: sshd@11-147.75.90.239:22-139.178.68.195:51150.service: Deactivated successfully. Mar 25 02:37:21.771246 systemd[1]: session-10.scope: Deactivated successfully. Mar 25 02:37:21.774683 systemd-logind[1800]: Session 10 logged out. Waiting for processes to exit. Mar 25 02:37:21.777773 systemd[1]: Started sshd@12-147.75.90.239:22-139.178.68.195:51164.service - OpenSSH per-connection server daemon (139.178.68.195:51164). Mar 25 02:37:21.780676 systemd-logind[1800]: Removed session 10. Mar 25 02:37:21.818997 sshd[2119]: Accepted publickey for core from 139.178.68.195 port 51164 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:37:21.819593 sshd-session[2119]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:37:21.822415 systemd-logind[1800]: New session 11 of user core. Mar 25 02:37:21.831871 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 25 02:37:21.892262 sudo[2123]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 25 02:37:21.893068 sudo[2123]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 02:37:22.197318 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 25 02:37:22.216004 (dockerd)[2150]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 25 02:37:22.511823 dockerd[2150]: time="2025-03-25T02:37:22.511738036Z" level=info msg="Starting up" Mar 25 02:37:22.513034 dockerd[2150]: time="2025-03-25T02:37:22.512990905Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 25 02:37:22.542316 dockerd[2150]: time="2025-03-25T02:37:22.542256035Z" level=info msg="Loading containers: start." Mar 25 02:37:22.666641 kernel: Initializing XFRM netlink socket Mar 25 02:37:22.667005 systemd-timesyncd[1730]: Network configuration changed, trying to establish connection. Mar 25 02:37:22.736842 systemd-networkd[1728]: docker0: Link UP Mar 25 02:37:22.799613 dockerd[2150]: time="2025-03-25T02:37:22.799485704Z" level=info msg="Loading containers: done." Mar 25 02:37:22.818578 dockerd[2150]: time="2025-03-25T02:37:22.818529246Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 25 02:37:22.818578 dockerd[2150]: time="2025-03-25T02:37:22.818575589Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 25 02:37:22.818673 dockerd[2150]: time="2025-03-25T02:37:22.818636145Z" level=info msg="Daemon has completed initialization" Mar 25 02:37:22.834319 dockerd[2150]: time="2025-03-25T02:37:22.834290175Z" level=info msg="API listen on /run/docker.sock" Mar 25 02:37:22.834380 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 25 02:37:23.040468 systemd-timesyncd[1730]: Contacted time server [2604:2dc0:101:200::151]:123 (2.flatcar.pool.ntp.org). Mar 25 02:37:23.040531 systemd-timesyncd[1730]: Initial clock synchronization to Tue 2025-03-25 02:37:22.784615 UTC. Mar 25 02:37:23.424868 containerd[1819]: time="2025-03-25T02:37:23.424743566Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.3\"" Mar 25 02:37:23.999232 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1918201097.mount: Deactivated successfully. Mar 25 02:37:24.671971 systemd[1]: Started sshd@13-147.75.90.239:22-176.113.115.137:64001.service - OpenSSH per-connection server daemon (176.113.115.137:64001). Mar 25 02:37:24.865087 sshd[2431]: Connection closed by 176.113.115.137 port 64001 Mar 25 02:37:24.865842 systemd[1]: sshd@13-147.75.90.239:22-176.113.115.137:64001.service: Deactivated successfully. Mar 25 02:37:25.160294 containerd[1819]: time="2025-03-25T02:37:25.160269038Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:25.160546 containerd[1819]: time="2025-03-25T02:37:25.160450926Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.3: active requests=0, bytes read=28682430" Mar 25 02:37:25.160823 containerd[1819]: time="2025-03-25T02:37:25.160810995Z" level=info msg="ImageCreate event name:\"sha256:f8bdc4cfa0651e2d7edb4678d2b90129aef82a19249b37dc8d4705e8bd604295\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:25.162459 containerd[1819]: time="2025-03-25T02:37:25.162421259Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:279e45cf07e4f56925c3c5237179eb63616788426a96e94df5fedf728b18926e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:25.162915 containerd[1819]: time="2025-03-25T02:37:25.162876876Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.3\" with image id \"sha256:f8bdc4cfa0651e2d7edb4678d2b90129aef82a19249b37dc8d4705e8bd604295\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:279e45cf07e4f56925c3c5237179eb63616788426a96e94df5fedf728b18926e\", size \"28679230\" in 1.738062297s" Mar 25 02:37:25.162915 containerd[1819]: time="2025-03-25T02:37:25.162893266Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.3\" returns image reference \"sha256:f8bdc4cfa0651e2d7edb4678d2b90129aef82a19249b37dc8d4705e8bd604295\"" Mar 25 02:37:25.163413 containerd[1819]: time="2025-03-25T02:37:25.163368302Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.3\"" Mar 25 02:37:26.738300 containerd[1819]: time="2025-03-25T02:37:26.738273501Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:26.738566 containerd[1819]: time="2025-03-25T02:37:26.738487558Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.3: active requests=0, bytes read=24779684" Mar 25 02:37:26.738876 containerd[1819]: time="2025-03-25T02:37:26.738861929Z" level=info msg="ImageCreate event name:\"sha256:085818208a5213f37ef6d103caaf8e1e243816a614eb5b87a98bfffe79c687b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:26.740096 containerd[1819]: time="2025-03-25T02:37:26.740083279Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:54456a96a1bbdc35dcc2e70fcc1355bf655af67694e40b650ac12e83521f6411\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:26.740645 containerd[1819]: time="2025-03-25T02:37:26.740631097Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.3\" with image id \"sha256:085818208a5213f37ef6d103caaf8e1e243816a614eb5b87a98bfffe79c687b5\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:54456a96a1bbdc35dcc2e70fcc1355bf655af67694e40b650ac12e83521f6411\", size \"26267292\" in 1.577229296s" Mar 25 02:37:26.740678 containerd[1819]: time="2025-03-25T02:37:26.740648197Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.3\" returns image reference \"sha256:085818208a5213f37ef6d103caaf8e1e243816a614eb5b87a98bfffe79c687b5\"" Mar 25 02:37:26.740884 containerd[1819]: time="2025-03-25T02:37:26.740871175Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.3\"" Mar 25 02:37:27.908929 containerd[1819]: time="2025-03-25T02:37:27.908903063Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:27.909188 containerd[1819]: time="2025-03-25T02:37:27.909150849Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.3: active requests=0, bytes read=19171419" Mar 25 02:37:27.909593 containerd[1819]: time="2025-03-25T02:37:27.909580572Z" level=info msg="ImageCreate event name:\"sha256:b4260bf5078ab1b01dd05fb05015fc436b7100b7b9b5ea738e247a86008b16b8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:27.910828 containerd[1819]: time="2025-03-25T02:37:27.910789392Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:aafae2e3a8d65bc6dc3a0c6095c24bc72b1ff608e1417f0f5e860ce4a61c27df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:27.911316 containerd[1819]: time="2025-03-25T02:37:27.911301350Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.3\" with image id \"sha256:b4260bf5078ab1b01dd05fb05015fc436b7100b7b9b5ea738e247a86008b16b8\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:aafae2e3a8d65bc6dc3a0c6095c24bc72b1ff608e1417f0f5e860ce4a61c27df\", size \"20659045\" in 1.170412054s" Mar 25 02:37:27.911316 containerd[1819]: time="2025-03-25T02:37:27.911315268Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.3\" returns image reference \"sha256:b4260bf5078ab1b01dd05fb05015fc436b7100b7b9b5ea738e247a86008b16b8\"" Mar 25 02:37:27.911562 containerd[1819]: time="2025-03-25T02:37:27.911549996Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.3\"" Mar 25 02:37:28.909638 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3505231984.mount: Deactivated successfully. Mar 25 02:37:29.141800 containerd[1819]: time="2025-03-25T02:37:29.141774096Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:29.142020 containerd[1819]: time="2025-03-25T02:37:29.141994619Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.3: active requests=0, bytes read=30918185" Mar 25 02:37:29.142289 containerd[1819]: time="2025-03-25T02:37:29.142277663Z" level=info msg="ImageCreate event name:\"sha256:a1ae78fd2f9d8fc345928378dc947c7f1e95f01c1a552781827071867a95d09c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:29.143209 containerd[1819]: time="2025-03-25T02:37:29.143171700Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:5015269547a0b7dd2c062758e9a64467b58978ff2502cad4c3f5cdf4aa554ad3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:29.143417 containerd[1819]: time="2025-03-25T02:37:29.143376193Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.3\" with image id \"sha256:a1ae78fd2f9d8fc345928378dc947c7f1e95f01c1a552781827071867a95d09c\", repo tag \"registry.k8s.io/kube-proxy:v1.32.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:5015269547a0b7dd2c062758e9a64467b58978ff2502cad4c3f5cdf4aa554ad3\", size \"30917204\" in 1.231808389s" Mar 25 02:37:29.143417 containerd[1819]: time="2025-03-25T02:37:29.143393384Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.3\" returns image reference \"sha256:a1ae78fd2f9d8fc345928378dc947c7f1e95f01c1a552781827071867a95d09c\"" Mar 25 02:37:29.143712 containerd[1819]: time="2025-03-25T02:37:29.143674323Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Mar 25 02:37:29.674514 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 25 02:37:29.675497 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 02:37:29.679086 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1895345763.mount: Deactivated successfully. Mar 25 02:37:29.914776 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 02:37:29.916779 (kubelet)[2475]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 02:37:29.937653 kubelet[2475]: E0325 02:37:29.937595 2475 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 02:37:29.939042 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 02:37:29.939126 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 02:37:29.939289 systemd[1]: kubelet.service: Consumed 99ms CPU time, 111.2M memory peak. Mar 25 02:37:30.354598 containerd[1819]: time="2025-03-25T02:37:30.354572627Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:30.354858 containerd[1819]: time="2025-03-25T02:37:30.354784744Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Mar 25 02:37:30.355210 containerd[1819]: time="2025-03-25T02:37:30.355195758Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:30.356533 containerd[1819]: time="2025-03-25T02:37:30.356518675Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:30.357117 containerd[1819]: time="2025-03-25T02:37:30.357102120Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.213410822s" Mar 25 02:37:30.357164 containerd[1819]: time="2025-03-25T02:37:30.357119016Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Mar 25 02:37:30.357358 containerd[1819]: time="2025-03-25T02:37:30.357344795Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 25 02:37:30.845672 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1451752018.mount: Deactivated successfully. Mar 25 02:37:30.847137 containerd[1819]: time="2025-03-25T02:37:30.847087055Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 02:37:30.847321 containerd[1819]: time="2025-03-25T02:37:30.847276379Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Mar 25 02:37:30.847739 containerd[1819]: time="2025-03-25T02:37:30.847691128Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 02:37:30.848531 containerd[1819]: time="2025-03-25T02:37:30.848499449Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 02:37:30.848973 containerd[1819]: time="2025-03-25T02:37:30.848940050Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 491.579987ms" Mar 25 02:37:30.848973 containerd[1819]: time="2025-03-25T02:37:30.848969653Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Mar 25 02:37:30.849395 containerd[1819]: time="2025-03-25T02:37:30.849333422Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Mar 25 02:37:31.335924 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3566032329.mount: Deactivated successfully. Mar 25 02:37:32.370839 containerd[1819]: time="2025-03-25T02:37:32.370787342Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:32.371047 containerd[1819]: time="2025-03-25T02:37:32.370965418Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551320" Mar 25 02:37:32.371347 containerd[1819]: time="2025-03-25T02:37:32.371312147Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:32.373013 containerd[1819]: time="2025-03-25T02:37:32.372975983Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:32.373454 containerd[1819]: time="2025-03-25T02:37:32.373414108Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 1.524050158s" Mar 25 02:37:32.373454 containerd[1819]: time="2025-03-25T02:37:32.373431001Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Mar 25 02:37:34.121371 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 02:37:34.121487 systemd[1]: kubelet.service: Consumed 99ms CPU time, 111.2M memory peak. Mar 25 02:37:34.122769 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 02:37:34.144061 systemd[1]: Reload requested from client PID 2644 ('systemctl') (unit session-11.scope)... Mar 25 02:37:34.144084 systemd[1]: Reloading... Mar 25 02:37:34.184739 zram_generator::config[2690]: No configuration found. Mar 25 02:37:34.252670 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 02:37:34.334200 systemd[1]: Reloading finished in 189 ms. Mar 25 02:37:34.370413 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 02:37:34.372013 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 02:37:34.372502 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 02:37:34.372614 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 02:37:34.372640 systemd[1]: kubelet.service: Consumed 55ms CPU time, 91.8M memory peak. Mar 25 02:37:34.373430 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 02:37:34.627770 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 02:37:34.630687 (kubelet)[2759]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 02:37:34.652232 kubelet[2759]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 02:37:34.652232 kubelet[2759]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 25 02:37:34.652232 kubelet[2759]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 02:37:34.652451 kubelet[2759]: I0325 02:37:34.652268 2759 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 02:37:34.775385 kubelet[2759]: I0325 02:37:34.775341 2759 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Mar 25 02:37:34.775385 kubelet[2759]: I0325 02:37:34.775352 2759 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 02:37:34.775487 kubelet[2759]: I0325 02:37:34.775480 2759 server.go:954] "Client rotation is on, will bootstrap in background" Mar 25 02:37:34.792807 kubelet[2759]: E0325 02:37:34.792792 2759 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://147.75.90.239:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 147.75.90.239:6443: connect: connection refused" logger="UnhandledError" Mar 25 02:37:34.793363 kubelet[2759]: I0325 02:37:34.793349 2759 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 02:37:34.800636 kubelet[2759]: I0325 02:37:34.800621 2759 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 25 02:37:34.809986 kubelet[2759]: I0325 02:37:34.809973 2759 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 02:37:34.810141 kubelet[2759]: I0325 02:37:34.810094 2759 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 02:37:34.810235 kubelet[2759]: I0325 02:37:34.810110 2759 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284.0.0-a-3a00d206eb","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 25 02:37:34.810235 kubelet[2759]: I0325 02:37:34.810207 2759 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 02:37:34.810235 kubelet[2759]: I0325 02:37:34.810214 2759 container_manager_linux.go:304] "Creating device plugin manager" Mar 25 02:37:34.810340 kubelet[2759]: I0325 02:37:34.810279 2759 state_mem.go:36] "Initialized new in-memory state store" Mar 25 02:37:34.812976 kubelet[2759]: I0325 02:37:34.812939 2759 kubelet.go:446] "Attempting to sync node with API server" Mar 25 02:37:34.812976 kubelet[2759]: I0325 02:37:34.812948 2759 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 02:37:34.812976 kubelet[2759]: I0325 02:37:34.812959 2759 kubelet.go:352] "Adding apiserver pod source" Mar 25 02:37:34.812976 kubelet[2759]: I0325 02:37:34.812965 2759 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 02:37:34.815295 kubelet[2759]: I0325 02:37:34.815281 2759 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 02:37:34.815578 kubelet[2759]: I0325 02:37:34.815547 2759 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 02:37:34.815610 kubelet[2759]: W0325 02:37:34.815594 2759 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 25 02:37:34.816472 kubelet[2759]: W0325 02:37:34.816429 2759 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://147.75.90.239:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 147.75.90.239:6443: connect: connection refused Mar 25 02:37:34.816520 kubelet[2759]: E0325 02:37:34.816477 2759 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://147.75.90.239:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 147.75.90.239:6443: connect: connection refused" logger="UnhandledError" Mar 25 02:37:34.817387 kubelet[2759]: W0325 02:37:34.817341 2759 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://147.75.90.239:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284.0.0-a-3a00d206eb&limit=500&resourceVersion=0": dial tcp 147.75.90.239:6443: connect: connection refused Mar 25 02:37:34.817387 kubelet[2759]: E0325 02:37:34.817364 2759 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://147.75.90.239:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284.0.0-a-3a00d206eb&limit=500&resourceVersion=0\": dial tcp 147.75.90.239:6443: connect: connection refused" logger="UnhandledError" Mar 25 02:37:34.818614 kubelet[2759]: I0325 02:37:34.818602 2759 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 25 02:37:34.818656 kubelet[2759]: I0325 02:37:34.818624 2759 server.go:1287] "Started kubelet" Mar 25 02:37:34.819025 kubelet[2759]: I0325 02:37:34.818951 2759 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 02:37:34.819292 kubelet[2759]: I0325 02:37:34.819263 2759 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 02:37:34.819386 kubelet[2759]: I0325 02:37:34.819314 2759 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 02:37:34.819826 kubelet[2759]: I0325 02:37:34.819817 2759 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 02:37:34.819885 kubelet[2759]: E0325 02:37:34.819871 2759 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 02:37:34.819918 kubelet[2759]: E0325 02:37:34.819881 2759 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284.0.0-a-3a00d206eb\" not found" Mar 25 02:37:34.819918 kubelet[2759]: I0325 02:37:34.819892 2759 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 02:37:34.819918 kubelet[2759]: I0325 02:37:34.819887 2759 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 25 02:37:34.820002 kubelet[2759]: I0325 02:37:34.819889 2759 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 25 02:37:34.822583 kubelet[2759]: I0325 02:37:34.822574 2759 reconciler.go:26] "Reconciler: start to sync state" Mar 25 02:37:34.822726 kubelet[2759]: W0325 02:37:34.822706 2759 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://147.75.90.239:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 147.75.90.239:6443: connect: connection refused Mar 25 02:37:34.822757 kubelet[2759]: E0325 02:37:34.822734 2759 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://147.75.90.239:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 147.75.90.239:6443: connect: connection refused" logger="UnhandledError" Mar 25 02:37:34.822757 kubelet[2759]: E0325 02:37:34.822735 2759 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.75.90.239:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284.0.0-a-3a00d206eb?timeout=10s\": dial tcp 147.75.90.239:6443: connect: connection refused" interval="200ms" Mar 25 02:37:34.822964 kubelet[2759]: I0325 02:37:34.822957 2759 server.go:490] "Adding debug handlers to kubelet server" Mar 25 02:37:34.823118 kubelet[2759]: I0325 02:37:34.823111 2759 factory.go:221] Registration of the containerd container factory successfully Mar 25 02:37:34.823118 kubelet[2759]: I0325 02:37:34.823118 2759 factory.go:221] Registration of the systemd container factory successfully Mar 25 02:37:34.823159 kubelet[2759]: I0325 02:37:34.823151 2759 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 02:37:34.823590 kubelet[2759]: E0325 02:37:34.822676 2759 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://147.75.90.239:6443/api/v1/namespaces/default/events\": dial tcp 147.75.90.239:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4284.0.0-a-3a00d206eb.182feb433c0ed949 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4284.0.0-a-3a00d206eb,UID:ci-4284.0.0-a-3a00d206eb,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4284.0.0-a-3a00d206eb,},FirstTimestamp:2025-03-25 02:37:34.818613577 +0000 UTC m=+0.185839604,LastTimestamp:2025-03-25 02:37:34.818613577 +0000 UTC m=+0.185839604,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284.0.0-a-3a00d206eb,}" Mar 25 02:37:34.829448 kubelet[2759]: I0325 02:37:34.829440 2759 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 25 02:37:34.829448 kubelet[2759]: I0325 02:37:34.829446 2759 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 25 02:37:34.829515 kubelet[2759]: I0325 02:37:34.829455 2759 state_mem.go:36] "Initialized new in-memory state store" Mar 25 02:37:34.830295 kubelet[2759]: I0325 02:37:34.830286 2759 policy_none.go:49] "None policy: Start" Mar 25 02:37:34.830295 kubelet[2759]: I0325 02:37:34.830294 2759 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 25 02:37:34.830338 kubelet[2759]: I0325 02:37:34.830300 2759 state_mem.go:35] "Initializing new in-memory state store" Mar 25 02:37:34.830386 kubelet[2759]: I0325 02:37:34.830375 2759 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 02:37:34.830907 kubelet[2759]: I0325 02:37:34.830899 2759 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 02:37:34.830935 kubelet[2759]: I0325 02:37:34.830910 2759 status_manager.go:227] "Starting to sync pod status with apiserver" Mar 25 02:37:34.830935 kubelet[2759]: I0325 02:37:34.830922 2759 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 25 02:37:34.830935 kubelet[2759]: I0325 02:37:34.830926 2759 kubelet.go:2388] "Starting kubelet main sync loop" Mar 25 02:37:34.830981 kubelet[2759]: E0325 02:37:34.830961 2759 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 02:37:34.831924 kubelet[2759]: W0325 02:37:34.831904 2759 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://147.75.90.239:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 147.75.90.239:6443: connect: connection refused Mar 25 02:37:34.831972 kubelet[2759]: E0325 02:37:34.831934 2759 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://147.75.90.239:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 147.75.90.239:6443: connect: connection refused" logger="UnhandledError" Mar 25 02:37:34.834078 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 25 02:37:34.855612 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 25 02:37:34.857551 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 25 02:37:34.872412 kubelet[2759]: I0325 02:37:34.872370 2759 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 02:37:34.872555 kubelet[2759]: I0325 02:37:34.872519 2759 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 25 02:37:34.872555 kubelet[2759]: I0325 02:37:34.872529 2759 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 02:37:34.872704 kubelet[2759]: I0325 02:37:34.872686 2759 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 02:37:34.873241 kubelet[2759]: E0325 02:37:34.873202 2759 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 25 02:37:34.873241 kubelet[2759]: E0325 02:37:34.873231 2759 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4284.0.0-a-3a00d206eb\" not found" Mar 25 02:37:34.954548 systemd[1]: Created slice kubepods-burstable-pod784b346fb55ab4cac989400e8e663aaa.slice - libcontainer container kubepods-burstable-pod784b346fb55ab4cac989400e8e663aaa.slice. Mar 25 02:37:34.975947 kubelet[2759]: I0325 02:37:34.975881 2759 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:34.976641 kubelet[2759]: E0325 02:37:34.976574 2759 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://147.75.90.239:6443/api/v1/nodes\": dial tcp 147.75.90.239:6443: connect: connection refused" node="ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:34.984061 kubelet[2759]: E0325 02:37:34.983970 2759 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284.0.0-a-3a00d206eb\" not found" node="ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:34.991960 systemd[1]: Created slice kubepods-burstable-pod50714d2baeb29c90bb43d3dafa0eec23.slice - libcontainer container kubepods-burstable-pod50714d2baeb29c90bb43d3dafa0eec23.slice. Mar 25 02:37:35.008596 kubelet[2759]: E0325 02:37:35.008550 2759 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284.0.0-a-3a00d206eb\" not found" node="ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:35.015928 systemd[1]: Created slice kubepods-burstable-pod4a586da502b920b0969b3dec7e3d7f2b.slice - libcontainer container kubepods-burstable-pod4a586da502b920b0969b3dec7e3d7f2b.slice. Mar 25 02:37:35.020338 kubelet[2759]: E0325 02:37:35.020252 2759 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284.0.0-a-3a00d206eb\" not found" node="ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:35.023857 kubelet[2759]: I0325 02:37:35.023770 2759 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/50714d2baeb29c90bb43d3dafa0eec23-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284.0.0-a-3a00d206eb\" (UID: \"50714d2baeb29c90bb43d3dafa0eec23\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:35.024041 kubelet[2759]: I0325 02:37:35.023915 2759 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/784b346fb55ab4cac989400e8e663aaa-k8s-certs\") pod \"kube-apiserver-ci-4284.0.0-a-3a00d206eb\" (UID: \"784b346fb55ab4cac989400e8e663aaa\") " pod="kube-system/kube-apiserver-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:35.024041 kubelet[2759]: I0325 02:37:35.023990 2759 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/50714d2baeb29c90bb43d3dafa0eec23-ca-certs\") pod \"kube-controller-manager-ci-4284.0.0-a-3a00d206eb\" (UID: \"50714d2baeb29c90bb43d3dafa0eec23\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:35.024223 kubelet[2759]: I0325 02:37:35.024041 2759 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/50714d2baeb29c90bb43d3dafa0eec23-kubeconfig\") pod \"kube-controller-manager-ci-4284.0.0-a-3a00d206eb\" (UID: \"50714d2baeb29c90bb43d3dafa0eec23\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:35.024223 kubelet[2759]: E0325 02:37:35.024029 2759 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.75.90.239:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284.0.0-a-3a00d206eb?timeout=10s\": dial tcp 147.75.90.239:6443: connect: connection refused" interval="400ms" Mar 25 02:37:35.024223 kubelet[2759]: I0325 02:37:35.024087 2759 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4a586da502b920b0969b3dec7e3d7f2b-kubeconfig\") pod \"kube-scheduler-ci-4284.0.0-a-3a00d206eb\" (UID: \"4a586da502b920b0969b3dec7e3d7f2b\") " pod="kube-system/kube-scheduler-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:35.024223 kubelet[2759]: I0325 02:37:35.024154 2759 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/784b346fb55ab4cac989400e8e663aaa-ca-certs\") pod \"kube-apiserver-ci-4284.0.0-a-3a00d206eb\" (UID: \"784b346fb55ab4cac989400e8e663aaa\") " pod="kube-system/kube-apiserver-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:35.024223 kubelet[2759]: I0325 02:37:35.024199 2759 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/784b346fb55ab4cac989400e8e663aaa-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284.0.0-a-3a00d206eb\" (UID: \"784b346fb55ab4cac989400e8e663aaa\") " pod="kube-system/kube-apiserver-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:35.024646 kubelet[2759]: I0325 02:37:35.024246 2759 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/50714d2baeb29c90bb43d3dafa0eec23-flexvolume-dir\") pod \"kube-controller-manager-ci-4284.0.0-a-3a00d206eb\" (UID: \"50714d2baeb29c90bb43d3dafa0eec23\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:35.024646 kubelet[2759]: I0325 02:37:35.024287 2759 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/50714d2baeb29c90bb43d3dafa0eec23-k8s-certs\") pod \"kube-controller-manager-ci-4284.0.0-a-3a00d206eb\" (UID: \"50714d2baeb29c90bb43d3dafa0eec23\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:35.180687 kubelet[2759]: I0325 02:37:35.180600 2759 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:35.181455 kubelet[2759]: E0325 02:37:35.181345 2759 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://147.75.90.239:6443/api/v1/nodes\": dial tcp 147.75.90.239:6443: connect: connection refused" node="ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:35.286099 containerd[1819]: time="2025-03-25T02:37:35.286004614Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284.0.0-a-3a00d206eb,Uid:784b346fb55ab4cac989400e8e663aaa,Namespace:kube-system,Attempt:0,}" Mar 25 02:37:35.295249 containerd[1819]: time="2025-03-25T02:37:35.295229833Z" level=info msg="connecting to shim 9c8d2a8228c0ef71c92ba7a19366940edb94b33a24a6a8170eb6f024e6780338" address="unix:///run/containerd/s/db25ea1694e514f28d2abc0700bf51cd404e599b1eba0cc2e780528a0f069648" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:37:35.310153 containerd[1819]: time="2025-03-25T02:37:35.310129607Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284.0.0-a-3a00d206eb,Uid:50714d2baeb29c90bb43d3dafa0eec23,Namespace:kube-system,Attempt:0,}" Mar 25 02:37:35.317008 systemd[1]: Started cri-containerd-9c8d2a8228c0ef71c92ba7a19366940edb94b33a24a6a8170eb6f024e6780338.scope - libcontainer container 9c8d2a8228c0ef71c92ba7a19366940edb94b33a24a6a8170eb6f024e6780338. Mar 25 02:37:35.322129 containerd[1819]: time="2025-03-25T02:37:35.322093554Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284.0.0-a-3a00d206eb,Uid:4a586da502b920b0969b3dec7e3d7f2b,Namespace:kube-system,Attempt:0,}" Mar 25 02:37:35.337350 containerd[1819]: time="2025-03-25T02:37:35.337327732Z" level=info msg="connecting to shim 0ae21a2aabd6e8d479126c947482647ba187ee17c9e390d21fae21cc300c9389" address="unix:///run/containerd/s/5c4cf24c9785cef5875a0b8a2cfa4e768329abcc165aba56acf0fbd9b18b3d9e" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:37:35.340661 containerd[1819]: time="2025-03-25T02:37:35.340632604Z" level=info msg="connecting to shim 451fb8aa2b071234e7ff3017b9cde36ca59eae8974baed7c5b7c939149b60423" address="unix:///run/containerd/s/3bc439d9874aa5450ace97ee2338a5e2cb1904b282a9a210012ab9d5b65dfc25" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:37:35.344776 containerd[1819]: time="2025-03-25T02:37:35.344753761Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284.0.0-a-3a00d206eb,Uid:784b346fb55ab4cac989400e8e663aaa,Namespace:kube-system,Attempt:0,} returns sandbox id \"9c8d2a8228c0ef71c92ba7a19366940edb94b33a24a6a8170eb6f024e6780338\"" Mar 25 02:37:35.355069 systemd[1]: Started cri-containerd-0ae21a2aabd6e8d479126c947482647ba187ee17c9e390d21fae21cc300c9389.scope - libcontainer container 0ae21a2aabd6e8d479126c947482647ba187ee17c9e390d21fae21cc300c9389. Mar 25 02:37:35.355478 containerd[1819]: time="2025-03-25T02:37:35.355430059Z" level=info msg="CreateContainer within sandbox \"9c8d2a8228c0ef71c92ba7a19366940edb94b33a24a6a8170eb6f024e6780338\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 25 02:37:35.357439 systemd[1]: Started cri-containerd-451fb8aa2b071234e7ff3017b9cde36ca59eae8974baed7c5b7c939149b60423.scope - libcontainer container 451fb8aa2b071234e7ff3017b9cde36ca59eae8974baed7c5b7c939149b60423. Mar 25 02:37:35.358907 containerd[1819]: time="2025-03-25T02:37:35.358890908Z" level=info msg="Container 5c8051c2621d5c81e979a3b79f81d7eb8f010a96a44194e2fa899d08308e55e3: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:37:35.363460 containerd[1819]: time="2025-03-25T02:37:35.363432775Z" level=info msg="CreateContainer within sandbox \"9c8d2a8228c0ef71c92ba7a19366940edb94b33a24a6a8170eb6f024e6780338\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"5c8051c2621d5c81e979a3b79f81d7eb8f010a96a44194e2fa899d08308e55e3\"" Mar 25 02:37:35.363877 containerd[1819]: time="2025-03-25T02:37:35.363863328Z" level=info msg="StartContainer for \"5c8051c2621d5c81e979a3b79f81d7eb8f010a96a44194e2fa899d08308e55e3\"" Mar 25 02:37:35.364455 containerd[1819]: time="2025-03-25T02:37:35.364442646Z" level=info msg="connecting to shim 5c8051c2621d5c81e979a3b79f81d7eb8f010a96a44194e2fa899d08308e55e3" address="unix:///run/containerd/s/db25ea1694e514f28d2abc0700bf51cd404e599b1eba0cc2e780528a0f069648" protocol=ttrpc version=3 Mar 25 02:37:35.372000 systemd[1]: Started cri-containerd-5c8051c2621d5c81e979a3b79f81d7eb8f010a96a44194e2fa899d08308e55e3.scope - libcontainer container 5c8051c2621d5c81e979a3b79f81d7eb8f010a96a44194e2fa899d08308e55e3. Mar 25 02:37:35.384089 containerd[1819]: time="2025-03-25T02:37:35.384066803Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284.0.0-a-3a00d206eb,Uid:50714d2baeb29c90bb43d3dafa0eec23,Namespace:kube-system,Attempt:0,} returns sandbox id \"0ae21a2aabd6e8d479126c947482647ba187ee17c9e390d21fae21cc300c9389\"" Mar 25 02:37:35.385069 containerd[1819]: time="2025-03-25T02:37:35.385053799Z" level=info msg="CreateContainer within sandbox \"0ae21a2aabd6e8d479126c947482647ba187ee17c9e390d21fae21cc300c9389\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 25 02:37:35.387158 containerd[1819]: time="2025-03-25T02:37:35.387144082Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284.0.0-a-3a00d206eb,Uid:4a586da502b920b0969b3dec7e3d7f2b,Namespace:kube-system,Attempt:0,} returns sandbox id \"451fb8aa2b071234e7ff3017b9cde36ca59eae8974baed7c5b7c939149b60423\"" Mar 25 02:37:35.388018 containerd[1819]: time="2025-03-25T02:37:35.388006494Z" level=info msg="CreateContainer within sandbox \"451fb8aa2b071234e7ff3017b9cde36ca59eae8974baed7c5b7c939149b60423\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 25 02:37:35.388450 containerd[1819]: time="2025-03-25T02:37:35.388439415Z" level=info msg="Container 40d5cd5a48d7baaae7fa7fd264bd02e6dfe7e60f61dd1d188cf2b17d30dcf7bf: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:37:35.391274 containerd[1819]: time="2025-03-25T02:37:35.391258333Z" level=info msg="CreateContainer within sandbox \"0ae21a2aabd6e8d479126c947482647ba187ee17c9e390d21fae21cc300c9389\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"40d5cd5a48d7baaae7fa7fd264bd02e6dfe7e60f61dd1d188cf2b17d30dcf7bf\"" Mar 25 02:37:35.391322 containerd[1819]: time="2025-03-25T02:37:35.391311443Z" level=info msg="Container c382d59abf21517243f158a110991dd5851e124f63ddeecc09ecd2d7a5cea1ec: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:37:35.391460 containerd[1819]: time="2025-03-25T02:37:35.391450288Z" level=info msg="StartContainer for \"40d5cd5a48d7baaae7fa7fd264bd02e6dfe7e60f61dd1d188cf2b17d30dcf7bf\"" Mar 25 02:37:35.391987 containerd[1819]: time="2025-03-25T02:37:35.391976805Z" level=info msg="connecting to shim 40d5cd5a48d7baaae7fa7fd264bd02e6dfe7e60f61dd1d188cf2b17d30dcf7bf" address="unix:///run/containerd/s/5c4cf24c9785cef5875a0b8a2cfa4e768329abcc165aba56acf0fbd9b18b3d9e" protocol=ttrpc version=3 Mar 25 02:37:35.394025 containerd[1819]: time="2025-03-25T02:37:35.394004559Z" level=info msg="CreateContainer within sandbox \"451fb8aa2b071234e7ff3017b9cde36ca59eae8974baed7c5b7c939149b60423\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c382d59abf21517243f158a110991dd5851e124f63ddeecc09ecd2d7a5cea1ec\"" Mar 25 02:37:35.394239 containerd[1819]: time="2025-03-25T02:37:35.394225613Z" level=info msg="StartContainer for \"c382d59abf21517243f158a110991dd5851e124f63ddeecc09ecd2d7a5cea1ec\"" Mar 25 02:37:35.394758 containerd[1819]: time="2025-03-25T02:37:35.394746285Z" level=info msg="connecting to shim c382d59abf21517243f158a110991dd5851e124f63ddeecc09ecd2d7a5cea1ec" address="unix:///run/containerd/s/3bc439d9874aa5450ace97ee2338a5e2cb1904b282a9a210012ab9d5b65dfc25" protocol=ttrpc version=3 Mar 25 02:37:35.408838 systemd[1]: Started cri-containerd-40d5cd5a48d7baaae7fa7fd264bd02e6dfe7e60f61dd1d188cf2b17d30dcf7bf.scope - libcontainer container 40d5cd5a48d7baaae7fa7fd264bd02e6dfe7e60f61dd1d188cf2b17d30dcf7bf. Mar 25 02:37:35.410527 systemd[1]: Started cri-containerd-c382d59abf21517243f158a110991dd5851e124f63ddeecc09ecd2d7a5cea1ec.scope - libcontainer container c382d59abf21517243f158a110991dd5851e124f63ddeecc09ecd2d7a5cea1ec. Mar 25 02:37:35.410635 containerd[1819]: time="2025-03-25T02:37:35.410587636Z" level=info msg="StartContainer for \"5c8051c2621d5c81e979a3b79f81d7eb8f010a96a44194e2fa899d08308e55e3\" returns successfully" Mar 25 02:37:35.437139 containerd[1819]: time="2025-03-25T02:37:35.437112010Z" level=info msg="StartContainer for \"c382d59abf21517243f158a110991dd5851e124f63ddeecc09ecd2d7a5cea1ec\" returns successfully" Mar 25 02:37:35.437222 containerd[1819]: time="2025-03-25T02:37:35.437145099Z" level=info msg="StartContainer for \"40d5cd5a48d7baaae7fa7fd264bd02e6dfe7e60f61dd1d188cf2b17d30dcf7bf\" returns successfully" Mar 25 02:37:35.582898 kubelet[2759]: I0325 02:37:35.582850 2759 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:35.835256 kubelet[2759]: E0325 02:37:35.835196 2759 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284.0.0-a-3a00d206eb\" not found" node="ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:35.835800 kubelet[2759]: E0325 02:37:35.835791 2759 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284.0.0-a-3a00d206eb\" not found" node="ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:35.836582 kubelet[2759]: E0325 02:37:35.836572 2759 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284.0.0-a-3a00d206eb\" not found" node="ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:36.171377 kubelet[2759]: E0325 02:37:36.171319 2759 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4284.0.0-a-3a00d206eb\" not found" node="ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:36.289872 kubelet[2759]: I0325 02:37:36.289800 2759 kubelet_node_status.go:79] "Successfully registered node" node="ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:36.320022 kubelet[2759]: I0325 02:37:36.319964 2759 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:36.323547 kubelet[2759]: E0325 02:37:36.323524 2759 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4284.0.0-a-3a00d206eb\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:36.323547 kubelet[2759]: I0325 02:37:36.323546 2759 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:36.324930 kubelet[2759]: E0325 02:37:36.324888 2759 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4284.0.0-a-3a00d206eb\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:36.324930 kubelet[2759]: I0325 02:37:36.324905 2759 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:36.325991 kubelet[2759]: E0325 02:37:36.325949 2759 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4284.0.0-a-3a00d206eb\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:36.814515 kubelet[2759]: I0325 02:37:36.814406 2759 apiserver.go:52] "Watching apiserver" Mar 25 02:37:36.820511 kubelet[2759]: I0325 02:37:36.820423 2759 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 02:37:36.838030 kubelet[2759]: I0325 02:37:36.837976 2759 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:36.838840 kubelet[2759]: I0325 02:37:36.838197 2759 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:36.842297 kubelet[2759]: E0325 02:37:36.842227 2759 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4284.0.0-a-3a00d206eb\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:36.842863 kubelet[2759]: E0325 02:37:36.842778 2759 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4284.0.0-a-3a00d206eb\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:37.728669 kubelet[2759]: I0325 02:37:37.728569 2759 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:37.735791 kubelet[2759]: W0325 02:37:37.735737 2759 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 02:37:38.590121 systemd[1]: Reload requested from client PID 3077 ('systemctl') (unit session-11.scope)... Mar 25 02:37:38.590130 systemd[1]: Reloading... Mar 25 02:37:38.635702 zram_generator::config[3124]: No configuration found. Mar 25 02:37:38.712393 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 02:37:38.802452 systemd[1]: Reloading finished in 212 ms. Mar 25 02:37:38.819914 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 02:37:38.825106 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 02:37:38.825221 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 02:37:38.825243 systemd[1]: kubelet.service: Consumed 724ms CPU time, 136M memory peak. Mar 25 02:37:38.826498 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 02:37:39.096409 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 02:37:39.098696 (kubelet)[3188]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 02:37:39.121981 kubelet[3188]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 02:37:39.121981 kubelet[3188]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 25 02:37:39.121981 kubelet[3188]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 02:37:39.122210 kubelet[3188]: I0325 02:37:39.122014 3188 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 02:37:39.126456 kubelet[3188]: I0325 02:37:39.126435 3188 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Mar 25 02:37:39.126456 kubelet[3188]: I0325 02:37:39.126453 3188 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 02:37:39.126684 kubelet[3188]: I0325 02:37:39.126674 3188 server.go:954] "Client rotation is on, will bootstrap in background" Mar 25 02:37:39.127449 kubelet[3188]: I0325 02:37:39.127437 3188 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 25 02:37:39.128762 kubelet[3188]: I0325 02:37:39.128751 3188 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 02:37:39.130714 kubelet[3188]: I0325 02:37:39.130674 3188 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 25 02:37:39.138540 kubelet[3188]: I0325 02:37:39.138497 3188 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 02:37:39.138630 kubelet[3188]: I0325 02:37:39.138606 3188 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 02:37:39.138765 kubelet[3188]: I0325 02:37:39.138628 3188 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284.0.0-a-3a00d206eb","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 25 02:37:39.138765 kubelet[3188]: I0325 02:37:39.138741 3188 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 02:37:39.138765 kubelet[3188]: I0325 02:37:39.138747 3188 container_manager_linux.go:304] "Creating device plugin manager" Mar 25 02:37:39.138889 kubelet[3188]: I0325 02:37:39.138775 3188 state_mem.go:36] "Initialized new in-memory state store" Mar 25 02:37:39.138908 kubelet[3188]: I0325 02:37:39.138900 3188 kubelet.go:446] "Attempting to sync node with API server" Mar 25 02:37:39.138928 kubelet[3188]: I0325 02:37:39.138908 3188 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 02:37:39.138928 kubelet[3188]: I0325 02:37:39.138924 3188 kubelet.go:352] "Adding apiserver pod source" Mar 25 02:37:39.138962 kubelet[3188]: I0325 02:37:39.138934 3188 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 02:37:39.139357 kubelet[3188]: I0325 02:37:39.139344 3188 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 02:37:39.139643 kubelet[3188]: I0325 02:37:39.139608 3188 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 02:37:39.139900 kubelet[3188]: I0325 02:37:39.139864 3188 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 25 02:37:39.139900 kubelet[3188]: I0325 02:37:39.139883 3188 server.go:1287] "Started kubelet" Mar 25 02:37:39.140035 kubelet[3188]: I0325 02:37:39.139987 3188 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 02:37:39.140073 kubelet[3188]: I0325 02:37:39.140042 3188 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 02:37:39.140191 kubelet[3188]: I0325 02:37:39.140180 3188 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 02:37:39.140894 kubelet[3188]: I0325 02:37:39.140879 3188 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 02:37:39.141187 kubelet[3188]: I0325 02:37:39.141173 3188 server.go:490] "Adding debug handlers to kubelet server" Mar 25 02:37:39.141566 kubelet[3188]: I0325 02:37:39.141213 3188 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 25 02:37:39.141853 kubelet[3188]: E0325 02:37:39.141236 3188 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284.0.0-a-3a00d206eb\" not found" Mar 25 02:37:39.141959 kubelet[3188]: I0325 02:37:39.141244 3188 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 25 02:37:39.142002 kubelet[3188]: I0325 02:37:39.141258 3188 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 02:37:39.142223 kubelet[3188]: I0325 02:37:39.142201 3188 reconciler.go:26] "Reconciler: start to sync state" Mar 25 02:37:39.142484 kubelet[3188]: I0325 02:37:39.142473 3188 factory.go:221] Registration of the systemd container factory successfully Mar 25 02:37:39.142779 kubelet[3188]: I0325 02:37:39.142762 3188 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 02:37:39.143023 kubelet[3188]: E0325 02:37:39.143006 3188 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 02:37:39.144236 kubelet[3188]: I0325 02:37:39.144223 3188 factory.go:221] Registration of the containerd container factory successfully Mar 25 02:37:39.147264 kubelet[3188]: I0325 02:37:39.147240 3188 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 02:37:39.147858 kubelet[3188]: I0325 02:37:39.147844 3188 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 02:37:39.147904 kubelet[3188]: I0325 02:37:39.147863 3188 status_manager.go:227] "Starting to sync pod status with apiserver" Mar 25 02:37:39.147904 kubelet[3188]: I0325 02:37:39.147878 3188 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 25 02:37:39.147904 kubelet[3188]: I0325 02:37:39.147894 3188 kubelet.go:2388] "Starting kubelet main sync loop" Mar 25 02:37:39.148032 kubelet[3188]: E0325 02:37:39.147937 3188 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 02:37:39.159465 kubelet[3188]: I0325 02:37:39.159393 3188 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 25 02:37:39.159465 kubelet[3188]: I0325 02:37:39.159402 3188 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 25 02:37:39.159465 kubelet[3188]: I0325 02:37:39.159412 3188 state_mem.go:36] "Initialized new in-memory state store" Mar 25 02:37:39.159572 kubelet[3188]: I0325 02:37:39.159520 3188 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 25 02:37:39.159572 kubelet[3188]: I0325 02:37:39.159527 3188 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 25 02:37:39.159572 kubelet[3188]: I0325 02:37:39.159538 3188 policy_none.go:49] "None policy: Start" Mar 25 02:37:39.159572 kubelet[3188]: I0325 02:37:39.159543 3188 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 25 02:37:39.159572 kubelet[3188]: I0325 02:37:39.159548 3188 state_mem.go:35] "Initializing new in-memory state store" Mar 25 02:37:39.159699 kubelet[3188]: I0325 02:37:39.159604 3188 state_mem.go:75] "Updated machine memory state" Mar 25 02:37:39.161491 kubelet[3188]: I0325 02:37:39.161453 3188 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 02:37:39.161551 kubelet[3188]: I0325 02:37:39.161545 3188 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 25 02:37:39.161572 kubelet[3188]: I0325 02:37:39.161552 3188 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 02:37:39.161620 kubelet[3188]: I0325 02:37:39.161614 3188 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 02:37:39.161976 kubelet[3188]: E0325 02:37:39.161933 3188 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 25 02:37:39.248786 kubelet[3188]: I0325 02:37:39.248716 3188 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:39.248786 kubelet[3188]: I0325 02:37:39.248733 3188 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:39.249003 kubelet[3188]: I0325 02:37:39.248733 3188 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:39.254677 kubelet[3188]: W0325 02:37:39.254623 3188 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 02:37:39.254847 kubelet[3188]: W0325 02:37:39.254757 3188 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 02:37:39.256245 kubelet[3188]: W0325 02:37:39.256179 3188 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 02:37:39.257804 kubelet[3188]: E0325 02:37:39.256679 3188 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4284.0.0-a-3a00d206eb\" already exists" pod="kube-system/kube-controller-manager-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:39.268570 kubelet[3188]: I0325 02:37:39.268508 3188 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:39.280268 kubelet[3188]: I0325 02:37:39.280165 3188 kubelet_node_status.go:125] "Node was previously registered" node="ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:39.281145 kubelet[3188]: I0325 02:37:39.281101 3188 kubelet_node_status.go:79] "Successfully registered node" node="ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:39.343218 kubelet[3188]: I0325 02:37:39.343123 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/784b346fb55ab4cac989400e8e663aaa-ca-certs\") pod \"kube-apiserver-ci-4284.0.0-a-3a00d206eb\" (UID: \"784b346fb55ab4cac989400e8e663aaa\") " pod="kube-system/kube-apiserver-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:39.343218 kubelet[3188]: I0325 02:37:39.343211 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/50714d2baeb29c90bb43d3dafa0eec23-flexvolume-dir\") pod \"kube-controller-manager-ci-4284.0.0-a-3a00d206eb\" (UID: \"50714d2baeb29c90bb43d3dafa0eec23\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:39.343522 kubelet[3188]: I0325 02:37:39.343261 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/50714d2baeb29c90bb43d3dafa0eec23-k8s-certs\") pod \"kube-controller-manager-ci-4284.0.0-a-3a00d206eb\" (UID: \"50714d2baeb29c90bb43d3dafa0eec23\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:39.343522 kubelet[3188]: I0325 02:37:39.343300 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4a586da502b920b0969b3dec7e3d7f2b-kubeconfig\") pod \"kube-scheduler-ci-4284.0.0-a-3a00d206eb\" (UID: \"4a586da502b920b0969b3dec7e3d7f2b\") " pod="kube-system/kube-scheduler-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:39.343522 kubelet[3188]: I0325 02:37:39.343345 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/50714d2baeb29c90bb43d3dafa0eec23-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284.0.0-a-3a00d206eb\" (UID: \"50714d2baeb29c90bb43d3dafa0eec23\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:39.343522 kubelet[3188]: I0325 02:37:39.343384 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/784b346fb55ab4cac989400e8e663aaa-k8s-certs\") pod \"kube-apiserver-ci-4284.0.0-a-3a00d206eb\" (UID: \"784b346fb55ab4cac989400e8e663aaa\") " pod="kube-system/kube-apiserver-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:39.343522 kubelet[3188]: I0325 02:37:39.343418 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/784b346fb55ab4cac989400e8e663aaa-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284.0.0-a-3a00d206eb\" (UID: \"784b346fb55ab4cac989400e8e663aaa\") " pod="kube-system/kube-apiserver-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:39.343957 kubelet[3188]: I0325 02:37:39.343453 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/50714d2baeb29c90bb43d3dafa0eec23-ca-certs\") pod \"kube-controller-manager-ci-4284.0.0-a-3a00d206eb\" (UID: \"50714d2baeb29c90bb43d3dafa0eec23\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:39.343957 kubelet[3188]: I0325 02:37:39.343487 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/50714d2baeb29c90bb43d3dafa0eec23-kubeconfig\") pod \"kube-controller-manager-ci-4284.0.0-a-3a00d206eb\" (UID: \"50714d2baeb29c90bb43d3dafa0eec23\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:40.139605 kubelet[3188]: I0325 02:37:40.139491 3188 apiserver.go:52] "Watching apiserver" Mar 25 02:37:40.154773 kubelet[3188]: I0325 02:37:40.154694 3188 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:40.154943 kubelet[3188]: I0325 02:37:40.154918 3188 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:40.162619 kubelet[3188]: W0325 02:37:40.162563 3188 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 02:37:40.162885 kubelet[3188]: E0325 02:37:40.162729 3188 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4284.0.0-a-3a00d206eb\" already exists" pod="kube-system/kube-apiserver-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:40.163000 kubelet[3188]: W0325 02:37:40.162977 3188 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 02:37:40.163107 kubelet[3188]: E0325 02:37:40.163076 3188 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4284.0.0-a-3a00d206eb\" already exists" pod="kube-system/kube-scheduler-ci-4284.0.0-a-3a00d206eb" Mar 25 02:37:40.171892 kubelet[3188]: I0325 02:37:40.171865 3188 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4284.0.0-a-3a00d206eb" podStartSLOduration=1.1718450360000001 podStartE2EDuration="1.171845036s" podCreationTimestamp="2025-03-25 02:37:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 02:37:40.171821665 +0000 UTC m=+1.071132348" watchObservedRunningTime="2025-03-25 02:37:40.171845036 +0000 UTC m=+1.071155718" Mar 25 02:37:40.175787 kubelet[3188]: I0325 02:37:40.175331 3188 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4284.0.0-a-3a00d206eb" podStartSLOduration=1.175319629 podStartE2EDuration="1.175319629s" podCreationTimestamp="2025-03-25 02:37:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 02:37:40.175271407 +0000 UTC m=+1.074582093" watchObservedRunningTime="2025-03-25 02:37:40.175319629 +0000 UTC m=+1.074630309" Mar 25 02:37:40.182765 kubelet[3188]: I0325 02:37:40.182714 3188 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4284.0.0-a-3a00d206eb" podStartSLOduration=3.182704675 podStartE2EDuration="3.182704675s" podCreationTimestamp="2025-03-25 02:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 02:37:40.179134328 +0000 UTC m=+1.078445011" watchObservedRunningTime="2025-03-25 02:37:40.182704675 +0000 UTC m=+1.082015357" Mar 25 02:37:40.242488 kubelet[3188]: I0325 02:37:40.242413 3188 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 02:37:43.169520 kubelet[3188]: I0325 02:37:43.169417 3188 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 25 02:37:43.170363 containerd[1819]: time="2025-03-25T02:37:43.170043370Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 25 02:37:43.171211 kubelet[3188]: I0325 02:37:43.170602 3188 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 25 02:37:43.511344 sudo[2123]: pam_unix(sudo:session): session closed for user root Mar 25 02:37:43.511962 sshd[2122]: Connection closed by 139.178.68.195 port 51164 Mar 25 02:37:43.512123 sshd-session[2119]: pam_unix(sshd:session): session closed for user core Mar 25 02:37:43.513756 systemd[1]: sshd@12-147.75.90.239:22-139.178.68.195:51164.service: Deactivated successfully. Mar 25 02:37:43.514711 systemd[1]: session-11.scope: Deactivated successfully. Mar 25 02:37:43.514804 systemd[1]: session-11.scope: Consumed 3.496s CPU time, 231.5M memory peak. Mar 25 02:37:43.515841 systemd-logind[1800]: Session 11 logged out. Waiting for processes to exit. Mar 25 02:37:43.516488 systemd-logind[1800]: Removed session 11. Mar 25 02:37:44.274193 systemd[1]: Created slice kubepods-besteffort-pod3fcc69a8_04f0_41be_be8b_66b5e7240ddd.slice - libcontainer container kubepods-besteffort-pod3fcc69a8_04f0_41be_be8b_66b5e7240ddd.slice. Mar 25 02:37:44.277541 kubelet[3188]: I0325 02:37:44.277456 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3fcc69a8-04f0-41be-be8b-66b5e7240ddd-xtables-lock\") pod \"kube-proxy-jl8tq\" (UID: \"3fcc69a8-04f0-41be-be8b-66b5e7240ddd\") " pod="kube-system/kube-proxy-jl8tq" Mar 25 02:37:44.278343 kubelet[3188]: I0325 02:37:44.277565 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/3fcc69a8-04f0-41be-be8b-66b5e7240ddd-kube-proxy\") pod \"kube-proxy-jl8tq\" (UID: \"3fcc69a8-04f0-41be-be8b-66b5e7240ddd\") " pod="kube-system/kube-proxy-jl8tq" Mar 25 02:37:44.278343 kubelet[3188]: I0325 02:37:44.277622 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3fcc69a8-04f0-41be-be8b-66b5e7240ddd-lib-modules\") pod \"kube-proxy-jl8tq\" (UID: \"3fcc69a8-04f0-41be-be8b-66b5e7240ddd\") " pod="kube-system/kube-proxy-jl8tq" Mar 25 02:37:44.278343 kubelet[3188]: I0325 02:37:44.277712 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpcnc\" (UniqueName: \"kubernetes.io/projected/3fcc69a8-04f0-41be-be8b-66b5e7240ddd-kube-api-access-gpcnc\") pod \"kube-proxy-jl8tq\" (UID: \"3fcc69a8-04f0-41be-be8b-66b5e7240ddd\") " pod="kube-system/kube-proxy-jl8tq" Mar 25 02:37:44.346090 systemd[1]: Created slice kubepods-besteffort-pod44cbf0a0_e424_474f_9204_a141c3db63f4.slice - libcontainer container kubepods-besteffort-pod44cbf0a0_e424_474f_9204_a141c3db63f4.slice. Mar 25 02:37:44.379103 kubelet[3188]: I0325 02:37:44.379009 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/44cbf0a0-e424-474f-9204-a141c3db63f4-var-lib-calico\") pod \"tigera-operator-ccfc44587-rfhr8\" (UID: \"44cbf0a0-e424-474f-9204-a141c3db63f4\") " pod="tigera-operator/tigera-operator-ccfc44587-rfhr8" Mar 25 02:37:44.379103 kubelet[3188]: I0325 02:37:44.379105 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4xcf\" (UniqueName: \"kubernetes.io/projected/44cbf0a0-e424-474f-9204-a141c3db63f4-kube-api-access-k4xcf\") pod \"tigera-operator-ccfc44587-rfhr8\" (UID: \"44cbf0a0-e424-474f-9204-a141c3db63f4\") " pod="tigera-operator/tigera-operator-ccfc44587-rfhr8" Mar 25 02:37:44.598255 containerd[1819]: time="2025-03-25T02:37:44.598013094Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jl8tq,Uid:3fcc69a8-04f0-41be-be8b-66b5e7240ddd,Namespace:kube-system,Attempt:0,}" Mar 25 02:37:44.606161 containerd[1819]: time="2025-03-25T02:37:44.606114129Z" level=info msg="connecting to shim 517847e68fc0456f8362f33cdfe4f58def6f81de7f980d5f3d01b3837592875d" address="unix:///run/containerd/s/eb3b03546336538094dc1e81e18da5f37348e9d0aeb21c3f56d8e7202be65e66" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:37:44.632909 systemd[1]: Started cri-containerd-517847e68fc0456f8362f33cdfe4f58def6f81de7f980d5f3d01b3837592875d.scope - libcontainer container 517847e68fc0456f8362f33cdfe4f58def6f81de7f980d5f3d01b3837592875d. Mar 25 02:37:44.649958 containerd[1819]: time="2025-03-25T02:37:44.649886656Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-ccfc44587-rfhr8,Uid:44cbf0a0-e424-474f-9204-a141c3db63f4,Namespace:tigera-operator,Attempt:0,}" Mar 25 02:37:44.650027 containerd[1819]: time="2025-03-25T02:37:44.649965995Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jl8tq,Uid:3fcc69a8-04f0-41be-be8b-66b5e7240ddd,Namespace:kube-system,Attempt:0,} returns sandbox id \"517847e68fc0456f8362f33cdfe4f58def6f81de7f980d5f3d01b3837592875d\"" Mar 25 02:37:44.651054 containerd[1819]: time="2025-03-25T02:37:44.651041393Z" level=info msg="CreateContainer within sandbox \"517847e68fc0456f8362f33cdfe4f58def6f81de7f980d5f3d01b3837592875d\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 25 02:37:44.655942 containerd[1819]: time="2025-03-25T02:37:44.655898149Z" level=info msg="Container ea3fecab24943c844276aa10a1037a07e68c60d82d993d38c56e81134710bd33: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:37:44.657645 containerd[1819]: time="2025-03-25T02:37:44.657608999Z" level=info msg="connecting to shim db66b328053df6ad6552874937d39a13eded766c5e16c6e5e5fe95d43910d0b7" address="unix:///run/containerd/s/5025f3bfb5a3766db0f250415c010d5609d5f069962aec1081183f52fac3e483" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:37:44.659549 containerd[1819]: time="2025-03-25T02:37:44.659533247Z" level=info msg="CreateContainer within sandbox \"517847e68fc0456f8362f33cdfe4f58def6f81de7f980d5f3d01b3837592875d\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"ea3fecab24943c844276aa10a1037a07e68c60d82d993d38c56e81134710bd33\"" Mar 25 02:37:44.659857 containerd[1819]: time="2025-03-25T02:37:44.659846125Z" level=info msg="StartContainer for \"ea3fecab24943c844276aa10a1037a07e68c60d82d993d38c56e81134710bd33\"" Mar 25 02:37:44.660599 containerd[1819]: time="2025-03-25T02:37:44.660584700Z" level=info msg="connecting to shim ea3fecab24943c844276aa10a1037a07e68c60d82d993d38c56e81134710bd33" address="unix:///run/containerd/s/eb3b03546336538094dc1e81e18da5f37348e9d0aeb21c3f56d8e7202be65e66" protocol=ttrpc version=3 Mar 25 02:37:44.681123 systemd[1]: Started cri-containerd-db66b328053df6ad6552874937d39a13eded766c5e16c6e5e5fe95d43910d0b7.scope - libcontainer container db66b328053df6ad6552874937d39a13eded766c5e16c6e5e5fe95d43910d0b7. Mar 25 02:37:44.690621 systemd[1]: Started cri-containerd-ea3fecab24943c844276aa10a1037a07e68c60d82d993d38c56e81134710bd33.scope - libcontainer container ea3fecab24943c844276aa10a1037a07e68c60d82d993d38c56e81134710bd33. Mar 25 02:37:44.748298 containerd[1819]: time="2025-03-25T02:37:44.748269376Z" level=info msg="StartContainer for \"ea3fecab24943c844276aa10a1037a07e68c60d82d993d38c56e81134710bd33\" returns successfully" Mar 25 02:37:44.755468 containerd[1819]: time="2025-03-25T02:37:44.755445943Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-ccfc44587-rfhr8,Uid:44cbf0a0-e424-474f-9204-a141c3db63f4,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"db66b328053df6ad6552874937d39a13eded766c5e16c6e5e5fe95d43910d0b7\"" Mar 25 02:37:44.756207 containerd[1819]: time="2025-03-25T02:37:44.756192835Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 25 02:37:45.121138 systemd[1]: Started sshd@14-147.75.90.239:22-83.235.16.111:57248.service - OpenSSH per-connection server daemon (83.235.16.111:57248). Mar 25 02:37:45.181597 kubelet[3188]: I0325 02:37:45.181546 3188 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-jl8tq" podStartSLOduration=1.181527553 podStartE2EDuration="1.181527553s" podCreationTimestamp="2025-03-25 02:37:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 02:37:45.181420546 +0000 UTC m=+6.080731247" watchObservedRunningTime="2025-03-25 02:37:45.181527553 +0000 UTC m=+6.080838243" Mar 25 02:37:46.178866 sshd[3544]: Invalid user pa from 83.235.16.111 port 57248 Mar 25 02:37:46.369373 sshd[3544]: Received disconnect from 83.235.16.111 port 57248:11: Bye Bye [preauth] Mar 25 02:37:46.369373 sshd[3544]: Disconnected from invalid user pa 83.235.16.111 port 57248 [preauth] Mar 25 02:37:46.372056 systemd[1]: sshd@14-147.75.90.239:22-83.235.16.111:57248.service: Deactivated successfully. Mar 25 02:37:47.442175 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1407151242.mount: Deactivated successfully. Mar 25 02:37:47.678372 containerd[1819]: time="2025-03-25T02:37:47.678347612Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:47.678593 containerd[1819]: time="2025-03-25T02:37:47.678564712Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=21945008" Mar 25 02:37:47.678860 containerd[1819]: time="2025-03-25T02:37:47.678845902Z" level=info msg="ImageCreate event name:\"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:47.679901 containerd[1819]: time="2025-03-25T02:37:47.679886083Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:47.680294 containerd[1819]: time="2025-03-25T02:37:47.680283910Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"21941003\" in 2.924073189s" Mar 25 02:37:47.680314 containerd[1819]: time="2025-03-25T02:37:47.680298185Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\"" Mar 25 02:37:47.681152 containerd[1819]: time="2025-03-25T02:37:47.681138948Z" level=info msg="CreateContainer within sandbox \"db66b328053df6ad6552874937d39a13eded766c5e16c6e5e5fe95d43910d0b7\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 25 02:37:47.683830 containerd[1819]: time="2025-03-25T02:37:47.683791579Z" level=info msg="Container 1f35fa9decb7831284427ca16bd6f48f2fe46d10a2d4e41f9d6870ff656f3ce4: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:37:47.685910 containerd[1819]: time="2025-03-25T02:37:47.685869311Z" level=info msg="CreateContainer within sandbox \"db66b328053df6ad6552874937d39a13eded766c5e16c6e5e5fe95d43910d0b7\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"1f35fa9decb7831284427ca16bd6f48f2fe46d10a2d4e41f9d6870ff656f3ce4\"" Mar 25 02:37:47.686089 containerd[1819]: time="2025-03-25T02:37:47.686076048Z" level=info msg="StartContainer for \"1f35fa9decb7831284427ca16bd6f48f2fe46d10a2d4e41f9d6870ff656f3ce4\"" Mar 25 02:37:47.686477 containerd[1819]: time="2025-03-25T02:37:47.686466043Z" level=info msg="connecting to shim 1f35fa9decb7831284427ca16bd6f48f2fe46d10a2d4e41f9d6870ff656f3ce4" address="unix:///run/containerd/s/5025f3bfb5a3766db0f250415c010d5609d5f069962aec1081183f52fac3e483" protocol=ttrpc version=3 Mar 25 02:37:47.708911 systemd[1]: Started cri-containerd-1f35fa9decb7831284427ca16bd6f48f2fe46d10a2d4e41f9d6870ff656f3ce4.scope - libcontainer container 1f35fa9decb7831284427ca16bd6f48f2fe46d10a2d4e41f9d6870ff656f3ce4. Mar 25 02:37:47.723812 containerd[1819]: time="2025-03-25T02:37:47.723757509Z" level=info msg="StartContainer for \"1f35fa9decb7831284427ca16bd6f48f2fe46d10a2d4e41f9d6870ff656f3ce4\" returns successfully" Mar 25 02:37:48.196325 kubelet[3188]: I0325 02:37:48.196245 3188 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-ccfc44587-rfhr8" podStartSLOduration=1.2715725629999999 podStartE2EDuration="4.196235929s" podCreationTimestamp="2025-03-25 02:37:44 +0000 UTC" firstStartedPulling="2025-03-25 02:37:44.755977851 +0000 UTC m=+5.655288533" lastFinishedPulling="2025-03-25 02:37:47.680641216 +0000 UTC m=+8.579951899" observedRunningTime="2025-03-25 02:37:48.196198889 +0000 UTC m=+9.095509573" watchObservedRunningTime="2025-03-25 02:37:48.196235929 +0000 UTC m=+9.095546610" Mar 25 02:37:50.497078 systemd[1]: Created slice kubepods-besteffort-pod102d1d72_b9c7_405d_8552_dd6c0c177f05.slice - libcontainer container kubepods-besteffort-pod102d1d72_b9c7_405d_8552_dd6c0c177f05.slice. Mar 25 02:37:50.517659 systemd[1]: Created slice kubepods-besteffort-pod16110074_b9b9_48ea_a866_5d7872e44c83.slice - libcontainer container kubepods-besteffort-pod16110074_b9b9_48ea_a866_5d7872e44c83.slice. Mar 25 02:37:50.525531 kubelet[3188]: I0325 02:37:50.525481 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/102d1d72-b9c7-405d-8552-dd6c0c177f05-tigera-ca-bundle\") pod \"calico-typha-5bd4cf546f-7svfs\" (UID: \"102d1d72-b9c7-405d-8552-dd6c0c177f05\") " pod="calico-system/calico-typha-5bd4cf546f-7svfs" Mar 25 02:37:50.525531 kubelet[3188]: I0325 02:37:50.525507 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xskt6\" (UniqueName: \"kubernetes.io/projected/102d1d72-b9c7-405d-8552-dd6c0c177f05-kube-api-access-xskt6\") pod \"calico-typha-5bd4cf546f-7svfs\" (UID: \"102d1d72-b9c7-405d-8552-dd6c0c177f05\") " pod="calico-system/calico-typha-5bd4cf546f-7svfs" Mar 25 02:37:50.525531 kubelet[3188]: I0325 02:37:50.525520 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/102d1d72-b9c7-405d-8552-dd6c0c177f05-typha-certs\") pod \"calico-typha-5bd4cf546f-7svfs\" (UID: \"102d1d72-b9c7-405d-8552-dd6c0c177f05\") " pod="calico-system/calico-typha-5bd4cf546f-7svfs" Mar 25 02:37:50.625998 kubelet[3188]: I0325 02:37:50.625935 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/16110074-b9b9-48ea-a866-5d7872e44c83-var-lib-calico\") pod \"calico-node-v7ltq\" (UID: \"16110074-b9b9-48ea-a866-5d7872e44c83\") " pod="calico-system/calico-node-v7ltq" Mar 25 02:37:50.626248 kubelet[3188]: I0325 02:37:50.626087 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/16110074-b9b9-48ea-a866-5d7872e44c83-node-certs\") pod \"calico-node-v7ltq\" (UID: \"16110074-b9b9-48ea-a866-5d7872e44c83\") " pod="calico-system/calico-node-v7ltq" Mar 25 02:37:50.626248 kubelet[3188]: I0325 02:37:50.626173 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/16110074-b9b9-48ea-a866-5d7872e44c83-var-run-calico\") pod \"calico-node-v7ltq\" (UID: \"16110074-b9b9-48ea-a866-5d7872e44c83\") " pod="calico-system/calico-node-v7ltq" Mar 25 02:37:50.626248 kubelet[3188]: I0325 02:37:50.626227 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/16110074-b9b9-48ea-a866-5d7872e44c83-cni-net-dir\") pod \"calico-node-v7ltq\" (UID: \"16110074-b9b9-48ea-a866-5d7872e44c83\") " pod="calico-system/calico-node-v7ltq" Mar 25 02:37:50.626588 kubelet[3188]: I0325 02:37:50.626306 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/16110074-b9b9-48ea-a866-5d7872e44c83-flexvol-driver-host\") pod \"calico-node-v7ltq\" (UID: \"16110074-b9b9-48ea-a866-5d7872e44c83\") " pod="calico-system/calico-node-v7ltq" Mar 25 02:37:50.626588 kubelet[3188]: I0325 02:37:50.626427 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/16110074-b9b9-48ea-a866-5d7872e44c83-policysync\") pod \"calico-node-v7ltq\" (UID: \"16110074-b9b9-48ea-a866-5d7872e44c83\") " pod="calico-system/calico-node-v7ltq" Mar 25 02:37:50.626867 kubelet[3188]: I0325 02:37:50.626656 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/16110074-b9b9-48ea-a866-5d7872e44c83-cni-log-dir\") pod \"calico-node-v7ltq\" (UID: \"16110074-b9b9-48ea-a866-5d7872e44c83\") " pod="calico-system/calico-node-v7ltq" Mar 25 02:37:50.627033 kubelet[3188]: I0325 02:37:50.626845 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/16110074-b9b9-48ea-a866-5d7872e44c83-xtables-lock\") pod \"calico-node-v7ltq\" (UID: \"16110074-b9b9-48ea-a866-5d7872e44c83\") " pod="calico-system/calico-node-v7ltq" Mar 25 02:37:50.627033 kubelet[3188]: I0325 02:37:50.626966 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16110074-b9b9-48ea-a866-5d7872e44c83-tigera-ca-bundle\") pod \"calico-node-v7ltq\" (UID: \"16110074-b9b9-48ea-a866-5d7872e44c83\") " pod="calico-system/calico-node-v7ltq" Mar 25 02:37:50.627374 kubelet[3188]: I0325 02:37:50.627173 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/16110074-b9b9-48ea-a866-5d7872e44c83-lib-modules\") pod \"calico-node-v7ltq\" (UID: \"16110074-b9b9-48ea-a866-5d7872e44c83\") " pod="calico-system/calico-node-v7ltq" Mar 25 02:37:50.627374 kubelet[3188]: I0325 02:37:50.627308 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/16110074-b9b9-48ea-a866-5d7872e44c83-cni-bin-dir\") pod \"calico-node-v7ltq\" (UID: \"16110074-b9b9-48ea-a866-5d7872e44c83\") " pod="calico-system/calico-node-v7ltq" Mar 25 02:37:50.627712 kubelet[3188]: I0325 02:37:50.627416 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqng8\" (UniqueName: \"kubernetes.io/projected/16110074-b9b9-48ea-a866-5d7872e44c83-kube-api-access-sqng8\") pod \"calico-node-v7ltq\" (UID: \"16110074-b9b9-48ea-a866-5d7872e44c83\") " pod="calico-system/calico-node-v7ltq" Mar 25 02:37:50.658577 kubelet[3188]: E0325 02:37:50.658533 3188 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-84cjd" podUID="8d6881ea-948c-42f2-b0a9-c85a672155d1" Mar 25 02:37:50.728248 kubelet[3188]: I0325 02:37:50.728228 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fw7m\" (UniqueName: \"kubernetes.io/projected/8d6881ea-948c-42f2-b0a9-c85a672155d1-kube-api-access-7fw7m\") pod \"csi-node-driver-84cjd\" (UID: \"8d6881ea-948c-42f2-b0a9-c85a672155d1\") " pod="calico-system/csi-node-driver-84cjd" Mar 25 02:37:50.728327 kubelet[3188]: I0325 02:37:50.728261 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8d6881ea-948c-42f2-b0a9-c85a672155d1-registration-dir\") pod \"csi-node-driver-84cjd\" (UID: \"8d6881ea-948c-42f2-b0a9-c85a672155d1\") " pod="calico-system/csi-node-driver-84cjd" Mar 25 02:37:50.728327 kubelet[3188]: I0325 02:37:50.728307 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8d6881ea-948c-42f2-b0a9-c85a672155d1-socket-dir\") pod \"csi-node-driver-84cjd\" (UID: \"8d6881ea-948c-42f2-b0a9-c85a672155d1\") " pod="calico-system/csi-node-driver-84cjd" Mar 25 02:37:50.728382 kubelet[3188]: I0325 02:37:50.728328 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/8d6881ea-948c-42f2-b0a9-c85a672155d1-varrun\") pod \"csi-node-driver-84cjd\" (UID: \"8d6881ea-948c-42f2-b0a9-c85a672155d1\") " pod="calico-system/csi-node-driver-84cjd" Mar 25 02:37:50.728427 kubelet[3188]: I0325 02:37:50.728418 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d6881ea-948c-42f2-b0a9-c85a672155d1-kubelet-dir\") pod \"csi-node-driver-84cjd\" (UID: \"8d6881ea-948c-42f2-b0a9-c85a672155d1\") " pod="calico-system/csi-node-driver-84cjd" Mar 25 02:37:50.728558 kubelet[3188]: E0325 02:37:50.728551 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.728582 kubelet[3188]: W0325 02:37:50.728559 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.728582 kubelet[3188]: E0325 02:37:50.728567 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.728733 kubelet[3188]: E0325 02:37:50.728723 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.728733 kubelet[3188]: W0325 02:37:50.728732 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.728783 kubelet[3188]: E0325 02:37:50.728743 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.728877 kubelet[3188]: E0325 02:37:50.728871 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.728877 kubelet[3188]: W0325 02:37:50.728876 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.728920 kubelet[3188]: E0325 02:37:50.728882 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.729035 kubelet[3188]: E0325 02:37:50.729028 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.729059 kubelet[3188]: W0325 02:37:50.729035 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.729059 kubelet[3188]: E0325 02:37:50.729043 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.729131 kubelet[3188]: E0325 02:37:50.729125 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.729131 kubelet[3188]: W0325 02:37:50.729130 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.729177 kubelet[3188]: E0325 02:37:50.729136 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.729244 kubelet[3188]: E0325 02:37:50.729238 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.729265 kubelet[3188]: W0325 02:37:50.729244 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.729265 kubelet[3188]: E0325 02:37:50.729250 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.729335 kubelet[3188]: E0325 02:37:50.729330 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.729356 kubelet[3188]: W0325 02:37:50.729334 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.729356 kubelet[3188]: E0325 02:37:50.729341 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.729413 kubelet[3188]: E0325 02:37:50.729408 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.729437 kubelet[3188]: W0325 02:37:50.729413 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.729437 kubelet[3188]: E0325 02:37:50.729418 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.729526 kubelet[3188]: E0325 02:37:50.729519 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.729526 kubelet[3188]: W0325 02:37:50.729524 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.729588 kubelet[3188]: E0325 02:37:50.729530 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.729773 kubelet[3188]: E0325 02:37:50.729762 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.729807 kubelet[3188]: W0325 02:37:50.729772 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.729807 kubelet[3188]: E0325 02:37:50.729789 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.729891 kubelet[3188]: E0325 02:37:50.729884 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.729925 kubelet[3188]: W0325 02:37:50.729891 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.729925 kubelet[3188]: E0325 02:37:50.729904 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.730003 kubelet[3188]: E0325 02:37:50.729994 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.730033 kubelet[3188]: W0325 02:37:50.730004 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.730033 kubelet[3188]: E0325 02:37:50.730020 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.730107 kubelet[3188]: E0325 02:37:50.730100 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.730142 kubelet[3188]: W0325 02:37:50.730107 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.730142 kubelet[3188]: E0325 02:37:50.730116 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.730220 kubelet[3188]: E0325 02:37:50.730214 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.730220 kubelet[3188]: W0325 02:37:50.730219 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.730282 kubelet[3188]: E0325 02:37:50.730225 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.730347 kubelet[3188]: E0325 02:37:50.730338 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.730383 kubelet[3188]: W0325 02:37:50.730348 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.730383 kubelet[3188]: E0325 02:37:50.730360 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.730465 kubelet[3188]: E0325 02:37:50.730458 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.730496 kubelet[3188]: W0325 02:37:50.730465 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.730496 kubelet[3188]: E0325 02:37:50.730475 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.730579 kubelet[3188]: E0325 02:37:50.730572 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.730612 kubelet[3188]: W0325 02:37:50.730579 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.730612 kubelet[3188]: E0325 02:37:50.730588 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.730766 kubelet[3188]: E0325 02:37:50.730758 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.730766 kubelet[3188]: W0325 02:37:50.730766 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.730828 kubelet[3188]: E0325 02:37:50.730776 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.730893 kubelet[3188]: E0325 02:37:50.730886 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.730926 kubelet[3188]: W0325 02:37:50.730893 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.730926 kubelet[3188]: E0325 02:37:50.730903 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.731006 kubelet[3188]: E0325 02:37:50.731000 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.731039 kubelet[3188]: W0325 02:37:50.731007 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.731039 kubelet[3188]: E0325 02:37:50.731016 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.731135 kubelet[3188]: E0325 02:37:50.731129 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.731167 kubelet[3188]: W0325 02:37:50.731136 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.731167 kubelet[3188]: E0325 02:37:50.731144 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.731237 kubelet[3188]: E0325 02:37:50.731231 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.731268 kubelet[3188]: W0325 02:37:50.731238 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.731268 kubelet[3188]: E0325 02:37:50.731247 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.731349 kubelet[3188]: E0325 02:37:50.731343 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.731378 kubelet[3188]: W0325 02:37:50.731349 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.731378 kubelet[3188]: E0325 02:37:50.731361 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.731484 kubelet[3188]: E0325 02:37:50.731478 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.731516 kubelet[3188]: W0325 02:37:50.731484 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.731516 kubelet[3188]: E0325 02:37:50.731493 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.731599 kubelet[3188]: E0325 02:37:50.731593 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.731635 kubelet[3188]: W0325 02:37:50.731599 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.731635 kubelet[3188]: E0325 02:37:50.731608 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.731748 kubelet[3188]: E0325 02:37:50.731741 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.731780 kubelet[3188]: W0325 02:37:50.731748 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.731780 kubelet[3188]: E0325 02:37:50.731757 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.731876 kubelet[3188]: E0325 02:37:50.731869 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.731908 kubelet[3188]: W0325 02:37:50.731877 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.731908 kubelet[3188]: E0325 02:37:50.731886 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.732075 kubelet[3188]: E0325 02:37:50.732067 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.732075 kubelet[3188]: W0325 02:37:50.732073 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.732143 kubelet[3188]: E0325 02:37:50.732079 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.736387 kubelet[3188]: E0325 02:37:50.736376 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.736387 kubelet[3188]: W0325 02:37:50.736383 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.736480 kubelet[3188]: E0325 02:37:50.736390 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.801551 containerd[1819]: time="2025-03-25T02:37:50.801382518Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5bd4cf546f-7svfs,Uid:102d1d72-b9c7-405d-8552-dd6c0c177f05,Namespace:calico-system,Attempt:0,}" Mar 25 02:37:50.809224 containerd[1819]: time="2025-03-25T02:37:50.809200317Z" level=info msg="connecting to shim 9a81a81544f352ef4445be84906d49d058763020bb81a0714dc3d9e6de316589" address="unix:///run/containerd/s/4aca9fc2d1afba8d8a9066e3823072e91f3d2b5c4307683e03b9c6ed7ceed460" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:37:50.820071 containerd[1819]: time="2025-03-25T02:37:50.820010901Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-v7ltq,Uid:16110074-b9b9-48ea-a866-5d7872e44c83,Namespace:calico-system,Attempt:0,}" Mar 25 02:37:50.829771 systemd[1]: Started cri-containerd-9a81a81544f352ef4445be84906d49d058763020bb81a0714dc3d9e6de316589.scope - libcontainer container 9a81a81544f352ef4445be84906d49d058763020bb81a0714dc3d9e6de316589. Mar 25 02:37:50.831343 kubelet[3188]: E0325 02:37:50.831327 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.831343 kubelet[3188]: W0325 02:37:50.831339 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.831435 kubelet[3188]: E0325 02:37:50.831350 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.831455 kubelet[3188]: E0325 02:37:50.831446 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.831455 kubelet[3188]: W0325 02:37:50.831452 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.831487 kubelet[3188]: E0325 02:37:50.831459 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.831546 kubelet[3188]: E0325 02:37:50.831541 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.831565 kubelet[3188]: W0325 02:37:50.831546 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.831565 kubelet[3188]: E0325 02:37:50.831552 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.831694 kubelet[3188]: E0325 02:37:50.831657 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.831694 kubelet[3188]: W0325 02:37:50.831665 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.831694 kubelet[3188]: E0325 02:37:50.831673 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.831772 kubelet[3188]: E0325 02:37:50.831752 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.831772 kubelet[3188]: W0325 02:37:50.831759 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.831772 kubelet[3188]: E0325 02:37:50.831768 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.831872 kubelet[3188]: E0325 02:37:50.831865 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.831872 kubelet[3188]: W0325 02:37:50.831871 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.831918 kubelet[3188]: E0325 02:37:50.831880 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.832057 kubelet[3188]: E0325 02:37:50.832049 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.832078 kubelet[3188]: W0325 02:37:50.832058 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.832078 kubelet[3188]: E0325 02:37:50.832071 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.832178 kubelet[3188]: E0325 02:37:50.832171 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.832208 kubelet[3188]: W0325 02:37:50.832179 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.832208 kubelet[3188]: E0325 02:37:50.832189 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.832294 kubelet[3188]: E0325 02:37:50.832289 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.832317 kubelet[3188]: W0325 02:37:50.832296 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.832317 kubelet[3188]: E0325 02:37:50.832305 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.832441 kubelet[3188]: E0325 02:37:50.832435 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.832462 kubelet[3188]: W0325 02:37:50.832443 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.832462 kubelet[3188]: E0325 02:37:50.832452 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.832572 kubelet[3188]: E0325 02:37:50.832566 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.832596 kubelet[3188]: W0325 02:37:50.832573 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.832596 kubelet[3188]: E0325 02:37:50.832582 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.832723 kubelet[3188]: E0325 02:37:50.832716 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.832779 kubelet[3188]: W0325 02:37:50.832723 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.832779 kubelet[3188]: E0325 02:37:50.832739 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.832819 containerd[1819]: time="2025-03-25T02:37:50.832796263Z" level=info msg="connecting to shim a7a9a9e61787f4ea212d42f7bcff30b7eb5b80c1af314a353b0a4b80d9ca4938" address="unix:///run/containerd/s/c9488742c006459f61853947d7314575ad34e18f2378c84f69f5c058b4e4c143" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:37:50.832851 kubelet[3188]: E0325 02:37:50.832831 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.832851 kubelet[3188]: W0325 02:37:50.832837 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.832898 kubelet[3188]: E0325 02:37:50.832853 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.832974 kubelet[3188]: E0325 02:37:50.832969 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.833000 kubelet[3188]: W0325 02:37:50.832975 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.833000 kubelet[3188]: E0325 02:37:50.832984 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.833096 kubelet[3188]: E0325 02:37:50.833090 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.833120 kubelet[3188]: W0325 02:37:50.833096 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.833120 kubelet[3188]: E0325 02:37:50.833105 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.833200 kubelet[3188]: E0325 02:37:50.833194 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.833222 kubelet[3188]: W0325 02:37:50.833201 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.833222 kubelet[3188]: E0325 02:37:50.833209 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.833314 kubelet[3188]: E0325 02:37:50.833308 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.833337 kubelet[3188]: W0325 02:37:50.833315 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.833337 kubelet[3188]: E0325 02:37:50.833323 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.833442 kubelet[3188]: E0325 02:37:50.833436 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.833465 kubelet[3188]: W0325 02:37:50.833443 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.833465 kubelet[3188]: E0325 02:37:50.833453 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.833621 kubelet[3188]: E0325 02:37:50.833613 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.833662 kubelet[3188]: W0325 02:37:50.833622 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.833662 kubelet[3188]: E0325 02:37:50.833640 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.833728 kubelet[3188]: E0325 02:37:50.833722 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.833728 kubelet[3188]: W0325 02:37:50.833727 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.833786 kubelet[3188]: E0325 02:37:50.833733 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.833817 kubelet[3188]: E0325 02:37:50.833812 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.833817 kubelet[3188]: W0325 02:37:50.833817 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.833872 kubelet[3188]: E0325 02:37:50.833823 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.833928 kubelet[3188]: E0325 02:37:50.833920 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.833959 kubelet[3188]: W0325 02:37:50.833929 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.833959 kubelet[3188]: E0325 02:37:50.833939 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.834054 kubelet[3188]: E0325 02:37:50.834046 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.834090 kubelet[3188]: W0325 02:37:50.834054 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.834090 kubelet[3188]: E0325 02:37:50.834064 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.834199 kubelet[3188]: E0325 02:37:50.834192 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.834199 kubelet[3188]: W0325 02:37:50.834198 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.834258 kubelet[3188]: E0325 02:37:50.834205 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.834325 kubelet[3188]: E0325 02:37:50.834319 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.834348 kubelet[3188]: W0325 02:37:50.834324 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.834348 kubelet[3188]: E0325 02:37:50.834330 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.838044 kubelet[3188]: E0325 02:37:50.837986 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.838044 kubelet[3188]: W0325 02:37:50.837996 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.838044 kubelet[3188]: E0325 02:37:50.838007 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.840862 systemd[1]: Started cri-containerd-a7a9a9e61787f4ea212d42f7bcff30b7eb5b80c1af314a353b0a4b80d9ca4938.scope - libcontainer container a7a9a9e61787f4ea212d42f7bcff30b7eb5b80c1af314a353b0a4b80d9ca4938. Mar 25 02:37:50.851345 containerd[1819]: time="2025-03-25T02:37:50.851325047Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-v7ltq,Uid:16110074-b9b9-48ea-a866-5d7872e44c83,Namespace:calico-system,Attempt:0,} returns sandbox id \"a7a9a9e61787f4ea212d42f7bcff30b7eb5b80c1af314a353b0a4b80d9ca4938\"" Mar 25 02:37:50.851977 containerd[1819]: time="2025-03-25T02:37:50.851959619Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 25 02:37:50.856538 containerd[1819]: time="2025-03-25T02:37:50.856487534Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5bd4cf546f-7svfs,Uid:102d1d72-b9c7-405d-8552-dd6c0c177f05,Namespace:calico-system,Attempt:0,} returns sandbox id \"9a81a81544f352ef4445be84906d49d058763020bb81a0714dc3d9e6de316589\"" Mar 25 02:37:50.913567 kubelet[3188]: E0325 02:37:50.913501 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.913567 kubelet[3188]: W0325 02:37:50.913556 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.913943 kubelet[3188]: E0325 02:37:50.913601 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.914271 kubelet[3188]: E0325 02:37:50.914234 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.914378 kubelet[3188]: W0325 02:37:50.914274 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.914378 kubelet[3188]: E0325 02:37:50.914309 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.914974 kubelet[3188]: E0325 02:37:50.914936 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.915124 kubelet[3188]: W0325 02:37:50.914976 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.915124 kubelet[3188]: E0325 02:37:50.915010 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.915779 kubelet[3188]: E0325 02:37:50.915729 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.915779 kubelet[3188]: W0325 02:37:50.915767 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.916132 kubelet[3188]: E0325 02:37:50.915802 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.916476 kubelet[3188]: E0325 02:37:50.916430 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.916476 kubelet[3188]: W0325 02:37:50.916467 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.916827 kubelet[3188]: E0325 02:37:50.916503 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.917179 kubelet[3188]: E0325 02:37:50.917132 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.917179 kubelet[3188]: W0325 02:37:50.917170 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.917504 kubelet[3188]: E0325 02:37:50.917205 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.917838 kubelet[3188]: E0325 02:37:50.917794 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.917838 kubelet[3188]: W0325 02:37:50.917824 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.918159 kubelet[3188]: E0325 02:37:50.917854 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.918480 kubelet[3188]: E0325 02:37:50.918423 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.918480 kubelet[3188]: W0325 02:37:50.918461 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.918765 kubelet[3188]: E0325 02:37:50.918495 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.919204 kubelet[3188]: E0325 02:37:50.919155 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.919204 kubelet[3188]: W0325 02:37:50.919193 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.919493 kubelet[3188]: E0325 02:37:50.919226 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.919905 kubelet[3188]: E0325 02:37:50.919856 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.919905 kubelet[3188]: W0325 02:37:50.919895 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.920243 kubelet[3188]: E0325 02:37:50.919929 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.920570 kubelet[3188]: E0325 02:37:50.920528 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.920715 kubelet[3188]: W0325 02:37:50.920567 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.920715 kubelet[3188]: E0325 02:37:50.920602 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.921260 kubelet[3188]: E0325 02:37:50.921210 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.921260 kubelet[3188]: W0325 02:37:50.921249 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.921588 kubelet[3188]: E0325 02:37:50.921283 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.921988 kubelet[3188]: E0325 02:37:50.921940 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.921988 kubelet[3188]: W0325 02:37:50.921978 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.922294 kubelet[3188]: E0325 02:37:50.922013 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.922607 kubelet[3188]: E0325 02:37:50.922577 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.922734 kubelet[3188]: W0325 02:37:50.922606 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.922734 kubelet[3188]: E0325 02:37:50.922658 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.923276 kubelet[3188]: E0325 02:37:50.923228 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.923276 kubelet[3188]: W0325 02:37:50.923265 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.923587 kubelet[3188]: E0325 02:37:50.923300 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.923987 kubelet[3188]: E0325 02:37:50.923938 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.923987 kubelet[3188]: W0325 02:37:50.923976 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.924304 kubelet[3188]: E0325 02:37:50.924009 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.924622 kubelet[3188]: E0325 02:37:50.924592 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.924766 kubelet[3188]: W0325 02:37:50.924621 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.924766 kubelet[3188]: E0325 02:37:50.924692 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.925299 kubelet[3188]: E0325 02:37:50.925250 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.925299 kubelet[3188]: W0325 02:37:50.925288 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.925717 kubelet[3188]: E0325 02:37:50.925322 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.926034 kubelet[3188]: E0325 02:37:50.925976 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.926034 kubelet[3188]: W0325 02:37:50.926006 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.926034 kubelet[3188]: E0325 02:37:50.926036 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.926608 kubelet[3188]: E0325 02:37:50.926559 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.926608 kubelet[3188]: W0325 02:37:50.926588 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.926608 kubelet[3188]: E0325 02:37:50.926615 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.927290 kubelet[3188]: E0325 02:37:50.927246 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.927290 kubelet[3188]: W0325 02:37:50.927285 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.927573 kubelet[3188]: E0325 02:37:50.927314 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.927975 kubelet[3188]: E0325 02:37:50.927922 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.927975 kubelet[3188]: W0325 02:37:50.927961 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.928312 kubelet[3188]: E0325 02:37:50.927995 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.928538 kubelet[3188]: E0325 02:37:50.928503 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.928538 kubelet[3188]: W0325 02:37:50.928535 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.928882 kubelet[3188]: E0325 02:37:50.928566 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.929229 kubelet[3188]: E0325 02:37:50.929175 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.929229 kubelet[3188]: W0325 02:37:50.929218 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.929575 kubelet[3188]: E0325 02:37:50.929253 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:50.929956 kubelet[3188]: E0325 02:37:50.929904 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:50.929956 kubelet[3188]: W0325 02:37:50.929944 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:50.930221 kubelet[3188]: E0325 02:37:50.929977 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.084960 update_engine[1805]: I20250325 02:37:51.084699 1805 update_attempter.cc:509] Updating boot flags... Mar 25 02:37:51.124663 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (3862) Mar 25 02:37:51.156644 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (3863) Mar 25 02:37:51.231967 kubelet[3188]: E0325 02:37:51.231872 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.231967 kubelet[3188]: W0325 02:37:51.231924 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.231967 kubelet[3188]: E0325 02:37:51.231968 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.232575 kubelet[3188]: E0325 02:37:51.232528 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.232575 kubelet[3188]: W0325 02:37:51.232568 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.232960 kubelet[3188]: E0325 02:37:51.232603 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.233283 kubelet[3188]: E0325 02:37:51.233199 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.233283 kubelet[3188]: W0325 02:37:51.233238 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.233283 kubelet[3188]: E0325 02:37:51.233272 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.233950 kubelet[3188]: E0325 02:37:51.233865 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.233950 kubelet[3188]: W0325 02:37:51.233904 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.233950 kubelet[3188]: E0325 02:37:51.233939 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.234584 kubelet[3188]: E0325 02:37:51.234521 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.234584 kubelet[3188]: W0325 02:37:51.234560 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.234926 kubelet[3188]: E0325 02:37:51.234596 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.235253 kubelet[3188]: E0325 02:37:51.235204 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.235253 kubelet[3188]: W0325 02:37:51.235243 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.235655 kubelet[3188]: E0325 02:37:51.235279 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.235876 kubelet[3188]: E0325 02:37:51.235818 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.235876 kubelet[3188]: W0325 02:37:51.235846 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.235876 kubelet[3188]: E0325 02:37:51.235875 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.236501 kubelet[3188]: E0325 02:37:51.236446 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.236501 kubelet[3188]: W0325 02:37:51.236496 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.236818 kubelet[3188]: E0325 02:37:51.236548 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.237323 kubelet[3188]: E0325 02:37:51.237246 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.237323 kubelet[3188]: W0325 02:37:51.237283 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.237323 kubelet[3188]: E0325 02:37:51.237321 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.237940 kubelet[3188]: E0325 02:37:51.237852 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.237940 kubelet[3188]: W0325 02:37:51.237891 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.237940 kubelet[3188]: E0325 02:37:51.237925 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.238505 kubelet[3188]: E0325 02:37:51.238424 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.238505 kubelet[3188]: W0325 02:37:51.238453 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.238505 kubelet[3188]: E0325 02:37:51.238481 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.239097 kubelet[3188]: E0325 02:37:51.239009 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.239097 kubelet[3188]: W0325 02:37:51.239048 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.239097 kubelet[3188]: E0325 02:37:51.239083 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.239608 kubelet[3188]: E0325 02:37:51.239577 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.239608 kubelet[3188]: W0325 02:37:51.239606 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.239926 kubelet[3188]: E0325 02:37:51.239654 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.240288 kubelet[3188]: E0325 02:37:51.240251 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.240401 kubelet[3188]: W0325 02:37:51.240289 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.240401 kubelet[3188]: E0325 02:37:51.240325 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.240984 kubelet[3188]: E0325 02:37:51.240948 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.241101 kubelet[3188]: W0325 02:37:51.240986 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.241101 kubelet[3188]: E0325 02:37:51.241021 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.241680 kubelet[3188]: E0325 02:37:51.241584 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.241680 kubelet[3188]: W0325 02:37:51.241611 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.241680 kubelet[3188]: E0325 02:37:51.241680 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.242316 kubelet[3188]: E0325 02:37:51.242227 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.242316 kubelet[3188]: W0325 02:37:51.242266 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.242316 kubelet[3188]: E0325 02:37:51.242300 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.242938 kubelet[3188]: E0325 02:37:51.242885 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.242938 kubelet[3188]: W0325 02:37:51.242934 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.243326 kubelet[3188]: E0325 02:37:51.242969 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.243570 kubelet[3188]: E0325 02:37:51.243532 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.243570 kubelet[3188]: W0325 02:37:51.243563 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.243801 kubelet[3188]: E0325 02:37:51.243597 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:51.244317 kubelet[3188]: E0325 02:37:51.244229 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:37:51.244317 kubelet[3188]: W0325 02:37:51.244267 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:37:51.244658 kubelet[3188]: E0325 02:37:51.244300 3188 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:37:52.149670 kubelet[3188]: E0325 02:37:52.149510 3188 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-84cjd" podUID="8d6881ea-948c-42f2-b0a9-c85a672155d1" Mar 25 02:37:52.456507 containerd[1819]: time="2025-03-25T02:37:52.456451533Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:52.456717 containerd[1819]: time="2025-03-25T02:37:52.456693009Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5364011" Mar 25 02:37:52.457099 containerd[1819]: time="2025-03-25T02:37:52.457089146Z" level=info msg="ImageCreate event name:\"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:52.457890 containerd[1819]: time="2025-03-25T02:37:52.457848861Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:52.458244 containerd[1819]: time="2025-03-25T02:37:52.458204634Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6857075\" in 1.606219052s" Mar 25 02:37:52.458244 containerd[1819]: time="2025-03-25T02:37:52.458219567Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\"" Mar 25 02:37:52.458664 containerd[1819]: time="2025-03-25T02:37:52.458646156Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 25 02:37:52.459180 containerd[1819]: time="2025-03-25T02:37:52.459165341Z" level=info msg="CreateContainer within sandbox \"a7a9a9e61787f4ea212d42f7bcff30b7eb5b80c1af314a353b0a4b80d9ca4938\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 25 02:37:52.462436 containerd[1819]: time="2025-03-25T02:37:52.462424804Z" level=info msg="Container 0d0f48863e5af7875492191d5a8482e4ea7fc3974ac1364c9d301ddcd64ce0a0: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:37:52.465881 containerd[1819]: time="2025-03-25T02:37:52.465841480Z" level=info msg="CreateContainer within sandbox \"a7a9a9e61787f4ea212d42f7bcff30b7eb5b80c1af314a353b0a4b80d9ca4938\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"0d0f48863e5af7875492191d5a8482e4ea7fc3974ac1364c9d301ddcd64ce0a0\"" Mar 25 02:37:52.466077 containerd[1819]: time="2025-03-25T02:37:52.466064780Z" level=info msg="StartContainer for \"0d0f48863e5af7875492191d5a8482e4ea7fc3974ac1364c9d301ddcd64ce0a0\"" Mar 25 02:37:52.466797 containerd[1819]: time="2025-03-25T02:37:52.466785592Z" level=info msg="connecting to shim 0d0f48863e5af7875492191d5a8482e4ea7fc3974ac1364c9d301ddcd64ce0a0" address="unix:///run/containerd/s/c9488742c006459f61853947d7314575ad34e18f2378c84f69f5c058b4e4c143" protocol=ttrpc version=3 Mar 25 02:37:52.486982 systemd[1]: Started cri-containerd-0d0f48863e5af7875492191d5a8482e4ea7fc3974ac1364c9d301ddcd64ce0a0.scope - libcontainer container 0d0f48863e5af7875492191d5a8482e4ea7fc3974ac1364c9d301ddcd64ce0a0. Mar 25 02:37:52.505547 containerd[1819]: time="2025-03-25T02:37:52.505525154Z" level=info msg="StartContainer for \"0d0f48863e5af7875492191d5a8482e4ea7fc3974ac1364c9d301ddcd64ce0a0\" returns successfully" Mar 25 02:37:52.509691 systemd[1]: cri-containerd-0d0f48863e5af7875492191d5a8482e4ea7fc3974ac1364c9d301ddcd64ce0a0.scope: Deactivated successfully. Mar 25 02:37:52.510992 containerd[1819]: time="2025-03-25T02:37:52.510947179Z" level=info msg="received exit event container_id:\"0d0f48863e5af7875492191d5a8482e4ea7fc3974ac1364c9d301ddcd64ce0a0\" id:\"0d0f48863e5af7875492191d5a8482e4ea7fc3974ac1364c9d301ddcd64ce0a0\" pid:3914 exited_at:{seconds:1742870272 nanos:510760710}" Mar 25 02:37:52.511040 containerd[1819]: time="2025-03-25T02:37:52.511022963Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d0f48863e5af7875492191d5a8482e4ea7fc3974ac1364c9d301ddcd64ce0a0\" id:\"0d0f48863e5af7875492191d5a8482e4ea7fc3974ac1364c9d301ddcd64ce0a0\" pid:3914 exited_at:{seconds:1742870272 nanos:510760710}" Mar 25 02:37:52.520875 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0d0f48863e5af7875492191d5a8482e4ea7fc3974ac1364c9d301ddcd64ce0a0-rootfs.mount: Deactivated successfully. Mar 25 02:37:54.149279 kubelet[3188]: E0325 02:37:54.149142 3188 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-84cjd" podUID="8d6881ea-948c-42f2-b0a9-c85a672155d1" Mar 25 02:37:55.164173 containerd[1819]: time="2025-03-25T02:37:55.164139053Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:55.164391 containerd[1819]: time="2025-03-25T02:37:55.164366701Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=30414075" Mar 25 02:37:55.164751 containerd[1819]: time="2025-03-25T02:37:55.164707112Z" level=info msg="ImageCreate event name:\"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:55.165631 containerd[1819]: time="2025-03-25T02:37:55.165585501Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:55.165988 containerd[1819]: time="2025-03-25T02:37:55.165976029Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"31907171\" in 2.707312654s" Mar 25 02:37:55.166024 containerd[1819]: time="2025-03-25T02:37:55.165991113Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\"" Mar 25 02:37:55.166477 containerd[1819]: time="2025-03-25T02:37:55.166463929Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 25 02:37:55.169463 containerd[1819]: time="2025-03-25T02:37:55.169444546Z" level=info msg="CreateContainer within sandbox \"9a81a81544f352ef4445be84906d49d058763020bb81a0714dc3d9e6de316589\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 25 02:37:55.172254 containerd[1819]: time="2025-03-25T02:37:55.172239808Z" level=info msg="Container d27a06727ed437b45fdeb63da62f856f535df7f7b68bf63afa7defd6713686e6: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:37:55.174826 containerd[1819]: time="2025-03-25T02:37:55.174782841Z" level=info msg="CreateContainer within sandbox \"9a81a81544f352ef4445be84906d49d058763020bb81a0714dc3d9e6de316589\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"d27a06727ed437b45fdeb63da62f856f535df7f7b68bf63afa7defd6713686e6\"" Mar 25 02:37:55.175024 containerd[1819]: time="2025-03-25T02:37:55.175012682Z" level=info msg="StartContainer for \"d27a06727ed437b45fdeb63da62f856f535df7f7b68bf63afa7defd6713686e6\"" Mar 25 02:37:55.175532 containerd[1819]: time="2025-03-25T02:37:55.175521234Z" level=info msg="connecting to shim d27a06727ed437b45fdeb63da62f856f535df7f7b68bf63afa7defd6713686e6" address="unix:///run/containerd/s/4aca9fc2d1afba8d8a9066e3823072e91f3d2b5c4307683e03b9c6ed7ceed460" protocol=ttrpc version=3 Mar 25 02:37:55.199931 systemd[1]: Started cri-containerd-d27a06727ed437b45fdeb63da62f856f535df7f7b68bf63afa7defd6713686e6.scope - libcontainer container d27a06727ed437b45fdeb63da62f856f535df7f7b68bf63afa7defd6713686e6. Mar 25 02:37:55.232432 containerd[1819]: time="2025-03-25T02:37:55.232407331Z" level=info msg="StartContainer for \"d27a06727ed437b45fdeb63da62f856f535df7f7b68bf63afa7defd6713686e6\" returns successfully" Mar 25 02:37:56.149194 kubelet[3188]: E0325 02:37:56.149090 3188 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-84cjd" podUID="8d6881ea-948c-42f2-b0a9-c85a672155d1" Mar 25 02:37:56.218032 kubelet[3188]: I0325 02:37:56.217931 3188 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5bd4cf546f-7svfs" podStartSLOduration=1.908516949 podStartE2EDuration="6.217876906s" podCreationTimestamp="2025-03-25 02:37:50 +0000 UTC" firstStartedPulling="2025-03-25 02:37:50.857006318 +0000 UTC m=+11.756317001" lastFinishedPulling="2025-03-25 02:37:55.166366275 +0000 UTC m=+16.065676958" observedRunningTime="2025-03-25 02:37:56.21783866 +0000 UTC m=+17.117149397" watchObservedRunningTime="2025-03-25 02:37:56.217876906 +0000 UTC m=+17.117187619" Mar 25 02:37:58.148838 kubelet[3188]: E0325 02:37:58.148805 3188 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-84cjd" podUID="8d6881ea-948c-42f2-b0a9-c85a672155d1" Mar 25 02:37:58.848362 containerd[1819]: time="2025-03-25T02:37:58.848332347Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:58.848571 containerd[1819]: time="2025-03-25T02:37:58.848522844Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=97781477" Mar 25 02:37:58.848797 containerd[1819]: time="2025-03-25T02:37:58.848784758Z" level=info msg="ImageCreate event name:\"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:58.849701 containerd[1819]: time="2025-03-25T02:37:58.849689982Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:37:58.850063 containerd[1819]: time="2025-03-25T02:37:58.850050346Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"99274581\" in 3.683568614s" Mar 25 02:37:58.850088 containerd[1819]: time="2025-03-25T02:37:58.850065932Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\"" Mar 25 02:37:58.850948 containerd[1819]: time="2025-03-25T02:37:58.850937324Z" level=info msg="CreateContainer within sandbox \"a7a9a9e61787f4ea212d42f7bcff30b7eb5b80c1af314a353b0a4b80d9ca4938\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 25 02:37:58.854000 containerd[1819]: time="2025-03-25T02:37:58.853984843Z" level=info msg="Container 033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:37:58.857852 containerd[1819]: time="2025-03-25T02:37:58.857837027Z" level=info msg="CreateContainer within sandbox \"a7a9a9e61787f4ea212d42f7bcff30b7eb5b80c1af314a353b0a4b80d9ca4938\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666\"" Mar 25 02:37:58.858114 containerd[1819]: time="2025-03-25T02:37:58.858101899Z" level=info msg="StartContainer for \"033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666\"" Mar 25 02:37:58.858838 containerd[1819]: time="2025-03-25T02:37:58.858827034Z" level=info msg="connecting to shim 033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666" address="unix:///run/containerd/s/c9488742c006459f61853947d7314575ad34e18f2378c84f69f5c058b4e4c143" protocol=ttrpc version=3 Mar 25 02:37:58.880936 systemd[1]: Started cri-containerd-033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666.scope - libcontainer container 033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666. Mar 25 02:37:58.904281 containerd[1819]: time="2025-03-25T02:37:58.904227035Z" level=info msg="StartContainer for \"033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666\" returns successfully" Mar 25 02:37:59.487360 systemd[1]: cri-containerd-033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666.scope: Deactivated successfully. Mar 25 02:37:59.487507 systemd[1]: cri-containerd-033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666.scope: Consumed 497ms CPU time, 176.3M memory peak, 154M written to disk. Mar 25 02:37:59.487888 containerd[1819]: time="2025-03-25T02:37:59.487873740Z" level=info msg="received exit event container_id:\"033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666\" id:\"033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666\" pid:4025 exited_at:{seconds:1742870279 nanos:487763109}" Mar 25 02:37:59.487924 containerd[1819]: time="2025-03-25T02:37:59.487905819Z" level=info msg="TaskExit event in podsandbox handler container_id:\"033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666\" id:\"033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666\" pid:4025 exited_at:{seconds:1742870279 nanos:487763109}" Mar 25 02:37:59.497927 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666-rootfs.mount: Deactivated successfully. Mar 25 02:37:59.589172 kubelet[3188]: I0325 02:37:59.589077 3188 kubelet_node_status.go:502] "Fast updating node status as it just became ready" Mar 25 02:37:59.643369 systemd[1]: Created slice kubepods-burstable-pode23e1cc9_f062_44e4_9ac6_0c4070ca0b2b.slice - libcontainer container kubepods-burstable-pode23e1cc9_f062_44e4_9ac6_0c4070ca0b2b.slice. Mar 25 02:37:59.654513 systemd[1]: Created slice kubepods-besteffort-pode5e02bef_e0da_44bf_b287_659021a0e138.slice - libcontainer container kubepods-besteffort-pode5e02bef_e0da_44bf_b287_659021a0e138.slice. Mar 25 02:37:59.661242 systemd[1]: Created slice kubepods-burstable-pod868470d1_1d68_4476_8397_6e99241d4307.slice - libcontainer container kubepods-burstable-pod868470d1_1d68_4476_8397_6e99241d4307.slice. Mar 25 02:37:59.665465 systemd[1]: Created slice kubepods-besteffort-pod77e2f28d_fe85_4b5f_acde_24f4c51f84a7.slice - libcontainer container kubepods-besteffort-pod77e2f28d_fe85_4b5f_acde_24f4c51f84a7.slice. Mar 25 02:37:59.668743 systemd[1]: Created slice kubepods-besteffort-pod67549bf9_1d7c_4c56_b80d_a3bdf8cbbe58.slice - libcontainer container kubepods-besteffort-pod67549bf9_1d7c_4c56_b80d_a3bdf8cbbe58.slice. Mar 25 02:37:59.808083 kubelet[3188]: I0325 02:37:59.807953 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e23e1cc9-f062-44e4-9ac6-0c4070ca0b2b-config-volume\") pod \"coredns-668d6bf9bc-56n2d\" (UID: \"e23e1cc9-f062-44e4-9ac6-0c4070ca0b2b\") " pod="kube-system/coredns-668d6bf9bc-56n2d" Mar 25 02:37:59.808083 kubelet[3188]: I0325 02:37:59.808070 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc4sm\" (UniqueName: \"kubernetes.io/projected/67549bf9-1d7c-4c56-b80d-a3bdf8cbbe58-kube-api-access-sc4sm\") pod \"calico-apiserver-6785d4dd64-xght8\" (UID: \"67549bf9-1d7c-4c56-b80d-a3bdf8cbbe58\") " pod="calico-apiserver/calico-apiserver-6785d4dd64-xght8" Mar 25 02:37:59.808579 kubelet[3188]: I0325 02:37:59.808132 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk8xf\" (UniqueName: \"kubernetes.io/projected/e23e1cc9-f062-44e4-9ac6-0c4070ca0b2b-kube-api-access-qk8xf\") pod \"coredns-668d6bf9bc-56n2d\" (UID: \"e23e1cc9-f062-44e4-9ac6-0c4070ca0b2b\") " pod="kube-system/coredns-668d6bf9bc-56n2d" Mar 25 02:37:59.808579 kubelet[3188]: I0325 02:37:59.808195 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqxq6\" (UniqueName: \"kubernetes.io/projected/868470d1-1d68-4476-8397-6e99241d4307-kube-api-access-sqxq6\") pod \"coredns-668d6bf9bc-4pvm2\" (UID: \"868470d1-1d68-4476-8397-6e99241d4307\") " pod="kube-system/coredns-668d6bf9bc-4pvm2" Mar 25 02:37:59.808579 kubelet[3188]: I0325 02:37:59.808249 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/67549bf9-1d7c-4c56-b80d-a3bdf8cbbe58-calico-apiserver-certs\") pod \"calico-apiserver-6785d4dd64-xght8\" (UID: \"67549bf9-1d7c-4c56-b80d-a3bdf8cbbe58\") " pod="calico-apiserver/calico-apiserver-6785d4dd64-xght8" Mar 25 02:37:59.808579 kubelet[3188]: I0325 02:37:59.808299 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5e02bef-e0da-44bf-b287-659021a0e138-tigera-ca-bundle\") pod \"calico-kube-controllers-5fcdbb9f57-psbcp\" (UID: \"e5e02bef-e0da-44bf-b287-659021a0e138\") " pod="calico-system/calico-kube-controllers-5fcdbb9f57-psbcp" Mar 25 02:37:59.808579 kubelet[3188]: I0325 02:37:59.808346 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/868470d1-1d68-4476-8397-6e99241d4307-config-volume\") pod \"coredns-668d6bf9bc-4pvm2\" (UID: \"868470d1-1d68-4476-8397-6e99241d4307\") " pod="kube-system/coredns-668d6bf9bc-4pvm2" Mar 25 02:37:59.809189 kubelet[3188]: I0325 02:37:59.808402 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2d9t\" (UniqueName: \"kubernetes.io/projected/e5e02bef-e0da-44bf-b287-659021a0e138-kube-api-access-l2d9t\") pod \"calico-kube-controllers-5fcdbb9f57-psbcp\" (UID: \"e5e02bef-e0da-44bf-b287-659021a0e138\") " pod="calico-system/calico-kube-controllers-5fcdbb9f57-psbcp" Mar 25 02:37:59.809189 kubelet[3188]: I0325 02:37:59.808452 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/77e2f28d-fe85-4b5f-acde-24f4c51f84a7-calico-apiserver-certs\") pod \"calico-apiserver-6785d4dd64-6p69w\" (UID: \"77e2f28d-fe85-4b5f-acde-24f4c51f84a7\") " pod="calico-apiserver/calico-apiserver-6785d4dd64-6p69w" Mar 25 02:37:59.809189 kubelet[3188]: I0325 02:37:59.808507 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtzd4\" (UniqueName: \"kubernetes.io/projected/77e2f28d-fe85-4b5f-acde-24f4c51f84a7-kube-api-access-jtzd4\") pod \"calico-apiserver-6785d4dd64-6p69w\" (UID: \"77e2f28d-fe85-4b5f-acde-24f4c51f84a7\") " pod="calico-apiserver/calico-apiserver-6785d4dd64-6p69w" Mar 25 02:37:59.961871 containerd[1819]: time="2025-03-25T02:37:59.961838020Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-56n2d,Uid:e23e1cc9-f062-44e4-9ac6-0c4070ca0b2b,Namespace:kube-system,Attempt:0,}" Mar 25 02:37:59.962265 containerd[1819]: time="2025-03-25T02:37:59.961837941Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5fcdbb9f57-psbcp,Uid:e5e02bef-e0da-44bf-b287-659021a0e138,Namespace:calico-system,Attempt:0,}" Mar 25 02:37:59.964291 containerd[1819]: time="2025-03-25T02:37:59.964264798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-4pvm2,Uid:868470d1-1d68-4476-8397-6e99241d4307,Namespace:kube-system,Attempt:0,}" Mar 25 02:37:59.967855 containerd[1819]: time="2025-03-25T02:37:59.967804353Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6785d4dd64-6p69w,Uid:77e2f28d-fe85-4b5f-acde-24f4c51f84a7,Namespace:calico-apiserver,Attempt:0,}" Mar 25 02:37:59.971353 containerd[1819]: time="2025-03-25T02:37:59.971332028Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6785d4dd64-xght8,Uid:67549bf9-1d7c-4c56-b80d-a3bdf8cbbe58,Namespace:calico-apiserver,Attempt:0,}" Mar 25 02:38:00.152707 systemd[1]: Created slice kubepods-besteffort-pod8d6881ea_948c_42f2_b0a9_c85a672155d1.slice - libcontainer container kubepods-besteffort-pod8d6881ea_948c_42f2_b0a9_c85a672155d1.slice. Mar 25 02:38:00.154440 containerd[1819]: time="2025-03-25T02:38:00.154401818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-84cjd,Uid:8d6881ea-948c-42f2-b0a9-c85a672155d1,Namespace:calico-system,Attempt:0,}" Mar 25 02:38:00.157516 containerd[1819]: time="2025-03-25T02:38:00.157477895Z" level=error msg="Failed to destroy network for sandbox \"85093f30e1a0320a4b2a1cae27be3caf03fc7c8ab53ad88999e4c50d1efc3a80\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.157956 containerd[1819]: time="2025-03-25T02:38:00.157936456Z" level=error msg="Failed to destroy network for sandbox \"551e9a84977bad385963287cdedc1d81e674c0ecbbcb63aea61452f98867d6a9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.158095 containerd[1819]: time="2025-03-25T02:38:00.157965886Z" level=error msg="Failed to destroy network for sandbox \"294225ff5223340dfbdacc166d417f4b18b5381163e3ac73c6a9779bb17bdc37\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.159180 containerd[1819]: time="2025-03-25T02:38:00.158949793Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-4pvm2,Uid:868470d1-1d68-4476-8397-6e99241d4307,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"85093f30e1a0320a4b2a1cae27be3caf03fc7c8ab53ad88999e4c50d1efc3a80\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.159309 containerd[1819]: time="2025-03-25T02:38:00.159286900Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-56n2d,Uid:e23e1cc9-f062-44e4-9ac6-0c4070ca0b2b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"551e9a84977bad385963287cdedc1d81e674c0ecbbcb63aea61452f98867d6a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.159354 kubelet[3188]: E0325 02:38:00.159299 3188 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85093f30e1a0320a4b2a1cae27be3caf03fc7c8ab53ad88999e4c50d1efc3a80\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.159354 kubelet[3188]: E0325 02:38:00.159347 3188 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85093f30e1a0320a4b2a1cae27be3caf03fc7c8ab53ad88999e4c50d1efc3a80\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-4pvm2" Mar 25 02:38:00.159403 kubelet[3188]: E0325 02:38:00.159361 3188 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85093f30e1a0320a4b2a1cae27be3caf03fc7c8ab53ad88999e4c50d1efc3a80\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-4pvm2" Mar 25 02:38:00.159403 kubelet[3188]: E0325 02:38:00.159372 3188 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"551e9a84977bad385963287cdedc1d81e674c0ecbbcb63aea61452f98867d6a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.159403 kubelet[3188]: E0325 02:38:00.159389 3188 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-4pvm2_kube-system(868470d1-1d68-4476-8397-6e99241d4307)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-4pvm2_kube-system(868470d1-1d68-4476-8397-6e99241d4307)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"85093f30e1a0320a4b2a1cae27be3caf03fc7c8ab53ad88999e4c50d1efc3a80\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-4pvm2" podUID="868470d1-1d68-4476-8397-6e99241d4307" Mar 25 02:38:00.159477 kubelet[3188]: E0325 02:38:00.159405 3188 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"551e9a84977bad385963287cdedc1d81e674c0ecbbcb63aea61452f98867d6a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-56n2d" Mar 25 02:38:00.159477 kubelet[3188]: E0325 02:38:00.159417 3188 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"551e9a84977bad385963287cdedc1d81e674c0ecbbcb63aea61452f98867d6a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-56n2d" Mar 25 02:38:00.159477 kubelet[3188]: E0325 02:38:00.159444 3188 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-56n2d_kube-system(e23e1cc9-f062-44e4-9ac6-0c4070ca0b2b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-56n2d_kube-system(e23e1cc9-f062-44e4-9ac6-0c4070ca0b2b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"551e9a84977bad385963287cdedc1d81e674c0ecbbcb63aea61452f98867d6a9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-56n2d" podUID="e23e1cc9-f062-44e4-9ac6-0c4070ca0b2b" Mar 25 02:38:00.159708 containerd[1819]: time="2025-03-25T02:38:00.159690653Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5fcdbb9f57-psbcp,Uid:e5e02bef-e0da-44bf-b287-659021a0e138,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"294225ff5223340dfbdacc166d417f4b18b5381163e3ac73c6a9779bb17bdc37\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.159762 kubelet[3188]: E0325 02:38:00.159753 3188 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"294225ff5223340dfbdacc166d417f4b18b5381163e3ac73c6a9779bb17bdc37\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.159787 kubelet[3188]: E0325 02:38:00.159768 3188 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"294225ff5223340dfbdacc166d417f4b18b5381163e3ac73c6a9779bb17bdc37\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5fcdbb9f57-psbcp" Mar 25 02:38:00.159787 kubelet[3188]: E0325 02:38:00.159778 3188 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"294225ff5223340dfbdacc166d417f4b18b5381163e3ac73c6a9779bb17bdc37\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5fcdbb9f57-psbcp" Mar 25 02:38:00.159823 kubelet[3188]: E0325 02:38:00.159795 3188 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5fcdbb9f57-psbcp_calico-system(e5e02bef-e0da-44bf-b287-659021a0e138)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5fcdbb9f57-psbcp_calico-system(e5e02bef-e0da-44bf-b287-659021a0e138)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"294225ff5223340dfbdacc166d417f4b18b5381163e3ac73c6a9779bb17bdc37\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5fcdbb9f57-psbcp" podUID="e5e02bef-e0da-44bf-b287-659021a0e138" Mar 25 02:38:00.161130 containerd[1819]: time="2025-03-25T02:38:00.161108716Z" level=error msg="Failed to destroy network for sandbox \"15e6505112097643c92c85fcf0a4f41025abe04568961326c4b2b8a8e4a06da9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.161273 containerd[1819]: time="2025-03-25T02:38:00.161258862Z" level=error msg="Failed to destroy network for sandbox \"5ef1ea39ab7caf63da693f6b3f4eb699ee089c6d96bd6c060745d80de65fffbd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.161476 containerd[1819]: time="2025-03-25T02:38:00.161461445Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6785d4dd64-xght8,Uid:67549bf9-1d7c-4c56-b80d-a3bdf8cbbe58,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"15e6505112097643c92c85fcf0a4f41025abe04568961326c4b2b8a8e4a06da9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.161580 kubelet[3188]: E0325 02:38:00.161563 3188 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15e6505112097643c92c85fcf0a4f41025abe04568961326c4b2b8a8e4a06da9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.161608 kubelet[3188]: E0325 02:38:00.161594 3188 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15e6505112097643c92c85fcf0a4f41025abe04568961326c4b2b8a8e4a06da9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6785d4dd64-xght8" Mar 25 02:38:00.161639 kubelet[3188]: E0325 02:38:00.161608 3188 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15e6505112097643c92c85fcf0a4f41025abe04568961326c4b2b8a8e4a06da9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6785d4dd64-xght8" Mar 25 02:38:00.161662 kubelet[3188]: E0325 02:38:00.161637 3188 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6785d4dd64-xght8_calico-apiserver(67549bf9-1d7c-4c56-b80d-a3bdf8cbbe58)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6785d4dd64-xght8_calico-apiserver(67549bf9-1d7c-4c56-b80d-a3bdf8cbbe58)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"15e6505112097643c92c85fcf0a4f41025abe04568961326c4b2b8a8e4a06da9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6785d4dd64-xght8" podUID="67549bf9-1d7c-4c56-b80d-a3bdf8cbbe58" Mar 25 02:38:00.161752 containerd[1819]: time="2025-03-25T02:38:00.161709074Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6785d4dd64-6p69w,Uid:77e2f28d-fe85-4b5f-acde-24f4c51f84a7,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ef1ea39ab7caf63da693f6b3f4eb699ee089c6d96bd6c060745d80de65fffbd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.161797 kubelet[3188]: E0325 02:38:00.161778 3188 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ef1ea39ab7caf63da693f6b3f4eb699ee089c6d96bd6c060745d80de65fffbd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.161826 kubelet[3188]: E0325 02:38:00.161796 3188 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ef1ea39ab7caf63da693f6b3f4eb699ee089c6d96bd6c060745d80de65fffbd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6785d4dd64-6p69w" Mar 25 02:38:00.161826 kubelet[3188]: E0325 02:38:00.161808 3188 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ef1ea39ab7caf63da693f6b3f4eb699ee089c6d96bd6c060745d80de65fffbd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6785d4dd64-6p69w" Mar 25 02:38:00.161866 kubelet[3188]: E0325 02:38:00.161826 3188 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6785d4dd64-6p69w_calico-apiserver(77e2f28d-fe85-4b5f-acde-24f4c51f84a7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6785d4dd64-6p69w_calico-apiserver(77e2f28d-fe85-4b5f-acde-24f4c51f84a7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5ef1ea39ab7caf63da693f6b3f4eb699ee089c6d96bd6c060745d80de65fffbd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6785d4dd64-6p69w" podUID="77e2f28d-fe85-4b5f-acde-24f4c51f84a7" Mar 25 02:38:00.181344 containerd[1819]: time="2025-03-25T02:38:00.181317021Z" level=error msg="Failed to destroy network for sandbox \"b085e73f0534b8eea024626b5f9e63d0b9e68c57b962ea9a1e0822db77cb2c9d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.181786 containerd[1819]: time="2025-03-25T02:38:00.181742228Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-84cjd,Uid:8d6881ea-948c-42f2-b0a9-c85a672155d1,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b085e73f0534b8eea024626b5f9e63d0b9e68c57b962ea9a1e0822db77cb2c9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.181903 kubelet[3188]: E0325 02:38:00.181879 3188 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b085e73f0534b8eea024626b5f9e63d0b9e68c57b962ea9a1e0822db77cb2c9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:38:00.181939 kubelet[3188]: E0325 02:38:00.181920 3188 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b085e73f0534b8eea024626b5f9e63d0b9e68c57b962ea9a1e0822db77cb2c9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-84cjd" Mar 25 02:38:00.181939 kubelet[3188]: E0325 02:38:00.181933 3188 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b085e73f0534b8eea024626b5f9e63d0b9e68c57b962ea9a1e0822db77cb2c9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-84cjd" Mar 25 02:38:00.181980 kubelet[3188]: E0325 02:38:00.181958 3188 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-84cjd_calico-system(8d6881ea-948c-42f2-b0a9-c85a672155d1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-84cjd_calico-system(8d6881ea-948c-42f2-b0a9-c85a672155d1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b085e73f0534b8eea024626b5f9e63d0b9e68c57b962ea9a1e0822db77cb2c9d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-84cjd" podUID="8d6881ea-948c-42f2-b0a9-c85a672155d1" Mar 25 02:38:00.214926 containerd[1819]: time="2025-03-25T02:38:00.214905784Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 25 02:38:00.921511 systemd[1]: run-netns-cni\x2da24cd546\x2db766\x2dfaa6\x2d12f6\x2d1400c170d0a5.mount: Deactivated successfully. Mar 25 02:38:00.921564 systemd[1]: run-netns-cni\x2d38d81dbf\x2d8002\x2db80c\x2d2f4a\x2d3f76a30ce338.mount: Deactivated successfully. Mar 25 02:38:00.921600 systemd[1]: run-netns-cni\x2dcd694c90\x2d8d72\x2d0b75\x2db75d\x2d067d79e1bd29.mount: Deactivated successfully. Mar 25 02:38:05.425699 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3449571899.mount: Deactivated successfully. Mar 25 02:38:05.444196 containerd[1819]: time="2025-03-25T02:38:05.444146103Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:38:05.444423 containerd[1819]: time="2025-03-25T02:38:05.444369006Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=142241445" Mar 25 02:38:05.444699 containerd[1819]: time="2025-03-25T02:38:05.444650611Z" level=info msg="ImageCreate event name:\"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:38:05.445396 containerd[1819]: time="2025-03-25T02:38:05.445386130Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:38:05.445738 containerd[1819]: time="2025-03-25T02:38:05.445726925Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"142241307\" in 5.230797252s" Mar 25 02:38:05.445768 containerd[1819]: time="2025-03-25T02:38:05.445743049Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\"" Mar 25 02:38:05.448995 containerd[1819]: time="2025-03-25T02:38:05.448980979Z" level=info msg="CreateContainer within sandbox \"a7a9a9e61787f4ea212d42f7bcff30b7eb5b80c1af314a353b0a4b80d9ca4938\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 25 02:38:05.452861 containerd[1819]: time="2025-03-25T02:38:05.452819381Z" level=info msg="Container 6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:38:05.456705 containerd[1819]: time="2025-03-25T02:38:05.456668549Z" level=info msg="CreateContainer within sandbox \"a7a9a9e61787f4ea212d42f7bcff30b7eb5b80c1af314a353b0a4b80d9ca4938\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\"" Mar 25 02:38:05.456936 containerd[1819]: time="2025-03-25T02:38:05.456877284Z" level=info msg="StartContainer for \"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\"" Mar 25 02:38:05.457670 containerd[1819]: time="2025-03-25T02:38:05.457614438Z" level=info msg="connecting to shim 6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca" address="unix:///run/containerd/s/c9488742c006459f61853947d7314575ad34e18f2378c84f69f5c058b4e4c143" protocol=ttrpc version=3 Mar 25 02:38:05.485936 systemd[1]: Started cri-containerd-6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca.scope - libcontainer container 6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca. Mar 25 02:38:05.511611 containerd[1819]: time="2025-03-25T02:38:05.511555687Z" level=info msg="StartContainer for \"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\" returns successfully" Mar 25 02:38:05.572322 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 25 02:38:05.572379 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 25 02:38:06.276429 kubelet[3188]: I0325 02:38:06.276273 3188 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-v7ltq" podStartSLOduration=1.68200039 podStartE2EDuration="16.276236037s" podCreationTimestamp="2025-03-25 02:37:50 +0000 UTC" firstStartedPulling="2025-03-25 02:37:50.851833358 +0000 UTC m=+11.751144041" lastFinishedPulling="2025-03-25 02:38:05.446069006 +0000 UTC m=+26.345379688" observedRunningTime="2025-03-25 02:38:06.276077088 +0000 UTC m=+27.175387902" watchObservedRunningTime="2025-03-25 02:38:06.276236037 +0000 UTC m=+27.175546787" Mar 25 02:38:06.991681 kernel: bpftool[4630]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 25 02:38:07.148484 systemd-networkd[1728]: vxlan.calico: Link UP Mar 25 02:38:07.148489 systemd-networkd[1728]: vxlan.calico: Gained carrier Mar 25 02:38:07.245111 kubelet[3188]: I0325 02:38:07.245049 3188 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 02:38:08.705881 systemd-networkd[1728]: vxlan.calico: Gained IPv6LL Mar 25 02:38:09.583308 kubelet[3188]: I0325 02:38:09.583193 3188 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 02:38:09.628180 containerd[1819]: time="2025-03-25T02:38:09.628122148Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\" id:\"c72813a4e74cf70a24586b36e643cc5e540d57cae5565a6ceb4fd77a9b0329a7\" pid:4751 exited_at:{seconds:1742870289 nanos:627786711}" Mar 25 02:38:09.671092 containerd[1819]: time="2025-03-25T02:38:09.671045119Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\" id:\"23f93d085283c4c9a76ad2c39225c1d954e3f1b6de0a30b381a2c4713f1aa5b2\" pid:4781 exited_at:{seconds:1742870289 nanos:670824955}" Mar 25 02:38:11.149234 containerd[1819]: time="2025-03-25T02:38:11.149149056Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6785d4dd64-6p69w,Uid:77e2f28d-fe85-4b5f-acde-24f4c51f84a7,Namespace:calico-apiserver,Attempt:0,}" Mar 25 02:38:11.149805 containerd[1819]: time="2025-03-25T02:38:11.149150164Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6785d4dd64-xght8,Uid:67549bf9-1d7c-4c56-b80d-a3bdf8cbbe58,Namespace:calico-apiserver,Attempt:0,}" Mar 25 02:38:11.207486 systemd-networkd[1728]: cali85870de692e: Link UP Mar 25 02:38:11.207589 systemd-networkd[1728]: cali85870de692e: Gained carrier Mar 25 02:38:11.212134 containerd[1819]: 2025-03-25 02:38:11.168 [INFO][4798] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--6p69w-eth0 calico-apiserver-6785d4dd64- calico-apiserver 77e2f28d-fe85-4b5f-acde-24f4c51f84a7 695 0 2025-03-25 02:37:50 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6785d4dd64 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284.0.0-a-3a00d206eb calico-apiserver-6785d4dd64-6p69w eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali85870de692e [] []}} ContainerID="6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" Namespace="calico-apiserver" Pod="calico-apiserver-6785d4dd64-6p69w" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--6p69w-" Mar 25 02:38:11.212134 containerd[1819]: 2025-03-25 02:38:11.168 [INFO][4798] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" Namespace="calico-apiserver" Pod="calico-apiserver-6785d4dd64-6p69w" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--6p69w-eth0" Mar 25 02:38:11.212134 containerd[1819]: 2025-03-25 02:38:11.184 [INFO][4844] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" HandleID="k8s-pod-network.6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" Workload="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--6p69w-eth0" Mar 25 02:38:11.212329 containerd[1819]: 2025-03-25 02:38:11.189 [INFO][4844] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" HandleID="k8s-pod-network.6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" Workload="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--6p69w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00029ab60), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284.0.0-a-3a00d206eb", "pod":"calico-apiserver-6785d4dd64-6p69w", "timestamp":"2025-03-25 02:38:11.18402182 +0000 UTC"}, Hostname:"ci-4284.0.0-a-3a00d206eb", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 02:38:11.212329 containerd[1819]: 2025-03-25 02:38:11.189 [INFO][4844] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 02:38:11.212329 containerd[1819]: 2025-03-25 02:38:11.189 [INFO][4844] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 02:38:11.212329 containerd[1819]: 2025-03-25 02:38:11.189 [INFO][4844] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-a-3a00d206eb' Mar 25 02:38:11.212329 containerd[1819]: 2025-03-25 02:38:11.190 [INFO][4844] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.212329 containerd[1819]: 2025-03-25 02:38:11.192 [INFO][4844] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.212329 containerd[1819]: 2025-03-25 02:38:11.195 [INFO][4844] ipam/ipam.go 489: Trying affinity for 192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.212329 containerd[1819]: 2025-03-25 02:38:11.197 [INFO][4844] ipam/ipam.go 155: Attempting to load block cidr=192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.212329 containerd[1819]: 2025-03-25 02:38:11.198 [INFO][4844] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.212581 containerd[1819]: 2025-03-25 02:38:11.198 [INFO][4844] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.14.64/26 handle="k8s-pod-network.6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.212581 containerd[1819]: 2025-03-25 02:38:11.200 [INFO][4844] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725 Mar 25 02:38:11.212581 containerd[1819]: 2025-03-25 02:38:11.202 [INFO][4844] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.14.64/26 handle="k8s-pod-network.6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.212581 containerd[1819]: 2025-03-25 02:38:11.205 [INFO][4844] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.14.65/26] block=192.168.14.64/26 handle="k8s-pod-network.6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.212581 containerd[1819]: 2025-03-25 02:38:11.205 [INFO][4844] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.14.65/26] handle="k8s-pod-network.6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.212581 containerd[1819]: 2025-03-25 02:38:11.205 [INFO][4844] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 02:38:11.212581 containerd[1819]: 2025-03-25 02:38:11.205 [INFO][4844] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.14.65/26] IPv6=[] ContainerID="6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" HandleID="k8s-pod-network.6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" Workload="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--6p69w-eth0" Mar 25 02:38:11.212726 containerd[1819]: 2025-03-25 02:38:11.206 [INFO][4798] cni-plugin/k8s.go 386: Populated endpoint ContainerID="6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" Namespace="calico-apiserver" Pod="calico-apiserver-6785d4dd64-6p69w" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--6p69w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--6p69w-eth0", GenerateName:"calico-apiserver-6785d4dd64-", Namespace:"calico-apiserver", SelfLink:"", UID:"77e2f28d-fe85-4b5f-acde-24f4c51f84a7", ResourceVersion:"695", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6785d4dd64", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-3a00d206eb", ContainerID:"", Pod:"calico-apiserver-6785d4dd64-6p69w", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.14.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali85870de692e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:38:11.212769 containerd[1819]: 2025-03-25 02:38:11.206 [INFO][4798] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.14.65/32] ContainerID="6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" Namespace="calico-apiserver" Pod="calico-apiserver-6785d4dd64-6p69w" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--6p69w-eth0" Mar 25 02:38:11.212769 containerd[1819]: 2025-03-25 02:38:11.206 [INFO][4798] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali85870de692e ContainerID="6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" Namespace="calico-apiserver" Pod="calico-apiserver-6785d4dd64-6p69w" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--6p69w-eth0" Mar 25 02:38:11.212769 containerd[1819]: 2025-03-25 02:38:11.207 [INFO][4798] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" Namespace="calico-apiserver" Pod="calico-apiserver-6785d4dd64-6p69w" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--6p69w-eth0" Mar 25 02:38:11.212819 containerd[1819]: 2025-03-25 02:38:11.207 [INFO][4798] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" Namespace="calico-apiserver" Pod="calico-apiserver-6785d4dd64-6p69w" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--6p69w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--6p69w-eth0", GenerateName:"calico-apiserver-6785d4dd64-", Namespace:"calico-apiserver", SelfLink:"", UID:"77e2f28d-fe85-4b5f-acde-24f4c51f84a7", ResourceVersion:"695", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6785d4dd64", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-3a00d206eb", ContainerID:"6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725", Pod:"calico-apiserver-6785d4dd64-6p69w", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.14.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali85870de692e", MAC:"0a:e5:76:ac:9c:3d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:38:11.212864 containerd[1819]: 2025-03-25 02:38:11.211 [INFO][4798] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" Namespace="calico-apiserver" Pod="calico-apiserver-6785d4dd64-6p69w" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--6p69w-eth0" Mar 25 02:38:11.221275 containerd[1819]: time="2025-03-25T02:38:11.221245911Z" level=info msg="connecting to shim 6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725" address="unix:///run/containerd/s/205ccdb3f75d87f4d49d3cbebd8e1bd7c88f357202516b9f1c0aaf7a628f93e0" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:38:11.242821 systemd[1]: Started cri-containerd-6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725.scope - libcontainer container 6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725. Mar 25 02:38:11.267734 containerd[1819]: time="2025-03-25T02:38:11.267712388Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6785d4dd64-6p69w,Uid:77e2f28d-fe85-4b5f-acde-24f4c51f84a7,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725\"" Mar 25 02:38:11.268390 containerd[1819]: time="2025-03-25T02:38:11.268378527Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 02:38:11.353142 systemd-networkd[1728]: cali65acaf4d619: Link UP Mar 25 02:38:11.353914 systemd-networkd[1728]: cali65acaf4d619: Gained carrier Mar 25 02:38:11.376984 containerd[1819]: 2025-03-25 02:38:11.168 [INFO][4800] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--xght8-eth0 calico-apiserver-6785d4dd64- calico-apiserver 67549bf9-1d7c-4c56-b80d-a3bdf8cbbe58 698 0 2025-03-25 02:37:50 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6785d4dd64 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284.0.0-a-3a00d206eb calico-apiserver-6785d4dd64-xght8 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali65acaf4d619 [] []}} ContainerID="c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" Namespace="calico-apiserver" Pod="calico-apiserver-6785d4dd64-xght8" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--xght8-" Mar 25 02:38:11.376984 containerd[1819]: 2025-03-25 02:38:11.168 [INFO][4800] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" Namespace="calico-apiserver" Pod="calico-apiserver-6785d4dd64-xght8" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--xght8-eth0" Mar 25 02:38:11.376984 containerd[1819]: 2025-03-25 02:38:11.184 [INFO][4842] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" HandleID="k8s-pod-network.c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" Workload="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--xght8-eth0" Mar 25 02:38:11.378086 containerd[1819]: 2025-03-25 02:38:11.190 [INFO][4842] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" HandleID="k8s-pod-network.c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" Workload="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--xght8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00021a8f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284.0.0-a-3a00d206eb", "pod":"calico-apiserver-6785d4dd64-xght8", "timestamp":"2025-03-25 02:38:11.184022206 +0000 UTC"}, Hostname:"ci-4284.0.0-a-3a00d206eb", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 02:38:11.378086 containerd[1819]: 2025-03-25 02:38:11.190 [INFO][4842] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 02:38:11.378086 containerd[1819]: 2025-03-25 02:38:11.205 [INFO][4842] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 02:38:11.378086 containerd[1819]: 2025-03-25 02:38:11.205 [INFO][4842] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-a-3a00d206eb' Mar 25 02:38:11.378086 containerd[1819]: 2025-03-25 02:38:11.292 [INFO][4842] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.378086 containerd[1819]: 2025-03-25 02:38:11.301 [INFO][4842] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.378086 containerd[1819]: 2025-03-25 02:38:11.312 [INFO][4842] ipam/ipam.go 489: Trying affinity for 192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.378086 containerd[1819]: 2025-03-25 02:38:11.316 [INFO][4842] ipam/ipam.go 155: Attempting to load block cidr=192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.378086 containerd[1819]: 2025-03-25 02:38:11.322 [INFO][4842] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.378992 containerd[1819]: 2025-03-25 02:38:11.322 [INFO][4842] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.14.64/26 handle="k8s-pod-network.c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.378992 containerd[1819]: 2025-03-25 02:38:11.325 [INFO][4842] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699 Mar 25 02:38:11.378992 containerd[1819]: 2025-03-25 02:38:11.333 [INFO][4842] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.14.64/26 handle="k8s-pod-network.c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.378992 containerd[1819]: 2025-03-25 02:38:11.344 [INFO][4842] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.14.66/26] block=192.168.14.64/26 handle="k8s-pod-network.c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.378992 containerd[1819]: 2025-03-25 02:38:11.344 [INFO][4842] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.14.66/26] handle="k8s-pod-network.c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:11.378992 containerd[1819]: 2025-03-25 02:38:11.344 [INFO][4842] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 02:38:11.378992 containerd[1819]: 2025-03-25 02:38:11.344 [INFO][4842] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.14.66/26] IPv6=[] ContainerID="c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" HandleID="k8s-pod-network.c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" Workload="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--xght8-eth0" Mar 25 02:38:11.379588 containerd[1819]: 2025-03-25 02:38:11.349 [INFO][4800] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" Namespace="calico-apiserver" Pod="calico-apiserver-6785d4dd64-xght8" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--xght8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--xght8-eth0", GenerateName:"calico-apiserver-6785d4dd64-", Namespace:"calico-apiserver", SelfLink:"", UID:"67549bf9-1d7c-4c56-b80d-a3bdf8cbbe58", ResourceVersion:"698", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6785d4dd64", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-3a00d206eb", ContainerID:"", Pod:"calico-apiserver-6785d4dd64-xght8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.14.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali65acaf4d619", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:38:11.379881 containerd[1819]: 2025-03-25 02:38:11.349 [INFO][4800] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.14.66/32] ContainerID="c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" Namespace="calico-apiserver" Pod="calico-apiserver-6785d4dd64-xght8" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--xght8-eth0" Mar 25 02:38:11.379881 containerd[1819]: 2025-03-25 02:38:11.349 [INFO][4800] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali65acaf4d619 ContainerID="c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" Namespace="calico-apiserver" Pod="calico-apiserver-6785d4dd64-xght8" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--xght8-eth0" Mar 25 02:38:11.379881 containerd[1819]: 2025-03-25 02:38:11.353 [INFO][4800] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" Namespace="calico-apiserver" Pod="calico-apiserver-6785d4dd64-xght8" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--xght8-eth0" Mar 25 02:38:11.380163 containerd[1819]: 2025-03-25 02:38:11.354 [INFO][4800] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" Namespace="calico-apiserver" Pod="calico-apiserver-6785d4dd64-xght8" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--xght8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--xght8-eth0", GenerateName:"calico-apiserver-6785d4dd64-", Namespace:"calico-apiserver", SelfLink:"", UID:"67549bf9-1d7c-4c56-b80d-a3bdf8cbbe58", ResourceVersion:"698", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6785d4dd64", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-3a00d206eb", ContainerID:"c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699", Pod:"calico-apiserver-6785d4dd64-xght8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.14.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali65acaf4d619", MAC:"0a:11:43:f8:8f:c1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:38:11.380384 containerd[1819]: 2025-03-25 02:38:11.373 [INFO][4800] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" Namespace="calico-apiserver" Pod="calico-apiserver-6785d4dd64-xght8" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--apiserver--6785d4dd64--xght8-eth0" Mar 25 02:38:11.394388 containerd[1819]: time="2025-03-25T02:38:11.394361813Z" level=info msg="connecting to shim c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699" address="unix:///run/containerd/s/b68a10e98fee5887305cfb1dee573d34907fbcd0e8a6263121c89196be4768e3" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:38:11.415028 systemd[1]: Started cri-containerd-c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699.scope - libcontainer container c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699. Mar 25 02:38:11.490336 containerd[1819]: time="2025-03-25T02:38:11.490315100Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6785d4dd64-xght8,Uid:67549bf9-1d7c-4c56-b80d-a3bdf8cbbe58,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699\"" Mar 25 02:38:12.149386 containerd[1819]: time="2025-03-25T02:38:12.149324097Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-4pvm2,Uid:868470d1-1d68-4476-8397-6e99241d4307,Namespace:kube-system,Attempt:0,}" Mar 25 02:38:12.201137 systemd-networkd[1728]: cali85f624e8d8a: Link UP Mar 25 02:38:12.201238 systemd-networkd[1728]: cali85f624e8d8a: Gained carrier Mar 25 02:38:12.205970 containerd[1819]: 2025-03-25 02:38:12.167 [INFO][4995] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--4pvm2-eth0 coredns-668d6bf9bc- kube-system 868470d1-1d68-4476-8397-6e99241d4307 696 0 2025-03-25 02:37:44 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284.0.0-a-3a00d206eb coredns-668d6bf9bc-4pvm2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali85f624e8d8a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" Namespace="kube-system" Pod="coredns-668d6bf9bc-4pvm2" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--4pvm2-" Mar 25 02:38:12.205970 containerd[1819]: 2025-03-25 02:38:12.167 [INFO][4995] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" Namespace="kube-system" Pod="coredns-668d6bf9bc-4pvm2" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--4pvm2-eth0" Mar 25 02:38:12.205970 containerd[1819]: 2025-03-25 02:38:12.181 [INFO][5015] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" HandleID="k8s-pod-network.418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" Workload="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--4pvm2-eth0" Mar 25 02:38:12.206135 containerd[1819]: 2025-03-25 02:38:12.187 [INFO][5015] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" HandleID="k8s-pod-network.418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" Workload="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--4pvm2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003055e0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284.0.0-a-3a00d206eb", "pod":"coredns-668d6bf9bc-4pvm2", "timestamp":"2025-03-25 02:38:12.181868283 +0000 UTC"}, Hostname:"ci-4284.0.0-a-3a00d206eb", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 02:38:12.206135 containerd[1819]: 2025-03-25 02:38:12.187 [INFO][5015] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 02:38:12.206135 containerd[1819]: 2025-03-25 02:38:12.187 [INFO][5015] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 02:38:12.206135 containerd[1819]: 2025-03-25 02:38:12.187 [INFO][5015] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-a-3a00d206eb' Mar 25 02:38:12.206135 containerd[1819]: 2025-03-25 02:38:12.188 [INFO][5015] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:12.206135 containerd[1819]: 2025-03-25 02:38:12.190 [INFO][5015] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:12.206135 containerd[1819]: 2025-03-25 02:38:12.192 [INFO][5015] ipam/ipam.go 489: Trying affinity for 192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:12.206135 containerd[1819]: 2025-03-25 02:38:12.193 [INFO][5015] ipam/ipam.go 155: Attempting to load block cidr=192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:12.206135 containerd[1819]: 2025-03-25 02:38:12.194 [INFO][5015] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:12.206294 containerd[1819]: 2025-03-25 02:38:12.194 [INFO][5015] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.14.64/26 handle="k8s-pod-network.418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:12.206294 containerd[1819]: 2025-03-25 02:38:12.195 [INFO][5015] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a Mar 25 02:38:12.206294 containerd[1819]: 2025-03-25 02:38:12.197 [INFO][5015] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.14.64/26 handle="k8s-pod-network.418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:12.206294 containerd[1819]: 2025-03-25 02:38:12.199 [INFO][5015] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.14.67/26] block=192.168.14.64/26 handle="k8s-pod-network.418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:12.206294 containerd[1819]: 2025-03-25 02:38:12.199 [INFO][5015] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.14.67/26] handle="k8s-pod-network.418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:12.206294 containerd[1819]: 2025-03-25 02:38:12.199 [INFO][5015] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 02:38:12.206294 containerd[1819]: 2025-03-25 02:38:12.199 [INFO][5015] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.14.67/26] IPv6=[] ContainerID="418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" HandleID="k8s-pod-network.418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" Workload="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--4pvm2-eth0" Mar 25 02:38:12.206424 containerd[1819]: 2025-03-25 02:38:12.200 [INFO][4995] cni-plugin/k8s.go 386: Populated endpoint ContainerID="418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" Namespace="kube-system" Pod="coredns-668d6bf9bc-4pvm2" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--4pvm2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--4pvm2-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"868470d1-1d68-4476-8397-6e99241d4307", ResourceVersion:"696", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 37, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-3a00d206eb", ContainerID:"", Pod:"coredns-668d6bf9bc-4pvm2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.14.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali85f624e8d8a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:38:12.206424 containerd[1819]: 2025-03-25 02:38:12.200 [INFO][4995] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.14.67/32] ContainerID="418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" Namespace="kube-system" Pod="coredns-668d6bf9bc-4pvm2" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--4pvm2-eth0" Mar 25 02:38:12.206424 containerd[1819]: 2025-03-25 02:38:12.200 [INFO][4995] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali85f624e8d8a ContainerID="418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" Namespace="kube-system" Pod="coredns-668d6bf9bc-4pvm2" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--4pvm2-eth0" Mar 25 02:38:12.206424 containerd[1819]: 2025-03-25 02:38:12.201 [INFO][4995] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" Namespace="kube-system" Pod="coredns-668d6bf9bc-4pvm2" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--4pvm2-eth0" Mar 25 02:38:12.206424 containerd[1819]: 2025-03-25 02:38:12.201 [INFO][4995] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" Namespace="kube-system" Pod="coredns-668d6bf9bc-4pvm2" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--4pvm2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--4pvm2-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"868470d1-1d68-4476-8397-6e99241d4307", ResourceVersion:"696", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 37, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-3a00d206eb", ContainerID:"418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a", Pod:"coredns-668d6bf9bc-4pvm2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.14.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali85f624e8d8a", MAC:"0a:3d:38:6a:07:ab", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:38:12.206424 containerd[1819]: 2025-03-25 02:38:12.205 [INFO][4995] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" Namespace="kube-system" Pod="coredns-668d6bf9bc-4pvm2" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--4pvm2-eth0" Mar 25 02:38:12.215387 containerd[1819]: time="2025-03-25T02:38:12.215353662Z" level=info msg="connecting to shim 418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a" address="unix:///run/containerd/s/fa5e25f5d97dcde9fc53e9f335679e83138c8e585906a4adf1be930fa7b514d9" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:38:12.237803 systemd[1]: Started cri-containerd-418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a.scope - libcontainer container 418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a. Mar 25 02:38:12.266007 containerd[1819]: time="2025-03-25T02:38:12.265986807Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-4pvm2,Uid:868470d1-1d68-4476-8397-6e99241d4307,Namespace:kube-system,Attempt:0,} returns sandbox id \"418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a\"" Mar 25 02:38:12.267136 containerd[1819]: time="2025-03-25T02:38:12.267118226Z" level=info msg="CreateContainer within sandbox \"418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 02:38:12.270297 containerd[1819]: time="2025-03-25T02:38:12.270282353Z" level=info msg="Container 8eb15cbca15a4f76470ed982c40f545813d625ceccf9fc7f20c4e1f4af529551: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:38:12.272518 containerd[1819]: time="2025-03-25T02:38:12.272482181Z" level=info msg="CreateContainer within sandbox \"418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8eb15cbca15a4f76470ed982c40f545813d625ceccf9fc7f20c4e1f4af529551\"" Mar 25 02:38:12.272735 containerd[1819]: time="2025-03-25T02:38:12.272699943Z" level=info msg="StartContainer for \"8eb15cbca15a4f76470ed982c40f545813d625ceccf9fc7f20c4e1f4af529551\"" Mar 25 02:38:12.273136 containerd[1819]: time="2025-03-25T02:38:12.273095415Z" level=info msg="connecting to shim 8eb15cbca15a4f76470ed982c40f545813d625ceccf9fc7f20c4e1f4af529551" address="unix:///run/containerd/s/fa5e25f5d97dcde9fc53e9f335679e83138c8e585906a4adf1be930fa7b514d9" protocol=ttrpc version=3 Mar 25 02:38:12.307932 systemd[1]: Started cri-containerd-8eb15cbca15a4f76470ed982c40f545813d625ceccf9fc7f20c4e1f4af529551.scope - libcontainer container 8eb15cbca15a4f76470ed982c40f545813d625ceccf9fc7f20c4e1f4af529551. Mar 25 02:38:12.311743 systemd[1]: Started sshd@15-147.75.90.239:22-135.125.238.48:55520.service - OpenSSH per-connection server daemon (135.125.238.48:55520). Mar 25 02:38:12.342148 containerd[1819]: time="2025-03-25T02:38:12.342122585Z" level=info msg="StartContainer for \"8eb15cbca15a4f76470ed982c40f545813d625ceccf9fc7f20c4e1f4af529551\" returns successfully" Mar 25 02:38:12.801764 systemd-networkd[1728]: cali65acaf4d619: Gained IPv6LL Mar 25 02:38:12.993857 systemd-networkd[1728]: cali85870de692e: Gained IPv6LL Mar 25 02:38:13.215863 sshd[5099]: Invalid user intranet from 135.125.238.48 port 55520 Mar 25 02:38:13.273571 kubelet[3188]: I0325 02:38:13.273518 3188 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-4pvm2" podStartSLOduration=29.273498017 podStartE2EDuration="29.273498017s" podCreationTimestamp="2025-03-25 02:37:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 02:38:13.273043953 +0000 UTC m=+34.172354667" watchObservedRunningTime="2025-03-25 02:38:13.273498017 +0000 UTC m=+34.172808713" Mar 25 02:38:13.379805 sshd[5099]: Received disconnect from 135.125.238.48 port 55520:11: Bye Bye [preauth] Mar 25 02:38:13.379805 sshd[5099]: Disconnected from invalid user intranet 135.125.238.48 port 55520 [preauth] Mar 25 02:38:13.383778 systemd[1]: sshd@15-147.75.90.239:22-135.125.238.48:55520.service: Deactivated successfully. Mar 25 02:38:13.787531 containerd[1819]: time="2025-03-25T02:38:13.787479275Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:38:13.787742 containerd[1819]: time="2025-03-25T02:38:13.787633781Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=42993204" Mar 25 02:38:13.788038 containerd[1819]: time="2025-03-25T02:38:13.787997173Z" level=info msg="ImageCreate event name:\"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:38:13.788863 containerd[1819]: time="2025-03-25T02:38:13.788823443Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:38:13.789241 containerd[1819]: time="2025-03-25T02:38:13.789200313Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 2.520806913s" Mar 25 02:38:13.789241 containerd[1819]: time="2025-03-25T02:38:13.789216898Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 25 02:38:13.789734 containerd[1819]: time="2025-03-25T02:38:13.789670500Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 02:38:13.790179 containerd[1819]: time="2025-03-25T02:38:13.790167281Z" level=info msg="CreateContainer within sandbox \"6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 02:38:13.792720 containerd[1819]: time="2025-03-25T02:38:13.792675922Z" level=info msg="Container 17ee943974f228681dde969514030cde45aead08b532417f2c6afcff19ccc809: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:38:13.795145 containerd[1819]: time="2025-03-25T02:38:13.795104563Z" level=info msg="CreateContainer within sandbox \"6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"17ee943974f228681dde969514030cde45aead08b532417f2c6afcff19ccc809\"" Mar 25 02:38:13.795346 containerd[1819]: time="2025-03-25T02:38:13.795335715Z" level=info msg="StartContainer for \"17ee943974f228681dde969514030cde45aead08b532417f2c6afcff19ccc809\"" Mar 25 02:38:13.795880 containerd[1819]: time="2025-03-25T02:38:13.795841733Z" level=info msg="connecting to shim 17ee943974f228681dde969514030cde45aead08b532417f2c6afcff19ccc809" address="unix:///run/containerd/s/205ccdb3f75d87f4d49d3cbebd8e1bd7c88f357202516b9f1c0aaf7a628f93e0" protocol=ttrpc version=3 Mar 25 02:38:13.816779 systemd[1]: Started cri-containerd-17ee943974f228681dde969514030cde45aead08b532417f2c6afcff19ccc809.scope - libcontainer container 17ee943974f228681dde969514030cde45aead08b532417f2c6afcff19ccc809. Mar 25 02:38:13.845872 containerd[1819]: time="2025-03-25T02:38:13.845848717Z" level=info msg="StartContainer for \"17ee943974f228681dde969514030cde45aead08b532417f2c6afcff19ccc809\" returns successfully" Mar 25 02:38:13.953762 systemd-networkd[1728]: cali85f624e8d8a: Gained IPv6LL Mar 25 02:38:14.149972 containerd[1819]: time="2025-03-25T02:38:14.149719149Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5fcdbb9f57-psbcp,Uid:e5e02bef-e0da-44bf-b287-659021a0e138,Namespace:calico-system,Attempt:0,}" Mar 25 02:38:14.150229 containerd[1819]: time="2025-03-25T02:38:14.149948651Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-84cjd,Uid:8d6881ea-948c-42f2-b0a9-c85a672155d1,Namespace:calico-system,Attempt:0,}" Mar 25 02:38:14.202954 containerd[1819]: time="2025-03-25T02:38:14.202921353Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:38:14.203155 containerd[1819]: time="2025-03-25T02:38:14.203104616Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 25 02:38:14.205044 containerd[1819]: time="2025-03-25T02:38:14.204840678Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 415.151882ms" Mar 25 02:38:14.205044 containerd[1819]: time="2025-03-25T02:38:14.204878213Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 25 02:38:14.205888 systemd-networkd[1728]: calia15e56e26b8: Link UP Mar 25 02:38:14.206021 systemd-networkd[1728]: calia15e56e26b8: Gained carrier Mar 25 02:38:14.206124 containerd[1819]: time="2025-03-25T02:38:14.206109263Z" level=info msg="CreateContainer within sandbox \"c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 02:38:14.209693 containerd[1819]: time="2025-03-25T02:38:14.209671604Z" level=info msg="Container a743079ca52daa4cf2bbc9a3ce100bc9594ee125a8129c12350614d98259668b: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.172 [INFO][5203] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--a--3a00d206eb-k8s-csi--node--driver--84cjd-eth0 csi-node-driver- calico-system 8d6881ea-948c-42f2-b0a9-c85a672155d1 619 0 2025-03-25 02:37:50 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:54877d75d5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4284.0.0-a-3a00d206eb csi-node-driver-84cjd eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calia15e56e26b8 [] []}} ContainerID="4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" Namespace="calico-system" Pod="csi-node-driver-84cjd" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-csi--node--driver--84cjd-" Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.172 [INFO][5203] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" Namespace="calico-system" Pod="csi-node-driver-84cjd" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-csi--node--driver--84cjd-eth0" Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.187 [INFO][5242] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" HandleID="k8s-pod-network.4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" Workload="ci--4284.0.0--a--3a00d206eb-k8s-csi--node--driver--84cjd-eth0" Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.191 [INFO][5242] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" HandleID="k8s-pod-network.4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" Workload="ci--4284.0.0--a--3a00d206eb-k8s-csi--node--driver--84cjd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002f96b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284.0.0-a-3a00d206eb", "pod":"csi-node-driver-84cjd", "timestamp":"2025-03-25 02:38:14.187083589 +0000 UTC"}, Hostname:"ci-4284.0.0-a-3a00d206eb", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.191 [INFO][5242] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.191 [INFO][5242] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.191 [INFO][5242] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-a-3a00d206eb' Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.192 [INFO][5242] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.194 [INFO][5242] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.195 [INFO][5242] ipam/ipam.go 489: Trying affinity for 192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.196 [INFO][5242] ipam/ipam.go 155: Attempting to load block cidr=192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.197 [INFO][5242] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.197 [INFO][5242] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.14.64/26 handle="k8s-pod-network.4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.198 [INFO][5242] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.200 [INFO][5242] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.14.64/26 handle="k8s-pod-network.4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.203 [INFO][5242] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.14.68/26] block=192.168.14.64/26 handle="k8s-pod-network.4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.203 [INFO][5242] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.14.68/26] handle="k8s-pod-network.4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.203 [INFO][5242] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 02:38:14.211396 containerd[1819]: 2025-03-25 02:38:14.203 [INFO][5242] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.14.68/26] IPv6=[] ContainerID="4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" HandleID="k8s-pod-network.4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" Workload="ci--4284.0.0--a--3a00d206eb-k8s-csi--node--driver--84cjd-eth0" Mar 25 02:38:14.211815 containerd[1819]: 2025-03-25 02:38:14.204 [INFO][5203] cni-plugin/k8s.go 386: Populated endpoint ContainerID="4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" Namespace="calico-system" Pod="csi-node-driver-84cjd" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-csi--node--driver--84cjd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--3a00d206eb-k8s-csi--node--driver--84cjd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8d6881ea-948c-42f2-b0a9-c85a672155d1", ResourceVersion:"619", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"54877d75d5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-3a00d206eb", ContainerID:"", Pod:"csi-node-driver-84cjd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.14.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calia15e56e26b8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:38:14.211815 containerd[1819]: 2025-03-25 02:38:14.205 [INFO][5203] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.14.68/32] ContainerID="4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" Namespace="calico-system" Pod="csi-node-driver-84cjd" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-csi--node--driver--84cjd-eth0" Mar 25 02:38:14.211815 containerd[1819]: 2025-03-25 02:38:14.205 [INFO][5203] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia15e56e26b8 ContainerID="4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" Namespace="calico-system" Pod="csi-node-driver-84cjd" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-csi--node--driver--84cjd-eth0" Mar 25 02:38:14.211815 containerd[1819]: 2025-03-25 02:38:14.206 [INFO][5203] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" Namespace="calico-system" Pod="csi-node-driver-84cjd" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-csi--node--driver--84cjd-eth0" Mar 25 02:38:14.211815 containerd[1819]: 2025-03-25 02:38:14.206 [INFO][5203] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" Namespace="calico-system" Pod="csi-node-driver-84cjd" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-csi--node--driver--84cjd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--3a00d206eb-k8s-csi--node--driver--84cjd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8d6881ea-948c-42f2-b0a9-c85a672155d1", ResourceVersion:"619", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"54877d75d5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-3a00d206eb", ContainerID:"4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b", Pod:"csi-node-driver-84cjd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.14.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calia15e56e26b8", MAC:"c2:b4:39:df:31:cb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:38:14.211815 containerd[1819]: 2025-03-25 02:38:14.210 [INFO][5203] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" Namespace="calico-system" Pod="csi-node-driver-84cjd" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-csi--node--driver--84cjd-eth0" Mar 25 02:38:14.213370 containerd[1819]: time="2025-03-25T02:38:14.213349049Z" level=info msg="CreateContainer within sandbox \"c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a743079ca52daa4cf2bbc9a3ce100bc9594ee125a8129c12350614d98259668b\"" Mar 25 02:38:14.213661 containerd[1819]: time="2025-03-25T02:38:14.213646653Z" level=info msg="StartContainer for \"a743079ca52daa4cf2bbc9a3ce100bc9594ee125a8129c12350614d98259668b\"" Mar 25 02:38:14.214229 containerd[1819]: time="2025-03-25T02:38:14.214215843Z" level=info msg="connecting to shim a743079ca52daa4cf2bbc9a3ce100bc9594ee125a8129c12350614d98259668b" address="unix:///run/containerd/s/b68a10e98fee5887305cfb1dee573d34907fbcd0e8a6263121c89196be4768e3" protocol=ttrpc version=3 Mar 25 02:38:14.220198 containerd[1819]: time="2025-03-25T02:38:14.220166965Z" level=info msg="connecting to shim 4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b" address="unix:///run/containerd/s/621a8b74e2081283d7ede1d52f85c1645a4b0bcc9787de1bfb11f1bfe34fbf03" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:38:14.239967 systemd[1]: Started cri-containerd-a743079ca52daa4cf2bbc9a3ce100bc9594ee125a8129c12350614d98259668b.scope - libcontainer container a743079ca52daa4cf2bbc9a3ce100bc9594ee125a8129c12350614d98259668b. Mar 25 02:38:14.241585 systemd[1]: Started cri-containerd-4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b.scope - libcontainer container 4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b. Mar 25 02:38:14.253699 containerd[1819]: time="2025-03-25T02:38:14.253641438Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-84cjd,Uid:8d6881ea-948c-42f2-b0a9-c85a672155d1,Namespace:calico-system,Attempt:0,} returns sandbox id \"4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b\"" Mar 25 02:38:14.254280 containerd[1819]: time="2025-03-25T02:38:14.254268699Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 25 02:38:14.267849 containerd[1819]: time="2025-03-25T02:38:14.267828806Z" level=info msg="StartContainer for \"a743079ca52daa4cf2bbc9a3ce100bc9594ee125a8129c12350614d98259668b\" returns successfully" Mar 25 02:38:14.269156 kubelet[3188]: I0325 02:38:14.269110 3188 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6785d4dd64-6p69w" podStartSLOduration=21.747725757 podStartE2EDuration="24.26909273s" podCreationTimestamp="2025-03-25 02:37:50 +0000 UTC" firstStartedPulling="2025-03-25 02:38:11.268252205 +0000 UTC m=+32.167562887" lastFinishedPulling="2025-03-25 02:38:13.789619178 +0000 UTC m=+34.688929860" observedRunningTime="2025-03-25 02:38:14.268863575 +0000 UTC m=+35.168174258" watchObservedRunningTime="2025-03-25 02:38:14.26909273 +0000 UTC m=+35.168403410" Mar 25 02:38:14.307827 systemd-networkd[1728]: calidb544a9d7f3: Link UP Mar 25 02:38:14.307926 systemd-networkd[1728]: calidb544a9d7f3: Gained carrier Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.172 [INFO][5198] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--a--3a00d206eb-k8s-calico--kube--controllers--5fcdbb9f57--psbcp-eth0 calico-kube-controllers-5fcdbb9f57- calico-system e5e02bef-e0da-44bf-b287-659021a0e138 697 0 2025-03-25 02:37:50 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5fcdbb9f57 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4284.0.0-a-3a00d206eb calico-kube-controllers-5fcdbb9f57-psbcp eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calidb544a9d7f3 [] []}} ContainerID="849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" Namespace="calico-system" Pod="calico-kube-controllers-5fcdbb9f57-psbcp" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--kube--controllers--5fcdbb9f57--psbcp-" Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.172 [INFO][5198] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" Namespace="calico-system" Pod="calico-kube-controllers-5fcdbb9f57-psbcp" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--kube--controllers--5fcdbb9f57--psbcp-eth0" Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.187 [INFO][5240] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" HandleID="k8s-pod-network.849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" Workload="ci--4284.0.0--a--3a00d206eb-k8s-calico--kube--controllers--5fcdbb9f57--psbcp-eth0" Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.192 [INFO][5240] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" HandleID="k8s-pod-network.849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" Workload="ci--4284.0.0--a--3a00d206eb-k8s-calico--kube--controllers--5fcdbb9f57--psbcp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000050180), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284.0.0-a-3a00d206eb", "pod":"calico-kube-controllers-5fcdbb9f57-psbcp", "timestamp":"2025-03-25 02:38:14.187093018 +0000 UTC"}, Hostname:"ci-4284.0.0-a-3a00d206eb", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.192 [INFO][5240] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.203 [INFO][5240] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.203 [INFO][5240] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-a-3a00d206eb' Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.293 [INFO][5240] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.296 [INFO][5240] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.298 [INFO][5240] ipam/ipam.go 489: Trying affinity for 192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.299 [INFO][5240] ipam/ipam.go 155: Attempting to load block cidr=192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.300 [INFO][5240] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.300 [INFO][5240] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.14.64/26 handle="k8s-pod-network.849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.301 [INFO][5240] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68 Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.303 [INFO][5240] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.14.64/26 handle="k8s-pod-network.849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.305 [INFO][5240] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.14.69/26] block=192.168.14.64/26 handle="k8s-pod-network.849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.305 [INFO][5240] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.14.69/26] handle="k8s-pod-network.849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.305 [INFO][5240] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 02:38:14.314440 containerd[1819]: 2025-03-25 02:38:14.305 [INFO][5240] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.14.69/26] IPv6=[] ContainerID="849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" HandleID="k8s-pod-network.849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" Workload="ci--4284.0.0--a--3a00d206eb-k8s-calico--kube--controllers--5fcdbb9f57--psbcp-eth0" Mar 25 02:38:14.315096 containerd[1819]: 2025-03-25 02:38:14.307 [INFO][5198] cni-plugin/k8s.go 386: Populated endpoint ContainerID="849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" Namespace="calico-system" Pod="calico-kube-controllers-5fcdbb9f57-psbcp" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--kube--controllers--5fcdbb9f57--psbcp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--3a00d206eb-k8s-calico--kube--controllers--5fcdbb9f57--psbcp-eth0", GenerateName:"calico-kube-controllers-5fcdbb9f57-", Namespace:"calico-system", SelfLink:"", UID:"e5e02bef-e0da-44bf-b287-659021a0e138", ResourceVersion:"697", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5fcdbb9f57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-3a00d206eb", ContainerID:"", Pod:"calico-kube-controllers-5fcdbb9f57-psbcp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.14.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidb544a9d7f3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:38:14.315096 containerd[1819]: 2025-03-25 02:38:14.307 [INFO][5198] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.14.69/32] ContainerID="849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" Namespace="calico-system" Pod="calico-kube-controllers-5fcdbb9f57-psbcp" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--kube--controllers--5fcdbb9f57--psbcp-eth0" Mar 25 02:38:14.315096 containerd[1819]: 2025-03-25 02:38:14.307 [INFO][5198] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidb544a9d7f3 ContainerID="849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" Namespace="calico-system" Pod="calico-kube-controllers-5fcdbb9f57-psbcp" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--kube--controllers--5fcdbb9f57--psbcp-eth0" Mar 25 02:38:14.315096 containerd[1819]: 2025-03-25 02:38:14.307 [INFO][5198] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" Namespace="calico-system" Pod="calico-kube-controllers-5fcdbb9f57-psbcp" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--kube--controllers--5fcdbb9f57--psbcp-eth0" Mar 25 02:38:14.315096 containerd[1819]: 2025-03-25 02:38:14.308 [INFO][5198] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" Namespace="calico-system" Pod="calico-kube-controllers-5fcdbb9f57-psbcp" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--kube--controllers--5fcdbb9f57--psbcp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--3a00d206eb-k8s-calico--kube--controllers--5fcdbb9f57--psbcp-eth0", GenerateName:"calico-kube-controllers-5fcdbb9f57-", Namespace:"calico-system", SelfLink:"", UID:"e5e02bef-e0da-44bf-b287-659021a0e138", ResourceVersion:"697", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 37, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5fcdbb9f57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-3a00d206eb", ContainerID:"849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68", Pod:"calico-kube-controllers-5fcdbb9f57-psbcp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.14.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidb544a9d7f3", MAC:"fe:bb:fb:95:28:37", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:38:14.315096 containerd[1819]: 2025-03-25 02:38:14.313 [INFO][5198] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" Namespace="calico-system" Pod="calico-kube-controllers-5fcdbb9f57-psbcp" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-calico--kube--controllers--5fcdbb9f57--psbcp-eth0" Mar 25 02:38:14.338261 containerd[1819]: time="2025-03-25T02:38:14.338209979Z" level=info msg="connecting to shim 849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68" address="unix:///run/containerd/s/78a9a8031276648f6af1f24324455b5bdc8872605d8826c428f023e54871362e" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:38:14.367815 systemd[1]: Started cri-containerd-849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68.scope - libcontainer container 849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68. Mar 25 02:38:14.393256 containerd[1819]: time="2025-03-25T02:38:14.393211648Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5fcdbb9f57-psbcp,Uid:e5e02bef-e0da-44bf-b287-659021a0e138,Namespace:calico-system,Attempt:0,} returns sandbox id \"849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68\"" Mar 25 02:38:15.150665 containerd[1819]: time="2025-03-25T02:38:15.150545479Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-56n2d,Uid:e23e1cc9-f062-44e4-9ac6-0c4070ca0b2b,Namespace:kube-system,Attempt:0,}" Mar 25 02:38:15.216439 systemd-networkd[1728]: cali33c5b70dbae: Link UP Mar 25 02:38:15.216585 systemd-networkd[1728]: cali33c5b70dbae: Gained carrier Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.172 [INFO][5446] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--56n2d-eth0 coredns-668d6bf9bc- kube-system e23e1cc9-f062-44e4-9ac6-0c4070ca0b2b 693 0 2025-03-25 02:37:44 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284.0.0-a-3a00d206eb coredns-668d6bf9bc-56n2d eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali33c5b70dbae [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" Namespace="kube-system" Pod="coredns-668d6bf9bc-56n2d" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--56n2d-" Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.172 [INFO][5446] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" Namespace="kube-system" Pod="coredns-668d6bf9bc-56n2d" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--56n2d-eth0" Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.188 [INFO][5468] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" HandleID="k8s-pod-network.86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" Workload="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--56n2d-eth0" Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.194 [INFO][5468] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" HandleID="k8s-pod-network.86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" Workload="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--56n2d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000299df0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284.0.0-a-3a00d206eb", "pod":"coredns-668d6bf9bc-56n2d", "timestamp":"2025-03-25 02:38:15.188112884 +0000 UTC"}, Hostname:"ci-4284.0.0-a-3a00d206eb", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.194 [INFO][5468] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.194 [INFO][5468] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.194 [INFO][5468] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-a-3a00d206eb' Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.195 [INFO][5468] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.198 [INFO][5468] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.201 [INFO][5468] ipam/ipam.go 489: Trying affinity for 192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.203 [INFO][5468] ipam/ipam.go 155: Attempting to load block cidr=192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.204 [INFO][5468] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.14.64/26 host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.204 [INFO][5468] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.14.64/26 handle="k8s-pod-network.86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.206 [INFO][5468] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435 Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.211 [INFO][5468] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.14.64/26 handle="k8s-pod-network.86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.214 [INFO][5468] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.14.70/26] block=192.168.14.64/26 handle="k8s-pod-network.86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.214 [INFO][5468] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.14.70/26] handle="k8s-pod-network.86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" host="ci-4284.0.0-a-3a00d206eb" Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.214 [INFO][5468] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 02:38:15.222092 containerd[1819]: 2025-03-25 02:38:15.214 [INFO][5468] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.14.70/26] IPv6=[] ContainerID="86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" HandleID="k8s-pod-network.86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" Workload="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--56n2d-eth0" Mar 25 02:38:15.222508 containerd[1819]: 2025-03-25 02:38:15.215 [INFO][5446] cni-plugin/k8s.go 386: Populated endpoint ContainerID="86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" Namespace="kube-system" Pod="coredns-668d6bf9bc-56n2d" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--56n2d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--56n2d-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e23e1cc9-f062-44e4-9ac6-0c4070ca0b2b", ResourceVersion:"693", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 37, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-3a00d206eb", ContainerID:"", Pod:"coredns-668d6bf9bc-56n2d", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.14.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali33c5b70dbae", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:38:15.222508 containerd[1819]: 2025-03-25 02:38:15.215 [INFO][5446] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.14.70/32] ContainerID="86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" Namespace="kube-system" Pod="coredns-668d6bf9bc-56n2d" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--56n2d-eth0" Mar 25 02:38:15.222508 containerd[1819]: 2025-03-25 02:38:15.215 [INFO][5446] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali33c5b70dbae ContainerID="86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" Namespace="kube-system" Pod="coredns-668d6bf9bc-56n2d" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--56n2d-eth0" Mar 25 02:38:15.222508 containerd[1819]: 2025-03-25 02:38:15.216 [INFO][5446] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" Namespace="kube-system" Pod="coredns-668d6bf9bc-56n2d" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--56n2d-eth0" Mar 25 02:38:15.222508 containerd[1819]: 2025-03-25 02:38:15.216 [INFO][5446] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" Namespace="kube-system" Pod="coredns-668d6bf9bc-56n2d" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--56n2d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--56n2d-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e23e1cc9-f062-44e4-9ac6-0c4070ca0b2b", ResourceVersion:"693", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 37, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-3a00d206eb", ContainerID:"86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435", Pod:"coredns-668d6bf9bc-56n2d", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.14.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali33c5b70dbae", MAC:"76:6a:78:42:e3:59", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:38:15.222508 containerd[1819]: 2025-03-25 02:38:15.220 [INFO][5446] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" Namespace="kube-system" Pod="coredns-668d6bf9bc-56n2d" WorkloadEndpoint="ci--4284.0.0--a--3a00d206eb-k8s-coredns--668d6bf9bc--56n2d-eth0" Mar 25 02:38:15.230972 containerd[1819]: time="2025-03-25T02:38:15.230946383Z" level=info msg="connecting to shim 86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435" address="unix:///run/containerd/s/395618252ba9cee0d7dfd468ee3558cd8a622c84e280889dc528bc7bad3063a5" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:38:15.247943 systemd[1]: Started cri-containerd-86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435.scope - libcontainer container 86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435. Mar 25 02:38:15.267180 kubelet[3188]: I0325 02:38:15.267166 3188 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 02:38:15.272325 kubelet[3188]: I0325 02:38:15.272277 3188 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6785d4dd64-xght8" podStartSLOduration=22.557728908 podStartE2EDuration="25.272256741s" podCreationTimestamp="2025-03-25 02:37:50 +0000 UTC" firstStartedPulling="2025-03-25 02:38:11.490830822 +0000 UTC m=+32.390141504" lastFinishedPulling="2025-03-25 02:38:14.205358655 +0000 UTC m=+35.104669337" observedRunningTime="2025-03-25 02:38:15.272168567 +0000 UTC m=+36.171479249" watchObservedRunningTime="2025-03-25 02:38:15.272256741 +0000 UTC m=+36.171567425" Mar 25 02:38:15.273457 containerd[1819]: time="2025-03-25T02:38:15.273437913Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-56n2d,Uid:e23e1cc9-f062-44e4-9ac6-0c4070ca0b2b,Namespace:kube-system,Attempt:0,} returns sandbox id \"86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435\"" Mar 25 02:38:15.274700 containerd[1819]: time="2025-03-25T02:38:15.274657125Z" level=info msg="CreateContainer within sandbox \"86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 02:38:15.277836 containerd[1819]: time="2025-03-25T02:38:15.277791515Z" level=info msg="Container 09ac4cf426061b249584c47064f3bfbab0b2f490c9848812d37bed735ea7eb1a: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:38:15.279924 containerd[1819]: time="2025-03-25T02:38:15.279877891Z" level=info msg="CreateContainer within sandbox \"86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"09ac4cf426061b249584c47064f3bfbab0b2f490c9848812d37bed735ea7eb1a\"" Mar 25 02:38:15.280117 containerd[1819]: time="2025-03-25T02:38:15.280078317Z" level=info msg="StartContainer for \"09ac4cf426061b249584c47064f3bfbab0b2f490c9848812d37bed735ea7eb1a\"" Mar 25 02:38:15.280504 containerd[1819]: time="2025-03-25T02:38:15.280490476Z" level=info msg="connecting to shim 09ac4cf426061b249584c47064f3bfbab0b2f490c9848812d37bed735ea7eb1a" address="unix:///run/containerd/s/395618252ba9cee0d7dfd468ee3558cd8a622c84e280889dc528bc7bad3063a5" protocol=ttrpc version=3 Mar 25 02:38:15.301937 systemd[1]: Started cri-containerd-09ac4cf426061b249584c47064f3bfbab0b2f490c9848812d37bed735ea7eb1a.scope - libcontainer container 09ac4cf426061b249584c47064f3bfbab0b2f490c9848812d37bed735ea7eb1a. Mar 25 02:38:15.314936 containerd[1819]: time="2025-03-25T02:38:15.314915877Z" level=info msg="StartContainer for \"09ac4cf426061b249584c47064f3bfbab0b2f490c9848812d37bed735ea7eb1a\" returns successfully" Mar 25 02:38:15.748487 systemd[1]: Started sshd@16-147.75.90.239:22-103.63.108.25:56894.service - OpenSSH per-connection server daemon (103.63.108.25:56894). Mar 25 02:38:15.809699 systemd-networkd[1728]: calia15e56e26b8: Gained IPv6LL Mar 25 02:38:15.809902 systemd-networkd[1728]: calidb544a9d7f3: Gained IPv6LL Mar 25 02:38:15.846999 containerd[1819]: time="2025-03-25T02:38:15.846948975Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:38:15.847233 containerd[1819]: time="2025-03-25T02:38:15.847179895Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7909887" Mar 25 02:38:15.847569 containerd[1819]: time="2025-03-25T02:38:15.847527627Z" level=info msg="ImageCreate event name:\"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:38:15.848477 containerd[1819]: time="2025-03-25T02:38:15.848430518Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:38:15.848850 containerd[1819]: time="2025-03-25T02:38:15.848809597Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"9402991\" in 1.594525368s" Mar 25 02:38:15.848850 containerd[1819]: time="2025-03-25T02:38:15.848823405Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\"" Mar 25 02:38:15.849282 containerd[1819]: time="2025-03-25T02:38:15.849243690Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 25 02:38:15.849774 containerd[1819]: time="2025-03-25T02:38:15.849759240Z" level=info msg="CreateContainer within sandbox \"4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 25 02:38:15.853532 containerd[1819]: time="2025-03-25T02:38:15.853492947Z" level=info msg="Container 44c0f7cfe8d2c80b950a98a6eba466298d81dc1f8c03a69cb28e2666f6fb51d5: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:38:15.856613 containerd[1819]: time="2025-03-25T02:38:15.856574172Z" level=info msg="CreateContainer within sandbox \"4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"44c0f7cfe8d2c80b950a98a6eba466298d81dc1f8c03a69cb28e2666f6fb51d5\"" Mar 25 02:38:15.856814 containerd[1819]: time="2025-03-25T02:38:15.856801778Z" level=info msg="StartContainer for \"44c0f7cfe8d2c80b950a98a6eba466298d81dc1f8c03a69cb28e2666f6fb51d5\"" Mar 25 02:38:15.857527 containerd[1819]: time="2025-03-25T02:38:15.857514650Z" level=info msg="connecting to shim 44c0f7cfe8d2c80b950a98a6eba466298d81dc1f8c03a69cb28e2666f6fb51d5" address="unix:///run/containerd/s/621a8b74e2081283d7ede1d52f85c1645a4b0bcc9787de1bfb11f1bfe34fbf03" protocol=ttrpc version=3 Mar 25 02:38:15.869933 systemd[1]: Started cri-containerd-44c0f7cfe8d2c80b950a98a6eba466298d81dc1f8c03a69cb28e2666f6fb51d5.scope - libcontainer container 44c0f7cfe8d2c80b950a98a6eba466298d81dc1f8c03a69cb28e2666f6fb51d5. Mar 25 02:38:15.888112 containerd[1819]: time="2025-03-25T02:38:15.888085268Z" level=info msg="StartContainer for \"44c0f7cfe8d2c80b950a98a6eba466298d81dc1f8c03a69cb28e2666f6fb51d5\" returns successfully" Mar 25 02:38:16.270565 kubelet[3188]: I0325 02:38:16.270518 3188 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 02:38:16.277415 kubelet[3188]: I0325 02:38:16.277376 3188 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-56n2d" podStartSLOduration=32.277362136 podStartE2EDuration="32.277362136s" podCreationTimestamp="2025-03-25 02:37:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 02:38:16.277333299 +0000 UTC m=+37.176643990" watchObservedRunningTime="2025-03-25 02:38:16.277362136 +0000 UTC m=+37.176672828" Mar 25 02:38:16.513913 systemd-networkd[1728]: cali33c5b70dbae: Gained IPv6LL Mar 25 02:38:16.865077 sshd[5594]: Invalid user sunpei from 103.63.108.25 port 56894 Mar 25 02:38:17.064532 systemd[1]: Started sshd@17-147.75.90.239:22-41.73.244.116:41552.service - OpenSSH per-connection server daemon (41.73.244.116:41552). Mar 25 02:38:17.075630 sshd[5594]: Received disconnect from 103.63.108.25 port 56894:11: Bye Bye [preauth] Mar 25 02:38:17.075630 sshd[5594]: Disconnected from invalid user sunpei 103.63.108.25 port 56894 [preauth] Mar 25 02:38:17.083987 systemd[1]: sshd@16-147.75.90.239:22-103.63.108.25:56894.service: Deactivated successfully. Mar 25 02:38:18.132095 containerd[1819]: time="2025-03-25T02:38:18.132070070Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:38:18.132368 containerd[1819]: time="2025-03-25T02:38:18.132341636Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=34792912" Mar 25 02:38:18.132661 containerd[1819]: time="2025-03-25T02:38:18.132651259Z" level=info msg="ImageCreate event name:\"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:38:18.133456 containerd[1819]: time="2025-03-25T02:38:18.133444934Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:38:18.133831 containerd[1819]: time="2025-03-25T02:38:18.133818397Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"36285984\" in 2.284560172s" Mar 25 02:38:18.133855 containerd[1819]: time="2025-03-25T02:38:18.133836267Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\"" Mar 25 02:38:18.134334 containerd[1819]: time="2025-03-25T02:38:18.134325011Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 25 02:38:18.137139 containerd[1819]: time="2025-03-25T02:38:18.137123923Z" level=info msg="CreateContainer within sandbox \"849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 25 02:38:18.139723 containerd[1819]: time="2025-03-25T02:38:18.139683788Z" level=info msg="Container 424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:38:18.142516 containerd[1819]: time="2025-03-25T02:38:18.142472859Z" level=info msg="CreateContainer within sandbox \"849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\"" Mar 25 02:38:18.142699 containerd[1819]: time="2025-03-25T02:38:18.142687321Z" level=info msg="StartContainer for \"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\"" Mar 25 02:38:18.143243 containerd[1819]: time="2025-03-25T02:38:18.143232598Z" level=info msg="connecting to shim 424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d" address="unix:///run/containerd/s/78a9a8031276648f6af1f24324455b5bdc8872605d8826c428f023e54871362e" protocol=ttrpc version=3 Mar 25 02:38:18.171875 systemd[1]: Started cri-containerd-424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d.scope - libcontainer container 424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d. Mar 25 02:38:18.220884 containerd[1819]: time="2025-03-25T02:38:18.220825058Z" level=info msg="StartContainer for \"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" returns successfully" Mar 25 02:38:18.286035 kubelet[3188]: I0325 02:38:18.285987 3188 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5fcdbb9f57-psbcp" podStartSLOduration=24.545476171 podStartE2EDuration="28.28596983s" podCreationTimestamp="2025-03-25 02:37:50 +0000 UTC" firstStartedPulling="2025-03-25 02:38:14.393779563 +0000 UTC m=+35.293090242" lastFinishedPulling="2025-03-25 02:38:18.134273218 +0000 UTC m=+39.033583901" observedRunningTime="2025-03-25 02:38:18.285942154 +0000 UTC m=+39.185252841" watchObservedRunningTime="2025-03-25 02:38:18.28596983 +0000 UTC m=+39.185280514" Mar 25 02:38:19.005869 sshd[5635]: Invalid user team from 41.73.244.116 port 41552 Mar 25 02:38:19.329801 containerd[1819]: time="2025-03-25T02:38:19.329714585Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"c052aa97eeae023b46faa6fd1bfe7e071ae370f0aaf5271eb8f9dfe94f53ac9c\" pid:5707 exited_at:{seconds:1742870299 nanos:329471312}" Mar 25 02:38:19.379967 sshd[5635]: Received disconnect from 41.73.244.116 port 41552:11: Bye Bye [preauth] Mar 25 02:38:19.379967 sshd[5635]: Disconnected from invalid user team 41.73.244.116 port 41552 [preauth] Mar 25 02:38:19.383332 systemd[1]: sshd@17-147.75.90.239:22-41.73.244.116:41552.service: Deactivated successfully. Mar 25 02:38:19.914842 containerd[1819]: time="2025-03-25T02:38:19.914817721Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:38:19.915191 containerd[1819]: time="2025-03-25T02:38:19.915145402Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13986843" Mar 25 02:38:19.915635 containerd[1819]: time="2025-03-25T02:38:19.915618890Z" level=info msg="ImageCreate event name:\"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:38:19.916613 containerd[1819]: time="2025-03-25T02:38:19.916598325Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:38:19.916973 containerd[1819]: time="2025-03-25T02:38:19.916959444Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"15479899\" in 1.782619517s" Mar 25 02:38:19.917018 containerd[1819]: time="2025-03-25T02:38:19.916974974Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\"" Mar 25 02:38:19.918050 containerd[1819]: time="2025-03-25T02:38:19.918032046Z" level=info msg="CreateContainer within sandbox \"4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 25 02:38:19.921672 containerd[1819]: time="2025-03-25T02:38:19.921652488Z" level=info msg="Container d52270326e36ae019d1944685fa44a7c013865dfb668d3d982460a92ad187a21: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:38:19.926040 containerd[1819]: time="2025-03-25T02:38:19.925996479Z" level=info msg="CreateContainer within sandbox \"4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"d52270326e36ae019d1944685fa44a7c013865dfb668d3d982460a92ad187a21\"" Mar 25 02:38:19.926388 containerd[1819]: time="2025-03-25T02:38:19.926373279Z" level=info msg="StartContainer for \"d52270326e36ae019d1944685fa44a7c013865dfb668d3d982460a92ad187a21\"" Mar 25 02:38:19.927385 containerd[1819]: time="2025-03-25T02:38:19.927367798Z" level=info msg="connecting to shim d52270326e36ae019d1944685fa44a7c013865dfb668d3d982460a92ad187a21" address="unix:///run/containerd/s/621a8b74e2081283d7ede1d52f85c1645a4b0bcc9787de1bfb11f1bfe34fbf03" protocol=ttrpc version=3 Mar 25 02:38:19.944799 systemd[1]: Started cri-containerd-d52270326e36ae019d1944685fa44a7c013865dfb668d3d982460a92ad187a21.scope - libcontainer container d52270326e36ae019d1944685fa44a7c013865dfb668d3d982460a92ad187a21. Mar 25 02:38:19.966440 containerd[1819]: time="2025-03-25T02:38:19.966417122Z" level=info msg="StartContainer for \"d52270326e36ae019d1944685fa44a7c013865dfb668d3d982460a92ad187a21\" returns successfully" Mar 25 02:38:20.191272 kubelet[3188]: I0325 02:38:20.191058 3188 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 25 02:38:20.191272 kubelet[3188]: I0325 02:38:20.191126 3188 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 25 02:38:20.316506 kubelet[3188]: I0325 02:38:20.316392 3188 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-84cjd" podStartSLOduration=24.653170009 podStartE2EDuration="30.316355379s" podCreationTimestamp="2025-03-25 02:37:50 +0000 UTC" firstStartedPulling="2025-03-25 02:38:14.254172636 +0000 UTC m=+35.153483320" lastFinishedPulling="2025-03-25 02:38:19.917358007 +0000 UTC m=+40.816668690" observedRunningTime="2025-03-25 02:38:20.315848611 +0000 UTC m=+41.215159386" watchObservedRunningTime="2025-03-25 02:38:20.316355379 +0000 UTC m=+41.215666111" Mar 25 02:38:30.887438 systemd[1]: Started sshd@18-147.75.90.239:22-189.8.108.39:56174.service - OpenSSH per-connection server daemon (189.8.108.39:56174). Mar 25 02:38:31.945184 sshd[5774]: Invalid user seekcy from 189.8.108.39 port 56174 Mar 25 02:38:32.147522 sshd[5774]: Received disconnect from 189.8.108.39 port 56174:11: Bye Bye [preauth] Mar 25 02:38:32.147522 sshd[5774]: Disconnected from invalid user seekcy 189.8.108.39 port 56174 [preauth] Mar 25 02:38:32.150978 systemd[1]: sshd@18-147.75.90.239:22-189.8.108.39:56174.service: Deactivated successfully. Mar 25 02:38:38.956906 systemd[1]: Started sshd@19-147.75.90.239:22-103.31.39.72:45058.service - OpenSSH per-connection server daemon (103.31.39.72:45058). Mar 25 02:38:39.676101 containerd[1819]: time="2025-03-25T02:38:39.676042848Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\" id:\"109340b2c23aec030b0c5c1219a19d6a3be8bd78f87ee3eeebd6c0b39b8994e0\" pid:5796 exited_at:{seconds:1742870319 nanos:675838499}" Mar 25 02:38:39.761237 kubelet[3188]: I0325 02:38:39.761103 3188 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 02:38:40.080204 sshd[5781]: Invalid user bhartman from 103.31.39.72 port 45058 Mar 25 02:38:40.287434 sshd[5781]: Received disconnect from 103.31.39.72 port 45058:11: Bye Bye [preauth] Mar 25 02:38:40.287434 sshd[5781]: Disconnected from invalid user bhartman 103.31.39.72 port 45058 [preauth] Mar 25 02:38:40.290805 systemd[1]: sshd@19-147.75.90.239:22-103.31.39.72:45058.service: Deactivated successfully. Mar 25 02:38:44.697871 kubelet[3188]: I0325 02:38:44.697793 3188 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 02:38:47.091323 containerd[1819]: time="2025-03-25T02:38:47.091262085Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"737eb00cd62d5f41f4a28f9e68656598a99ab9e7f35678772909c3b8b66d0c8f\" pid:5834 exited_at:{seconds:1742870327 nanos:91102870}" Mar 25 02:38:49.382059 containerd[1819]: time="2025-03-25T02:38:49.382022855Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"216d4eda408a45cfce9a96ab466fcca62275797795684bd6391e483a0f6b1957\" pid:5863 exited_at:{seconds:1742870329 nanos:381808145}" Mar 25 02:38:50.733554 systemd[1]: Started sshd@20-147.75.90.239:22-160.30.159.175:45458.service - OpenSSH per-connection server daemon (160.30.159.175:45458). Mar 25 02:38:51.821538 sshd[5875]: Invalid user zi from 160.30.159.175 port 45458 Mar 25 02:38:52.008611 sshd[5875]: Received disconnect from 160.30.159.175 port 45458:11: Bye Bye [preauth] Mar 25 02:38:52.008611 sshd[5875]: Disconnected from invalid user zi 160.30.159.175 port 45458 [preauth] Mar 25 02:38:52.011937 systemd[1]: sshd@20-147.75.90.239:22-160.30.159.175:45458.service: Deactivated successfully. Mar 25 02:39:09.680541 containerd[1819]: time="2025-03-25T02:39:09.680490305Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\" id:\"0623032ab64d6def08348099aa2ee442dbd5bbdc3da6b9f384c3bf4db5eac6d1\" pid:5892 exited_at:{seconds:1742870349 nanos:680165620}" Mar 25 02:39:19.358189 containerd[1819]: time="2025-03-25T02:39:19.358157691Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"cfd9162b0e14b9ea4d4a48835a641e137cf1945d577acb0748cbabb3aa119c13\" pid:5924 exited_at:{seconds:1742870359 nanos:357965326}" Mar 25 02:39:35.482937 systemd[1]: Started sshd@21-147.75.90.239:22-103.63.108.25:54974.service - OpenSSH per-connection server daemon (103.63.108.25:54974). Mar 25 02:39:36.593782 sshd[5943]: Invalid user perforce from 103.63.108.25 port 54974 Mar 25 02:39:36.806791 sshd[5943]: Received disconnect from 103.63.108.25 port 54974:11: Bye Bye [preauth] Mar 25 02:39:36.806791 sshd[5943]: Disconnected from invalid user perforce 103.63.108.25 port 54974 [preauth] Mar 25 02:39:36.810126 systemd[1]: sshd@21-147.75.90.239:22-103.63.108.25:54974.service: Deactivated successfully. Mar 25 02:39:39.716622 containerd[1819]: time="2025-03-25T02:39:39.716586233Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\" id:\"ce41df53c83748908011ad7ef8fb6e989ada70fbd602eb1387cf9aba88ed6023\" pid:5966 exited_at:{seconds:1742870379 nanos:716358232}" Mar 25 02:39:39.779420 systemd[1]: Started sshd@22-147.75.90.239:22-41.73.244.116:49804.service - OpenSSH per-connection server daemon (41.73.244.116:49804). Mar 25 02:39:41.759138 sshd[5985]: Invalid user jaewon from 41.73.244.116 port 49804 Mar 25 02:39:42.137831 sshd[5985]: Received disconnect from 41.73.244.116 port 49804:11: Bye Bye [preauth] Mar 25 02:39:42.137831 sshd[5985]: Disconnected from invalid user jaewon 41.73.244.116 port 49804 [preauth] Mar 25 02:39:42.140972 systemd[1]: sshd@22-147.75.90.239:22-41.73.244.116:49804.service: Deactivated successfully. Mar 25 02:39:45.005088 systemd[1]: Started sshd@23-147.75.90.239:22-189.8.108.39:55624.service - OpenSSH per-connection server daemon (189.8.108.39:55624). Mar 25 02:39:46.061200 sshd[5993]: Invalid user ple from 189.8.108.39 port 55624 Mar 25 02:39:46.262427 sshd[5993]: Received disconnect from 189.8.108.39 port 55624:11: Bye Bye [preauth] Mar 25 02:39:46.262427 sshd[5993]: Disconnected from invalid user ple 189.8.108.39 port 55624 [preauth] Mar 25 02:39:46.265742 systemd[1]: sshd@23-147.75.90.239:22-189.8.108.39:55624.service: Deactivated successfully. Mar 25 02:39:47.036411 containerd[1819]: time="2025-03-25T02:39:47.036386085Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"eea6fc014b97b558017562655f84ac57277cffc02b66bc5d712d44d599efeefe\" pid:6024 exited_at:{seconds:1742870387 nanos:36214861}" Mar 25 02:39:49.380426 containerd[1819]: time="2025-03-25T02:39:49.380396719Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"6b63258a7a41a4a05d6de8edc2a77efd8498094ba5147fb04ec8f180ccd13c43\" pid:6046 exited_at:{seconds:1742870389 nanos:380227544}" Mar 25 02:39:59.419881 systemd[1]: Started sshd@24-147.75.90.239:22-103.31.39.72:38582.service - OpenSSH per-connection server daemon (103.31.39.72:38582). Mar 25 02:40:00.476962 sshd[6057]: Invalid user public from 103.31.39.72 port 38582 Mar 25 02:40:00.673824 sshd[6057]: Received disconnect from 103.31.39.72 port 38582:11: Bye Bye [preauth] Mar 25 02:40:00.673824 sshd[6057]: Disconnected from invalid user public 103.31.39.72 port 38582 [preauth] Mar 25 02:40:00.674710 systemd[1]: sshd@24-147.75.90.239:22-103.31.39.72:38582.service: Deactivated successfully. Mar 25 02:40:09.675203 containerd[1819]: time="2025-03-25T02:40:09.675111765Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\" id:\"e48285bfb84fd46fb150d4a851a259423748fc5d21833ff3f345e6cb5b16d29d\" pid:6077 exited_at:{seconds:1742870409 nanos:674895294}" Mar 25 02:40:19.329283 containerd[1819]: time="2025-03-25T02:40:19.329260989Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"628cd18d0f2bd9c4ae2ecd17a6e0bd61acd4ec74e0c8d956947cc4a9f3ac66cf\" pid:6109 exited_at:{seconds:1742870419 nanos:329160442}" Mar 25 02:40:32.347360 systemd[1]: Started sshd@25-147.75.90.239:22-160.30.159.175:36330.service - OpenSSH per-connection server daemon (160.30.159.175:36330). Mar 25 02:40:33.428238 sshd[6120]: Invalid user hfsr from 160.30.159.175 port 36330 Mar 25 02:40:33.616462 sshd[6120]: Received disconnect from 160.30.159.175 port 36330:11: Bye Bye [preauth] Mar 25 02:40:33.616462 sshd[6120]: Disconnected from invalid user hfsr 160.30.159.175 port 36330 [preauth] Mar 25 02:40:33.619780 systemd[1]: sshd@25-147.75.90.239:22-160.30.159.175:36330.service: Deactivated successfully. Mar 25 02:40:39.690435 containerd[1819]: time="2025-03-25T02:40:39.690399414Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\" id:\"23f43425375f5e325366637bc82ac8cb5002bcc727d229cf24a41078b5b34711\" pid:6137 exited_at:{seconds:1742870439 nanos:690123350}" Mar 25 02:40:47.043904 containerd[1819]: time="2025-03-25T02:40:47.043877725Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"676e3b7b3d0d707cabf26330817989b93a971c59dfffd816538cb608efea1966\" pid:6167 exited_at:{seconds:1742870447 nanos:43769488}" Mar 25 02:40:49.394893 containerd[1819]: time="2025-03-25T02:40:49.394855950Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"ae4958305dc58b18142665e42b40a1f134327eb1d6391187132ae5df0ffd151c\" pid:6194 exited_at:{seconds:1742870449 nanos:394585826}" Mar 25 02:40:55.006989 systemd[1]: Started sshd@26-147.75.90.239:22-103.63.108.25:53066.service - OpenSSH per-connection server daemon (103.63.108.25:53066). Mar 25 02:40:56.056078 sshd[6206]: Invalid user admin from 103.63.108.25 port 53066 Mar 25 02:40:56.251996 sshd[6206]: Received disconnect from 103.63.108.25 port 53066:11: Bye Bye [preauth] Mar 25 02:40:56.251996 sshd[6206]: Disconnected from invalid user admin 103.63.108.25 port 53066 [preauth] Mar 25 02:40:56.255278 systemd[1]: sshd@26-147.75.90.239:22-103.63.108.25:53066.service: Deactivated successfully. Mar 25 02:41:00.174988 systemd[1]: Started sshd@27-147.75.90.239:22-41.73.244.116:56590.service - OpenSSH per-connection server daemon (41.73.244.116:56590). Mar 25 02:41:02.111743 systemd[1]: Started sshd@28-147.75.90.239:22-189.8.108.39:57746.service - OpenSSH per-connection server daemon (189.8.108.39:57746). Mar 25 02:41:02.139025 sshd[6213]: Invalid user ple from 41.73.244.116 port 56590 Mar 25 02:41:02.515815 sshd[6213]: Received disconnect from 41.73.244.116 port 56590:11: Bye Bye [preauth] Mar 25 02:41:02.515815 sshd[6213]: Disconnected from invalid user ple 41.73.244.116 port 56590 [preauth] Mar 25 02:41:02.519070 systemd[1]: sshd@27-147.75.90.239:22-41.73.244.116:56590.service: Deactivated successfully. Mar 25 02:41:03.169282 sshd[6216]: Invalid user bhartman from 189.8.108.39 port 57746 Mar 25 02:41:03.367717 sshd[6216]: Received disconnect from 189.8.108.39 port 57746:11: Bye Bye [preauth] Mar 25 02:41:03.367717 sshd[6216]: Disconnected from invalid user bhartman 189.8.108.39 port 57746 [preauth] Mar 25 02:41:03.371071 systemd[1]: sshd@28-147.75.90.239:22-189.8.108.39:57746.service: Deactivated successfully. Mar 25 02:41:09.702650 containerd[1819]: time="2025-03-25T02:41:09.702618540Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\" id:\"588b490f93794cea33192c152a13def2cd41359f88246ab60a01036cc2236e3c\" pid:6233 exited_at:{seconds:1742870469 nanos:702283675}" Mar 25 02:41:19.382908 containerd[1819]: time="2025-03-25T02:41:19.382870075Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"59e77856812e6c72f3c6c786d14abd29e5b1e96f5a3c32f5ff995c2b0943d697\" pid:6272 exited_at:{seconds:1742870479 nanos:382644892}" Mar 25 02:41:21.647020 systemd[1]: Started sshd@29-147.75.90.239:22-103.31.39.72:39830.service - OpenSSH per-connection server daemon (103.31.39.72:39830). Mar 25 02:41:23.180715 sshd[6283]: Invalid user ple from 103.31.39.72 port 39830 Mar 25 02:41:23.472808 sshd[6283]: Received disconnect from 103.31.39.72 port 39830:11: Bye Bye [preauth] Mar 25 02:41:23.472808 sshd[6283]: Disconnected from invalid user ple 103.31.39.72 port 39830 [preauth] Mar 25 02:41:23.476076 systemd[1]: sshd@29-147.75.90.239:22-103.31.39.72:39830.service: Deactivated successfully. Mar 25 02:41:37.177614 update_engine[1805]: I20250325 02:41:37.177532 1805 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 25 02:41:37.177614 update_engine[1805]: I20250325 02:41:37.177557 1805 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 25 02:41:37.177849 update_engine[1805]: I20250325 02:41:37.177656 1805 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 25 02:41:37.177872 update_engine[1805]: I20250325 02:41:37.177865 1805 omaha_request_params.cc:62] Current group set to alpha Mar 25 02:41:37.177961 update_engine[1805]: I20250325 02:41:37.177922 1805 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 25 02:41:37.177961 update_engine[1805]: I20250325 02:41:37.177928 1805 update_attempter.cc:643] Scheduling an action processor start. Mar 25 02:41:37.177961 update_engine[1805]: I20250325 02:41:37.177937 1805 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 25 02:41:37.177961 update_engine[1805]: I20250325 02:41:37.177953 1805 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 25 02:41:37.178045 update_engine[1805]: I20250325 02:41:37.177982 1805 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 25 02:41:37.178045 update_engine[1805]: I20250325 02:41:37.177987 1805 omaha_request_action.cc:272] Request: Mar 25 02:41:37.178045 update_engine[1805]: Mar 25 02:41:37.178045 update_engine[1805]: Mar 25 02:41:37.178045 update_engine[1805]: Mar 25 02:41:37.178045 update_engine[1805]: Mar 25 02:41:37.178045 update_engine[1805]: Mar 25 02:41:37.178045 update_engine[1805]: Mar 25 02:41:37.178045 update_engine[1805]: Mar 25 02:41:37.178045 update_engine[1805]: Mar 25 02:41:37.178045 update_engine[1805]: I20250325 02:41:37.177991 1805 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 02:41:37.178205 locksmithd[1856]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 25 02:41:37.178768 update_engine[1805]: I20250325 02:41:37.178730 1805 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 02:41:37.178938 update_engine[1805]: I20250325 02:41:37.178898 1805 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 02:41:37.179403 update_engine[1805]: E20250325 02:41:37.179353 1805 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 02:41:37.179403 update_engine[1805]: I20250325 02:41:37.179387 1805 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 25 02:41:39.678432 containerd[1819]: time="2025-03-25T02:41:39.678407009Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\" id:\"2e9103b5410898c0808c1ca13f0843fcac231b99ef1da2f6909da6422be465fd\" pid:6315 exited_at:{seconds:1742870499 nanos:678121758}" Mar 25 02:41:47.078795 containerd[1819]: time="2025-03-25T02:41:47.078737635Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"ba066ab673037abc8ad0f067d3255beda691014c5655d91f4324531b6e216585\" pid:6346 exited_at:{seconds:1742870507 nanos:78592159}" Mar 25 02:41:47.093692 update_engine[1805]: I20250325 02:41:47.093647 1805 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 02:41:47.093851 update_engine[1805]: I20250325 02:41:47.093786 1805 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 02:41:47.094001 update_engine[1805]: I20250325 02:41:47.093958 1805 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 02:41:47.094553 update_engine[1805]: E20250325 02:41:47.094506 1805 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 02:41:47.094553 update_engine[1805]: I20250325 02:41:47.094544 1805 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 25 02:41:49.322174 containerd[1819]: time="2025-03-25T02:41:49.322150399Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"2e70b91bd48ed080de556be3722ff52d6ea2ae47e0608a7f9a32bd7447fc6f6c\" pid:6369 exited_at:{seconds:1742870509 nanos:322038870}" Mar 25 02:41:56.520598 systemd[1]: Started sshd@30-147.75.90.239:22-92.255.85.189:30354.service - OpenSSH per-connection server daemon (92.255.85.189:30354). Mar 25 02:41:57.086289 update_engine[1805]: I20250325 02:41:57.086125 1805 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 02:41:57.087196 update_engine[1805]: I20250325 02:41:57.086740 1805 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 02:41:57.087485 update_engine[1805]: I20250325 02:41:57.087388 1805 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 02:41:57.087940 update_engine[1805]: E20250325 02:41:57.087843 1805 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 02:41:57.088148 update_engine[1805]: I20250325 02:41:57.087966 1805 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Mar 25 02:41:58.640340 sshd[6380]: Connection closed by authenticating user nobody 92.255.85.189 port 30354 [preauth] Mar 25 02:41:58.643758 systemd[1]: sshd@30-147.75.90.239:22-92.255.85.189:30354.service: Deactivated successfully. Mar 25 02:42:07.089936 update_engine[1805]: I20250325 02:42:07.089793 1805 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 02:42:07.091801 update_engine[1805]: I20250325 02:42:07.090361 1805 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 02:42:07.091801 update_engine[1805]: I20250325 02:42:07.090955 1805 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 02:42:07.091801 update_engine[1805]: E20250325 02:42:07.091429 1805 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 02:42:07.091801 update_engine[1805]: I20250325 02:42:07.091600 1805 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 25 02:42:07.091801 update_engine[1805]: I20250325 02:42:07.091645 1805 omaha_request_action.cc:617] Omaha request response: Mar 25 02:42:07.091801 update_engine[1805]: E20250325 02:42:07.091717 1805 omaha_request_action.cc:636] Omaha request network transfer failed. Mar 25 02:42:07.091801 update_engine[1805]: I20250325 02:42:07.091738 1805 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Mar 25 02:42:07.091801 update_engine[1805]: I20250325 02:42:07.091746 1805 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 25 02:42:07.091801 update_engine[1805]: I20250325 02:42:07.091753 1805 update_attempter.cc:306] Processing Done. Mar 25 02:42:07.091801 update_engine[1805]: E20250325 02:42:07.091766 1805 update_attempter.cc:619] Update failed. Mar 25 02:42:07.091801 update_engine[1805]: I20250325 02:42:07.091773 1805 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Mar 25 02:42:07.091801 update_engine[1805]: I20250325 02:42:07.091780 1805 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Mar 25 02:42:07.091801 update_engine[1805]: I20250325 02:42:07.091787 1805 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Mar 25 02:42:07.092265 update_engine[1805]: I20250325 02:42:07.091858 1805 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 25 02:42:07.092265 update_engine[1805]: I20250325 02:42:07.091882 1805 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 25 02:42:07.092265 update_engine[1805]: I20250325 02:42:07.091892 1805 omaha_request_action.cc:272] Request: Mar 25 02:42:07.092265 update_engine[1805]: Mar 25 02:42:07.092265 update_engine[1805]: Mar 25 02:42:07.092265 update_engine[1805]: Mar 25 02:42:07.092265 update_engine[1805]: Mar 25 02:42:07.092265 update_engine[1805]: Mar 25 02:42:07.092265 update_engine[1805]: Mar 25 02:42:07.092265 update_engine[1805]: I20250325 02:42:07.091897 1805 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 02:42:07.092265 update_engine[1805]: I20250325 02:42:07.092088 1805 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 02:42:07.092646 locksmithd[1856]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Mar 25 02:42:07.092915 update_engine[1805]: I20250325 02:42:07.092307 1805 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 02:42:07.092915 update_engine[1805]: E20250325 02:42:07.092698 1805 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 02:42:07.092915 update_engine[1805]: I20250325 02:42:07.092757 1805 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 25 02:42:07.092915 update_engine[1805]: I20250325 02:42:07.092770 1805 omaha_request_action.cc:617] Omaha request response: Mar 25 02:42:07.092915 update_engine[1805]: I20250325 02:42:07.092778 1805 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 25 02:42:07.092915 update_engine[1805]: I20250325 02:42:07.092785 1805 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 25 02:42:07.092915 update_engine[1805]: I20250325 02:42:07.092792 1805 update_attempter.cc:306] Processing Done. Mar 25 02:42:07.092915 update_engine[1805]: I20250325 02:42:07.092797 1805 update_attempter.cc:310] Error event sent. Mar 25 02:42:07.092915 update_engine[1805]: I20250325 02:42:07.092808 1805 update_check_scheduler.cc:74] Next update check in 40m7s Mar 25 02:42:07.093220 locksmithd[1856]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Mar 25 02:42:09.678936 containerd[1819]: time="2025-03-25T02:42:09.678876678Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\" id:\"83fdf6f26d54bf861497ea40d42ebe2e37fbc81f421e99edc41adbcf6a3b926f\" pid:6397 exited_at:{seconds:1742870529 nanos:678685763}" Mar 25 02:42:15.970050 systemd[1]: Started sshd@31-147.75.90.239:22-103.63.108.25:51316.service - OpenSSH per-connection server daemon (103.63.108.25:51316). Mar 25 02:42:17.007275 sshd[6417]: Invalid user morteza from 103.63.108.25 port 51316 Mar 25 02:42:17.130959 systemd[1]: Started sshd@32-147.75.90.239:22-160.30.159.175:38604.service - OpenSSH per-connection server daemon (160.30.159.175:38604). Mar 25 02:42:17.206568 sshd[6417]: Received disconnect from 103.63.108.25 port 51316:11: Bye Bye [preauth] Mar 25 02:42:17.206568 sshd[6417]: Disconnected from invalid user morteza 103.63.108.25 port 51316 [preauth] Mar 25 02:42:17.209843 systemd[1]: sshd@31-147.75.90.239:22-103.63.108.25:51316.service: Deactivated successfully. Mar 25 02:42:18.141349 sshd[6422]: Invalid user ang from 160.30.159.175 port 38604 Mar 25 02:42:18.332167 sshd[6422]: Received disconnect from 160.30.159.175 port 38604:11: Bye Bye [preauth] Mar 25 02:42:18.332167 sshd[6422]: Disconnected from invalid user ang 160.30.159.175 port 38604 [preauth] Mar 25 02:42:18.335538 systemd[1]: sshd@32-147.75.90.239:22-160.30.159.175:38604.service: Deactivated successfully. Mar 25 02:42:18.735687 systemd[1]: Started sshd@33-147.75.90.239:22-189.8.108.39:58174.service - OpenSSH per-connection server daemon (189.8.108.39:58174). Mar 25 02:42:19.380916 containerd[1819]: time="2025-03-25T02:42:19.380885919Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"80cccb1424a1e50a7715b97aa2e61354e2c7c7ad3676accac629452a11581e7b\" pid:6443 exited_at:{seconds:1742870539 nanos:380736689}" Mar 25 02:42:19.827881 sshd[6429]: Invalid user souradeep from 189.8.108.39 port 58174 Mar 25 02:42:20.030565 sshd[6429]: Received disconnect from 189.8.108.39 port 58174:11: Bye Bye [preauth] Mar 25 02:42:20.030565 sshd[6429]: Disconnected from invalid user souradeep 189.8.108.39 port 58174 [preauth] Mar 25 02:42:20.034259 systemd[1]: sshd@33-147.75.90.239:22-189.8.108.39:58174.service: Deactivated successfully. Mar 25 02:42:24.640335 systemd[1]: Started sshd@34-147.75.90.239:22-41.73.244.116:37440.service - OpenSSH per-connection server daemon (41.73.244.116:37440). Mar 25 02:42:26.580599 sshd[6456]: Invalid user media from 41.73.244.116 port 37440 Mar 25 02:42:26.957916 sshd[6456]: Received disconnect from 41.73.244.116 port 37440:11: Bye Bye [preauth] Mar 25 02:42:26.957916 sshd[6456]: Disconnected from invalid user media 41.73.244.116 port 37440 [preauth] Mar 25 02:42:26.960624 systemd[1]: sshd@34-147.75.90.239:22-41.73.244.116:37440.service: Deactivated successfully. Mar 25 02:42:35.345540 containerd[1819]: time="2025-03-25T02:42:35.345300863Z" level=warning msg="container event discarded" container=9c8d2a8228c0ef71c92ba7a19366940edb94b33a24a6a8170eb6f024e6780338 type=CONTAINER_CREATED_EVENT Mar 25 02:42:35.356914 containerd[1819]: time="2025-03-25T02:42:35.356765499Z" level=warning msg="container event discarded" container=9c8d2a8228c0ef71c92ba7a19366940edb94b33a24a6a8170eb6f024e6780338 type=CONTAINER_STARTED_EVENT Mar 25 02:42:35.373549 containerd[1819]: time="2025-03-25T02:42:35.373397577Z" level=warning msg="container event discarded" container=5c8051c2621d5c81e979a3b79f81d7eb8f010a96a44194e2fa899d08308e55e3 type=CONTAINER_CREATED_EVENT Mar 25 02:42:35.395375 containerd[1819]: time="2025-03-25T02:42:35.395208734Z" level=warning msg="container event discarded" container=0ae21a2aabd6e8d479126c947482647ba187ee17c9e390d21fae21cc300c9389 type=CONTAINER_CREATED_EVENT Mar 25 02:42:35.395375 containerd[1819]: time="2025-03-25T02:42:35.395323484Z" level=warning msg="container event discarded" container=0ae21a2aabd6e8d479126c947482647ba187ee17c9e390d21fae21cc300c9389 type=CONTAINER_STARTED_EVENT Mar 25 02:42:35.395375 containerd[1819]: time="2025-03-25T02:42:35.395371253Z" level=warning msg="container event discarded" container=451fb8aa2b071234e7ff3017b9cde36ca59eae8974baed7c5b7c939149b60423 type=CONTAINER_CREATED_EVENT Mar 25 02:42:35.395903 containerd[1819]: time="2025-03-25T02:42:35.395420317Z" level=warning msg="container event discarded" container=451fb8aa2b071234e7ff3017b9cde36ca59eae8974baed7c5b7c939149b60423 type=CONTAINER_STARTED_EVENT Mar 25 02:42:35.395903 containerd[1819]: time="2025-03-25T02:42:35.395460542Z" level=warning msg="container event discarded" container=40d5cd5a48d7baaae7fa7fd264bd02e6dfe7e60f61dd1d188cf2b17d30dcf7bf type=CONTAINER_CREATED_EVENT Mar 25 02:42:35.395903 containerd[1819]: time="2025-03-25T02:42:35.395500952Z" level=warning msg="container event discarded" container=c382d59abf21517243f158a110991dd5851e124f63ddeecc09ecd2d7a5cea1ec type=CONTAINER_CREATED_EVENT Mar 25 02:42:35.420999 containerd[1819]: time="2025-03-25T02:42:35.420854888Z" level=warning msg="container event discarded" container=5c8051c2621d5c81e979a3b79f81d7eb8f010a96a44194e2fa899d08308e55e3 type=CONTAINER_STARTED_EVENT Mar 25 02:42:35.447379 containerd[1819]: time="2025-03-25T02:42:35.447273163Z" level=warning msg="container event discarded" container=40d5cd5a48d7baaae7fa7fd264bd02e6dfe7e60f61dd1d188cf2b17d30dcf7bf type=CONTAINER_STARTED_EVENT Mar 25 02:42:35.447379 containerd[1819]: time="2025-03-25T02:42:35.447356019Z" level=warning msg="container event discarded" container=c382d59abf21517243f158a110991dd5851e124f63ddeecc09ecd2d7a5cea1ec type=CONTAINER_STARTED_EVENT Mar 25 02:42:39.678497 containerd[1819]: time="2025-03-25T02:42:39.678444511Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\" id:\"dffa6f3050d055ef0ce6643f4fc946b8ffed6f97ad0ba5edec9d8d0239d40aa7\" pid:6474 exited_at:{seconds:1742870559 nanos:678206563}" Mar 25 02:42:40.963952 systemd[1]: Started sshd@35-147.75.90.239:22-103.31.39.72:34294.service - OpenSSH per-connection server daemon (103.31.39.72:34294). Mar 25 02:42:42.030585 sshd[6492]: Invalid user div from 103.31.39.72 port 34294 Mar 25 02:42:42.230034 sshd[6492]: Received disconnect from 103.31.39.72 port 34294:11: Bye Bye [preauth] Mar 25 02:42:42.230034 sshd[6492]: Disconnected from invalid user div 103.31.39.72 port 34294 [preauth] Mar 25 02:42:42.233355 systemd[1]: sshd@35-147.75.90.239:22-103.31.39.72:34294.service: Deactivated successfully. Mar 25 02:42:44.660959 containerd[1819]: time="2025-03-25T02:42:44.660785385Z" level=warning msg="container event discarded" container=517847e68fc0456f8362f33cdfe4f58def6f81de7f980d5f3d01b3837592875d type=CONTAINER_CREATED_EVENT Mar 25 02:42:44.660959 containerd[1819]: time="2025-03-25T02:42:44.660937780Z" level=warning msg="container event discarded" container=517847e68fc0456f8362f33cdfe4f58def6f81de7f980d5f3d01b3837592875d type=CONTAINER_STARTED_EVENT Mar 25 02:42:44.660959 containerd[1819]: time="2025-03-25T02:42:44.660967363Z" level=warning msg="container event discarded" container=ea3fecab24943c844276aa10a1037a07e68c60d82d993d38c56e81134710bd33 type=CONTAINER_CREATED_EVENT Mar 25 02:42:44.757766 containerd[1819]: time="2025-03-25T02:42:44.757687186Z" level=warning msg="container event discarded" container=ea3fecab24943c844276aa10a1037a07e68c60d82d993d38c56e81134710bd33 type=CONTAINER_STARTED_EVENT Mar 25 02:42:44.757766 containerd[1819]: time="2025-03-25T02:42:44.757731421Z" level=warning msg="container event discarded" container=db66b328053df6ad6552874937d39a13eded766c5e16c6e5e5fe95d43910d0b7 type=CONTAINER_CREATED_EVENT Mar 25 02:42:44.757766 containerd[1819]: time="2025-03-25T02:42:44.757741561Z" level=warning msg="container event discarded" container=db66b328053df6ad6552874937d39a13eded766c5e16c6e5e5fe95d43910d0b7 type=CONTAINER_STARTED_EVENT Mar 25 02:42:47.044203 containerd[1819]: time="2025-03-25T02:42:47.044177387Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"3acdbca5058cb939da1566acf59098563e9d5df1b921db617606f58787cddefe\" pid:6510 exited_at:{seconds:1742870567 nanos:44031526}" Mar 25 02:42:47.696238 containerd[1819]: time="2025-03-25T02:42:47.696120519Z" level=warning msg="container event discarded" container=1f35fa9decb7831284427ca16bd6f48f2fe46d10a2d4e41f9d6870ff656f3ce4 type=CONTAINER_CREATED_EVENT Mar 25 02:42:47.733842 containerd[1819]: time="2025-03-25T02:42:47.733700233Z" level=warning msg="container event discarded" container=1f35fa9decb7831284427ca16bd6f48f2fe46d10a2d4e41f9d6870ff656f3ce4 type=CONTAINER_STARTED_EVENT Mar 25 02:42:49.363433 containerd[1819]: time="2025-03-25T02:42:49.363401695Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"fbdacbac437a5cceb6313e331c084197dec24aea194fea14c3ff749dde8933ba\" pid:6537 exited_at:{seconds:1742870569 nanos:363247764}" Mar 25 02:42:50.862270 containerd[1819]: time="2025-03-25T02:42:50.862109917Z" level=warning msg="container event discarded" container=a7a9a9e61787f4ea212d42f7bcff30b7eb5b80c1af314a353b0a4b80d9ca4938 type=CONTAINER_CREATED_EVENT Mar 25 02:42:50.862270 containerd[1819]: time="2025-03-25T02:42:50.862202700Z" level=warning msg="container event discarded" container=a7a9a9e61787f4ea212d42f7bcff30b7eb5b80c1af314a353b0a4b80d9ca4938 type=CONTAINER_STARTED_EVENT Mar 25 02:42:50.862270 containerd[1819]: time="2025-03-25T02:42:50.862237381Z" level=warning msg="container event discarded" container=9a81a81544f352ef4445be84906d49d058763020bb81a0714dc3d9e6de316589 type=CONTAINER_CREATED_EVENT Mar 25 02:42:50.862270 containerd[1819]: time="2025-03-25T02:42:50.862262829Z" level=warning msg="container event discarded" container=9a81a81544f352ef4445be84906d49d058763020bb81a0714dc3d9e6de316589 type=CONTAINER_STARTED_EVENT Mar 25 02:42:52.476264 containerd[1819]: time="2025-03-25T02:42:52.476116351Z" level=warning msg="container event discarded" container=0d0f48863e5af7875492191d5a8482e4ea7fc3974ac1364c9d301ddcd64ce0a0 type=CONTAINER_CREATED_EVENT Mar 25 02:42:52.515891 containerd[1819]: time="2025-03-25T02:42:52.515706416Z" level=warning msg="container event discarded" container=0d0f48863e5af7875492191d5a8482e4ea7fc3974ac1364c9d301ddcd64ce0a0 type=CONTAINER_STARTED_EVENT Mar 25 02:42:52.739079 containerd[1819]: time="2025-03-25T02:42:52.738800317Z" level=warning msg="container event discarded" container=0d0f48863e5af7875492191d5a8482e4ea7fc3974ac1364c9d301ddcd64ce0a0 type=CONTAINER_STOPPED_EVENT Mar 25 02:42:55.184661 containerd[1819]: time="2025-03-25T02:42:55.184535090Z" level=warning msg="container event discarded" container=d27a06727ed437b45fdeb63da62f856f535df7f7b68bf63afa7defd6713686e6 type=CONTAINER_CREATED_EVENT Mar 25 02:42:55.242273 containerd[1819]: time="2025-03-25T02:42:55.242119725Z" level=warning msg="container event discarded" container=d27a06727ed437b45fdeb63da62f856f535df7f7b68bf63afa7defd6713686e6 type=CONTAINER_STARTED_EVENT Mar 25 02:42:58.867947 containerd[1819]: time="2025-03-25T02:42:58.867757547Z" level=warning msg="container event discarded" container=033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666 type=CONTAINER_CREATED_EVENT Mar 25 02:42:58.914465 containerd[1819]: time="2025-03-25T02:42:58.914324130Z" level=warning msg="container event discarded" container=033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666 type=CONTAINER_STARTED_EVENT Mar 25 02:43:00.140276 containerd[1819]: time="2025-03-25T02:43:00.140034828Z" level=warning msg="container event discarded" container=033ee3654c7bea15d2de09c8da89c2ada1902398c2d6c13763147c64e1a42666 type=CONTAINER_STOPPED_EVENT Mar 25 02:43:05.467433 containerd[1819]: time="2025-03-25T02:43:05.467184530Z" level=warning msg="container event discarded" container=6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca type=CONTAINER_CREATED_EVENT Mar 25 02:43:05.520914 containerd[1819]: time="2025-03-25T02:43:05.520769151Z" level=warning msg="container event discarded" container=6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca type=CONTAINER_STARTED_EVENT Mar 25 02:43:09.686474 containerd[1819]: time="2025-03-25T02:43:09.686446213Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\" id:\"c7b3a185d712f8b0b47f39ffa0a8386f558acd445f7abd8a8f3dfd5ef8f27b21\" pid:6574 exited_at:{seconds:1742870589 nanos:686244910}" Mar 25 02:43:11.278975 containerd[1819]: time="2025-03-25T02:43:11.278786159Z" level=warning msg="container event discarded" container=6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725 type=CONTAINER_CREATED_EVENT Mar 25 02:43:11.278975 containerd[1819]: time="2025-03-25T02:43:11.278924361Z" level=warning msg="container event discarded" container=6bb8301b914748577461ac38e4bad06b3aaab50396ff36827b77f72358320725 type=CONTAINER_STARTED_EVENT Mar 25 02:43:11.500870 containerd[1819]: time="2025-03-25T02:43:11.500709398Z" level=warning msg="container event discarded" container=c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699 type=CONTAINER_CREATED_EVENT Mar 25 02:43:11.500870 containerd[1819]: time="2025-03-25T02:43:11.500796358Z" level=warning msg="container event discarded" container=c0e1ca40172bf65cc95041b7427d59499ea4153ee5cd3997df0b2df11e8ef699 type=CONTAINER_STARTED_EVENT Mar 25 02:43:12.276460 containerd[1819]: time="2025-03-25T02:43:12.276355633Z" level=warning msg="container event discarded" container=418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a type=CONTAINER_CREATED_EVENT Mar 25 02:43:12.276460 containerd[1819]: time="2025-03-25T02:43:12.276443473Z" level=warning msg="container event discarded" container=418c3030c5aaefe59fcbb04d3220c7f86fa381c913f6b1c61f2302e42fbcdf3a type=CONTAINER_STARTED_EVENT Mar 25 02:43:12.276460 containerd[1819]: time="2025-03-25T02:43:12.276473859Z" level=warning msg="container event discarded" container=8eb15cbca15a4f76470ed982c40f545813d625ceccf9fc7f20c4e1f4af529551 type=CONTAINER_CREATED_EVENT Mar 25 02:43:12.351922 containerd[1819]: time="2025-03-25T02:43:12.351791413Z" level=warning msg="container event discarded" container=8eb15cbca15a4f76470ed982c40f545813d625ceccf9fc7f20c4e1f4af529551 type=CONTAINER_STARTED_EVENT Mar 25 02:43:13.805821 containerd[1819]: time="2025-03-25T02:43:13.805686532Z" level=warning msg="container event discarded" container=17ee943974f228681dde969514030cde45aead08b532417f2c6afcff19ccc809 type=CONTAINER_CREATED_EVENT Mar 25 02:43:13.856296 containerd[1819]: time="2025-03-25T02:43:13.856184115Z" level=warning msg="container event discarded" container=17ee943974f228681dde969514030cde45aead08b532417f2c6afcff19ccc809 type=CONTAINER_STARTED_EVENT Mar 25 02:43:14.223840 containerd[1819]: time="2025-03-25T02:43:14.223711296Z" level=warning msg="container event discarded" container=a743079ca52daa4cf2bbc9a3ce100bc9594ee125a8129c12350614d98259668b type=CONTAINER_CREATED_EVENT Mar 25 02:43:14.264109 containerd[1819]: time="2025-03-25T02:43:14.263998930Z" level=warning msg="container event discarded" container=4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b type=CONTAINER_CREATED_EVENT Mar 25 02:43:14.264109 containerd[1819]: time="2025-03-25T02:43:14.264087294Z" level=warning msg="container event discarded" container=4a7477ca7b1d92063599c7a4889afa449864c79c87052dd03b57163fcd1cab1b type=CONTAINER_STARTED_EVENT Mar 25 02:43:14.277703 containerd[1819]: time="2025-03-25T02:43:14.277546086Z" level=warning msg="container event discarded" container=a743079ca52daa4cf2bbc9a3ce100bc9594ee125a8129c12350614d98259668b type=CONTAINER_STARTED_EVENT Mar 25 02:43:14.404430 containerd[1819]: time="2025-03-25T02:43:14.404319492Z" level=warning msg="container event discarded" container=849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68 type=CONTAINER_CREATED_EVENT Mar 25 02:43:14.404430 containerd[1819]: time="2025-03-25T02:43:14.404414817Z" level=warning msg="container event discarded" container=849aa2e1f9d4227197feb1b571276ff570d00cf2901254003652bf8201ccfc68 type=CONTAINER_STARTED_EVENT Mar 25 02:43:15.283593 containerd[1819]: time="2025-03-25T02:43:15.283477972Z" level=warning msg="container event discarded" container=86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435 type=CONTAINER_CREATED_EVENT Mar 25 02:43:15.283593 containerd[1819]: time="2025-03-25T02:43:15.283569993Z" level=warning msg="container event discarded" container=86e770a85f46325b06f9d77642b9dfcb08420f914934a0cdada0ae6fab715435 type=CONTAINER_STARTED_EVENT Mar 25 02:43:15.284669 containerd[1819]: time="2025-03-25T02:43:15.283618970Z" level=warning msg="container event discarded" container=09ac4cf426061b249584c47064f3bfbab0b2f490c9848812d37bed735ea7eb1a type=CONTAINER_CREATED_EVENT Mar 25 02:43:15.325180 containerd[1819]: time="2025-03-25T02:43:15.325056325Z" level=warning msg="container event discarded" container=09ac4cf426061b249584c47064f3bfbab0b2f490c9848812d37bed735ea7eb1a type=CONTAINER_STARTED_EVENT Mar 25 02:43:15.866734 containerd[1819]: time="2025-03-25T02:43:15.866554795Z" level=warning msg="container event discarded" container=44c0f7cfe8d2c80b950a98a6eba466298d81dc1f8c03a69cb28e2666f6fb51d5 type=CONTAINER_CREATED_EVENT Mar 25 02:43:15.898284 containerd[1819]: time="2025-03-25T02:43:15.898097564Z" level=warning msg="container event discarded" container=44c0f7cfe8d2c80b950a98a6eba466298d81dc1f8c03a69cb28e2666f6fb51d5 type=CONTAINER_STARTED_EVENT Mar 25 02:43:18.152390 containerd[1819]: time="2025-03-25T02:43:18.152219798Z" level=warning msg="container event discarded" container=424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d type=CONTAINER_CREATED_EVENT Mar 25 02:43:18.230917 containerd[1819]: time="2025-03-25T02:43:18.230851586Z" level=warning msg="container event discarded" container=424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d type=CONTAINER_STARTED_EVENT Mar 25 02:43:19.377247 containerd[1819]: time="2025-03-25T02:43:19.377220715Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"1fa2648606cadf8e39f2abf0de4d81b36c13c34c1faf3b3ce69a3638479a1d94\" pid:6604 exited_at:{seconds:1742870599 nanos:377033882}" Mar 25 02:43:19.936053 containerd[1819]: time="2025-03-25T02:43:19.935935734Z" level=warning msg="container event discarded" container=d52270326e36ae019d1944685fa44a7c013865dfb668d3d982460a92ad187a21 type=CONTAINER_CREATED_EVENT Mar 25 02:43:19.976291 containerd[1819]: time="2025-03-25T02:43:19.976250568Z" level=warning msg="container event discarded" container=d52270326e36ae019d1944685fa44a7c013865dfb668d3d982460a92ad187a21 type=CONTAINER_STARTED_EVENT Mar 25 02:43:38.825384 systemd[1]: Started sshd@36-147.75.90.239:22-189.8.108.39:55650.service - OpenSSH per-connection server daemon (189.8.108.39:55650). Mar 25 02:43:39.688378 containerd[1819]: time="2025-03-25T02:43:39.688336056Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\" id:\"1b941166a14aae9c5e8863ecce90999be06651444cc32c45d1600c3cbf517e27\" pid:6639 exited_at:{seconds:1742870619 nanos:688099819}" Mar 25 02:43:39.891242 sshd[6623]: Invalid user arena from 189.8.108.39 port 55650 Mar 25 02:43:40.090126 sshd[6623]: Received disconnect from 189.8.108.39 port 55650:11: Bye Bye [preauth] Mar 25 02:43:40.090126 sshd[6623]: Disconnected from invalid user arena 189.8.108.39 port 55650 [preauth] Mar 25 02:43:40.093515 systemd[1]: sshd@36-147.75.90.239:22-189.8.108.39:55650.service: Deactivated successfully. Mar 25 02:43:41.107148 systemd[1]: Started sshd@37-147.75.90.239:22-103.63.108.25:49612.service - OpenSSH per-connection server daemon (103.63.108.25:49612). Mar 25 02:43:42.118621 sshd[6662]: Invalid user testyl from 103.63.108.25 port 49612 Mar 25 02:43:42.308314 sshd[6662]: Received disconnect from 103.63.108.25 port 49612:11: Bye Bye [preauth] Mar 25 02:43:42.308314 sshd[6662]: Disconnected from invalid user testyl 103.63.108.25 port 49612 [preauth] Mar 25 02:43:42.313121 systemd[1]: sshd@37-147.75.90.239:22-103.63.108.25:49612.service: Deactivated successfully. Mar 25 02:43:47.039871 containerd[1819]: time="2025-03-25T02:43:47.039850086Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"2a243e173e26220686f7d804fa26a79ffaeac94d6046b9f224dfc40b5c5d8c09\" pid:6680 exited_at:{seconds:1742870627 nanos:39753888}" Mar 25 02:43:49.367269 containerd[1819]: time="2025-03-25T02:43:49.367235477Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"2665ef07f572822d06eb031e199af47d9f0a8e9008bfcccce66222c7eaff24b8\" pid:6702 exited_at:{seconds:1742870629 nanos:367058786}" Mar 25 02:43:51.393900 systemd[1]: Started sshd@38-147.75.90.239:22-41.73.244.116:35370.service - OpenSSH per-connection server daemon (41.73.244.116:35370). Mar 25 02:43:53.357990 sshd[6713]: Invalid user souradeep from 41.73.244.116 port 35370 Mar 25 02:43:53.732923 sshd[6713]: Received disconnect from 41.73.244.116 port 35370:11: Bye Bye [preauth] Mar 25 02:43:53.732923 sshd[6713]: Disconnected from invalid user souradeep 41.73.244.116 port 35370 [preauth] Mar 25 02:43:53.737501 systemd[1]: sshd@38-147.75.90.239:22-41.73.244.116:35370.service: Deactivated successfully. Mar 25 02:44:03.365872 systemd[1]: Started sshd@39-147.75.90.239:22-160.30.159.175:35808.service - OpenSSH per-connection server daemon (160.30.159.175:35808). Mar 25 02:44:04.550823 sshd[6718]: Invalid user t128 from 160.30.159.175 port 35808 Mar 25 02:44:04.746676 sshd[6718]: Received disconnect from 160.30.159.175 port 35808:11: Bye Bye [preauth] Mar 25 02:44:04.746676 sshd[6718]: Disconnected from invalid user t128 160.30.159.175 port 35808 [preauth] Mar 25 02:44:04.750003 systemd[1]: sshd@39-147.75.90.239:22-160.30.159.175:35808.service: Deactivated successfully. Mar 25 02:44:08.996410 systemd[1]: Started sshd@40-147.75.90.239:22-103.31.39.72:56126.service - OpenSSH per-connection server daemon (103.31.39.72:56126). Mar 25 02:44:09.687188 containerd[1819]: time="2025-03-25T02:44:09.687154832Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\" id:\"b71652a863ff8d222b500c8eb94ba89eb8e326b5ec74d4ac79a59b41707b5023\" pid:6736 exited_at:{seconds:1742870649 nanos:686893287}" Mar 25 02:44:10.555884 sshd[6723]: Invalid user rstudio-server from 103.31.39.72 port 56126 Mar 25 02:44:10.846321 sshd[6723]: Received disconnect from 103.31.39.72 port 56126:11: Bye Bye [preauth] Mar 25 02:44:10.846321 sshd[6723]: Disconnected from invalid user rstudio-server 103.31.39.72 port 56126 [preauth] Mar 25 02:44:10.849695 systemd[1]: sshd@40-147.75.90.239:22-103.31.39.72:56126.service: Deactivated successfully. Mar 25 02:44:18.200378 systemd[1]: Started sshd@41-147.75.90.239:22-139.178.68.195:55510.service - OpenSSH per-connection server daemon (139.178.68.195:55510). Mar 25 02:44:18.291642 sshd[6761]: Accepted publickey for core from 139.178.68.195 port 55510 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:44:18.292273 sshd-session[6761]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:44:18.295264 systemd-logind[1800]: New session 12 of user core. Mar 25 02:44:18.320830 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 25 02:44:18.464470 sshd[6763]: Connection closed by 139.178.68.195 port 55510 Mar 25 02:44:18.464588 sshd-session[6761]: pam_unix(sshd:session): session closed for user core Mar 25 02:44:18.466342 systemd[1]: sshd@41-147.75.90.239:22-139.178.68.195:55510.service: Deactivated successfully. Mar 25 02:44:18.467300 systemd[1]: session-12.scope: Deactivated successfully. Mar 25 02:44:18.468052 systemd-logind[1800]: Session 12 logged out. Waiting for processes to exit. Mar 25 02:44:18.468543 systemd-logind[1800]: Removed session 12. Mar 25 02:44:19.372740 containerd[1819]: time="2025-03-25T02:44:19.372583443Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"3488e13a5f7ae61ed0f075f0877866e322b2a7291a1ec59cbf6afa329ee08758\" pid:6801 exited_at:{seconds:1742870659 nanos:372257553}" Mar 25 02:44:23.485842 systemd[1]: Started sshd@42-147.75.90.239:22-139.178.68.195:55512.service - OpenSSH per-connection server daemon (139.178.68.195:55512). Mar 25 02:44:23.513447 sshd[6819]: Accepted publickey for core from 139.178.68.195 port 55512 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:44:23.514109 sshd-session[6819]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:44:23.516938 systemd-logind[1800]: New session 13 of user core. Mar 25 02:44:23.537815 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 25 02:44:23.625787 sshd[6821]: Connection closed by 139.178.68.195 port 55512 Mar 25 02:44:23.625965 sshd-session[6819]: pam_unix(sshd:session): session closed for user core Mar 25 02:44:23.627493 systemd[1]: sshd@42-147.75.90.239:22-139.178.68.195:55512.service: Deactivated successfully. Mar 25 02:44:23.628489 systemd[1]: session-13.scope: Deactivated successfully. Mar 25 02:44:23.629239 systemd-logind[1800]: Session 13 logged out. Waiting for processes to exit. Mar 25 02:44:23.629829 systemd-logind[1800]: Removed session 13. Mar 25 02:44:28.641616 systemd[1]: Started sshd@43-147.75.90.239:22-139.178.68.195:54998.service - OpenSSH per-connection server daemon (139.178.68.195:54998). Mar 25 02:44:28.669331 sshd[6862]: Accepted publickey for core from 139.178.68.195 port 54998 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:44:28.669951 sshd-session[6862]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:44:28.672636 systemd-logind[1800]: New session 14 of user core. Mar 25 02:44:28.692882 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 25 02:44:28.779890 sshd[6864]: Connection closed by 139.178.68.195 port 54998 Mar 25 02:44:28.780048 sshd-session[6862]: pam_unix(sshd:session): session closed for user core Mar 25 02:44:28.800911 systemd[1]: sshd@43-147.75.90.239:22-139.178.68.195:54998.service: Deactivated successfully. Mar 25 02:44:28.801832 systemd[1]: session-14.scope: Deactivated successfully. Mar 25 02:44:28.802554 systemd-logind[1800]: Session 14 logged out. Waiting for processes to exit. Mar 25 02:44:28.803416 systemd[1]: Started sshd@44-147.75.90.239:22-139.178.68.195:55002.service - OpenSSH per-connection server daemon (139.178.68.195:55002). Mar 25 02:44:28.804134 systemd-logind[1800]: Removed session 14. Mar 25 02:44:28.831837 sshd[6889]: Accepted publickey for core from 139.178.68.195 port 55002 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:44:28.832521 sshd-session[6889]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:44:28.835430 systemd-logind[1800]: New session 15 of user core. Mar 25 02:44:28.858049 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 25 02:44:28.957909 sshd[6892]: Connection closed by 139.178.68.195 port 55002 Mar 25 02:44:28.958011 sshd-session[6889]: pam_unix(sshd:session): session closed for user core Mar 25 02:44:28.973808 systemd[1]: sshd@44-147.75.90.239:22-139.178.68.195:55002.service: Deactivated successfully. Mar 25 02:44:28.974686 systemd[1]: session-15.scope: Deactivated successfully. Mar 25 02:44:28.975401 systemd-logind[1800]: Session 15 logged out. Waiting for processes to exit. Mar 25 02:44:28.976089 systemd[1]: Started sshd@45-147.75.90.239:22-139.178.68.195:55018.service - OpenSSH per-connection server daemon (139.178.68.195:55018). Mar 25 02:44:28.976555 systemd-logind[1800]: Removed session 15. Mar 25 02:44:29.003998 sshd[6915]: Accepted publickey for core from 139.178.68.195 port 55018 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:44:29.004663 sshd-session[6915]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:44:29.007654 systemd-logind[1800]: New session 16 of user core. Mar 25 02:44:29.025097 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 25 02:44:29.120078 sshd[6919]: Connection closed by 139.178.68.195 port 55018 Mar 25 02:44:29.120256 sshd-session[6915]: pam_unix(sshd:session): session closed for user core Mar 25 02:44:29.122014 systemd[1]: sshd@45-147.75.90.239:22-139.178.68.195:55018.service: Deactivated successfully. Mar 25 02:44:29.123087 systemd[1]: session-16.scope: Deactivated successfully. Mar 25 02:44:29.123904 systemd-logind[1800]: Session 16 logged out. Waiting for processes to exit. Mar 25 02:44:29.124537 systemd-logind[1800]: Removed session 16. Mar 25 02:44:34.142676 systemd[1]: Started sshd@46-147.75.90.239:22-139.178.68.195:55022.service - OpenSSH per-connection server daemon (139.178.68.195:55022). Mar 25 02:44:34.178223 sshd[6948]: Accepted publickey for core from 139.178.68.195 port 55022 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:44:34.181786 sshd-session[6948]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:44:34.194196 systemd-logind[1800]: New session 17 of user core. Mar 25 02:44:34.211066 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 25 02:44:34.305601 sshd[6950]: Connection closed by 139.178.68.195 port 55022 Mar 25 02:44:34.305810 sshd-session[6948]: pam_unix(sshd:session): session closed for user core Mar 25 02:44:34.307577 systemd[1]: sshd@46-147.75.90.239:22-139.178.68.195:55022.service: Deactivated successfully. Mar 25 02:44:34.308587 systemd[1]: session-17.scope: Deactivated successfully. Mar 25 02:44:34.309370 systemd-logind[1800]: Session 17 logged out. Waiting for processes to exit. Mar 25 02:44:34.310085 systemd-logind[1800]: Removed session 17. Mar 25 02:44:39.331605 systemd[1]: Started sshd@47-147.75.90.239:22-139.178.68.195:41960.service - OpenSSH per-connection server daemon (139.178.68.195:41960). Mar 25 02:44:39.373056 sshd[6976]: Accepted publickey for core from 139.178.68.195 port 41960 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:44:39.373721 sshd-session[6976]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:44:39.376570 systemd-logind[1800]: New session 18 of user core. Mar 25 02:44:39.385821 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 25 02:44:39.477478 sshd[6978]: Connection closed by 139.178.68.195 port 41960 Mar 25 02:44:39.477657 sshd-session[6976]: pam_unix(sshd:session): session closed for user core Mar 25 02:44:39.479407 systemd[1]: sshd@47-147.75.90.239:22-139.178.68.195:41960.service: Deactivated successfully. Mar 25 02:44:39.480396 systemd[1]: session-18.scope: Deactivated successfully. Mar 25 02:44:39.481173 systemd-logind[1800]: Session 18 logged out. Waiting for processes to exit. Mar 25 02:44:39.481802 systemd-logind[1800]: Removed session 18. Mar 25 02:44:39.677549 containerd[1819]: time="2025-03-25T02:44:39.677473503Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6091df3589d8c62afe7769ac0707ff9c581a0d6baa929d76ed2da64aa5fb00ca\" id:\"b0f3db547836f42697135303816aeeb87309fa728abac4f7a4cb35795f5b9677\" pid:7015 exited_at:{seconds:1742870679 nanos:677273933}" Mar 25 02:44:44.494972 systemd[1]: Started sshd@48-147.75.90.239:22-139.178.68.195:41968.service - OpenSSH per-connection server daemon (139.178.68.195:41968). Mar 25 02:44:44.546300 sshd[7035]: Accepted publickey for core from 139.178.68.195 port 41968 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:44:44.546965 sshd-session[7035]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:44:44.549628 systemd-logind[1800]: New session 19 of user core. Mar 25 02:44:44.572885 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 25 02:44:44.661817 sshd[7037]: Connection closed by 139.178.68.195 port 41968 Mar 25 02:44:44.662014 sshd-session[7035]: pam_unix(sshd:session): session closed for user core Mar 25 02:44:44.663706 systemd[1]: sshd@48-147.75.90.239:22-139.178.68.195:41968.service: Deactivated successfully. Mar 25 02:44:44.664832 systemd[1]: session-19.scope: Deactivated successfully. Mar 25 02:44:44.665550 systemd-logind[1800]: Session 19 logged out. Waiting for processes to exit. Mar 25 02:44:44.666261 systemd-logind[1800]: Removed session 19. Mar 25 02:44:47.034666 containerd[1819]: time="2025-03-25T02:44:47.034642444Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"6283ee6d18de423e3defc71996612ef577d283b59dc4c19fa0c818c17fb9ef55\" pid:7077 exited_at:{seconds:1742870687 nanos:34547800}" Mar 25 02:44:49.323297 containerd[1819]: time="2025-03-25T02:44:49.323264443Z" level=info msg="TaskExit event in podsandbox handler container_id:\"424817f9f848b4e0c7db1cfec681032d3466ca9ab1247fe40a00b27b7deb147d\" id:\"a13e98d012b21890b19eaaf01044c9a82788c3ef00b36aaca316ed27a333fbc6\" pid:7099 exited_at:{seconds:1742870689 nanos:323078872}" Mar 25 02:44:49.685139 systemd[1]: Started sshd@49-147.75.90.239:22-139.178.68.195:33760.service - OpenSSH per-connection server daemon (139.178.68.195:33760). Mar 25 02:44:49.735092 sshd[7109]: Accepted publickey for core from 139.178.68.195 port 33760 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:44:49.735828 sshd-session[7109]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:44:49.738575 systemd-logind[1800]: New session 20 of user core. Mar 25 02:44:49.759897 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 25 02:44:49.847506 sshd[7111]: Connection closed by 139.178.68.195 port 33760 Mar 25 02:44:49.847686 sshd-session[7109]: pam_unix(sshd:session): session closed for user core Mar 25 02:44:49.859890 systemd[1]: sshd@49-147.75.90.239:22-139.178.68.195:33760.service: Deactivated successfully. Mar 25 02:44:49.860751 systemd[1]: session-20.scope: Deactivated successfully. Mar 25 02:44:49.861502 systemd-logind[1800]: Session 20 logged out. Waiting for processes to exit. Mar 25 02:44:49.862203 systemd[1]: Started sshd@50-147.75.90.239:22-139.178.68.195:33766.service - OpenSSH per-connection server daemon (139.178.68.195:33766). Mar 25 02:44:49.862632 systemd-logind[1800]: Removed session 20. Mar 25 02:44:49.894054 sshd[7135]: Accepted publickey for core from 139.178.68.195 port 33766 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:44:49.897428 sshd-session[7135]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:44:49.910520 systemd-logind[1800]: New session 21 of user core. Mar 25 02:44:49.930163 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 25 02:44:50.058065 sshd[7138]: Connection closed by 139.178.68.195 port 33766 Mar 25 02:44:50.058229 sshd-session[7135]: pam_unix(sshd:session): session closed for user core Mar 25 02:44:50.073755 systemd[1]: sshd@50-147.75.90.239:22-139.178.68.195:33766.service: Deactivated successfully. Mar 25 02:44:50.074672 systemd[1]: session-21.scope: Deactivated successfully. Mar 25 02:44:50.075437 systemd-logind[1800]: Session 21 logged out. Waiting for processes to exit. Mar 25 02:44:50.076125 systemd[1]: Started sshd@51-147.75.90.239:22-139.178.68.195:33778.service - OpenSSH per-connection server daemon (139.178.68.195:33778). Mar 25 02:44:50.076648 systemd-logind[1800]: Removed session 21. Mar 25 02:44:50.104033 sshd[7160]: Accepted publickey for core from 139.178.68.195 port 33778 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:44:50.104839 sshd-session[7160]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:44:50.107330 systemd-logind[1800]: New session 22 of user core. Mar 25 02:44:50.120754 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 25 02:44:50.978889 sshd[7163]: Connection closed by 139.178.68.195 port 33778 Mar 25 02:44:50.979155 sshd-session[7160]: pam_unix(sshd:session): session closed for user core Mar 25 02:44:50.992390 systemd[1]: sshd@51-147.75.90.239:22-139.178.68.195:33778.service: Deactivated successfully. Mar 25 02:44:50.993857 systemd[1]: session-22.scope: Deactivated successfully. Mar 25 02:44:50.994952 systemd-logind[1800]: Session 22 logged out. Waiting for processes to exit. Mar 25 02:44:50.996001 systemd[1]: Started sshd@52-147.75.90.239:22-139.178.68.195:33792.service - OpenSSH per-connection server daemon (139.178.68.195:33792). Mar 25 02:44:50.996924 systemd-logind[1800]: Removed session 22. Mar 25 02:44:51.054611 sshd[7192]: Accepted publickey for core from 139.178.68.195 port 33792 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:44:51.056027 sshd-session[7192]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:44:51.060959 systemd-logind[1800]: New session 23 of user core. Mar 25 02:44:51.074937 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 25 02:44:51.279093 sshd[7197]: Connection closed by 139.178.68.195 port 33792 Mar 25 02:44:51.279255 sshd-session[7192]: pam_unix(sshd:session): session closed for user core Mar 25 02:44:51.298136 systemd[1]: sshd@52-147.75.90.239:22-139.178.68.195:33792.service: Deactivated successfully. Mar 25 02:44:51.299132 systemd[1]: session-23.scope: Deactivated successfully. Mar 25 02:44:51.299970 systemd-logind[1800]: Session 23 logged out. Waiting for processes to exit. Mar 25 02:44:51.300809 systemd[1]: Started sshd@53-147.75.90.239:22-139.178.68.195:33798.service - OpenSSH per-connection server daemon (139.178.68.195:33798). Mar 25 02:44:51.301415 systemd-logind[1800]: Removed session 23. Mar 25 02:44:51.334719 sshd[7219]: Accepted publickey for core from 139.178.68.195 port 33798 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:44:51.335705 sshd-session[7219]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:44:51.339419 systemd-logind[1800]: New session 24 of user core. Mar 25 02:44:51.353804 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 25 02:44:51.499907 sshd[7222]: Connection closed by 139.178.68.195 port 33798 Mar 25 02:44:51.500219 sshd-session[7219]: pam_unix(sshd:session): session closed for user core Mar 25 02:44:51.501920 systemd[1]: sshd@53-147.75.90.239:22-139.178.68.195:33798.service: Deactivated successfully. Mar 25 02:44:51.502948 systemd[1]: session-24.scope: Deactivated successfully. Mar 25 02:44:51.503680 systemd-logind[1800]: Session 24 logged out. Waiting for processes to exit. Mar 25 02:44:51.504435 systemd-logind[1800]: Removed session 24. Mar 25 02:44:56.239947 systemd[1]: Started sshd@54-147.75.90.239:22-189.8.108.39:57802.service - OpenSSH per-connection server daemon (189.8.108.39:57802). Mar 25 02:44:56.526881 systemd[1]: Started sshd@55-147.75.90.239:22-139.178.68.195:57002.service - OpenSSH per-connection server daemon (139.178.68.195:57002). Mar 25 02:44:56.571979 sshd[7253]: Accepted publickey for core from 139.178.68.195 port 57002 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:44:56.572793 sshd-session[7253]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:44:56.575896 systemd-logind[1800]: New session 25 of user core. Mar 25 02:44:56.597072 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 25 02:44:56.693656 sshd[7255]: Connection closed by 139.178.68.195 port 57002 Mar 25 02:44:56.693837 sshd-session[7253]: pam_unix(sshd:session): session closed for user core Mar 25 02:44:56.695568 systemd[1]: sshd@55-147.75.90.239:22-139.178.68.195:57002.service: Deactivated successfully. Mar 25 02:44:56.696573 systemd[1]: session-25.scope: Deactivated successfully. Mar 25 02:44:56.697370 systemd-logind[1800]: Session 25 logged out. Waiting for processes to exit. Mar 25 02:44:56.698125 systemd-logind[1800]: Removed session 25. Mar 25 02:44:57.322277 sshd[7250]: Invalid user rajat from 189.8.108.39 port 57802 Mar 25 02:44:57.523616 sshd[7250]: Received disconnect from 189.8.108.39 port 57802:11: Bye Bye [preauth] Mar 25 02:44:57.523616 sshd[7250]: Disconnected from invalid user rajat 189.8.108.39 port 57802 [preauth] Mar 25 02:44:57.527040 systemd[1]: sshd@54-147.75.90.239:22-189.8.108.39:57802.service: Deactivated successfully. Mar 25 02:45:01.716283 systemd[1]: Started sshd@56-147.75.90.239:22-139.178.68.195:57016.service - OpenSSH per-connection server daemon (139.178.68.195:57016). Mar 25 02:45:01.755929 sshd[7282]: Accepted publickey for core from 139.178.68.195 port 57016 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:45:01.756657 sshd-session[7282]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:45:01.759640 systemd-logind[1800]: New session 26 of user core. Mar 25 02:45:01.770902 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 25 02:45:01.855882 sshd[7284]: Connection closed by 139.178.68.195 port 57016 Mar 25 02:45:01.856074 sshd-session[7282]: pam_unix(sshd:session): session closed for user core Mar 25 02:45:01.857886 systemd[1]: sshd@56-147.75.90.239:22-139.178.68.195:57016.service: Deactivated successfully. Mar 25 02:45:01.858891 systemd[1]: session-26.scope: Deactivated successfully. Mar 25 02:45:01.859613 systemd-logind[1800]: Session 26 logged out. Waiting for processes to exit. Mar 25 02:45:01.860356 systemd-logind[1800]: Removed session 26. Mar 25 02:45:04.594731 systemd[1]: Started sshd@57-147.75.90.239:22-103.63.108.25:47946.service - OpenSSH per-connection server daemon (103.63.108.25:47946). Mar 25 02:45:05.646005 sshd[7306]: Invalid user cer2 from 103.63.108.25 port 47946 Mar 25 02:45:05.839751 sshd[7306]: Received disconnect from 103.63.108.25 port 47946:11: Bye Bye [preauth] Mar 25 02:45:05.839751 sshd[7306]: Disconnected from invalid user cer2 103.63.108.25 port 47946 [preauth] Mar 25 02:45:05.843150 systemd[1]: sshd@57-147.75.90.239:22-103.63.108.25:47946.service: Deactivated successfully. Mar 25 02:45:06.877585 systemd[1]: Started sshd@58-147.75.90.239:22-139.178.68.195:42514.service - OpenSSH per-connection server daemon (139.178.68.195:42514). Mar 25 02:45:06.916533 sshd[7311]: Accepted publickey for core from 139.178.68.195 port 42514 ssh2: RSA SHA256:uTfG5sovPUqPrARqt2owfRXVYFyJtX+vlYONPCrvLPw Mar 25 02:45:06.917387 sshd-session[7311]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:45:06.920712 systemd-logind[1800]: New session 27 of user core. Mar 25 02:45:06.936892 systemd[1]: Started session-27.scope - Session 27 of User core. Mar 25 02:45:07.028391 sshd[7313]: Connection closed by 139.178.68.195 port 42514 Mar 25 02:45:07.028611 sshd-session[7311]: pam_unix(sshd:session): session closed for user core Mar 25 02:45:07.030413 systemd[1]: sshd@58-147.75.90.239:22-139.178.68.195:42514.service: Deactivated successfully. Mar 25 02:45:07.031529 systemd[1]: session-27.scope: Deactivated successfully. Mar 25 02:45:07.032346 systemd-logind[1800]: Session 27 logged out. Waiting for processes to exit. Mar 25 02:45:07.033018 systemd-logind[1800]: Removed session 27.