Jan 14 13:35:29.326135 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1]
Jan 14 13:35:29.326157 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241116 p3) 14.2.1 20241116, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT Mon Jan 13 18:56:28 -00 2025
Jan 14 13:35:29.326165 kernel: KASLR enabled
Jan 14 13:35:29.326170 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '')
Jan 14 13:35:29.326178 kernel: printk: bootconsole [pl11] enabled
Jan 14 13:35:29.326183 kernel: efi: EFI v2.7 by EDK II
Jan 14 13:35:29.326190 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3f20e698 RNG=0x3fd5f998 MEMRESERVE=0x3e477598 
Jan 14 13:35:29.326196 kernel: random: crng init done
Jan 14 13:35:29.326202 kernel: secureboot: Secure boot disabled
Jan 14 13:35:29.326208 kernel: ACPI: Early table checksum verification disabled
Jan 14 13:35:29.326214 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL)
Jan 14 13:35:29.326219 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326225 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326232 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01   00000001 INTL 20230628)
Jan 14 13:35:29.326240 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326246 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326252 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326260 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326266 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326272 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326278 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000)
Jan 14 13:35:29.326284 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326290 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200
Jan 14 13:35:29.326296 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff]
Jan 14 13:35:29.326302 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff]
Jan 14 13:35:29.326309 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff]
Jan 14 13:35:29.326315 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff]
Jan 14 13:35:29.326321 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff]
Jan 14 13:35:29.326329 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff]
Jan 14 13:35:29.326335 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff]
Jan 14 13:35:29.326341 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff]
Jan 14 13:35:29.326347 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff]
Jan 14 13:35:29.326353 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff]
Jan 14 13:35:29.326359 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff]
Jan 14 13:35:29.326365 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff]
Jan 14 13:35:29.326371 kernel: NUMA: NODE_DATA [mem 0x1bf7ee800-0x1bf7f3fff]
Jan 14 13:35:29.326377 kernel: Zone ranges:
Jan 14 13:35:29.326383 kernel:   DMA      [mem 0x0000000000000000-0x00000000ffffffff]
Jan 14 13:35:29.326389 kernel:   DMA32    empty
Jan 14 13:35:29.326395 kernel:   Normal   [mem 0x0000000100000000-0x00000001bfffffff]
Jan 14 13:35:29.326405 kernel: Movable zone start for each node
Jan 14 13:35:29.326411 kernel: Early memory node ranges
Jan 14 13:35:29.326418 kernel:   node   0: [mem 0x0000000000000000-0x00000000007fffff]
Jan 14 13:35:29.326424 kernel:   node   0: [mem 0x0000000000824000-0x000000003e45ffff]
Jan 14 13:35:29.326431 kernel:   node   0: [mem 0x000000003e460000-0x000000003e46ffff]
Jan 14 13:35:29.326438 kernel:   node   0: [mem 0x000000003e470000-0x000000003e54ffff]
Jan 14 13:35:29.326445 kernel:   node   0: [mem 0x000000003e550000-0x000000003e87ffff]
Jan 14 13:35:29.326451 kernel:   node   0: [mem 0x000000003e880000-0x000000003fc7ffff]
Jan 14 13:35:29.326457 kernel:   node   0: [mem 0x000000003fc80000-0x000000003fcfffff]
Jan 14 13:35:29.326464 kernel:   node   0: [mem 0x000000003fd00000-0x000000003fffffff]
Jan 14 13:35:29.326470 kernel:   node   0: [mem 0x0000000100000000-0x00000001bfffffff]
Jan 14 13:35:29.326477 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff]
Jan 14 13:35:29.326484 kernel: On node 0, zone DMA: 36 pages in unavailable ranges
Jan 14 13:35:29.326490 kernel: psci: probing for conduit method from ACPI.
Jan 14 13:35:29.326497 kernel: psci: PSCIv1.1 detected in firmware.
Jan 14 13:35:29.326503 kernel: psci: Using standard PSCI v0.2 function IDs
Jan 14 13:35:29.326510 kernel: psci: MIGRATE_INFO_TYPE not supported.
Jan 14 13:35:29.326518 kernel: psci: SMC Calling Convention v1.4
Jan 14 13:35:29.328563 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0
Jan 14 13:35:29.328571 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0
Jan 14 13:35:29.328578 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976
Jan 14 13:35:29.328585 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096
Jan 14 13:35:29.328592 kernel: pcpu-alloc: [0] 0 [0] 1 
Jan 14 13:35:29.328599 kernel: Detected PIPT I-cache on CPU0
Jan 14 13:35:29.328606 kernel: CPU features: detected: GIC system register CPU interface
Jan 14 13:35:29.328613 kernel: CPU features: detected: Hardware dirty bit management
Jan 14 13:35:29.328619 kernel: CPU features: detected: Spectre-BHB
Jan 14 13:35:29.328626 kernel: CPU features: kernel page table isolation forced ON by KASLR
Jan 14 13:35:29.328638 kernel: CPU features: detected: Kernel page table isolation (KPTI)
Jan 14 13:35:29.328645 kernel: CPU features: detected: ARM erratum 1418040
Jan 14 13:35:29.328651 kernel: CPU features: detected: ARM erratum 1542419 (kernel portion)
Jan 14 13:35:29.328658 kernel: CPU features: detected: SSBS not fully self-synchronizing
Jan 14 13:35:29.328664 kernel: alternatives: applying boot alternatives
Jan 14 13:35:29.328672 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=9798117b3b15ef802e3d618077f87253cc08e0d5280b8fe28b307e7558b7ebcc
Jan 14 13:35:29.328680 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space.
Jan 14 13:35:29.328686 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 14 13:35:29.328693 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear)
Jan 14 13:35:29.328699 kernel: Fallback order for Node 0: 0 
Jan 14 13:35:29.328706 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 1032156
Jan 14 13:35:29.328715 kernel: Policy zone: Normal
Jan 14 13:35:29.328721 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 14 13:35:29.328728 kernel: software IO TLB: area num 2.
Jan 14 13:35:29.328734 kernel: software IO TLB: mapped [mem 0x000000003a460000-0x000000003e460000] (64MB)
Jan 14 13:35:29.328741 kernel: Memory: 3982052K/4194160K available (10304K kernel code, 2184K rwdata, 8092K rodata, 39936K init, 897K bss, 212108K reserved, 0K cma-reserved)
Jan 14 13:35:29.328748 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1
Jan 14 13:35:29.328754 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 14 13:35:29.328762 kernel: rcu:         RCU event tracing is enabled.
Jan 14 13:35:29.328768 kernel: rcu:         RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2.
Jan 14 13:35:29.328775 kernel:         Trampoline variant of Tasks RCU enabled.
Jan 14 13:35:29.328781 kernel:         Tracing variant of Tasks RCU enabled.
Jan 14 13:35:29.328789 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 14 13:35:29.328796 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2
Jan 14 13:35:29.328803 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0
Jan 14 13:35:29.328809 kernel: GICv3: 960 SPIs implemented
Jan 14 13:35:29.328816 kernel: GICv3: 0 Extended SPIs implemented
Jan 14 13:35:29.328822 kernel: Root IRQ handler: gic_handle_irq
Jan 14 13:35:29.328829 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI
Jan 14 13:35:29.328836 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000
Jan 14 13:35:29.328842 kernel: ITS: No ITS available, not enabling LPIs
Jan 14 13:35:29.328849 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 14 13:35:29.328855 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040
Jan 14 13:35:29.328862 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt).
Jan 14 13:35:29.328870 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns
Jan 14 13:35:29.328877 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns
Jan 14 13:35:29.328884 kernel: Console: colour dummy device 80x25
Jan 14 13:35:29.328891 kernel: printk: console [tty1] enabled
Jan 14 13:35:29.328897 kernel: ACPI: Core revision 20230628
Jan 14 13:35:29.328904 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000)
Jan 14 13:35:29.328911 kernel: pid_max: default: 32768 minimum: 301
Jan 14 13:35:29.328918 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity
Jan 14 13:35:29.328925 kernel: landlock: Up and running.
Jan 14 13:35:29.328933 kernel: SELinux:  Initializing.
Jan 14 13:35:29.328939 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear)
Jan 14 13:35:29.328946 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear)
Jan 14 13:35:29.328953 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2.
Jan 14 13:35:29.328960 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2.
Jan 14 13:35:29.328967 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3a8030, hints 0xe, misc 0x31e1
Jan 14 13:35:29.328974 kernel: Hyper-V: Host Build 10.0.22477.1594-1-0
Jan 14 13:35:29.328987 kernel: Hyper-V: enabling crash_kexec_post_notifiers
Jan 14 13:35:29.328995 kernel: rcu: Hierarchical SRCU implementation.
Jan 14 13:35:29.329002 kernel: rcu:         Max phase no-delay instances is 400.
Jan 14 13:35:29.329009 kernel: Remapping and enabling EFI services.
Jan 14 13:35:29.329016 kernel: smp: Bringing up secondary CPUs ...
Jan 14 13:35:29.329024 kernel: Detected PIPT I-cache on CPU1
Jan 14 13:35:29.329031 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000
Jan 14 13:35:29.329039 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040
Jan 14 13:35:29.329046 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1]
Jan 14 13:35:29.329053 kernel: smp: Brought up 1 node, 2 CPUs
Jan 14 13:35:29.329061 kernel: SMP: Total of 2 processors activated.
Jan 14 13:35:29.329068 kernel: CPU features: detected: 32-bit EL0 Support
Jan 14 13:35:29.329075 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence
Jan 14 13:35:29.329082 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence
Jan 14 13:35:29.329090 kernel: CPU features: detected: CRC32 instructions
Jan 14 13:35:29.329097 kernel: CPU features: detected: RCpc load-acquire (LDAPR)
Jan 14 13:35:29.329104 kernel: CPU features: detected: LSE atomic instructions
Jan 14 13:35:29.329111 kernel: CPU features: detected: Privileged Access Never
Jan 14 13:35:29.329118 kernel: CPU: All CPU(s) started at EL1
Jan 14 13:35:29.329126 kernel: alternatives: applying system-wide alternatives
Jan 14 13:35:29.329133 kernel: devtmpfs: initialized
Jan 14 13:35:29.329141 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 14 13:35:29.329148 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear)
Jan 14 13:35:29.329155 kernel: pinctrl core: initialized pinctrl subsystem
Jan 14 13:35:29.329162 kernel: SMBIOS 3.1.0 present.
Jan 14 13:35:29.329169 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024
Jan 14 13:35:29.329176 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 14 13:35:29.329184 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations
Jan 14 13:35:29.329192 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 14 13:35:29.329200 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 14 13:35:29.329207 kernel: audit: initializing netlink subsys (disabled)
Jan 14 13:35:29.329214 kernel: audit: type=2000 audit(0.047:1): state=initialized audit_enabled=0 res=1
Jan 14 13:35:29.329221 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 14 13:35:29.329229 kernel: cpuidle: using governor menu
Jan 14 13:35:29.329236 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers.
Jan 14 13:35:29.329243 kernel: ASID allocator initialised with 32768 entries
Jan 14 13:35:29.329250 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 14 13:35:29.329258 kernel: Serial: AMBA PL011 UART driver
Jan 14 13:35:29.329266 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL
Jan 14 13:35:29.329273 kernel: Modules: 0 pages in range for non-PLT usage
Jan 14 13:35:29.329280 kernel: Modules: 508880 pages in range for PLT usage
Jan 14 13:35:29.329288 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 14 13:35:29.329295 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page
Jan 14 13:35:29.329302 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages
Jan 14 13:35:29.329309 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page
Jan 14 13:35:29.329316 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 14 13:35:29.329325 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page
Jan 14 13:35:29.329332 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages
Jan 14 13:35:29.329339 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page
Jan 14 13:35:29.329346 kernel: ACPI: Added _OSI(Module Device)
Jan 14 13:35:29.329353 kernel: ACPI: Added _OSI(Processor Device)
Jan 14 13:35:29.329360 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Jan 14 13:35:29.329367 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 14 13:35:29.329374 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 14 13:35:29.329381 kernel: ACPI: Interpreter enabled
Jan 14 13:35:29.329390 kernel: ACPI: Using GIC for interrupt routing
Jan 14 13:35:29.329397 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA
Jan 14 13:35:29.329404 kernel: printk: console [ttyAMA0] enabled
Jan 14 13:35:29.329411 kernel: printk: bootconsole [pl11] disabled
Jan 14 13:35:29.329418 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA
Jan 14 13:35:29.329426 kernel: iommu: Default domain type: Translated
Jan 14 13:35:29.329433 kernel: iommu: DMA domain TLB invalidation policy: strict mode
Jan 14 13:35:29.329440 kernel: efivars: Registered efivars operations
Jan 14 13:35:29.329447 kernel: vgaarb: loaded
Jan 14 13:35:29.329455 kernel: clocksource: Switched to clocksource arch_sys_counter
Jan 14 13:35:29.329462 kernel: VFS: Disk quotas dquot_6.6.0
Jan 14 13:35:29.329470 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 14 13:35:29.329477 kernel: pnp: PnP ACPI init
Jan 14 13:35:29.329484 kernel: pnp: PnP ACPI: found 0 devices
Jan 14 13:35:29.329491 kernel: NET: Registered PF_INET protocol family
Jan 14 13:35:29.329498 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 14 13:35:29.329505 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear)
Jan 14 13:35:29.329512 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 14 13:35:29.329529 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear)
Jan 14 13:35:29.329536 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear)
Jan 14 13:35:29.329544 kernel: TCP: Hash tables configured (established 32768 bind 32768)
Jan 14 13:35:29.329551 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear)
Jan 14 13:35:29.329558 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear)
Jan 14 13:35:29.329565 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 14 13:35:29.329572 kernel: PCI: CLS 0 bytes, default 64
Jan 14 13:35:29.329579 kernel: kvm [1]: HYP mode not available
Jan 14 13:35:29.329587 kernel: Initialise system trusted keyrings
Jan 14 13:35:29.329595 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0
Jan 14 13:35:29.329603 kernel: Key type asymmetric registered
Jan 14 13:35:29.329609 kernel: Asymmetric key parser 'x509' registered
Jan 14 13:35:29.329616 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250)
Jan 14 13:35:29.329623 kernel: io scheduler mq-deadline registered
Jan 14 13:35:29.329630 kernel: io scheduler kyber registered
Jan 14 13:35:29.329638 kernel: io scheduler bfq registered
Jan 14 13:35:29.329645 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 14 13:35:29.329652 kernel: thunder_xcv, ver 1.0
Jan 14 13:35:29.329660 kernel: thunder_bgx, ver 1.0
Jan 14 13:35:29.329667 kernel: nicpf, ver 1.0
Jan 14 13:35:29.329674 kernel: nicvf, ver 1.0
Jan 14 13:35:29.329810 kernel: rtc-efi rtc-efi.0: registered as rtc0
Jan 14 13:35:29.329882 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-01-14T13:35:28 UTC (1736861728)
Jan 14 13:35:29.329892 kernel: efifb: probing for efifb
Jan 14 13:35:29.329899 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k
Jan 14 13:35:29.329906 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1
Jan 14 13:35:29.329915 kernel: efifb: scrolling: redraw
Jan 14 13:35:29.329923 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0
Jan 14 13:35:29.329930 kernel: Console: switching to colour frame buffer device 128x48
Jan 14 13:35:29.329937 kernel: fb0: EFI VGA frame buffer device
Jan 14 13:35:29.329944 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping ....
Jan 14 13:35:29.329951 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 14 13:35:29.329958 kernel: No ACPI PMU IRQ for CPU0
Jan 14 13:35:29.329965 kernel: No ACPI PMU IRQ for CPU1
Jan 14 13:35:29.329972 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 1 counters available
Jan 14 13:35:29.329981 kernel: watchdog: Delayed init of the lockup detector failed: -19
Jan 14 13:35:29.329988 kernel: watchdog: Hard watchdog permanently disabled
Jan 14 13:35:29.329995 kernel: NET: Registered PF_INET6 protocol family
Jan 14 13:35:29.330001 kernel: Segment Routing with IPv6
Jan 14 13:35:29.330009 kernel: In-situ OAM (IOAM) with IPv6
Jan 14 13:35:29.330016 kernel: NET: Registered PF_PACKET protocol family
Jan 14 13:35:29.330023 kernel: Key type dns_resolver registered
Jan 14 13:35:29.330029 kernel: registered taskstats version 1
Jan 14 13:35:29.330036 kernel: Loading compiled-in X.509 certificates
Jan 14 13:35:29.330045 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: 46cb4d1b22f3a5974766fe7d7b651e2f296d4fe0'
Jan 14 13:35:29.330052 kernel: Key type .fscrypt registered
Jan 14 13:35:29.330059 kernel: Key type fscrypt-provisioning registered
Jan 14 13:35:29.330066 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 14 13:35:29.330073 kernel: ima: Allocated hash algorithm: sha1
Jan 14 13:35:29.330080 kernel: ima: No architecture policies found
Jan 14 13:35:29.330087 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng)
Jan 14 13:35:29.330095 kernel: clk: Disabling unused clocks
Jan 14 13:35:29.330102 kernel: Freeing unused kernel memory: 39936K
Jan 14 13:35:29.330110 kernel: Run /init as init process
Jan 14 13:35:29.330117 kernel:   with arguments:
Jan 14 13:35:29.330124 kernel:     /init
Jan 14 13:35:29.330131 kernel:   with environment:
Jan 14 13:35:29.330138 kernel:     HOME=/
Jan 14 13:35:29.330144 kernel:     TERM=linux
Jan 14 13:35:29.330151 kernel:     BOOT_IMAGE=/flatcar/vmlinuz-a
Jan 14 13:35:29.330160 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified)
Jan 14 13:35:29.330171 systemd[1]: Detected virtualization microsoft.
Jan 14 13:35:29.330179 systemd[1]: Detected architecture arm64.
Jan 14 13:35:29.330186 systemd[1]: Running in initrd.
Jan 14 13:35:29.330194 systemd[1]: No hostname configured, using default hostname.
Jan 14 13:35:29.330201 systemd[1]: Hostname set to <localhost>.
Jan 14 13:35:29.330209 systemd[1]: Initializing machine ID from random generator.
Jan 14 13:35:29.330216 systemd[1]: Queued start job for default target initrd.target.
Jan 14 13:35:29.330224 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch.
Jan 14 13:35:29.330233 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch.
Jan 14 13:35:29.330241 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM...
Jan 14 13:35:29.330249 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM...
Jan 14 13:35:29.330256 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT...
Jan 14 13:35:29.330264 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A...
Jan 14 13:35:29.330273 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132...
Jan 14 13:35:29.330282 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr...
Jan 14 13:35:29.330290 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre).
Jan 14 13:35:29.330298 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes.
Jan 14 13:35:29.330305 systemd[1]: Reached target paths.target - Path Units.
Jan 14 13:35:29.330313 systemd[1]: Reached target slices.target - Slice Units.
Jan 14 13:35:29.330320 systemd[1]: Reached target swap.target - Swaps.
Jan 14 13:35:29.330328 systemd[1]: Reached target timers.target - Timer Units.
Jan 14 13:35:29.330335 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket.
Jan 14 13:35:29.330343 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket.
Jan 14 13:35:29.330352 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log).
Jan 14 13:35:29.330360 systemd[1]: Listening on systemd-journald.socket - Journal Socket.
Jan 14 13:35:29.330367 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket.
Jan 14 13:35:29.330375 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket.
Jan 14 13:35:29.330382 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket.
Jan 14 13:35:29.330390 systemd[1]: Reached target sockets.target - Socket Units.
Jan 14 13:35:29.330397 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup...
Jan 14 13:35:29.330405 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes...
Jan 14 13:35:29.330413 systemd[1]: Finished network-cleanup.service - Network Cleanup.
Jan 14 13:35:29.330422 systemd[1]: Starting systemd-fsck-usr.service...
Jan 14 13:35:29.330430 systemd[1]: Starting systemd-journald.service - Journal Service...
Jan 14 13:35:29.330437 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules...
Jan 14 13:35:29.330463 systemd-journald[218]: Collecting audit messages is disabled.
Jan 14 13:35:29.330484 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup...
Jan 14 13:35:29.330493 systemd-journald[218]: Journal started
Jan 14 13:35:29.330515 systemd-journald[218]: Runtime Journal (/run/log/journal/235341efae424829a482ae4890fb2fb9) is 8.0M, max 78.5M, 70.5M free.
Jan 14 13:35:29.343262 systemd-modules-load[219]: Inserted module 'overlay'
Jan 14 13:35:29.354656 systemd[1]: Started systemd-journald.service - Journal Service.
Jan 14 13:35:29.360817 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup.
Jan 14 13:35:29.370865 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes.
Jan 14 13:35:29.410631 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 14 13:35:29.410654 kernel: Bridge firewalling registered
Jan 14 13:35:29.399292 systemd[1]: Finished systemd-fsck-usr.service.
Jan 14 13:35:29.414939 systemd-modules-load[219]: Inserted module 'br_netfilter'
Jan 14 13:35:29.415804 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules.
Jan 14 13:35:29.428356 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup.
Jan 14 13:35:29.454761 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters...
Jan 14 13:35:29.469723 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables...
Jan 14 13:35:29.482320 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully...
Jan 14 13:35:29.493731 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories...
Jan 14 13:35:29.515417 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters.
Jan 14 13:35:29.530841 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables.
Jan 14 13:35:29.543971 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully.
Jan 14 13:35:29.557722 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories.
Jan 14 13:35:29.586831 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook...
Jan 14 13:35:29.600646 systemd[1]: Starting systemd-resolved.service - Network Name Resolution...
Jan 14 13:35:29.620059 dracut-cmdline[251]: dracut-dracut-053
Jan 14 13:35:29.624962 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev...
Jan 14 13:35:29.642888 dracut-cmdline[251]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=9798117b3b15ef802e3d618077f87253cc08e0d5280b8fe28b307e7558b7ebcc
Jan 14 13:35:29.683156 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev.
Jan 14 13:35:29.696026 systemd-resolved[252]: Positive Trust Anchors:
Jan 14 13:35:29.696036 systemd-resolved[252]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d
Jan 14 13:35:29.696067 systemd-resolved[252]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test
Jan 14 13:35:29.698322 systemd-resolved[252]: Defaulting to hostname 'linux'.
Jan 14 13:35:29.699159 systemd[1]: Started systemd-resolved.service - Network Name Resolution.
Jan 14 13:35:29.706755 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups.
Jan 14 13:35:29.829550 kernel: SCSI subsystem initialized
Jan 14 13:35:29.838545 kernel: Loading iSCSI transport class v2.0-870.
Jan 14 13:35:29.848544 kernel: iscsi: registered transport (tcp)
Jan 14 13:35:29.866995 kernel: iscsi: registered transport (qla4xxx)
Jan 14 13:35:29.867054 kernel: QLogic iSCSI HBA Driver
Jan 14 13:35:29.899895 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook.
Jan 14 13:35:29.916666 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook...
Jan 14 13:35:29.949292 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 14 13:35:29.949352 kernel: device-mapper: uevent: version 1.0.3
Jan 14 13:35:29.955938 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com
Jan 14 13:35:30.003542 kernel: raid6: neonx8   gen() 15766 MB/s
Jan 14 13:35:30.023534 kernel: raid6: neonx4   gen() 15801 MB/s
Jan 14 13:35:30.044534 kernel: raid6: neonx2   gen() 13410 MB/s
Jan 14 13:35:30.065530 kernel: raid6: neonx1   gen() 10428 MB/s
Jan 14 13:35:30.085529 kernel: raid6: int64x8  gen()  6795 MB/s
Jan 14 13:35:30.105530 kernel: raid6: int64x4  gen()  7350 MB/s
Jan 14 13:35:30.126530 kernel: raid6: int64x2  gen()  6109 MB/s
Jan 14 13:35:30.149707 kernel: raid6: int64x1  gen()  5061 MB/s
Jan 14 13:35:30.149717 kernel: raid6: using algorithm neonx4 gen() 15801 MB/s
Jan 14 13:35:30.173745 kernel: raid6: .... xor() 12320 MB/s, rmw enabled
Jan 14 13:35:30.173757 kernel: raid6: using neon recovery algorithm
Jan 14 13:35:30.182532 kernel: xor: measuring software checksum speed
Jan 14 13:35:30.182546 kernel:    8regs           : 20434 MB/sec
Jan 14 13:35:30.189146 kernel:    32regs          : 21676 MB/sec
Jan 14 13:35:30.192557 kernel:    arm64_neon      : 27860 MB/sec
Jan 14 13:35:30.196904 kernel: xor: using function: arm64_neon (27860 MB/sec)
Jan 14 13:35:30.246541 kernel: Btrfs loaded, zoned=no, fsverity=no
Jan 14 13:35:30.256031 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook.
Jan 14 13:35:30.272658 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files...
Jan 14 13:35:30.295257 systemd-udevd[437]: Using default interface naming scheme 'v255'.
Jan 14 13:35:30.300733 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files.
Jan 14 13:35:30.322729 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook...
Jan 14 13:35:30.335445 dracut-pre-trigger[441]: rd.md=0: removing MD RAID activation
Jan 14 13:35:30.362233 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook.
Jan 14 13:35:30.377773 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices...
Jan 14 13:35:30.417841 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices.
Jan 14 13:35:30.436693 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook...
Jan 14 13:35:30.460666 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook.
Jan 14 13:35:30.467556 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems.
Jan 14 13:35:30.485642 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes.
Jan 14 13:35:30.506703 systemd[1]: Reached target remote-fs.target - Remote File Systems.
Jan 14 13:35:30.533717 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook...
Jan 14 13:35:30.560338 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook.
Jan 14 13:35:30.582800 kernel: hv_vmbus: Vmbus version:5.3
Jan 14 13:35:30.582823 kernel: hv_vmbus: registering driver hyperv_keyboard
Jan 14 13:35:30.573310 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully.
Jan 14 13:35:30.573491 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters.
Jan 14 13:35:30.634365 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 14 13:35:30.634393 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0
Jan 14 13:35:30.634403 kernel: hv_vmbus: registering driver hv_netvsc
Jan 14 13:35:30.634412 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 14 13:35:30.613453 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters...
Jan 14 13:35:30.699305 kernel: hv_vmbus: registering driver hv_storvsc
Jan 14 13:35:30.699329 kernel: hv_vmbus: registering driver hid_hyperv
Jan 14 13:35:30.699338 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1
Jan 14 13:35:30.699348 kernel: scsi host1: storvsc_host_t
Jan 14 13:35:30.699515 kernel: scsi host0: storvsc_host_t
Jan 14 13:35:30.699621 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on 
Jan 14 13:35:30.699702 kernel: PTP clock support registered
Jan 14 13:35:30.699718 kernel: scsi 0:0:0:0: Direct-Access     Msft     Virtual Disk     1.0  PQ: 0 ANSI: 5
Jan 14 13:35:30.699739 kernel: scsi 0:0:0:2: CD-ROM            Msft     Virtual DVD-ROM  1.0  PQ: 0 ANSI: 0
Jan 14 13:35:30.657007 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 14 13:35:30.657263 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup.
Jan 14 13:35:30.712361 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup...
Jan 14 13:35:30.741047 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup...
Jan 14 13:35:30.765824 kernel: hv_utils: Registering HyperV Utility Driver
Jan 14 13:35:30.765847 kernel: hv_vmbus: registering driver hv_utils
Jan 14 13:35:30.766565 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 14 13:35:30.784291 kernel: hv_netvsc 0022487b-74b7-0022-487b-74b70022487b eth0: VF slot 1 added
Jan 14 13:35:30.784443 kernel: sr 0:0:0:2: [sr0] scsi-1 drive
Jan 14 13:35:30.810187 kernel: hv_vmbus: registering driver hv_pci
Jan 14 13:35:30.810210 kernel: hv_utils: Heartbeat IC version 3.0
Jan 14 13:35:30.810219 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 14 13:35:30.810228 kernel: hv_utils: Shutdown IC version 3.2
Jan 14 13:35:30.810237 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0
Jan 14 13:35:30.810352 kernel: hv_utils: TimeSync IC version 4.0
Jan 14 13:35:30.767661 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup.
Jan 14 13:35:31.117173 kernel: hv_pci 4894d9d5-7fe0-4e09-a4df-dd0411651e4b: PCI VMBus probing: Using version 0x10004
Jan 14 13:35:31.235151 kernel: hv_pci 4894d9d5-7fe0-4e09-a4df-dd0411651e4b: PCI host bridge to bus 7fe0:00
Jan 14 13:35:31.235276 kernel: pci_bus 7fe0:00: root bus resource [mem 0xfc0000000-0xfc00fffff window]
Jan 14 13:35:31.235400 kernel: pci_bus 7fe0:00: No busn resource found for root bus, will use [bus 00-ff]
Jan 14 13:35:31.235485 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB)
Jan 14 13:35:31.235589 kernel: pci 7fe0:00:02.0: [15b3:1018] type 00 class 0x020000
Jan 14 13:35:31.235681 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks
Jan 14 13:35:31.235770 kernel: pci 7fe0:00:02.0: reg 0x10: [mem 0xfc0000000-0xfc00fffff 64bit pref]
Jan 14 13:35:31.235866 kernel: sd 0:0:0:0: [sda] Write Protect is off
Jan 14 13:35:31.235955 kernel: pci 7fe0:00:02.0: enabling Extended Tags
Jan 14 13:35:31.237704 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00
Jan 14 13:35:31.237825 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA
Jan 14 13:35:31.238036 kernel: pci 7fe0:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 7fe0:00:02.0 (capable of 126.016 Gb/s with 8.0 GT/s PCIe x16 link)
Jan 14 13:35:31.238167 kernel:  sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9
Jan 14 13:35:31.238179 kernel: pci_bus 7fe0:00: busn_res: [bus 00-ff] end is updated to 00
Jan 14 13:35:31.238280 kernel: sd 0:0:0:0: [sda] Attached SCSI disk
Jan 14 13:35:31.238383 kernel: pci 7fe0:00:02.0: BAR 0: assigned [mem 0xfc0000000-0xfc00fffff 64bit pref]
Jan 14 13:35:31.102072 systemd-resolved[252]: Clock change detected. Flushing caches.
Jan 14 13:35:31.119286 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup...
Jan 14 13:35:31.184886 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup.
Jan 14 13:35:31.246593 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters...
Jan 14 13:35:31.292129 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters.
Jan 14 13:35:31.316407 kernel: mlx5_core 7fe0:00:02.0: enabling device (0000 -> 0002)
Jan 14 13:35:31.540141 kernel: mlx5_core 7fe0:00:02.0: firmware version: 16.30.1284
Jan 14 13:35:31.540313 kernel: hv_netvsc 0022487b-74b7-0022-487b-74b70022487b eth0: VF registering: eth1
Jan 14 13:35:31.540408 kernel: mlx5_core 7fe0:00:02.0 eth1: joined to eth0
Jan 14 13:35:31.540501 kernel: mlx5_core 7fe0:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic)
Jan 14 13:35:31.548018 kernel: mlx5_core 7fe0:00:02.0 enP32736s1: renamed from eth1
Jan 14 13:35:31.737619 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM.
Jan 14 13:35:31.854199 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT.
Jan 14 13:35:31.880032 kernel: BTRFS: device fsid 2be7cc1c-29d4-4496-b29b-8561323213d2 devid 1 transid 38 /dev/sda3 scanned by (udev-worker) (499)
Jan 14 13:35:31.893355 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A.
Jan 14 13:35:31.901405 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A.
Jan 14 13:35:31.935218 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary...
Jan 14 13:35:31.959037 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by (udev-worker) (488)
Jan 14 13:35:31.978108 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM.
Jan 14 13:35:32.978068 kernel:  sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9
Jan 14 13:35:32.978122 disk-uuid[604]: The operation has completed successfully.
Jan 14 13:35:33.044459 systemd[1]: disk-uuid.service: Deactivated successfully.
Jan 14 13:35:33.046016 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary.
Jan 14 13:35:33.080171 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr...
Jan 14 13:35:33.093749 sh[691]: Success
Jan 14 13:35:33.125029 kernel: device-mapper: verity: sha256 using implementation "sha256-ce"
Jan 14 13:35:33.336485 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr.
Jan 14 13:35:33.361123 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr...
Jan 14 13:35:33.366551 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr.
Jan 14 13:35:33.405115 kernel: BTRFS info (device dm-0): first mount of filesystem 2be7cc1c-29d4-4496-b29b-8561323213d2
Jan 14 13:35:33.405182 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm
Jan 14 13:35:33.412680 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead
Jan 14 13:35:33.418296 kernel: BTRFS info (device dm-0): disabling log replay at mount time
Jan 14 13:35:33.422674 kernel: BTRFS info (device dm-0): using free space tree
Jan 14 13:35:33.711347 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr.
Jan 14 13:35:33.717448 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met.
Jan 14 13:35:33.742237 systemd[1]: Starting ignition-setup.service - Ignition (setup)...
Jan 14 13:35:33.750170 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline...
Jan 14 13:35:33.788629 kernel: BTRFS info (device sda6): first mount of filesystem 9f8ecb6c-ace6-4d16-8781-f4e964dc0779
Jan 14 13:35:33.788675 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm
Jan 14 13:35:33.793917 kernel: BTRFS info (device sda6): using free space tree
Jan 14 13:35:33.818060 kernel: BTRFS info (device sda6): auto enabling async discard
Jan 14 13:35:33.833817 systemd[1]: mnt-oem.mount: Deactivated successfully.
Jan 14 13:35:33.840065 kernel: BTRFS info (device sda6): last unmount of filesystem 9f8ecb6c-ace6-4d16-8781-f4e964dc0779
Jan 14 13:35:33.847798 systemd[1]: Finished ignition-setup.service - Ignition (setup).
Jan 14 13:35:33.866230 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)...
Jan 14 13:35:33.875239 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline.
Jan 14 13:35:33.897231 systemd[1]: Starting systemd-networkd.service - Network Configuration...
Jan 14 13:35:33.928607 systemd-networkd[875]: lo: Link UP
Jan 14 13:35:33.928625 systemd-networkd[875]: lo: Gained carrier
Jan 14 13:35:33.930601 systemd-networkd[875]: Enumeration completed
Jan 14 13:35:33.930713 systemd[1]: Started systemd-networkd.service - Network Configuration.
Jan 14 13:35:33.942764 systemd[1]: Reached target network.target - Network.
Jan 14 13:35:33.946970 systemd-networkd[875]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name.
Jan 14 13:35:33.946973 systemd-networkd[875]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network.
Jan 14 13:35:34.029016 kernel: mlx5_core 7fe0:00:02.0 enP32736s1: Link up
Jan 14 13:35:34.072020 kernel: hv_netvsc 0022487b-74b7-0022-487b-74b70022487b eth0: Data path switched to VF: enP32736s1
Jan 14 13:35:34.072821 systemd-networkd[875]: enP32736s1: Link UP
Jan 14 13:35:34.073104 systemd-networkd[875]: eth0: Link UP
Jan 14 13:35:34.073463 systemd-networkd[875]: eth0: Gained carrier
Jan 14 13:35:34.073473 systemd-networkd[875]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name.
Jan 14 13:35:34.085864 systemd-networkd[875]: enP32736s1: Gained carrier
Jan 14 13:35:34.109042 systemd-networkd[875]: eth0: DHCPv4 address 10.200.20.15/24, gateway 10.200.20.1 acquired from 168.63.129.16
Jan 14 13:35:34.947085 ignition[870]: Ignition 2.20.0
Jan 14 13:35:34.947096 ignition[870]: Stage: fetch-offline
Jan 14 13:35:34.951536 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline).
Jan 14 13:35:34.947132 ignition[870]: no configs at "/usr/lib/ignition/base.d"
Jan 14 13:35:34.947141 ignition[870]: no config dir at "/usr/lib/ignition/base.platform.d/azure"
Jan 14 13:35:34.947226 ignition[870]: parsed url from cmdline: ""
Jan 14 13:35:34.947229 ignition[870]: no config URL provided
Jan 14 13:35:34.947233 ignition[870]: reading system config file "/usr/lib/ignition/user.ign"
Jan 14 13:35:34.978197 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)...
Jan 14 13:35:34.947241 ignition[870]: no config at "/usr/lib/ignition/user.ign"
Jan 14 13:35:34.947245 ignition[870]: failed to fetch config: resource requires networking
Jan 14 13:35:34.947424 ignition[870]: Ignition finished successfully
Jan 14 13:35:34.997217 ignition[883]: Ignition 2.20.0
Jan 14 13:35:34.997223 ignition[883]: Stage: fetch
Jan 14 13:35:34.997411 ignition[883]: no configs at "/usr/lib/ignition/base.d"
Jan 14 13:35:34.997421 ignition[883]: no config dir at "/usr/lib/ignition/base.platform.d/azure"
Jan 14 13:35:34.997523 ignition[883]: parsed url from cmdline: ""
Jan 14 13:35:34.997526 ignition[883]: no config URL provided
Jan 14 13:35:34.997531 ignition[883]: reading system config file "/usr/lib/ignition/user.ign"
Jan 14 13:35:34.997547 ignition[883]: no config at "/usr/lib/ignition/user.ign"
Jan 14 13:35:34.997572 ignition[883]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1
Jan 14 13:35:35.113153 ignition[883]: GET result: OK
Jan 14 13:35:35.113245 ignition[883]: config has been read from IMDS userdata
Jan 14 13:35:35.113316 ignition[883]: parsing config with SHA512: 4d8868346995f696b41a6b84974154998c194de7f6be1e9c81fa3ecb9346efaf80c02405c47f5bdc31dccfb726b749eb7b2fb1007bf354fe998f92e9f873dda3
Jan 14 13:35:35.118499 unknown[883]: fetched base config from "system"
Jan 14 13:35:35.118947 ignition[883]: fetch: fetch complete
Jan 14 13:35:35.118506 unknown[883]: fetched base config from "system"
Jan 14 13:35:35.118952 ignition[883]: fetch: fetch passed
Jan 14 13:35:35.118511 unknown[883]: fetched user config from "azure"
Jan 14 13:35:35.119024 ignition[883]: Ignition finished successfully
Jan 14 13:35:35.122139 systemd[1]: Finished ignition-fetch.service - Ignition (fetch).
Jan 14 13:35:35.142098 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)...
Jan 14 13:35:35.161702 ignition[890]: Ignition 2.20.0
Jan 14 13:35:35.173292 systemd[1]: Finished ignition-kargs.service - Ignition (kargs).
Jan 14 13:35:35.161710 ignition[890]: Stage: kargs
Jan 14 13:35:35.161879 ignition[890]: no configs at "/usr/lib/ignition/base.d"
Jan 14 13:35:35.161888 ignition[890]: no config dir at "/usr/lib/ignition/base.platform.d/azure"
Jan 14 13:35:35.191276 systemd[1]: Starting ignition-disks.service - Ignition (disks)...
Jan 14 13:35:35.162771 ignition[890]: kargs: kargs passed
Jan 14 13:35:35.162811 ignition[890]: Ignition finished successfully
Jan 14 13:35:35.217943 ignition[896]: Ignition 2.20.0
Jan 14 13:35:35.221583 systemd[1]: Finished ignition-disks.service - Ignition (disks).
Jan 14 13:35:35.217950 ignition[896]: Stage: disks
Jan 14 13:35:35.231107 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device.
Jan 14 13:35:35.218169 ignition[896]: no configs at "/usr/lib/ignition/base.d"
Jan 14 13:35:35.242069 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems.
Jan 14 13:35:35.218179 ignition[896]: no config dir at "/usr/lib/ignition/base.platform.d/azure"
Jan 14 13:35:35.251799 systemd-networkd[875]: enP32736s1: Gained IPv6LL
Jan 14 13:35:35.219274 ignition[896]: disks: disks passed
Jan 14 13:35:35.252070 systemd[1]: Reached target local-fs.target - Local File Systems.
Jan 14 13:35:35.219321 ignition[896]: Ignition finished successfully
Jan 14 13:35:35.263682 systemd[1]: Reached target sysinit.target - System Initialization.
Jan 14 13:35:35.274645 systemd[1]: Reached target basic.target - Basic System.
Jan 14 13:35:35.294228 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT...
Jan 14 13:35:35.387201 systemd-fsck[905]: ROOT: clean, 14/7326000 files, 477710/7359488 blocks
Jan 14 13:35:35.397714 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT.
Jan 14 13:35:35.416200 systemd[1]: Mounting sysroot.mount - /sysroot...
Jan 14 13:35:35.476057 kernel: EXT4-fs (sda9): mounted filesystem f9a95e53-2d63-4443-b523-cb2108fb48f6 r/w with ordered data mode. Quota mode: none.
Jan 14 13:35:35.476826 systemd[1]: Mounted sysroot.mount - /sysroot.
Jan 14 13:35:35.486335 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System.
Jan 14 13:35:35.506126 systemd-networkd[875]: eth0: Gained IPv6LL
Jan 14 13:35:35.533073 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem...
Jan 14 13:35:35.543105 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr...
Jan 14 13:35:35.550189 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent...
Jan 14 13:35:35.592107 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (916)
Jan 14 13:35:35.592131 kernel: BTRFS info (device sda6): first mount of filesystem 9f8ecb6c-ace6-4d16-8781-f4e964dc0779
Jan 14 13:35:35.592141 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm
Jan 14 13:35:35.578378 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot).
Jan 14 13:35:35.615447 kernel: BTRFS info (device sda6): using free space tree
Jan 14 13:35:35.578415 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup.
Jan 14 13:35:35.603666 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr.
Jan 14 13:35:35.623197 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup...
Jan 14 13:35:35.653154 kernel: BTRFS info (device sda6): auto enabling async discard
Jan 14 13:35:35.654703 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem.
Jan 14 13:35:36.191612 coreos-metadata[918]: Jan 14 13:35:36.191 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1
Jan 14 13:35:36.200221 coreos-metadata[918]: Jan 14 13:35:36.200 INFO Fetch successful
Jan 14 13:35:36.200221 coreos-metadata[918]: Jan 14 13:35:36.200 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1
Jan 14 13:35:36.218373 coreos-metadata[918]: Jan 14 13:35:36.218 INFO Fetch successful
Jan 14 13:35:36.234056 coreos-metadata[918]: Jan 14 13:35:36.234 INFO wrote hostname ci-4186.1.0-a-e83668d6e0 to /sysroot/etc/hostname
Jan 14 13:35:36.244708 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent.
Jan 14 13:35:36.437983 initrd-setup-root[947]: cut: /sysroot/etc/passwd: No such file or directory
Jan 14 13:35:36.507782 initrd-setup-root[954]: cut: /sysroot/etc/group: No such file or directory
Jan 14 13:35:36.515433 initrd-setup-root[961]: cut: /sysroot/etc/shadow: No such file or directory
Jan 14 13:35:36.524753 initrd-setup-root[968]: cut: /sysroot/etc/gshadow: No such file or directory
Jan 14 13:35:37.148893 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup.
Jan 14 13:35:37.165279 systemd[1]: Starting ignition-mount.service - Ignition (mount)...
Jan 14 13:35:37.174221 systemd[1]: Starting sysroot-boot.service - /sysroot/boot...
Jan 14 13:35:37.196277 kernel: BTRFS info (device sda6): last unmount of filesystem 9f8ecb6c-ace6-4d16-8781-f4e964dc0779
Jan 14 13:35:37.192118 systemd[1]: sysroot-oem.mount: Deactivated successfully.
Jan 14 13:35:37.219782 ignition[1035]: INFO     : Ignition 2.20.0
Jan 14 13:35:37.219782 ignition[1035]: INFO     : Stage: mount
Jan 14 13:35:37.240904 ignition[1035]: INFO     : no configs at "/usr/lib/ignition/base.d"
Jan 14 13:35:37.240904 ignition[1035]: INFO     : no config dir at "/usr/lib/ignition/base.platform.d/azure"
Jan 14 13:35:37.240904 ignition[1035]: INFO     : mount: mount passed
Jan 14 13:35:37.240904 ignition[1035]: INFO     : Ignition finished successfully
Jan 14 13:35:37.221487 systemd[1]: Finished sysroot-boot.service - /sysroot/boot.
Jan 14 13:35:37.227476 systemd[1]: Finished ignition-mount.service - Ignition (mount).
Jan 14 13:35:37.258246 systemd[1]: Starting ignition-files.service - Ignition (files)...
Jan 14 13:35:37.278201 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem...
Jan 14 13:35:37.311078 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sda6 scanned by mount (1047)
Jan 14 13:35:37.324006 kernel: BTRFS info (device sda6): first mount of filesystem 9f8ecb6c-ace6-4d16-8781-f4e964dc0779
Jan 14 13:35:37.324049 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm
Jan 14 13:35:37.328161 kernel: BTRFS info (device sda6): using free space tree
Jan 14 13:35:37.335176 kernel: BTRFS info (device sda6): auto enabling async discard
Jan 14 13:35:37.336326 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem.
Jan 14 13:35:37.367659 ignition[1064]: INFO     : Ignition 2.20.0
Jan 14 13:35:37.367659 ignition[1064]: INFO     : Stage: files
Jan 14 13:35:37.367659 ignition[1064]: INFO     : no configs at "/usr/lib/ignition/base.d"
Jan 14 13:35:37.367659 ignition[1064]: INFO     : no config dir at "/usr/lib/ignition/base.platform.d/azure"
Jan 14 13:35:37.367659 ignition[1064]: DEBUG    : files: compiled without relabeling support, skipping
Jan 14 13:35:37.398284 ignition[1064]: INFO     : files: ensureUsers: op(1): [started]  creating or modifying user "core"
Jan 14 13:35:37.398284 ignition[1064]: DEBUG    : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core"
Jan 14 13:35:37.447351 ignition[1064]: INFO     : files: ensureUsers: op(1): [finished] creating or modifying user "core"
Jan 14 13:35:37.455187 ignition[1064]: INFO     : files: ensureUsers: op(2): [started]  adding ssh keys to user "core"
Jan 14 13:35:37.455187 ignition[1064]: INFO     : files: ensureUsers: op(2): [finished] adding ssh keys to user "core"
Jan 14 13:35:37.448353 unknown[1064]: wrote ssh authorized keys file for user: core
Jan 14 13:35:37.475160 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(3): [started]  writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz"
Jan 14 13:35:37.475160 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1
Jan 14 13:35:37.660254 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(3): GET result: OK
Jan 14 13:35:37.783546 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz"
Jan 14 13:35:37.783546 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(4): [started]  writing file "/sysroot/home/core/install.sh"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(5): [started]  writing file "/sysroot/home/core/nginx.yaml"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(6): [started]  writing file "/sysroot/home/core/nfs-pod.yaml"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(7): [started]  writing file "/sysroot/home/core/nfs-pvc.yaml"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(8): [started]  writing file "/sysroot/etc/flatcar/update.conf"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(9): [started]  writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-arm64.raw"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-arm64.raw"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(a): [started]  writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-arm64.raw"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.29.2-arm64.raw: attempt #1
Jan 14 13:35:38.284676 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(a): GET result: OK
Jan 14 13:35:39.120601 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-arm64.raw"
Jan 14 13:35:39.120601 ignition[1064]: INFO     : files: op(b): [started]  processing unit "prepare-helm.service"
Jan 14 13:35:39.181677 ignition[1064]: INFO     : files: op(b): op(c): [started]  writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service"
Jan 14 13:35:39.193853 ignition[1064]: INFO     : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service"
Jan 14 13:35:39.193853 ignition[1064]: INFO     : files: op(b): [finished] processing unit "prepare-helm.service"
Jan 14 13:35:39.193853 ignition[1064]: INFO     : files: op(d): [started]  setting preset to enabled for "prepare-helm.service"
Jan 14 13:35:39.193853 ignition[1064]: INFO     : files: op(d): [finished] setting preset to enabled for "prepare-helm.service"
Jan 14 13:35:39.193853 ignition[1064]: INFO     : files: createResultFile: createFiles: op(e): [started]  writing file "/sysroot/etc/.ignition-result.json"
Jan 14 13:35:39.193853 ignition[1064]: INFO     : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json"
Jan 14 13:35:39.193853 ignition[1064]: INFO     : files: files passed
Jan 14 13:35:39.193853 ignition[1064]: INFO     : Ignition finished successfully
Jan 14 13:35:39.194086 systemd[1]: Finished ignition-files.service - Ignition (files).
Jan 14 13:35:39.242248 systemd[1]: Starting ignition-quench.service - Ignition (record completion)...
Jan 14 13:35:39.261164 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion...
Jan 14 13:35:39.316561 initrd-setup-root-after-ignition[1091]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory
Jan 14 13:35:39.316561 initrd-setup-root-after-ignition[1091]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory
Jan 14 13:35:39.274595 systemd[1]: ignition-quench.service: Deactivated successfully.
Jan 14 13:35:39.341073 initrd-setup-root-after-ignition[1095]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory
Jan 14 13:35:39.274681 systemd[1]: Finished ignition-quench.service - Ignition (record completion).
Jan 14 13:35:39.283266 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion.
Jan 14 13:35:39.298634 systemd[1]: Reached target ignition-complete.target - Ignition Complete.
Jan 14 13:35:39.334269 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root...
Jan 14 13:35:39.376940 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 14 13:35:39.377067 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root.
Jan 14 13:35:39.388781 systemd[1]: Reached target initrd-fs.target - Initrd File Systems.
Jan 14 13:35:39.400614 systemd[1]: Reached target initrd.target - Initrd Default Target.
Jan 14 13:35:39.413393 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met.
Jan 14 13:35:39.416182 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook...
Jan 14 13:35:39.453166 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook.
Jan 14 13:35:39.469209 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons...
Jan 14 13:35:39.488625 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 14 13:35:39.490018 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons.
Jan 14 13:35:39.501086 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups.
Jan 14 13:35:39.513849 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes.
Jan 14 13:35:39.526497 systemd[1]: Stopped target timers.target - Timer Units.
Jan 14 13:35:39.538040 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 14 13:35:39.538110 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook.
Jan 14 13:35:39.554344 systemd[1]: Stopped target initrd.target - Initrd Default Target.
Jan 14 13:35:39.566837 systemd[1]: Stopped target basic.target - Basic System.
Jan 14 13:35:39.577668 systemd[1]: Stopped target ignition-complete.target - Ignition Complete.
Jan 14 13:35:39.589298 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup.
Jan 14 13:35:39.602562 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device.
Jan 14 13:35:39.615549 systemd[1]: Stopped target remote-fs.target - Remote File Systems.
Jan 14 13:35:39.627759 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems.
Jan 14 13:35:39.640146 systemd[1]: Stopped target sysinit.target - System Initialization.
Jan 14 13:35:39.652394 systemd[1]: Stopped target local-fs.target - Local File Systems.
Jan 14 13:35:39.663319 systemd[1]: Stopped target swap.target - Swaps.
Jan 14 13:35:39.672855 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 14 13:35:39.672933 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook.
Jan 14 13:35:39.687567 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes.
Jan 14 13:35:39.694076 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre).
Jan 14 13:35:39.706490 systemd[1]: clevis-luks-askpass.path: Deactivated successfully.
Jan 14 13:35:39.712056 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch.
Jan 14 13:35:39.719006 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 14 13:35:39.719075 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook.
Jan 14 13:35:39.736462 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully.
Jan 14 13:35:39.736507 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion.
Jan 14 13:35:39.751306 systemd[1]: ignition-files.service: Deactivated successfully.
Jan 14 13:35:39.751355 systemd[1]: Stopped ignition-files.service - Ignition (files).
Jan 14 13:35:39.762339 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully.
Jan 14 13:35:39.762382 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent.
Jan 14 13:35:39.826783 ignition[1117]: INFO     : Ignition 2.20.0
Jan 14 13:35:39.826783 ignition[1117]: INFO     : Stage: umount
Jan 14 13:35:39.826783 ignition[1117]: INFO     : no configs at "/usr/lib/ignition/base.d"
Jan 14 13:35:39.826783 ignition[1117]: INFO     : no config dir at "/usr/lib/ignition/base.platform.d/azure"
Jan 14 13:35:39.826783 ignition[1117]: INFO     : umount: umount passed
Jan 14 13:35:39.826783 ignition[1117]: INFO     : Ignition finished successfully
Jan 14 13:35:39.794184 systemd[1]: Stopping ignition-mount.service - Ignition (mount)...
Jan 14 13:35:39.811356 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 14 13:35:39.811431 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes.
Jan 14 13:35:39.823110 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot...
Jan 14 13:35:39.831808 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 14 13:35:39.831875 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices.
Jan 14 13:35:39.845415 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 14 13:35:39.845475 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook.
Jan 14 13:35:39.866750 systemd[1]: ignition-mount.service: Deactivated successfully.
Jan 14 13:35:39.866840 systemd[1]: Stopped ignition-mount.service - Ignition (mount).
Jan 14 13:35:39.876437 systemd[1]: ignition-disks.service: Deactivated successfully.
Jan 14 13:35:39.876500 systemd[1]: Stopped ignition-disks.service - Ignition (disks).
Jan 14 13:35:39.886499 systemd[1]: ignition-kargs.service: Deactivated successfully.
Jan 14 13:35:39.886544 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs).
Jan 14 13:35:39.892492 systemd[1]: ignition-fetch.service: Deactivated successfully.
Jan 14 13:35:39.892533 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch).
Jan 14 13:35:39.904046 systemd[1]: Stopped target network.target - Network.
Jan 14 13:35:39.917754 systemd[1]: ignition-fetch-offline.service: Deactivated successfully.
Jan 14 13:35:39.917810 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline).
Jan 14 13:35:39.937818 systemd[1]: Stopped target paths.target - Path Units.
Jan 14 13:35:39.948741 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 14 13:35:39.954382 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch.
Jan 14 13:35:39.962016 systemd[1]: Stopped target slices.target - Slice Units.
Jan 14 13:35:39.978427 systemd[1]: Stopped target sockets.target - Socket Units.
Jan 14 13:35:39.988197 systemd[1]: iscsid.socket: Deactivated successfully.
Jan 14 13:35:39.988241 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket.
Jan 14 13:35:39.998553 systemd[1]: iscsiuio.socket: Deactivated successfully.
Jan 14 13:35:39.998590 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket.
Jan 14 13:35:40.010063 systemd[1]: ignition-setup.service: Deactivated successfully.
Jan 14 13:35:40.010115 systemd[1]: Stopped ignition-setup.service - Ignition (setup).
Jan 14 13:35:40.020373 systemd[1]: ignition-setup-pre.service: Deactivated successfully.
Jan 14 13:35:40.020412 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup.
Jan 14 13:35:40.231126 kernel: hv_netvsc 0022487b-74b7-0022-487b-74b70022487b eth0: Data path switched from VF: enP32736s1
Jan 14 13:35:40.031898 systemd[1]: Stopping systemd-networkd.service - Network Configuration...
Jan 14 13:35:40.042085 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution...
Jan 14 13:35:40.053939 systemd[1]: sysroot-boot.mount: Deactivated successfully.
Jan 14 13:35:40.059281 systemd-networkd[875]: eth0: DHCPv6 lease lost
Jan 14 13:35:40.061306 systemd[1]: systemd-networkd.service: Deactivated successfully.
Jan 14 13:35:40.061421 systemd[1]: Stopped systemd-networkd.service - Network Configuration.
Jan 14 13:35:40.072251 systemd[1]: systemd-networkd.socket: Deactivated successfully.
Jan 14 13:35:40.072312 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket.
Jan 14 13:35:40.093147 systemd[1]: Stopping network-cleanup.service - Network Cleanup...
Jan 14 13:35:40.101611 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully.
Jan 14 13:35:40.101679 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline.
Jan 14 13:35:40.114447 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files...
Jan 14 13:35:40.137332 systemd[1]: systemd-resolved.service: Deactivated successfully.
Jan 14 13:35:40.137429 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution.
Jan 14 13:35:40.159302 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 14 13:35:40.161954 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files.
Jan 14 13:35:40.171518 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 14 13:35:40.171589 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket.
Jan 14 13:35:40.188461 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 14 13:35:40.188515 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket.
Jan 14 13:35:40.200289 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 14 13:35:40.200349 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook.
Jan 14 13:35:40.225482 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 14 13:35:40.225549 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook.
Jan 14 13:35:40.242531 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully.
Jan 14 13:35:40.242592 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters.
Jan 14 13:35:40.271213 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database...
Jan 14 13:35:40.288130 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 14 13:35:40.288190 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables.
Jan 14 13:35:40.300376 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 14 13:35:40.512102 systemd-journald[218]: Received SIGTERM from PID 1 (systemd).
Jan 14 13:35:40.300422 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules.
Jan 14 13:35:40.312374 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 14 13:35:40.312421 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev.
Jan 14 13:35:40.325055 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 14 13:35:40.325101 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories.
Jan 14 13:35:40.337985 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 14 13:35:40.338052 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup.
Jan 14 13:35:40.352481 systemd[1]: sysroot-boot.service: Deactivated successfully.
Jan 14 13:35:40.352599 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot.
Jan 14 13:35:40.364909 systemd[1]: network-cleanup.service: Deactivated successfully.
Jan 14 13:35:40.365028 systemd[1]: Stopped network-cleanup.service - Network Cleanup.
Jan 14 13:35:40.376709 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 14 13:35:40.376786 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database.
Jan 14 13:35:40.389938 systemd[1]: Reached target initrd-switch-root.target - Switch Root.
Jan 14 13:35:40.401284 systemd[1]: initrd-setup-root.service: Deactivated successfully.
Jan 14 13:35:40.401364 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup.
Jan 14 13:35:40.432236 systemd[1]: Starting initrd-switch-root.service - Switch Root...
Jan 14 13:35:40.453691 systemd[1]: Switching root.
Jan 14 13:35:40.616271 systemd-journald[218]: Journal stopped
Jan 14 13:35:29.326135 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1]
Jan 14 13:35:29.326157 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241116 p3) 14.2.1 20241116, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT Mon Jan 13 18:56:28 -00 2025
Jan 14 13:35:29.326165 kernel: KASLR enabled
Jan 14 13:35:29.326170 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '')
Jan 14 13:35:29.326178 kernel: printk: bootconsole [pl11] enabled
Jan 14 13:35:29.326183 kernel: efi: EFI v2.7 by EDK II
Jan 14 13:35:29.326190 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3f20e698 RNG=0x3fd5f998 MEMRESERVE=0x3e477598 
Jan 14 13:35:29.326196 kernel: random: crng init done
Jan 14 13:35:29.326202 kernel: secureboot: Secure boot disabled
Jan 14 13:35:29.326208 kernel: ACPI: Early table checksum verification disabled
Jan 14 13:35:29.326214 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL)
Jan 14 13:35:29.326219 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326225 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326232 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01   00000001 INTL 20230628)
Jan 14 13:35:29.326240 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326246 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326252 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326260 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326266 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326272 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326278 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000)
Jan 14 13:35:29.326284 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001)
Jan 14 13:35:29.326290 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200
Jan 14 13:35:29.326296 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff]
Jan 14 13:35:29.326302 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff]
Jan 14 13:35:29.326309 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff]
Jan 14 13:35:29.326315 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff]
Jan 14 13:35:29.326321 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff]
Jan 14 13:35:29.326329 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff]
Jan 14 13:35:29.326335 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff]
Jan 14 13:35:29.326341 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff]
Jan 14 13:35:29.326347 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff]
Jan 14 13:35:29.326353 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff]
Jan 14 13:35:29.326359 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff]
Jan 14 13:35:29.326365 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff]
Jan 14 13:35:29.326371 kernel: NUMA: NODE_DATA [mem 0x1bf7ee800-0x1bf7f3fff]
Jan 14 13:35:29.326377 kernel: Zone ranges:
Jan 14 13:35:29.326383 kernel:   DMA      [mem 0x0000000000000000-0x00000000ffffffff]
Jan 14 13:35:29.326389 kernel:   DMA32    empty
Jan 14 13:35:29.326395 kernel:   Normal   [mem 0x0000000100000000-0x00000001bfffffff]
Jan 14 13:35:29.326405 kernel: Movable zone start for each node
Jan 14 13:35:29.326411 kernel: Early memory node ranges
Jan 14 13:35:29.326418 kernel:   node   0: [mem 0x0000000000000000-0x00000000007fffff]
Jan 14 13:35:29.326424 kernel:   node   0: [mem 0x0000000000824000-0x000000003e45ffff]
Jan 14 13:35:29.326431 kernel:   node   0: [mem 0x000000003e460000-0x000000003e46ffff]
Jan 14 13:35:29.326438 kernel:   node   0: [mem 0x000000003e470000-0x000000003e54ffff]
Jan 14 13:35:29.326445 kernel:   node   0: [mem 0x000000003e550000-0x000000003e87ffff]
Jan 14 13:35:29.326451 kernel:   node   0: [mem 0x000000003e880000-0x000000003fc7ffff]
Jan 14 13:35:29.326457 kernel:   node   0: [mem 0x000000003fc80000-0x000000003fcfffff]
Jan 14 13:35:29.326464 kernel:   node   0: [mem 0x000000003fd00000-0x000000003fffffff]
Jan 14 13:35:29.326470 kernel:   node   0: [mem 0x0000000100000000-0x00000001bfffffff]
Jan 14 13:35:29.326477 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff]
Jan 14 13:35:29.326484 kernel: On node 0, zone DMA: 36 pages in unavailable ranges
Jan 14 13:35:29.326490 kernel: psci: probing for conduit method from ACPI.
Jan 14 13:35:29.326497 kernel: psci: PSCIv1.1 detected in firmware.
Jan 14 13:35:29.326503 kernel: psci: Using standard PSCI v0.2 function IDs
Jan 14 13:35:29.326510 kernel: psci: MIGRATE_INFO_TYPE not supported.
Jan 14 13:35:29.326518 kernel: psci: SMC Calling Convention v1.4
Jan 14 13:35:29.328563 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0
Jan 14 13:35:29.328571 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0
Jan 14 13:35:29.328578 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976
Jan 14 13:35:29.328585 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096
Jan 14 13:35:29.328592 kernel: pcpu-alloc: [0] 0 [0] 1 
Jan 14 13:35:29.328599 kernel: Detected PIPT I-cache on CPU0
Jan 14 13:35:29.328606 kernel: CPU features: detected: GIC system register CPU interface
Jan 14 13:35:29.328613 kernel: CPU features: detected: Hardware dirty bit management
Jan 14 13:35:29.328619 kernel: CPU features: detected: Spectre-BHB
Jan 14 13:35:29.328626 kernel: CPU features: kernel page table isolation forced ON by KASLR
Jan 14 13:35:29.328638 kernel: CPU features: detected: Kernel page table isolation (KPTI)
Jan 14 13:35:29.328645 kernel: CPU features: detected: ARM erratum 1418040
Jan 14 13:35:29.328651 kernel: CPU features: detected: ARM erratum 1542419 (kernel portion)
Jan 14 13:35:29.328658 kernel: CPU features: detected: SSBS not fully self-synchronizing
Jan 14 13:35:29.328664 kernel: alternatives: applying boot alternatives
Jan 14 13:35:29.328672 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=9798117b3b15ef802e3d618077f87253cc08e0d5280b8fe28b307e7558b7ebcc
Jan 14 13:35:29.328680 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space.
Jan 14 13:35:29.328686 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 14 13:35:29.328693 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear)
Jan 14 13:35:29.328699 kernel: Fallback order for Node 0: 0 
Jan 14 13:35:29.328706 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 1032156
Jan 14 13:35:29.328715 kernel: Policy zone: Normal
Jan 14 13:35:29.328721 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 14 13:35:29.328728 kernel: software IO TLB: area num 2.
Jan 14 13:35:29.328734 kernel: software IO TLB: mapped [mem 0x000000003a460000-0x000000003e460000] (64MB)
Jan 14 13:35:29.328741 kernel: Memory: 3982052K/4194160K available (10304K kernel code, 2184K rwdata, 8092K rodata, 39936K init, 897K bss, 212108K reserved, 0K cma-reserved)
Jan 14 13:35:29.328748 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1
Jan 14 13:35:29.328754 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 14 13:35:29.328762 kernel: rcu:         RCU event tracing is enabled.
Jan 14 13:35:29.328768 kernel: rcu:         RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2.
Jan 14 13:35:29.328775 kernel:         Trampoline variant of Tasks RCU enabled.
Jan 14 13:35:29.328781 kernel:         Tracing variant of Tasks RCU enabled.
Jan 14 13:35:29.328789 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 14 13:35:29.328796 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2
Jan 14 13:35:29.328803 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0
Jan 14 13:35:29.328809 kernel: GICv3: 960 SPIs implemented
Jan 14 13:35:29.328816 kernel: GICv3: 0 Extended SPIs implemented
Jan 14 13:35:29.328822 kernel: Root IRQ handler: gic_handle_irq
Jan 14 13:35:29.328829 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI
Jan 14 13:35:29.328836 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000
Jan 14 13:35:29.328842 kernel: ITS: No ITS available, not enabling LPIs
Jan 14 13:35:29.328849 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 14 13:35:29.328855 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040
Jan 14 13:35:29.328862 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt).
Jan 14 13:35:29.328870 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns
Jan 14 13:35:29.328877 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns
Jan 14 13:35:29.328884 kernel: Console: colour dummy device 80x25
Jan 14 13:35:29.328891 kernel: printk: console [tty1] enabled
Jan 14 13:35:29.328897 kernel: ACPI: Core revision 20230628
Jan 14 13:35:29.328904 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000)
Jan 14 13:35:29.328911 kernel: pid_max: default: 32768 minimum: 301
Jan 14 13:35:29.328918 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity
Jan 14 13:35:29.328925 kernel: landlock: Up and running.
Jan 14 13:35:29.328933 kernel: SELinux:  Initializing.
Jan 14 13:35:29.328939 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear)
Jan 14 13:35:29.328946 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear)
Jan 14 13:35:29.328953 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2.
Jan 14 13:35:29.328960 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2.
Jan 14 13:35:29.328967 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3a8030, hints 0xe, misc 0x31e1
Jan 14 13:35:29.328974 kernel: Hyper-V: Host Build 10.0.22477.1594-1-0
Jan 14 13:35:29.328987 kernel: Hyper-V: enabling crash_kexec_post_notifiers
Jan 14 13:35:29.328995 kernel: rcu: Hierarchical SRCU implementation.
Jan 14 13:35:29.329002 kernel: rcu:         Max phase no-delay instances is 400.
Jan 14 13:35:29.329009 kernel: Remapping and enabling EFI services.
Jan 14 13:35:29.329016 kernel: smp: Bringing up secondary CPUs ...
Jan 14 13:35:29.329024 kernel: Detected PIPT I-cache on CPU1
Jan 14 13:35:29.329031 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000
Jan 14 13:35:29.329039 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040
Jan 14 13:35:29.329046 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1]
Jan 14 13:35:29.329053 kernel: smp: Brought up 1 node, 2 CPUs
Jan 14 13:35:29.329061 kernel: SMP: Total of 2 processors activated.
Jan 14 13:35:29.329068 kernel: CPU features: detected: 32-bit EL0 Support
Jan 14 13:35:29.329075 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence
Jan 14 13:35:29.329082 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence
Jan 14 13:35:29.329090 kernel: CPU features: detected: CRC32 instructions
Jan 14 13:35:29.329097 kernel: CPU features: detected: RCpc load-acquire (LDAPR)
Jan 14 13:35:29.329104 kernel: CPU features: detected: LSE atomic instructions
Jan 14 13:35:29.329111 kernel: CPU features: detected: Privileged Access Never
Jan 14 13:35:29.329118 kernel: CPU: All CPU(s) started at EL1
Jan 14 13:35:29.329126 kernel: alternatives: applying system-wide alternatives
Jan 14 13:35:29.329133 kernel: devtmpfs: initialized
Jan 14 13:35:29.329141 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 14 13:35:29.329148 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear)
Jan 14 13:35:29.329155 kernel: pinctrl core: initialized pinctrl subsystem
Jan 14 13:35:29.329162 kernel: SMBIOS 3.1.0 present.
Jan 14 13:35:29.329169 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024
Jan 14 13:35:29.329176 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 14 13:35:29.329184 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations
Jan 14 13:35:29.329192 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 14 13:35:29.329200 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 14 13:35:29.329207 kernel: audit: initializing netlink subsys (disabled)
Jan 14 13:35:29.329214 kernel: audit: type=2000 audit(0.047:1): state=initialized audit_enabled=0 res=1
Jan 14 13:35:29.329221 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 14 13:35:29.329229 kernel: cpuidle: using governor menu
Jan 14 13:35:29.329236 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers.
Jan 14 13:35:29.329243 kernel: ASID allocator initialised with 32768 entries
Jan 14 13:35:29.329250 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 14 13:35:29.329258 kernel: Serial: AMBA PL011 UART driver
Jan 14 13:35:29.329266 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL
Jan 14 13:35:29.329273 kernel: Modules: 0 pages in range for non-PLT usage
Jan 14 13:35:29.329280 kernel: Modules: 508880 pages in range for PLT usage
Jan 14 13:35:29.329288 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 14 13:35:29.329295 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page
Jan 14 13:35:29.329302 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages
Jan 14 13:35:29.329309 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page
Jan 14 13:35:29.329316 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 14 13:35:29.329325 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page
Jan 14 13:35:29.329332 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages
Jan 14 13:35:29.329339 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page
Jan 14 13:35:29.329346 kernel: ACPI: Added _OSI(Module Device)
Jan 14 13:35:29.329353 kernel: ACPI: Added _OSI(Processor Device)
Jan 14 13:35:29.329360 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Jan 14 13:35:29.329367 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 14 13:35:29.329374 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 14 13:35:29.329381 kernel: ACPI: Interpreter enabled
Jan 14 13:35:29.329390 kernel: ACPI: Using GIC for interrupt routing
Jan 14 13:35:29.329397 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA
Jan 14 13:35:29.329404 kernel: printk: console [ttyAMA0] enabled
Jan 14 13:35:29.329411 kernel: printk: bootconsole [pl11] disabled
Jan 14 13:35:29.329418 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA
Jan 14 13:35:29.329426 kernel: iommu: Default domain type: Translated
Jan 14 13:35:29.329433 kernel: iommu: DMA domain TLB invalidation policy: strict mode
Jan 14 13:35:29.329440 kernel: efivars: Registered efivars operations
Jan 14 13:35:29.329447 kernel: vgaarb: loaded
Jan 14 13:35:29.329455 kernel: clocksource: Switched to clocksource arch_sys_counter
Jan 14 13:35:29.329462 kernel: VFS: Disk quotas dquot_6.6.0
Jan 14 13:35:29.329470 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 14 13:35:29.329477 kernel: pnp: PnP ACPI init
Jan 14 13:35:29.329484 kernel: pnp: PnP ACPI: found 0 devices
Jan 14 13:35:29.329491 kernel: NET: Registered PF_INET protocol family
Jan 14 13:35:29.329498 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 14 13:35:29.329505 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear)
Jan 14 13:35:29.329512 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 14 13:35:29.329529 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear)
Jan 14 13:35:29.329536 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear)
Jan 14 13:35:29.329544 kernel: TCP: Hash tables configured (established 32768 bind 32768)
Jan 14 13:35:29.329551 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear)
Jan 14 13:35:29.329558 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear)
Jan 14 13:35:29.329565 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 14 13:35:29.329572 kernel: PCI: CLS 0 bytes, default 64
Jan 14 13:35:29.329579 kernel: kvm [1]: HYP mode not available
Jan 14 13:35:29.329587 kernel: Initialise system trusted keyrings
Jan 14 13:35:29.329595 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0
Jan 14 13:35:29.329603 kernel: Key type asymmetric registered
Jan 14 13:35:29.329609 kernel: Asymmetric key parser 'x509' registered
Jan 14 13:35:29.329616 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250)
Jan 14 13:35:29.329623 kernel: io scheduler mq-deadline registered
Jan 14 13:35:29.329630 kernel: io scheduler kyber registered
Jan 14 13:35:29.329638 kernel: io scheduler bfq registered
Jan 14 13:35:29.329645 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 14 13:35:29.329652 kernel: thunder_xcv, ver 1.0
Jan 14 13:35:29.329660 kernel: thunder_bgx, ver 1.0
Jan 14 13:35:29.329667 kernel: nicpf, ver 1.0
Jan 14 13:35:29.329674 kernel: nicvf, ver 1.0
Jan 14 13:35:29.329810 kernel: rtc-efi rtc-efi.0: registered as rtc0
Jan 14 13:35:29.329882 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-01-14T13:35:28 UTC (1736861728)
Jan 14 13:35:29.329892 kernel: efifb: probing for efifb
Jan 14 13:35:29.329899 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k
Jan 14 13:35:29.329906 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1
Jan 14 13:35:29.329915 kernel: efifb: scrolling: redraw
Jan 14 13:35:29.329923 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0
Jan 14 13:35:29.329930 kernel: Console: switching to colour frame buffer device 128x48
Jan 14 13:35:29.329937 kernel: fb0: EFI VGA frame buffer device
Jan 14 13:35:29.329944 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping ....
Jan 14 13:35:29.329951 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 14 13:35:29.329958 kernel: No ACPI PMU IRQ for CPU0
Jan 14 13:35:29.329965 kernel: No ACPI PMU IRQ for CPU1
Jan 14 13:35:29.329972 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 1 counters available
Jan 14 13:35:29.329981 kernel: watchdog: Delayed init of the lockup detector failed: -19
Jan 14 13:35:29.329988 kernel: watchdog: Hard watchdog permanently disabled
Jan 14 13:35:29.329995 kernel: NET: Registered PF_INET6 protocol family
Jan 14 13:35:29.330001 kernel: Segment Routing with IPv6
Jan 14 13:35:29.330009 kernel: In-situ OAM (IOAM) with IPv6
Jan 14 13:35:29.330016 kernel: NET: Registered PF_PACKET protocol family
Jan 14 13:35:29.330023 kernel: Key type dns_resolver registered
Jan 14 13:35:29.330029 kernel: registered taskstats version 1
Jan 14 13:35:29.330036 kernel: Loading compiled-in X.509 certificates
Jan 14 13:35:29.330045 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: 46cb4d1b22f3a5974766fe7d7b651e2f296d4fe0'
Jan 14 13:35:29.330052 kernel: Key type .fscrypt registered
Jan 14 13:35:29.330059 kernel: Key type fscrypt-provisioning registered
Jan 14 13:35:29.330066 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 14 13:35:29.330073 kernel: ima: Allocated hash algorithm: sha1
Jan 14 13:35:29.330080 kernel: ima: No architecture policies found
Jan 14 13:35:29.330087 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng)
Jan 14 13:35:29.330095 kernel: clk: Disabling unused clocks
Jan 14 13:35:29.330102 kernel: Freeing unused kernel memory: 39936K
Jan 14 13:35:29.330110 kernel: Run /init as init process
Jan 14 13:35:29.330117 kernel:   with arguments:
Jan 14 13:35:29.330124 kernel:     /init
Jan 14 13:35:29.330131 kernel:   with environment:
Jan 14 13:35:29.330138 kernel:     HOME=/
Jan 14 13:35:29.330144 kernel:     TERM=linux
Jan 14 13:35:29.330151 kernel:     BOOT_IMAGE=/flatcar/vmlinuz-a
Jan 14 13:35:29.330160 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified)
Jan 14 13:35:29.330171 systemd[1]: Detected virtualization microsoft.
Jan 14 13:35:29.330179 systemd[1]: Detected architecture arm64.
Jan 14 13:35:29.330186 systemd[1]: Running in initrd.
Jan 14 13:35:29.330194 systemd[1]: No hostname configured, using default hostname.
Jan 14 13:35:29.330201 systemd[1]: Hostname set to <localhost>.
Jan 14 13:35:29.330209 systemd[1]: Initializing machine ID from random generator.
Jan 14 13:35:29.330216 systemd[1]: Queued start job for default target initrd.target.
Jan 14 13:35:29.330224 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch.
Jan 14 13:35:29.330233 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch.
Jan 14 13:35:29.330241 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM...
Jan 14 13:35:29.330249 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM...
Jan 14 13:35:29.330256 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT...
Jan 14 13:35:29.330264 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A...
Jan 14 13:35:29.330273 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132...
Jan 14 13:35:29.330282 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr...
Jan 14 13:35:29.330290 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre).
Jan 14 13:35:29.330298 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes.
Jan 14 13:35:29.330305 systemd[1]: Reached target paths.target - Path Units.
Jan 14 13:35:29.330313 systemd[1]: Reached target slices.target - Slice Units.
Jan 14 13:35:29.330320 systemd[1]: Reached target swap.target - Swaps.
Jan 14 13:35:29.330328 systemd[1]: Reached target timers.target - Timer Units.
Jan 14 13:35:29.330335 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket.
Jan 14 13:35:29.330343 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket.
Jan 14 13:35:29.330352 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log).
Jan 14 13:35:29.330360 systemd[1]: Listening on systemd-journald.socket - Journal Socket.
Jan 14 13:35:29.330367 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket.
Jan 14 13:35:29.330375 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket.
Jan 14 13:35:29.330382 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket.
Jan 14 13:35:29.330390 systemd[1]: Reached target sockets.target - Socket Units.
Jan 14 13:35:29.330397 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup...
Jan 14 13:35:29.330405 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes...
Jan 14 13:35:29.330413 systemd[1]: Finished network-cleanup.service - Network Cleanup.
Jan 14 13:35:29.330422 systemd[1]: Starting systemd-fsck-usr.service...
Jan 14 13:35:29.330430 systemd[1]: Starting systemd-journald.service - Journal Service...
Jan 14 13:35:29.330437 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules...
Jan 14 13:35:29.330463 systemd-journald[218]: Collecting audit messages is disabled.
Jan 14 13:35:29.330484 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup...
Jan 14 13:35:29.330493 systemd-journald[218]: Journal started
Jan 14 13:35:29.330515 systemd-journald[218]: Runtime Journal (/run/log/journal/235341efae424829a482ae4890fb2fb9) is 8.0M, max 78.5M, 70.5M free.
Jan 14 13:35:29.343262 systemd-modules-load[219]: Inserted module 'overlay'
Jan 14 13:35:29.354656 systemd[1]: Started systemd-journald.service - Journal Service.
Jan 14 13:35:29.360817 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup.
Jan 14 13:35:29.370865 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes.
Jan 14 13:35:29.410631 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 14 13:35:29.410654 kernel: Bridge firewalling registered
Jan 14 13:35:29.399292 systemd[1]: Finished systemd-fsck-usr.service.
Jan 14 13:35:29.414939 systemd-modules-load[219]: Inserted module 'br_netfilter'
Jan 14 13:35:29.415804 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules.
Jan 14 13:35:29.428356 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup.
Jan 14 13:35:29.454761 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters...
Jan 14 13:35:29.469723 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables...
Jan 14 13:35:29.482320 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully...
Jan 14 13:35:29.493731 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories...
Jan 14 13:35:29.515417 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters.
Jan 14 13:35:29.530841 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables.
Jan 14 13:35:29.543971 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully.
Jan 14 13:35:29.557722 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories.
Jan 14 13:35:29.586831 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook...
Jan 14 13:35:29.600646 systemd[1]: Starting systemd-resolved.service - Network Name Resolution...
Jan 14 13:35:29.620059 dracut-cmdline[251]: dracut-dracut-053
Jan 14 13:35:29.624962 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev...
Jan 14 13:35:29.642888 dracut-cmdline[251]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=9798117b3b15ef802e3d618077f87253cc08e0d5280b8fe28b307e7558b7ebcc
Jan 14 13:35:29.683156 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev.
Jan 14 13:35:29.696026 systemd-resolved[252]: Positive Trust Anchors:
Jan 14 13:35:29.696036 systemd-resolved[252]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d
Jan 14 13:35:29.696067 systemd-resolved[252]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test
Jan 14 13:35:29.698322 systemd-resolved[252]: Defaulting to hostname 'linux'.
Jan 14 13:35:29.699159 systemd[1]: Started systemd-resolved.service - Network Name Resolution.
Jan 14 13:35:29.706755 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups.
Jan 14 13:35:29.829550 kernel: SCSI subsystem initialized
Jan 14 13:35:29.838545 kernel: Loading iSCSI transport class v2.0-870.
Jan 14 13:35:29.848544 kernel: iscsi: registered transport (tcp)
Jan 14 13:35:29.866995 kernel: iscsi: registered transport (qla4xxx)
Jan 14 13:35:29.867054 kernel: QLogic iSCSI HBA Driver
Jan 14 13:35:29.899895 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook.
Jan 14 13:35:29.916666 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook...
Jan 14 13:35:29.949292 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 14 13:35:29.949352 kernel: device-mapper: uevent: version 1.0.3
Jan 14 13:35:29.955938 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com
Jan 14 13:35:30.003542 kernel: raid6: neonx8   gen() 15766 MB/s
Jan 14 13:35:30.023534 kernel: raid6: neonx4   gen() 15801 MB/s
Jan 14 13:35:30.044534 kernel: raid6: neonx2   gen() 13410 MB/s
Jan 14 13:35:30.065530 kernel: raid6: neonx1   gen() 10428 MB/s
Jan 14 13:35:30.085529 kernel: raid6: int64x8  gen()  6795 MB/s
Jan 14 13:35:30.105530 kernel: raid6: int64x4  gen()  7350 MB/s
Jan 14 13:35:30.126530 kernel: raid6: int64x2  gen()  6109 MB/s
Jan 14 13:35:30.149707 kernel: raid6: int64x1  gen()  5061 MB/s
Jan 14 13:35:30.149717 kernel: raid6: using algorithm neonx4 gen() 15801 MB/s
Jan 14 13:35:30.173745 kernel: raid6: .... xor() 12320 MB/s, rmw enabled
Jan 14 13:35:30.173757 kernel: raid6: using neon recovery algorithm
Jan 14 13:35:30.182532 kernel: xor: measuring software checksum speed
Jan 14 13:35:30.182546 kernel:    8regs           : 20434 MB/sec
Jan 14 13:35:30.189146 kernel:    32regs          : 21676 MB/sec
Jan 14 13:35:30.192557 kernel:    arm64_neon      : 27860 MB/sec
Jan 14 13:35:30.196904 kernel: xor: using function: arm64_neon (27860 MB/sec)
Jan 14 13:35:30.246541 kernel: Btrfs loaded, zoned=no, fsverity=no
Jan 14 13:35:30.256031 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook.
Jan 14 13:35:30.272658 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files...
Jan 14 13:35:30.295257 systemd-udevd[437]: Using default interface naming scheme 'v255'.
Jan 14 13:35:30.300733 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files.
Jan 14 13:35:30.322729 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook...
Jan 14 13:35:30.335445 dracut-pre-trigger[441]: rd.md=0: removing MD RAID activation
Jan 14 13:35:30.362233 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook.
Jan 14 13:35:30.377773 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices...
Jan 14 13:35:30.417841 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices.
Jan 14 13:35:30.436693 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook...
Jan 14 13:35:30.460666 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook.
Jan 14 13:35:30.467556 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems.
Jan 14 13:35:30.485642 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes.
Jan 14 13:35:30.506703 systemd[1]: Reached target remote-fs.target - Remote File Systems.
Jan 14 13:35:30.533717 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook...
Jan 14 13:35:30.560338 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook.
Jan 14 13:35:30.582800 kernel: hv_vmbus: Vmbus version:5.3
Jan 14 13:35:30.582823 kernel: hv_vmbus: registering driver hyperv_keyboard
Jan 14 13:35:30.573310 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully.
Jan 14 13:35:30.573491 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters.
Jan 14 13:35:30.634365 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 14 13:35:30.634393 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0
Jan 14 13:35:30.634403 kernel: hv_vmbus: registering driver hv_netvsc
Jan 14 13:35:30.634412 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 14 13:35:30.613453 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters...
Jan 14 13:35:30.699305 kernel: hv_vmbus: registering driver hv_storvsc
Jan 14 13:35:30.699329 kernel: hv_vmbus: registering driver hid_hyperv
Jan 14 13:35:30.699338 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1
Jan 14 13:35:30.699348 kernel: scsi host1: storvsc_host_t
Jan 14 13:35:30.699515 kernel: scsi host0: storvsc_host_t
Jan 14 13:35:30.699621 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on 
Jan 14 13:35:30.699702 kernel: PTP clock support registered
Jan 14 13:35:30.699718 kernel: scsi 0:0:0:0: Direct-Access     Msft     Virtual Disk     1.0  PQ: 0 ANSI: 5
Jan 14 13:35:30.699739 kernel: scsi 0:0:0:2: CD-ROM            Msft     Virtual DVD-ROM  1.0  PQ: 0 ANSI: 0
Jan 14 13:35:30.657007 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 14 13:35:30.657263 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup.
Jan 14 13:35:30.712361 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup...
Jan 14 13:35:30.741047 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup...
Jan 14 13:35:30.765824 kernel: hv_utils: Registering HyperV Utility Driver
Jan 14 13:35:30.765847 kernel: hv_vmbus: registering driver hv_utils
Jan 14 13:35:30.766565 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 14 13:35:30.784291 kernel: hv_netvsc 0022487b-74b7-0022-487b-74b70022487b eth0: VF slot 1 added
Jan 14 13:35:30.784443 kernel: sr 0:0:0:2: [sr0] scsi-1 drive
Jan 14 13:35:30.810187 kernel: hv_vmbus: registering driver hv_pci
Jan 14 13:35:30.810210 kernel: hv_utils: Heartbeat IC version 3.0
Jan 14 13:35:30.810219 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 14 13:35:30.810228 kernel: hv_utils: Shutdown IC version 3.2
Jan 14 13:35:30.810237 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0
Jan 14 13:35:30.810352 kernel: hv_utils: TimeSync IC version 4.0
Jan 14 13:35:30.767661 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup.
Jan 14 13:35:31.117173 kernel: hv_pci 4894d9d5-7fe0-4e09-a4df-dd0411651e4b: PCI VMBus probing: Using version 0x10004
Jan 14 13:35:31.235151 kernel: hv_pci 4894d9d5-7fe0-4e09-a4df-dd0411651e4b: PCI host bridge to bus 7fe0:00
Jan 14 13:35:31.235276 kernel: pci_bus 7fe0:00: root bus resource [mem 0xfc0000000-0xfc00fffff window]
Jan 14 13:35:31.235400 kernel: pci_bus 7fe0:00: No busn resource found for root bus, will use [bus 00-ff]
Jan 14 13:35:31.235485 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB)
Jan 14 13:35:31.235589 kernel: pci 7fe0:00:02.0: [15b3:1018] type 00 class 0x020000
Jan 14 13:35:31.235681 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks
Jan 14 13:35:31.235770 kernel: pci 7fe0:00:02.0: reg 0x10: [mem 0xfc0000000-0xfc00fffff 64bit pref]
Jan 14 13:35:31.235866 kernel: sd 0:0:0:0: [sda] Write Protect is off
Jan 14 13:35:31.235955 kernel: pci 7fe0:00:02.0: enabling Extended Tags
Jan 14 13:35:31.237704 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00
Jan 14 13:35:31.237825 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA
Jan 14 13:35:31.238036 kernel: pci 7fe0:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 7fe0:00:02.0 (capable of 126.016 Gb/s with 8.0 GT/s PCIe x16 link)
Jan 14 13:35:31.238167 kernel:  sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9
Jan 14 13:35:31.238179 kernel: pci_bus 7fe0:00: busn_res: [bus 00-ff] end is updated to 00
Jan 14 13:35:31.238280 kernel: sd 0:0:0:0: [sda] Attached SCSI disk
Jan 14 13:35:31.238383 kernel: pci 7fe0:00:02.0: BAR 0: assigned [mem 0xfc0000000-0xfc00fffff 64bit pref]
Jan 14 13:35:31.102072 systemd-resolved[252]: Clock change detected. Flushing caches.
Jan 14 13:35:31.119286 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup...
Jan 14 13:35:31.184886 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup.
Jan 14 13:35:31.246593 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters...
Jan 14 13:35:31.292129 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters.
Jan 14 13:35:31.316407 kernel: mlx5_core 7fe0:00:02.0: enabling device (0000 -> 0002)
Jan 14 13:35:31.540141 kernel: mlx5_core 7fe0:00:02.0: firmware version: 16.30.1284
Jan 14 13:35:31.540313 kernel: hv_netvsc 0022487b-74b7-0022-487b-74b70022487b eth0: VF registering: eth1
Jan 14 13:35:31.540408 kernel: mlx5_core 7fe0:00:02.0 eth1: joined to eth0
Jan 14 13:35:31.540501 kernel: mlx5_core 7fe0:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic)
Jan 14 13:35:31.548018 kernel: mlx5_core 7fe0:00:02.0 enP32736s1: renamed from eth1
Jan 14 13:35:31.737619 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM.
Jan 14 13:35:31.854199 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT.
Jan 14 13:35:31.880032 kernel: BTRFS: device fsid 2be7cc1c-29d4-4496-b29b-8561323213d2 devid 1 transid 38 /dev/sda3 scanned by (udev-worker) (499)
Jan 14 13:35:31.893355 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A.
Jan 14 13:35:31.901405 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A.
Jan 14 13:35:31.935218 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary...
Jan 14 13:35:31.959037 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by (udev-worker) (488)
Jan 14 13:35:31.978108 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM.
Jan 14 13:35:32.978068 kernel:  sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9
Jan 14 13:35:32.978122 disk-uuid[604]: The operation has completed successfully.
Jan 14 13:35:33.044459 systemd[1]: disk-uuid.service: Deactivated successfully.
Jan 14 13:35:33.046016 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary.
Jan 14 13:35:33.080171 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr...
Jan 14 13:35:33.093749 sh[691]: Success
Jan 14 13:35:33.125029 kernel: device-mapper: verity: sha256 using implementation "sha256-ce"
Jan 14 13:35:33.336485 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr.
Jan 14 13:35:33.361123 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr...
Jan 14 13:35:33.366551 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr.
Jan 14 13:35:33.405115 kernel: BTRFS info (device dm-0): first mount of filesystem 2be7cc1c-29d4-4496-b29b-8561323213d2
Jan 14 13:35:33.405182 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm
Jan 14 13:35:33.412680 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead
Jan 14 13:35:33.418296 kernel: BTRFS info (device dm-0): disabling log replay at mount time
Jan 14 13:35:33.422674 kernel: BTRFS info (device dm-0): using free space tree
Jan 14 13:35:33.711347 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr.
Jan 14 13:35:33.717448 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met.
Jan 14 13:35:33.742237 systemd[1]: Starting ignition-setup.service - Ignition (setup)...
Jan 14 13:35:33.750170 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline...
Jan 14 13:35:33.788629 kernel: BTRFS info (device sda6): first mount of filesystem 9f8ecb6c-ace6-4d16-8781-f4e964dc0779
Jan 14 13:35:33.788675 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm
Jan 14 13:35:33.793917 kernel: BTRFS info (device sda6): using free space tree
Jan 14 13:35:33.818060 kernel: BTRFS info (device sda6): auto enabling async discard
Jan 14 13:35:33.833817 systemd[1]: mnt-oem.mount: Deactivated successfully.
Jan 14 13:35:33.840065 kernel: BTRFS info (device sda6): last unmount of filesystem 9f8ecb6c-ace6-4d16-8781-f4e964dc0779
Jan 14 13:35:33.847798 systemd[1]: Finished ignition-setup.service - Ignition (setup).
Jan 14 13:35:33.866230 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)...
Jan 14 13:35:33.875239 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline.
Jan 14 13:35:33.897231 systemd[1]: Starting systemd-networkd.service - Network Configuration...
Jan 14 13:35:33.928607 systemd-networkd[875]: lo: Link UP
Jan 14 13:35:33.928625 systemd-networkd[875]: lo: Gained carrier
Jan 14 13:35:33.930601 systemd-networkd[875]: Enumeration completed
Jan 14 13:35:33.930713 systemd[1]: Started systemd-networkd.service - Network Configuration.
Jan 14 13:35:33.942764 systemd[1]: Reached target network.target - Network.
Jan 14 13:35:33.946970 systemd-networkd[875]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name.
Jan 14 13:35:33.946973 systemd-networkd[875]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network.
Jan 14 13:35:34.029016 kernel: mlx5_core 7fe0:00:02.0 enP32736s1: Link up
Jan 14 13:35:34.072020 kernel: hv_netvsc 0022487b-74b7-0022-487b-74b70022487b eth0: Data path switched to VF: enP32736s1
Jan 14 13:35:34.072821 systemd-networkd[875]: enP32736s1: Link UP
Jan 14 13:35:34.073104 systemd-networkd[875]: eth0: Link UP
Jan 14 13:35:34.073463 systemd-networkd[875]: eth0: Gained carrier
Jan 14 13:35:34.073473 systemd-networkd[875]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name.
Jan 14 13:35:34.085864 systemd-networkd[875]: enP32736s1: Gained carrier
Jan 14 13:35:34.109042 systemd-networkd[875]: eth0: DHCPv4 address 10.200.20.15/24, gateway 10.200.20.1 acquired from 168.63.129.16
Jan 14 13:35:34.947085 ignition[870]: Ignition 2.20.0
Jan 14 13:35:34.947096 ignition[870]: Stage: fetch-offline
Jan 14 13:35:34.951536 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline).
Jan 14 13:35:34.947132 ignition[870]: no configs at "/usr/lib/ignition/base.d"
Jan 14 13:35:34.947141 ignition[870]: no config dir at "/usr/lib/ignition/base.platform.d/azure"
Jan 14 13:35:34.947226 ignition[870]: parsed url from cmdline: ""
Jan 14 13:35:34.947229 ignition[870]: no config URL provided
Jan 14 13:35:34.947233 ignition[870]: reading system config file "/usr/lib/ignition/user.ign"
Jan 14 13:35:34.978197 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)...
Jan 14 13:35:34.947241 ignition[870]: no config at "/usr/lib/ignition/user.ign"
Jan 14 13:35:34.947245 ignition[870]: failed to fetch config: resource requires networking
Jan 14 13:35:34.947424 ignition[870]: Ignition finished successfully
Jan 14 13:35:34.997217 ignition[883]: Ignition 2.20.0
Jan 14 13:35:34.997223 ignition[883]: Stage: fetch
Jan 14 13:35:34.997411 ignition[883]: no configs at "/usr/lib/ignition/base.d"
Jan 14 13:35:34.997421 ignition[883]: no config dir at "/usr/lib/ignition/base.platform.d/azure"
Jan 14 13:35:34.997523 ignition[883]: parsed url from cmdline: ""
Jan 14 13:35:34.997526 ignition[883]: no config URL provided
Jan 14 13:35:34.997531 ignition[883]: reading system config file "/usr/lib/ignition/user.ign"
Jan 14 13:35:34.997547 ignition[883]: no config at "/usr/lib/ignition/user.ign"
Jan 14 13:35:34.997572 ignition[883]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1
Jan 14 13:35:35.113153 ignition[883]: GET result: OK
Jan 14 13:35:35.113245 ignition[883]: config has been read from IMDS userdata
Jan 14 13:35:35.113316 ignition[883]: parsing config with SHA512: 4d8868346995f696b41a6b84974154998c194de7f6be1e9c81fa3ecb9346efaf80c02405c47f5bdc31dccfb726b749eb7b2fb1007bf354fe998f92e9f873dda3
Jan 14 13:35:35.118499 unknown[883]: fetched base config from "system"
Jan 14 13:35:35.118947 ignition[883]: fetch: fetch complete
Jan 14 13:35:35.118506 unknown[883]: fetched base config from "system"
Jan 14 13:35:35.118952 ignition[883]: fetch: fetch passed
Jan 14 13:35:35.118511 unknown[883]: fetched user config from "azure"
Jan 14 13:35:35.119024 ignition[883]: Ignition finished successfully
Jan 14 13:35:35.122139 systemd[1]: Finished ignition-fetch.service - Ignition (fetch).
Jan 14 13:35:35.142098 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)...
Jan 14 13:35:35.161702 ignition[890]: Ignition 2.20.0
Jan 14 13:35:35.173292 systemd[1]: Finished ignition-kargs.service - Ignition (kargs).
Jan 14 13:35:35.161710 ignition[890]: Stage: kargs
Jan 14 13:35:35.161879 ignition[890]: no configs at "/usr/lib/ignition/base.d"
Jan 14 13:35:35.161888 ignition[890]: no config dir at "/usr/lib/ignition/base.platform.d/azure"
Jan 14 13:35:35.191276 systemd[1]: Starting ignition-disks.service - Ignition (disks)...
Jan 14 13:35:35.162771 ignition[890]: kargs: kargs passed
Jan 14 13:35:35.162811 ignition[890]: Ignition finished successfully
Jan 14 13:35:35.217943 ignition[896]: Ignition 2.20.0
Jan 14 13:35:35.221583 systemd[1]: Finished ignition-disks.service - Ignition (disks).
Jan 14 13:35:35.217950 ignition[896]: Stage: disks
Jan 14 13:35:35.231107 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device.
Jan 14 13:35:35.218169 ignition[896]: no configs at "/usr/lib/ignition/base.d"
Jan 14 13:35:35.242069 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems.
Jan 14 13:35:35.218179 ignition[896]: no config dir at "/usr/lib/ignition/base.platform.d/azure"
Jan 14 13:35:35.251799 systemd-networkd[875]: enP32736s1: Gained IPv6LL
Jan 14 13:35:35.219274 ignition[896]: disks: disks passed
Jan 14 13:35:35.252070 systemd[1]: Reached target local-fs.target - Local File Systems.
Jan 14 13:35:35.219321 ignition[896]: Ignition finished successfully
Jan 14 13:35:35.263682 systemd[1]: Reached target sysinit.target - System Initialization.
Jan 14 13:35:35.274645 systemd[1]: Reached target basic.target - Basic System.
Jan 14 13:35:35.294228 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT...
Jan 14 13:35:35.387201 systemd-fsck[905]: ROOT: clean, 14/7326000 files, 477710/7359488 blocks
Jan 14 13:35:35.397714 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT.
Jan 14 13:35:35.416200 systemd[1]: Mounting sysroot.mount - /sysroot...
Jan 14 13:35:35.476057 kernel: EXT4-fs (sda9): mounted filesystem f9a95e53-2d63-4443-b523-cb2108fb48f6 r/w with ordered data mode. Quota mode: none.
Jan 14 13:35:35.476826 systemd[1]: Mounted sysroot.mount - /sysroot.
Jan 14 13:35:35.486335 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System.
Jan 14 13:35:35.506126 systemd-networkd[875]: eth0: Gained IPv6LL
Jan 14 13:35:35.533073 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem...
Jan 14 13:35:35.543105 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr...
Jan 14 13:35:35.550189 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent...
Jan 14 13:35:35.592107 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (916)
Jan 14 13:35:35.592131 kernel: BTRFS info (device sda6): first mount of filesystem 9f8ecb6c-ace6-4d16-8781-f4e964dc0779
Jan 14 13:35:35.592141 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm
Jan 14 13:35:35.578378 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot).
Jan 14 13:35:35.615447 kernel: BTRFS info (device sda6): using free space tree
Jan 14 13:35:35.578415 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup.
Jan 14 13:35:35.603666 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr.
Jan 14 13:35:35.623197 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup...
Jan 14 13:35:35.653154 kernel: BTRFS info (device sda6): auto enabling async discard
Jan 14 13:35:35.654703 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem.
Jan 14 13:35:36.191612 coreos-metadata[918]: Jan 14 13:35:36.191 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1
Jan 14 13:35:36.200221 coreos-metadata[918]: Jan 14 13:35:36.200 INFO Fetch successful
Jan 14 13:35:36.200221 coreos-metadata[918]: Jan 14 13:35:36.200 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1
Jan 14 13:35:36.218373 coreos-metadata[918]: Jan 14 13:35:36.218 INFO Fetch successful
Jan 14 13:35:36.234056 coreos-metadata[918]: Jan 14 13:35:36.234 INFO wrote hostname ci-4186.1.0-a-e83668d6e0 to /sysroot/etc/hostname
Jan 14 13:35:36.244708 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent.
Jan 14 13:35:36.437983 initrd-setup-root[947]: cut: /sysroot/etc/passwd: No such file or directory
Jan 14 13:35:36.507782 initrd-setup-root[954]: cut: /sysroot/etc/group: No such file or directory
Jan 14 13:35:36.515433 initrd-setup-root[961]: cut: /sysroot/etc/shadow: No such file or directory
Jan 14 13:35:36.524753 initrd-setup-root[968]: cut: /sysroot/etc/gshadow: No such file or directory
Jan 14 13:35:37.148893 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup.
Jan 14 13:35:37.165279 systemd[1]: Starting ignition-mount.service - Ignition (mount)...
Jan 14 13:35:37.174221 systemd[1]: Starting sysroot-boot.service - /sysroot/boot...
Jan 14 13:35:37.196277 kernel: BTRFS info (device sda6): last unmount of filesystem 9f8ecb6c-ace6-4d16-8781-f4e964dc0779
Jan 14 13:35:37.192118 systemd[1]: sysroot-oem.mount: Deactivated successfully.
Jan 14 13:35:37.219782 ignition[1035]: INFO     : Ignition 2.20.0
Jan 14 13:35:37.219782 ignition[1035]: INFO     : Stage: mount
Jan 14 13:35:37.240904 ignition[1035]: INFO     : no configs at "/usr/lib/ignition/base.d"
Jan 14 13:35:37.240904 ignition[1035]: INFO     : no config dir at "/usr/lib/ignition/base.platform.d/azure"
Jan 14 13:35:37.240904 ignition[1035]: INFO     : mount: mount passed
Jan 14 13:35:37.240904 ignition[1035]: INFO     : Ignition finished successfully
Jan 14 13:35:37.221487 systemd[1]: Finished sysroot-boot.service - /sysroot/boot.
Jan 14 13:35:37.227476 systemd[1]: Finished ignition-mount.service - Ignition (mount).
Jan 14 13:35:37.258246 systemd[1]: Starting ignition-files.service - Ignition (files)...
Jan 14 13:35:37.278201 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem...
Jan 14 13:35:37.311078 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sda6 scanned by mount (1047)
Jan 14 13:35:37.324006 kernel: BTRFS info (device sda6): first mount of filesystem 9f8ecb6c-ace6-4d16-8781-f4e964dc0779
Jan 14 13:35:37.324049 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm
Jan 14 13:35:37.328161 kernel: BTRFS info (device sda6): using free space tree
Jan 14 13:35:37.335176 kernel: BTRFS info (device sda6): auto enabling async discard
Jan 14 13:35:37.336326 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem.
Jan 14 13:35:37.367659 ignition[1064]: INFO     : Ignition 2.20.0
Jan 14 13:35:37.367659 ignition[1064]: INFO     : Stage: files
Jan 14 13:35:37.367659 ignition[1064]: INFO     : no configs at "/usr/lib/ignition/base.d"
Jan 14 13:35:37.367659 ignition[1064]: INFO     : no config dir at "/usr/lib/ignition/base.platform.d/azure"
Jan 14 13:35:37.367659 ignition[1064]: DEBUG    : files: compiled without relabeling support, skipping
Jan 14 13:35:37.398284 ignition[1064]: INFO     : files: ensureUsers: op(1): [started]  creating or modifying user "core"
Jan 14 13:35:37.398284 ignition[1064]: DEBUG    : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core"
Jan 14 13:35:37.447351 ignition[1064]: INFO     : files: ensureUsers: op(1): [finished] creating or modifying user "core"
Jan 14 13:35:37.455187 ignition[1064]: INFO     : files: ensureUsers: op(2): [started]  adding ssh keys to user "core"
Jan 14 13:35:37.455187 ignition[1064]: INFO     : files: ensureUsers: op(2): [finished] adding ssh keys to user "core"
Jan 14 13:35:37.448353 unknown[1064]: wrote ssh authorized keys file for user: core
Jan 14 13:35:37.475160 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(3): [started]  writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz"
Jan 14 13:35:37.475160 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1
Jan 14 13:35:37.660254 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(3): GET result: OK
Jan 14 13:35:37.783546 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz"
Jan 14 13:35:37.783546 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(4): [started]  writing file "/sysroot/home/core/install.sh"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(5): [started]  writing file "/sysroot/home/core/nginx.yaml"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(6): [started]  writing file "/sysroot/home/core/nfs-pod.yaml"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(7): [started]  writing file "/sysroot/home/core/nfs-pvc.yaml"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(8): [started]  writing file "/sysroot/etc/flatcar/update.conf"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(9): [started]  writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-arm64.raw"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-arm64.raw"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(a): [started]  writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-arm64.raw"
Jan 14 13:35:37.805070 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.29.2-arm64.raw: attempt #1
Jan 14 13:35:38.284676 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(a): GET result: OK
Jan 14 13:35:39.120601 ignition[1064]: INFO     : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-arm64.raw"
Jan 14 13:35:39.120601 ignition[1064]: INFO     : files: op(b): [started]  processing unit "prepare-helm.service"
Jan 14 13:35:39.181677 ignition[1064]: INFO     : files: op(b): op(c): [started]  writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service"
Jan 14 13:35:39.193853 ignition[1064]: INFO     : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service"
Jan 14 13:35:39.193853 ignition[1064]: INFO     : files: op(b): [finished] processing unit "prepare-helm.service"
Jan 14 13:35:39.193853 ignition[1064]: INFO     : files: op(d): [started]  setting preset to enabled for "prepare-helm.service"
Jan 14 13:35:39.193853 ignition[1064]: INFO     : files: op(d): [finished] setting preset to enabled for "prepare-helm.service"
Jan 14 13:35:39.193853 ignition[1064]: INFO     : files: createResultFile: createFiles: op(e): [started]  writing file "/sysroot/etc/.ignition-result.json"
Jan 14 13:35:39.193853 ignition[1064]: INFO     : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json"
Jan 14 13:35:39.193853 ignition[1064]: INFO     : files: files passed
Jan 14 13:35:39.193853 ignition[1064]: INFO     : Ignition finished successfully
Jan 14 13:35:39.194086 systemd[1]: Finished ignition-files.service - Ignition (files).
Jan 14 13:35:39.242248 systemd[1]: Starting ignition-quench.service - Ignition (record completion)...
Jan 14 13:35:39.261164 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion...
Jan 14 13:35:39.316561 initrd-setup-root-after-ignition[1091]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory
Jan 14 13:35:39.316561 initrd-setup-root-after-ignition[1091]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory
Jan 14 13:35:39.274595 systemd[1]: ignition-quench.service: Deactivated successfully.
Jan 14 13:35:39.341073 initrd-setup-root-after-ignition[1095]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory
Jan 14 13:35:39.274681 systemd[1]: Finished ignition-quench.service - Ignition (record completion).
Jan 14 13:35:39.283266 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion.
Jan 14 13:35:39.298634 systemd[1]: Reached target ignition-complete.target - Ignition Complete.
Jan 14 13:35:39.334269 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root...
Jan 14 13:35:39.376940 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 14 13:35:39.377067 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root.
Jan 14 13:35:39.388781 systemd[1]: Reached target initrd-fs.target - Initrd File Systems.
Jan 14 13:35:39.400614 systemd[1]: Reached target initrd.target - Initrd Default Target.
Jan 14 13:35:39.413393 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met.
Jan 14 13:35:39.416182 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook...
Jan 14 13:35:39.453166 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook.
Jan 14 13:35:39.469209 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons...
Jan 14 13:35:39.488625 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 14 13:35:39.490018 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons.
Jan 14 13:35:39.501086 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups.
Jan 14 13:35:39.513849 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes.
Jan 14 13:35:39.526497 systemd[1]: Stopped target timers.target - Timer Units.
Jan 14 13:35:39.538040 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 14 13:35:39.538110 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook.
Jan 14 13:35:39.554344 systemd[1]: Stopped target initrd.target - Initrd Default Target.
Jan 14 13:35:39.566837 systemd[1]: Stopped target basic.target - Basic System.
Jan 14 13:35:39.577668 systemd[1]: Stopped target ignition-complete.target - Ignition Complete.
Jan 14 13:35:39.589298 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup.
Jan 14 13:35:39.602562 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device.
Jan 14 13:35:39.615549 systemd[1]: Stopped target remote-fs.target - Remote File Systems.
Jan 14 13:35:39.627759 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems.
Jan 14 13:35:39.640146 systemd[1]: Stopped target sysinit.target - System Initialization.
Jan 14 13:35:39.652394 systemd[1]: Stopped target local-fs.target - Local File Systems.
Jan 14 13:35:39.663319 systemd[1]: Stopped target swap.target - Swaps.
Jan 14 13:35:39.672855 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 14 13:35:39.672933 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook.
Jan 14 13:35:39.687567 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes.
Jan 14 13:35:39.694076 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre).
Jan 14 13:35:39.706490 systemd[1]: clevis-luks-askpass.path: Deactivated successfully.
Jan 14 13:35:39.712056 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch.
Jan 14 13:35:39.719006 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 14 13:35:39.719075 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook.
Jan 14 13:35:39.736462 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully.
Jan 14 13:35:39.736507 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion.
Jan 14 13:35:39.751306 systemd[1]: ignition-files.service: Deactivated successfully.
Jan 14 13:35:39.751355 systemd[1]: Stopped ignition-files.service - Ignition (files).
Jan 14 13:35:39.762339 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully.
Jan 14 13:35:39.762382 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent.
Jan 14 13:35:39.826783 ignition[1117]: INFO     : Ignition 2.20.0
Jan 14 13:35:39.826783 ignition[1117]: INFO     : Stage: umount
Jan 14 13:35:39.826783 ignition[1117]: INFO     : no configs at "/usr/lib/ignition/base.d"
Jan 14 13:35:39.826783 ignition[1117]: INFO     : no config dir at "/usr/lib/ignition/base.platform.d/azure"
Jan 14 13:35:39.826783 ignition[1117]: INFO     : umount: umount passed
Jan 14 13:35:39.826783 ignition[1117]: INFO     : Ignition finished successfully
Jan 14 13:35:39.794184 systemd[1]: Stopping ignition-mount.service - Ignition (mount)...
Jan 14 13:35:39.811356 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 14 13:35:39.811431 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes.
Jan 14 13:35:39.823110 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot...
Jan 14 13:35:39.831808 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 14 13:35:39.831875 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices.
Jan 14 13:35:39.845415 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 14 13:35:39.845475 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook.
Jan 14 13:35:39.866750 systemd[1]: ignition-mount.service: Deactivated successfully.
Jan 14 13:35:39.866840 systemd[1]: Stopped ignition-mount.service - Ignition (mount).
Jan 14 13:35:39.876437 systemd[1]: ignition-disks.service: Deactivated successfully.
Jan 14 13:35:39.876500 systemd[1]: Stopped ignition-disks.service - Ignition (disks).
Jan 14 13:35:39.886499 systemd[1]: ignition-kargs.service: Deactivated successfully.
Jan 14 13:35:39.886544 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs).
Jan 14 13:35:39.892492 systemd[1]: ignition-fetch.service: Deactivated successfully.
Jan 14 13:35:39.892533 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch).
Jan 14 13:35:39.904046 systemd[1]: Stopped target network.target - Network.
Jan 14 13:35:39.917754 systemd[1]: ignition-fetch-offline.service: Deactivated successfully.
Jan 14 13:35:39.917810 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline).
Jan 14 13:35:39.937818 systemd[1]: Stopped target paths.target - Path Units.
Jan 14 13:35:39.948741 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 14 13:35:39.954382 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch.
Jan 14 13:35:39.962016 systemd[1]: Stopped target slices.target - Slice Units.
Jan 14 13:35:39.978427 systemd[1]: Stopped target sockets.target - Socket Units.
Jan 14 13:35:39.988197 systemd[1]: iscsid.socket: Deactivated successfully.
Jan 14 13:35:39.988241 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket.
Jan 14 13:35:39.998553 systemd[1]: iscsiuio.socket: Deactivated successfully.
Jan 14 13:35:39.998590 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket.
Jan 14 13:35:40.010063 systemd[1]: ignition-setup.service: Deactivated successfully.
Jan 14 13:35:40.010115 systemd[1]: Stopped ignition-setup.service - Ignition (setup).
Jan 14 13:35:40.020373 systemd[1]: ignition-setup-pre.service: Deactivated successfully.
Jan 14 13:35:40.020412 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup.
Jan 14 13:35:40.231126 kernel: hv_netvsc 0022487b-74b7-0022-487b-74b70022487b eth0: Data path switched from VF: enP32736s1
Jan 14 13:35:40.031898 systemd[1]: Stopping systemd-networkd.service - Network Configuration...
Jan 14 13:35:40.042085 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution...
Jan 14 13:35:40.053939 systemd[1]: sysroot-boot.mount: Deactivated successfully.
Jan 14 13:35:40.059281 systemd-networkd[875]: eth0: DHCPv6 lease lost
Jan 14 13:35:40.061306 systemd[1]: systemd-networkd.service: Deactivated successfully.
Jan 14 13:35:40.061421 systemd[1]: Stopped systemd-networkd.service - Network Configuration.
Jan 14 13:35:40.072251 systemd[1]: systemd-networkd.socket: Deactivated successfully.
Jan 14 13:35:40.072312 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket.
Jan 14 13:35:40.093147 systemd[1]: Stopping network-cleanup.service - Network Cleanup...
Jan 14 13:35:40.101611 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully.
Jan 14 13:35:40.101679 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline.
Jan 14 13:35:40.114447 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files...
Jan 14 13:35:40.137332 systemd[1]: systemd-resolved.service: Deactivated successfully.
Jan 14 13:35:40.137429 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution.
Jan 14 13:35:40.159302 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 14 13:35:40.161954 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files.
Jan 14 13:35:40.171518 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 14 13:35:40.171589 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket.
Jan 14 13:35:40.188461 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 14 13:35:40.188515 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket.
Jan 14 13:35:40.200289 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 14 13:35:40.200349 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook.
Jan 14 13:35:40.225482 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 14 13:35:40.225549 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook.
Jan 14 13:35:40.242531 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully.
Jan 14 13:35:40.242592 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters.
Jan 14 13:35:40.271213 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database...
Jan 14 13:35:40.288130 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 14 13:35:40.288190 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables.
Jan 14 13:35:40.300376 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 14 13:35:40.512102 systemd-journald[218]: Received SIGTERM from PID 1 (systemd).
Jan 14 13:35:40.300422 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules.
Jan 14 13:35:40.312374 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 14 13:35:40.312421 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev.
Jan 14 13:35:40.325055 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 14 13:35:40.325101 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories.
Jan 14 13:35:40.337985 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 14 13:35:40.338052 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup.
Jan 14 13:35:40.352481 systemd[1]: sysroot-boot.service: Deactivated successfully.
Jan 14 13:35:40.352599 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot.
Jan 14 13:35:40.364909 systemd[1]: network-cleanup.service: Deactivated successfully.
Jan 14 13:35:40.365028 systemd[1]: Stopped network-cleanup.service - Network Cleanup.
Jan 14 13:35:40.376709 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 14 13:35:40.376786 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database.
Jan 14 13:35:40.389938 systemd[1]: Reached target initrd-switch-root.target - Switch Root.
Jan 14 13:35:40.401284 systemd[1]: initrd-setup-root.service: Deactivated successfully.
Jan 14 13:35:40.401364 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup.
Jan 14 13:35:40.432236 systemd[1]: Starting initrd-switch-root.service - Switch Root...
Jan 14 13:35:40.453691 systemd[1]: Switching root.
Jan 14 13:35:40.616271 systemd-journald[218]: Journal stopped
Jan 14 13:35:45.977246 kernel: SELinux:  policy capability network_peer_controls=1
Jan 14 13:35:45.977269 kernel: SELinux:  policy capability open_perms=1
Jan 14 13:35:45.977279 kernel: SELinux:  policy capability extended_socket_class=1
Jan 14 13:35:45.977287 kernel: SELinux:  policy capability always_check_network=0
Jan 14 13:35:45.977297 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 14 13:35:45.977304 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 14 13:35:45.977313 kernel: SELinux:  policy capability genfs_seclabel_symlinks=0
Jan 14 13:35:45.977321 kernel: SELinux:  policy capability ioctl_skip_cloexec=0
Jan 14 13:35:45.977329 kernel: audit: type=1403 audit(1736861741.711:2): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 14 13:35:45.977340 systemd[1]: Successfully loaded SELinux policy in 160.980ms.
Jan 14 13:35:45.977352 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 10.625ms.
Jan 14 13:35:45.977363 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified)
Jan 14 13:35:45.977371 systemd[1]: Detected virtualization microsoft.
Jan 14 13:35:45.977380 systemd[1]: Detected architecture arm64.
Jan 14 13:35:45.977389 systemd[1]: Detected first boot.
Jan 14 13:35:45.977400 systemd[1]: Hostname set to <ci-4186.1.0-a-e83668d6e0>.
Jan 14 13:35:45.977409 systemd[1]: Initializing machine ID from random generator.
Jan 14 13:35:45.977418 zram_generator::config[1159]: No configuration found.
Jan 14 13:35:45.977428 systemd[1]: Populated /etc with preset unit settings.
Jan 14 13:35:45.977436 systemd[1]: initrd-switch-root.service: Deactivated successfully.
Jan 14 13:35:45.977445 systemd[1]: Stopped initrd-switch-root.service - Switch Root.
Jan 14 13:35:45.977454 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 14 13:35:45.977465 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config.
Jan 14 13:35:45.977474 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run.
Jan 14 13:35:45.977484 systemd[1]: Created slice system-getty.slice - Slice /system/getty.
Jan 14 13:35:45.977493 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe.
Jan 14 13:35:45.977502 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty.
Jan 14 13:35:45.977511 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit.
Jan 14 13:35:45.977520 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck.
Jan 14 13:35:45.977530 systemd[1]: Created slice user.slice - User and Session Slice.
Jan 14 13:35:45.977541 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch.
Jan 14 13:35:45.977550 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch.
Jan 14 13:35:45.977559 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch.
Jan 14 13:35:45.977568 systemd[1]: Set up automount boot.automount - Boot partition Automount Point.
Jan 14 13:35:45.977577 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point.
Jan 14 13:35:45.977586 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM...
Jan 14 13:35:45.977595 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0...
Jan 14 13:35:45.977605 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre).
Jan 14 13:35:45.977615 systemd[1]: Stopped target initrd-switch-root.target - Switch Root.
Jan 14 13:35:45.977624 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems.
Jan 14 13:35:45.977635 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System.
Jan 14 13:35:45.977644 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes.
Jan 14 13:35:45.977654 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes.
Jan 14 13:35:45.977663 systemd[1]: Reached target remote-fs.target - Remote File Systems.
Jan 14 13:35:45.977672 systemd[1]: Reached target slices.target - Slice Units.
Jan 14 13:35:45.977683 systemd[1]: Reached target swap.target - Swaps.
Jan 14 13:35:45.977692 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes.
Jan 14 13:35:45.977701 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket.
Jan 14 13:35:45.977710 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket.
Jan 14 13:35:45.977719 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket.
Jan 14 13:35:45.977729 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket.
Jan 14 13:35:45.977741 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket.
Jan 14 13:35:45.977750 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System...
Jan 14 13:35:45.977760 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System...
Jan 14 13:35:45.977769 systemd[1]: Mounting media.mount - External Media Directory...
Jan 14 13:35:45.977778 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System...
Jan 14 13:35:45.977788 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System...
Jan 14 13:35:45.977797 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp...
Jan 14 13:35:45.977808 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 14 13:35:45.977818 systemd[1]: Reached target machines.target - Containers.
Jan 14 13:35:45.977827 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files...
Jan 14 13:35:45.977836 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met.
Jan 14 13:35:45.977846 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes...
Jan 14 13:35:45.977855 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs...
Jan 14 13:35:45.977865 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod...
Jan 14 13:35:45.977874 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm...
Jan 14 13:35:45.977885 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore...
Jan 14 13:35:45.977894 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse...
Jan 14 13:35:45.977903 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop...
Jan 14 13:35:45.977913 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf).
Jan 14 13:35:45.977922 systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Jan 14 13:35:45.977931 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device.
Jan 14 13:35:45.977942 systemd[1]: systemd-fsck-usr.service: Deactivated successfully.
Jan 14 13:35:45.977951 systemd[1]: Stopped systemd-fsck-usr.service.
Jan 14 13:35:45.977961 systemd[1]: Starting systemd-journald.service - Journal Service...
Jan 14 13:35:45.977970 kernel: fuse: init (API version 7.39)
Jan 14 13:35:45.977979 kernel: loop: module loaded
Jan 14 13:35:45.978010 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules...
Jan 14 13:35:45.978022 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line...
Jan 14 13:35:45.978032 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems...
Jan 14 13:35:45.978056 systemd-journald[1262]: Collecting audit messages is disabled.
Jan 14 13:35:45.978079 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices...
Jan 14 13:35:45.978089 systemd-journald[1262]: Journal started
Jan 14 13:35:45.978114 systemd-journald[1262]: Runtime Journal (/run/log/journal/085da10940fc43fdb77d0f10bdb20665) is 8.0M, max 78.5M, 70.5M free.
Jan 14 13:35:45.021298 systemd[1]: Queued start job for default target multi-user.target.
Jan 14 13:35:45.114749 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6.
Jan 14 13:35:45.115114 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 14 13:35:45.115390 systemd[1]: systemd-journald.service: Consumed 3.263s CPU time.
Jan 14 13:35:45.997015 systemd[1]: verity-setup.service: Deactivated successfully.
Jan 14 13:35:45.997069 systemd[1]: Stopped verity-setup.service.
Jan 14 13:35:46.004980 kernel: ACPI: bus type drm_connector registered
Jan 14 13:35:46.018380 systemd[1]: Started systemd-journald.service - Journal Service.
Jan 14 13:35:46.019121 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System.
Jan 14 13:35:46.025096 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System.
Jan 14 13:35:46.031231 systemd[1]: Mounted media.mount - External Media Directory.
Jan 14 13:35:46.036845 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System.
Jan 14 13:35:46.042793 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System.
Jan 14 13:35:46.049131 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp.
Jan 14 13:35:46.055535 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files.
Jan 14 13:35:46.063527 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes.
Jan 14 13:35:46.070875 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 14 13:35:46.073041 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs.
Jan 14 13:35:46.079703 systemd[1]: modprobe@dm_mod.service: Deactivated successfully.
Jan 14 13:35:46.079818 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod.
Jan 14 13:35:46.086378 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 14 13:35:46.086520 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm.
Jan 14 13:35:46.092601 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 14 13:35:46.092716 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore.
Jan 14 13:35:46.099725 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 14 13:35:46.099845 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse.
Jan 14 13:35:46.108516 systemd[1]: modprobe@loop.service: Deactivated successfully.
Jan 14 13:35:46.108639 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop.
Jan 14 13:35:46.115506 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules.
Jan 14 13:35:46.122269 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line.
Jan 14 13:35:46.129696 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems.
Jan 14 13:35:46.137600 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices.
Jan 14 13:35:46.155299 systemd[1]: Reached target network-pre.target - Preparation for Network.
Jan 14 13:35:46.166099 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System...
Jan 14 13:35:46.175114 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System...
Jan 14 13:35:46.182479 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/).
Jan 14 13:35:46.182517 systemd[1]: Reached target local-fs.target - Local File Systems.
Jan 14 13:35:46.189210 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink).
Jan 14 13:35:46.204119 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown...
Jan 14 13:35:46.211842 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache...
Jan 14 13:35:46.217351 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 14 13:35:46.264137 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database...
Jan 14 13:35:46.272063 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage...
Jan 14 13:35:46.278524 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 14 13:35:46.279578 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed...
Jan 14 13:35:46.285639 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 14 13:35:46.288165 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables...
Jan 14 13:35:46.297232 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/...
Jan 14 13:35:46.306278 systemd[1]: Starting systemd-sysusers.service - Create System Users...
Jan 14 13:35:46.314192 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization...
Jan 14 13:35:46.324556 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System.
Jan 14 13:35:46.332410 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System.
Jan 14 13:35:46.343027 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown.
Jan 14 13:35:46.355531 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed.
Jan 14 13:35:46.362837 systemd-journald[1262]: Time spent on flushing to /var/log/journal/085da10940fc43fdb77d0f10bdb20665 is 16.308ms for 901 entries.
Jan 14 13:35:46.362837 systemd-journald[1262]: System Journal (/var/log/journal/085da10940fc43fdb77d0f10bdb20665) is 8.0M, max 2.6G, 2.6G free.
Jan 14 13:35:46.421975 systemd-journald[1262]: Received client request to flush runtime journal.
Jan 14 13:35:46.422068 kernel: loop0: detected capacity change from 0 to 116784
Jan 14 13:35:46.372381 systemd[1]: Reached target first-boot-complete.target - First Boot Complete.
Jan 14 13:35:46.378057 udevadm[1296]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in.
Jan 14 13:35:46.394221 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk...
Jan 14 13:35:46.408057 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables.
Jan 14 13:35:46.423953 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage.
Jan 14 13:35:46.442795 systemd[1]: etc-machine\x2did.mount: Deactivated successfully.
Jan 14 13:35:46.443417 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk.
Jan 14 13:35:46.754011 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher
Jan 14 13:35:46.754081 systemd[1]: Finished systemd-sysusers.service - Create System Users.
Jan 14 13:35:46.767238 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev...
Jan 14 13:35:46.824010 kernel: loop1: detected capacity change from 0 to 113552
Jan 14 13:35:46.916093 systemd-tmpfiles[1312]: ACLs are not supported, ignoring.
Jan 14 13:35:46.916115 systemd-tmpfiles[1312]: ACLs are not supported, ignoring.
Jan 14 13:35:46.920775 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev.
Jan 14 13:35:47.158020 kernel: loop2: detected capacity change from 0 to 28752
Jan 14 13:35:47.318090 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database.
Jan 14 13:35:47.339217 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files...
Jan 14 13:35:47.357500 systemd-udevd[1317]: Using default interface naming scheme 'v255'.
Jan 14 13:35:47.446346 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files.
Jan 14 13:35:47.462187 systemd[1]: Starting systemd-networkd.service - Network Configuration...
Jan 14 13:35:47.499439 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped.
Jan 14 13:35:47.528291 systemd[1]: Starting systemd-userdbd.service - User Database Manager...
Jan 14 13:35:47.587545 systemd[1]: Started systemd-userdbd.service - User Database Manager.
Jan 14 13:35:47.607299 kernel: loop3: detected capacity change from 0 to 194512
Jan 14 13:35:47.638063 kernel: loop4: detected capacity change from 0 to 116784
Jan 14 13:35:47.658576 kernel: mousedev: PS/2 mouse device common for all mice
Jan 14 13:35:47.658684 kernel: hv_vmbus: registering driver hyperv_fb
Jan 14 13:35:47.658709 kernel: loop5: detected capacity change from 0 to 113552
Jan 14 13:35:47.658727 kernel: hyperv_fb: Synthvid Version major 3, minor 5
Jan 14 13:35:47.679004 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608
Jan 14 13:35:47.679091 kernel: Console: switching to colour dummy device 80x25
Jan 14 13:35:47.685128 kernel: Console: switching to colour frame buffer device 128x48
Jan 14 13:35:47.696973 kernel: hv_vmbus: registering driver hv_balloon
Jan 14 13:35:47.712385 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0
Jan 14 13:35:47.712467 kernel: loop6: detected capacity change from 0 to 28752
Jan 14 13:35:47.712484 kernel: hv_balloon: Memory hot add disabled on ARM64
Jan 14 13:35:47.733067 kernel: loop7: detected capacity change from 0 to 194512
Jan 14 13:35:47.734513 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup...
Jan 14 13:35:47.752025 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 14 13:35:47.754287 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup.
Jan 14 13:35:47.766121 (sd-merge)[1362]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'.
Jan 14 13:35:47.766526 (sd-merge)[1362]: Merged extensions into '/usr'.
Jan 14 13:35:47.779721 systemd-networkd[1331]: lo: Link UP
Jan 14 13:35:47.779731 systemd-networkd[1331]: lo: Gained carrier
Jan 14 13:35:47.781188 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup...
Jan 14 13:35:47.782821 systemd-networkd[1331]: Enumeration completed
Jan 14 13:35:47.784241 systemd-networkd[1331]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name.
Jan 14 13:35:47.784247 systemd-networkd[1331]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network.
Jan 14 13:35:47.798492 systemd[1]: Started systemd-networkd.service - Network Configuration.
Jan 14 13:35:47.806073 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 38 scanned by (udev-worker) (1330)
Jan 14 13:35:47.810448 systemd[1]: Reloading requested from client PID 1293 ('systemd-sysext') (unit systemd-sysext.service)...
Jan 14 13:35:47.810464 systemd[1]: Reloading...
Jan 14 13:35:47.854686 kernel: mlx5_core 7fe0:00:02.0 enP32736s1: Link up
Jan 14 13:35:47.887128 kernel: hv_netvsc 0022487b-74b7-0022-487b-74b70022487b eth0: Data path switched to VF: enP32736s1
Jan 14 13:35:47.887803 systemd-networkd[1331]: enP32736s1: Link UP
Jan 14 13:35:47.888129 systemd-networkd[1331]: eth0: Link UP
Jan 14 13:35:47.888135 systemd-networkd[1331]: eth0: Gained carrier
Jan 14 13:35:47.888182 systemd-networkd[1331]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name.
Jan 14 13:35:47.892514 systemd-networkd[1331]: enP32736s1: Gained carrier
Jan 14 13:35:47.906026 zram_generator::config[1458]: No configuration found.
Jan 14 13:35:47.907273 systemd-networkd[1331]: eth0: DHCPv4 address 10.200.20.15/24, gateway 10.200.20.1 acquired from 168.63.129.16
Jan 14 13:35:48.026847 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly.
Jan 14 13:35:48.101482 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM.
Jan 14 13:35:48.110240 systemd[1]: Reloading finished in 299 ms.
Jan 14 13:35:48.145073 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/.
Jan 14 13:35:48.171043 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization.
Jan 14 13:35:48.186164 systemd[1]: Starting ensure-sysext.service...
Jan 14 13:35:48.192201 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes...
Jan 14 13:35:48.202064 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM...
Jan 14 13:35:48.209337 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured...
Jan 14 13:35:48.221173 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories...
Jan 14 13:35:48.236285 systemd[1]: Reloading requested from client PID 1515 ('systemctl') (unit ensure-sysext.service)...
Jan 14 13:35:48.236306 systemd[1]: Reloading...
Jan 14 13:35:48.273465 systemd-tmpfiles[1519]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring.
Jan 14 13:35:48.274569 systemd-tmpfiles[1519]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring.
Jan 14 13:35:48.275823 systemd-tmpfiles[1519]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring.
Jan 14 13:35:48.277304 systemd-tmpfiles[1519]: ACLs are not supported, ignoring.
Jan 14 13:35:48.277351 systemd-tmpfiles[1519]: ACLs are not supported, ignoring.
Jan 14 13:35:48.298030 zram_generator::config[1550]: No configuration found.
Jan 14 13:35:48.314454 systemd-tmpfiles[1519]: Detected autofs mount point /boot during canonicalization of boot.
Jan 14 13:35:48.314464 systemd-tmpfiles[1519]: Skipping /boot
Jan 14 13:35:48.323609 lvm[1516]:   WARNING: Failed to connect to lvmetad. Falling back to device scanning.
Jan 14 13:35:48.326092 systemd-tmpfiles[1519]: Detected autofs mount point /boot during canonicalization of boot.
Jan 14 13:35:48.326102 systemd-tmpfiles[1519]: Skipping /boot
Jan 14 13:35:48.422409 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly.
Jan 14 13:35:48.496278 systemd[1]: Reloading finished in 259 ms.
Jan 14 13:35:48.521425 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes.
Jan 14 13:35:48.529789 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM.
Jan 14 13:35:48.538069 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories.
Jan 14 13:35:48.546617 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 14 13:35:48.546795 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup.
Jan 14 13:35:48.559580 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes.
Jan 14 13:35:48.573242 systemd[1]: Starting audit-rules.service - Load Audit Rules...
Jan 14 13:35:48.605262 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs...
Jan 14 13:35:48.613878 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes...
Jan 14 13:35:48.622682 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog...
Jan 14 13:35:48.630799 lvm[1621]:   WARNING: Failed to connect to lvmetad. Falling back to device scanning.
Jan 14 13:35:48.639329 systemd[1]: Starting systemd-resolved.service - Network Name Resolution...
Jan 14 13:35:48.653223 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP...
Jan 14 13:35:48.668347 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup...
Jan 14 13:35:48.678028 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes.
Jan 14 13:35:48.695602 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met.
Jan 14 13:35:48.699068 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod...
Jan 14 13:35:48.713060 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore...
Jan 14 13:35:48.733377 augenrules[1647]: No rules
Jan 14 13:35:48.727426 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop...
Jan 14 13:35:48.738541 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 14 13:35:48.739494 systemd[1]: audit-rules.service: Deactivated successfully.
Jan 14 13:35:48.739688 systemd[1]: Finished audit-rules.service - Load Audit Rules.
Jan 14 13:35:48.746704 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog.
Jan 14 13:35:48.758468 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP.
Jan 14 13:35:48.766399 systemd[1]: modprobe@dm_mod.service: Deactivated successfully.
Jan 14 13:35:48.766624 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod.
Jan 14 13:35:48.775611 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 14 13:35:48.775734 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore.
Jan 14 13:35:48.783781 systemd[1]: modprobe@loop.service: Deactivated successfully.
Jan 14 13:35:48.783906 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop.
Jan 14 13:35:48.800575 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met.
Jan 14 13:35:48.804030 systemd-resolved[1626]: Positive Trust Anchors:
Jan 14 13:35:48.804044 systemd-resolved[1626]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d
Jan 14 13:35:48.804080 systemd-resolved[1626]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test
Jan 14 13:35:48.810345 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod...
Jan 14 13:35:48.818726 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore...
Jan 14 13:35:48.827507 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop...
Jan 14 13:35:48.834553 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 14 13:35:48.836683 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup.
Jan 14 13:35:48.844711 systemd[1]: modprobe@dm_mod.service: Deactivated successfully.
Jan 14 13:35:48.844878 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod.
Jan 14 13:35:48.853569 systemd-resolved[1626]: Using system hostname 'ci-4186.1.0-a-e83668d6e0'.
Jan 14 13:35:48.854404 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 14 13:35:48.854538 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore.
Jan 14 13:35:48.862703 systemd[1]: Started systemd-resolved.service - Network Name Resolution.
Jan 14 13:35:48.870356 systemd[1]: modprobe@loop.service: Deactivated successfully.
Jan 14 13:35:48.871085 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop.
Jan 14 13:35:48.882563 systemd[1]: Reached target network.target - Network.
Jan 14 13:35:48.888520 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups.
Jan 14 13:35:48.901192 systemd[1]: Starting audit-rules.service - Load Audit Rules...
Jan 14 13:35:48.908199 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met.
Jan 14 13:35:48.910237 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod...
Jan 14 13:35:48.922709 augenrules[1666]: /sbin/augenrules: No change
Jan 14 13:35:48.927255 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm...
Jan 14 13:35:48.928646 augenrules[1685]: No rules
Jan 14 13:35:48.934455 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore...
Jan 14 13:35:48.944275 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop...
Jan 14 13:35:48.950319 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 14 13:35:48.950685 systemd[1]: Reached target time-set.target - System Time Set.
Jan 14 13:35:48.959267 systemd[1]: audit-rules.service: Deactivated successfully.
Jan 14 13:35:48.960067 systemd[1]: Finished audit-rules.service - Load Audit Rules.
Jan 14 13:35:48.965941 systemd[1]: modprobe@dm_mod.service: Deactivated successfully.
Jan 14 13:35:48.966112 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod.
Jan 14 13:35:48.973294 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 14 13:35:48.973419 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm.
Jan 14 13:35:48.979734 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 14 13:35:48.979876 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore.
Jan 14 13:35:48.987476 systemd[1]: modprobe@loop.service: Deactivated successfully.
Jan 14 13:35:48.987601 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop.
Jan 14 13:35:48.995529 systemd[1]: Finished ensure-sysext.service.
Jan 14 13:35:49.003532 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 14 13:35:49.003612 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 14 13:35:49.498247 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs.
Jan 14 13:35:49.505568 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt).
Jan 14 13:35:49.522097 systemd-networkd[1331]: eth0: Gained IPv6LL
Jan 14 13:35:49.524806 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured.
Jan 14 13:35:49.533296 systemd[1]: Reached target network-online.target - Network is Online.
Jan 14 13:35:49.714141 systemd-networkd[1331]: enP32736s1: Gained IPv6LL
Jan 14 13:35:52.110014 ldconfig[1288]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start.
Jan 14 13:35:52.138452 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache.
Jan 14 13:35:52.152121 systemd[1]: Starting systemd-update-done.service - Update is Completed...
Jan 14 13:35:52.167422 systemd[1]: Finished systemd-update-done.service - Update is Completed.
Jan 14 13:35:52.174033 systemd[1]: Reached target sysinit.target - System Initialization.
Jan 14 13:35:52.180055 systemd[1]: Started motdgen.path - Watch for update engine configuration changes.
Jan 14 13:35:52.187063 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data.
Jan 14 13:35:52.194205 systemd[1]: Started logrotate.timer - Daily rotation of log files.
Jan 14 13:35:52.200503 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information..
Jan 14 13:35:52.208400 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories.
Jan 14 13:35:52.215865 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate).
Jan 14 13:35:52.215897 systemd[1]: Reached target paths.target - Path Units.
Jan 14 13:35:52.222141 systemd[1]: Reached target timers.target - Timer Units.
Jan 14 13:35:52.229042 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket.
Jan 14 13:35:52.237279 systemd[1]: Starting docker.socket - Docker Socket for the API...
Jan 14 13:35:52.246514 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket.
Jan 14 13:35:52.252787 systemd[1]: Listening on docker.socket - Docker Socket for the API.
Jan 14 13:35:52.259198 systemd[1]: Reached target sockets.target - Socket Units.
Jan 14 13:35:52.264929 systemd[1]: Reached target basic.target - Basic System.
Jan 14 13:35:52.271123 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met.
Jan 14 13:35:52.271151 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met.
Jan 14 13:35:52.277084 systemd[1]: Starting chronyd.service - NTP client/server...
Jan 14 13:35:52.288152 systemd[1]: Starting containerd.service - containerd container runtime...
Jan 14 13:35:52.298162 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent...
Jan 14 13:35:52.305478 (chronyd)[1705]: chronyd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS
Jan 14 13:35:52.307034 systemd[1]: Starting dbus.service - D-Bus System Message Bus...
Jan 14 13:35:52.314212 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit...
Jan 14 13:35:52.323221 systemd[1]: Starting extend-filesystems.service - Extend Filesystems...
Jan 14 13:35:52.323807 jq[1712]: false
Jan 14 13:35:52.329577 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment).
Jan 14 13:35:52.329618 systemd[1]: hv_fcopy_daemon.service - Hyper-V FCOPY daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_fcopy).
Jan 14 13:35:52.331226 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon.
Jan 14 13:35:52.337130 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss).
Jan 14 13:35:52.339158 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent...
Jan 14 13:35:52.341972 KVP[1714]: KVP starting; pid is:1714
Jan 14 13:35:52.346075 chronyd[1717]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG)
Jan 14 13:35:52.362206 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd...
Jan 14 13:35:52.366973 chronyd[1717]: Timezone right/UTC failed leap second check, ignoring
Jan 14 13:35:52.368475 chronyd[1717]: Loaded seccomp filter (level 2)
Jan 14 13:35:52.371661 systemd[1]: Starting nvidia.service - NVIDIA Configure Service...
Jan 14 13:35:52.380157 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin...
Jan 14 13:35:52.391198 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline...
Jan 14 13:35:52.398926 extend-filesystems[1713]: Found loop4
Jan 14 13:35:52.410799 extend-filesystems[1713]: Found loop5
Jan 14 13:35:52.410799 extend-filesystems[1713]: Found loop6
Jan 14 13:35:52.410799 extend-filesystems[1713]: Found loop7
Jan 14 13:35:52.410799 extend-filesystems[1713]: Found sda
Jan 14 13:35:52.410799 extend-filesystems[1713]: Found sda1
Jan 14 13:35:52.410799 extend-filesystems[1713]: Found sda2
Jan 14 13:35:52.410799 extend-filesystems[1713]: Found sda3
Jan 14 13:35:52.410799 extend-filesystems[1713]: Found usr
Jan 14 13:35:52.410799 extend-filesystems[1713]: Found sda4
Jan 14 13:35:52.410799 extend-filesystems[1713]: Found sda6
Jan 14 13:35:52.410799 extend-filesystems[1713]: Found sda7
Jan 14 13:35:52.410799 extend-filesystems[1713]: Found sda9
Jan 14 13:35:52.410799 extend-filesystems[1713]: Checking size of /dev/sda9
Jan 14 13:35:52.616119 kernel: hv_utils: KVP IC version 4.0
Jan 14 13:35:52.403198 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys...
Jan 14 13:35:52.616313 extend-filesystems[1713]: Old size kept for /dev/sda9
Jan 14 13:35:52.616313 extend-filesystems[1713]: Found sr0
Jan 14 13:35:52.436465 KVP[1714]: KVP LIC Version: 3.1
Jan 14 13:35:52.438566 systemd[1]: Starting systemd-logind.service - User Login Management...
Jan 14 13:35:52.660987 coreos-metadata[1707]: Jan 14 13:35:52.629 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1
Jan 14 13:35:52.660987 coreos-metadata[1707]: Jan 14 13:35:52.635 INFO Fetch successful
Jan 14 13:35:52.660987 coreos-metadata[1707]: Jan 14 13:35:52.635 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1
Jan 14 13:35:52.660987 coreos-metadata[1707]: Jan 14 13:35:52.639 INFO Fetch successful
Jan 14 13:35:52.660987 coreos-metadata[1707]: Jan 14 13:35:52.640 INFO Fetching http://168.63.129.16/machine/00cd0636-7da8-41e5-a746-907eef01b5f0/b47ea196%2D6203%2D482d%2D88ef%2D9ea6fff606f6.%5Fci%2D4186.1.0%2Da%2De83668d6e0?comp=config&type=sharedConfig&incarnation=1: Attempt #1
Jan 14 13:35:52.660987 coreos-metadata[1707]: Jan 14 13:35:52.642 INFO Fetch successful
Jan 14 13:35:52.660987 coreos-metadata[1707]: Jan 14 13:35:52.642 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1
Jan 14 13:35:52.660987 coreos-metadata[1707]: Jan 14 13:35:52.658 INFO Fetch successful
Jan 14 13:35:52.540427 dbus-daemon[1708]: [system] SELinux support is enabled
Jan 14 13:35:52.448743 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0).
Jan 14 13:35:52.449309 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details.
Jan 14 13:35:52.668894 update_engine[1740]: I20250114 13:35:52.539098  1740 main.cc:92] Flatcar Update Engine starting
Jan 14 13:35:52.668894 update_engine[1740]: I20250114 13:35:52.551440  1740 update_check_scheduler.cc:74] Next update check in 3m3s
Jan 14 13:35:52.462170 systemd[1]: Starting update-engine.service - Update Engine...
Jan 14 13:35:52.683863 jq[1741]: true
Jan 14 13:35:52.683647 dbus-daemon[1708]: [system] Successfully activated service 'org.freedesktop.systemd1'
Jan 14 13:35:52.471234 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition...
Jan 14 13:35:52.504640 systemd[1]: Started chronyd.service - NTP client/server.
Jan 14 13:35:52.522342 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'.
Jan 14 13:35:52.522530 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped.
Jan 14 13:35:52.525328 systemd[1]: motdgen.service: Deactivated successfully.
Jan 14 13:35:52.525473 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd.
Jan 14 13:35:52.554550 systemd[1]: Started dbus.service - D-Bus System Message Bus.
Jan 14 13:35:52.564419 systemd-logind[1736]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard)
Jan 14 13:35:52.571043 systemd[1]: Finished nvidia.service - NVIDIA Configure Service.
Jan 14 13:35:52.573293 systemd-logind[1736]: New seat seat0.
Jan 14 13:35:52.589286 systemd[1]: Started systemd-logind.service - User Login Management.
Jan 14 13:35:52.603488 systemd[1]: extend-filesystems.service: Deactivated successfully.
Jan 14 13:35:52.603652 systemd[1]: Finished extend-filesystems.service - Extend Filesystems.
Jan 14 13:35:52.630431 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully.
Jan 14 13:35:52.631087 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline.
Jan 14 13:35:52.674372 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml).
Jan 14 13:35:52.674414 systemd[1]: Reached target system-config.target - Load system-provided cloud configs.
Jan 14 13:35:52.694156 jq[1758]: true
Jan 14 13:35:52.675511 (ntainerd)[1759]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR
Jan 14 13:35:52.686492 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url).
Jan 14 13:35:52.686511 systemd[1]: Reached target user-config.target - Load user-provided cloud configs.
Jan 14 13:35:52.703859 systemd[1]: Started update-engine.service - Update Engine.
Jan 14 13:35:52.721876 tar[1754]: linux-arm64/helm
Jan 14 13:35:52.725738 systemd[1]: Started locksmithd.service - Cluster reboot manager.
Jan 14 13:35:52.763383 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent.
Jan 14 13:35:52.771572 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met.
Jan 14 13:35:52.879125 bash[1799]: Updated "/home/core/.ssh/authorized_keys"
Jan 14 13:35:52.880452 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition.
Jan 14 13:35:52.889514 sshd_keygen[1739]: ssh-keygen: generating new host keys: RSA ECDSA ED25519
Jan 14 13:35:52.891532 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 38 scanned by (udev-worker) (1751)
Jan 14 13:35:52.894370 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met.
Jan 14 13:35:52.981639 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys.
Jan 14 13:35:53.008434 systemd[1]: Starting issuegen.service - Generate /run/issue...
Jan 14 13:35:53.019454 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent...
Jan 14 13:35:53.059502 systemd[1]: issuegen.service: Deactivated successfully.
Jan 14 13:35:53.059705 systemd[1]: Finished issuegen.service - Generate /run/issue.
Jan 14 13:35:53.076333 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions...
Jan 14 13:35:53.112264 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions.
Jan 14 13:35:53.135293 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent.
Jan 14 13:35:53.151174 systemd[1]: Started getty@tty1.service - Getty on tty1.
Jan 14 13:35:53.163458 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0.
Jan 14 13:35:53.166430 locksmithd[1779]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot"
Jan 14 13:35:53.171134 systemd[1]: Reached target getty.target - Login Prompts.
Jan 14 13:35:53.261306 containerd[1759]: time="2025-01-14T13:35:53.261202360Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23
Jan 14 13:35:53.314122 containerd[1759]: time="2025-01-14T13:35:53.313883880Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1
Jan 14 13:35:53.316431 containerd[1759]: time="2025-01-14T13:35:53.316380840Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.71-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1
Jan 14 13:35:53.317025 containerd[1759]: time="2025-01-14T13:35:53.316638680Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1
Jan 14 13:35:53.317025 containerd[1759]: time="2025-01-14T13:35:53.316666680Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1
Jan 14 13:35:53.317025 containerd[1759]: time="2025-01-14T13:35:53.316822800Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1
Jan 14 13:35:53.317025 containerd[1759]: time="2025-01-14T13:35:53.316840160Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1
Jan 14 13:35:53.317025 containerd[1759]: time="2025-01-14T13:35:53.316909720Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1
Jan 14 13:35:53.317025 containerd[1759]: time="2025-01-14T13:35:53.316921960Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1
Jan 14 13:35:53.317309 containerd[1759]: time="2025-01-14T13:35:53.317288800Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
Jan 14 13:35:53.317363 containerd[1759]: time="2025-01-14T13:35:53.317350840Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1
Jan 14 13:35:53.317426 containerd[1759]: time="2025-01-14T13:35:53.317412200Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1
Jan 14 13:35:53.317567 containerd[1759]: time="2025-01-14T13:35:53.317471760Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1
Jan 14 13:35:53.317664 containerd[1759]: time="2025-01-14T13:35:53.317649880Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1
Jan 14 13:35:53.318298 containerd[1759]: time="2025-01-14T13:35:53.317974320Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1
Jan 14 13:35:53.318298 containerd[1759]: time="2025-01-14T13:35:53.318121920Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
Jan 14 13:35:53.318298 containerd[1759]: time="2025-01-14T13:35:53.318135880Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1
Jan 14 13:35:53.318298 containerd[1759]: time="2025-01-14T13:35:53.318225000Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1
Jan 14 13:35:53.318298 containerd[1759]: time="2025-01-14T13:35:53.318265800Z" level=info msg="metadata content store policy set" policy=shared
Jan 14 13:35:53.330496 containerd[1759]: time="2025-01-14T13:35:53.330454080Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1
Jan 14 13:35:53.330647 containerd[1759]: time="2025-01-14T13:35:53.330633960Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1
Jan 14 13:35:53.330888 containerd[1759]: time="2025-01-14T13:35:53.330869480Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1
Jan 14 13:35:53.332627 containerd[1759]: time="2025-01-14T13:35:53.331028840Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1
Jan 14 13:35:53.332627 containerd[1759]: time="2025-01-14T13:35:53.331052360Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1
Jan 14 13:35:53.332627 containerd[1759]: time="2025-01-14T13:35:53.331225880Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1
Jan 14 13:35:53.332627 containerd[1759]: time="2025-01-14T13:35:53.331446160Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2
Jan 14 13:35:53.332627 containerd[1759]: time="2025-01-14T13:35:53.331533920Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2
Jan 14 13:35:53.332627 containerd[1759]: time="2025-01-14T13:35:53.331550200Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1
Jan 14 13:35:53.332627 containerd[1759]: time="2025-01-14T13:35:53.331565000Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1
Jan 14 13:35:53.332627 containerd[1759]: time="2025-01-14T13:35:53.331580200Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1
Jan 14 13:35:53.332627 containerd[1759]: time="2025-01-14T13:35:53.331593240Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1
Jan 14 13:35:53.332627 containerd[1759]: time="2025-01-14T13:35:53.331605920Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1
Jan 14 13:35:53.332627 containerd[1759]: time="2025-01-14T13:35:53.331619640Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1
Jan 14 13:35:53.332627 containerd[1759]: time="2025-01-14T13:35:53.331634120Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1
Jan 14 13:35:53.332627 containerd[1759]: time="2025-01-14T13:35:53.331649640Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1
Jan 14 13:35:53.332627 containerd[1759]: time="2025-01-14T13:35:53.331662120Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1
Jan 14 13:35:53.332943 containerd[1759]: time="2025-01-14T13:35:53.331675040Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1
Jan 14 13:35:53.332943 containerd[1759]: time="2025-01-14T13:35:53.331698160Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.332943 containerd[1759]: time="2025-01-14T13:35:53.331713120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.332943 containerd[1759]: time="2025-01-14T13:35:53.331728000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.332943 containerd[1759]: time="2025-01-14T13:35:53.331741200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.332943 containerd[1759]: time="2025-01-14T13:35:53.331752800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.332943 containerd[1759]: time="2025-01-14T13:35:53.331764960Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.332943 containerd[1759]: time="2025-01-14T13:35:53.331776120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.332943 containerd[1759]: time="2025-01-14T13:35:53.331787880Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.332943 containerd[1759]: time="2025-01-14T13:35:53.331800400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.332943 containerd[1759]: time="2025-01-14T13:35:53.331813720Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.332943 containerd[1759]: time="2025-01-14T13:35:53.331824920Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.332943 containerd[1759]: time="2025-01-14T13:35:53.331836320Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.332943 containerd[1759]: time="2025-01-14T13:35:53.331848160Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.332943 containerd[1759]: time="2025-01-14T13:35:53.331862880Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1
Jan 14 13:35:53.333215 containerd[1759]: time="2025-01-14T13:35:53.331885440Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.333215 containerd[1759]: time="2025-01-14T13:35:53.331897800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.333215 containerd[1759]: time="2025-01-14T13:35:53.331907560Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1
Jan 14 13:35:53.333215 containerd[1759]: time="2025-01-14T13:35:53.331952840Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1
Jan 14 13:35:53.333215 containerd[1759]: time="2025-01-14T13:35:53.331971120Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1
Jan 14 13:35:53.333215 containerd[1759]: time="2025-01-14T13:35:53.331982320Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1
Jan 14 13:35:53.333215 containerd[1759]: time="2025-01-14T13:35:53.332015960Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1
Jan 14 13:35:53.333215 containerd[1759]: time="2025-01-14T13:35:53.332025880Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.333215 containerd[1759]: time="2025-01-14T13:35:53.332037520Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1
Jan 14 13:35:53.333215 containerd[1759]: time="2025-01-14T13:35:53.332047560Z" level=info msg="NRI interface is disabled by configuration."
Jan 14 13:35:53.333215 containerd[1759]: time="2025-01-14T13:35:53.332062040Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1
Jan 14 13:35:53.333399 containerd[1759]: time="2025-01-14T13:35:53.332340240Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}"
Jan 14 13:35:53.333399 containerd[1759]: time="2025-01-14T13:35:53.332391280Z" level=info msg="Connect containerd service"
Jan 14 13:35:53.333399 containerd[1759]: time="2025-01-14T13:35:53.332428680Z" level=info msg="using legacy CRI server"
Jan 14 13:35:53.333399 containerd[1759]: time="2025-01-14T13:35:53.332435400Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
Jan 14 13:35:53.333399 containerd[1759]: time="2025-01-14T13:35:53.332546720Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\""
Jan 14 13:35:53.334170 containerd[1759]: time="2025-01-14T13:35:53.334144400Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
Jan 14 13:35:53.334355 containerd[1759]: time="2025-01-14T13:35:53.334305120Z" level=info msg="Start subscribing containerd event"
Jan 14 13:35:53.334400 containerd[1759]: time="2025-01-14T13:35:53.334373280Z" level=info msg="Start recovering state"
Jan 14 13:35:53.334466 containerd[1759]: time="2025-01-14T13:35:53.334445800Z" level=info msg="Start event monitor"
Jan 14 13:35:53.334466 containerd[1759]: time="2025-01-14T13:35:53.334462520Z" level=info msg="Start snapshots syncer"
Jan 14 13:35:53.334513 containerd[1759]: time="2025-01-14T13:35:53.334477960Z" level=info msg="Start cni network conf syncer for default"
Jan 14 13:35:53.334513 containerd[1759]: time="2025-01-14T13:35:53.334486560Z" level=info msg="Start streaming server"
Jan 14 13:35:53.334722 containerd[1759]: time="2025-01-14T13:35:53.334695160Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
Jan 14 13:35:53.334765 containerd[1759]: time="2025-01-14T13:35:53.334744320Z" level=info msg=serving... address=/run/containerd/containerd.sock
Jan 14 13:35:53.334889 systemd[1]: Started containerd.service - containerd container runtime.
Jan 14 13:35:53.343475 containerd[1759]: time="2025-01-14T13:35:53.343139440Z" level=info msg="containerd successfully booted in 0.085144s"
Jan 14 13:35:53.438237 tar[1754]: linux-arm64/LICENSE
Jan 14 13:35:53.438319 tar[1754]: linux-arm64/README.md
Jan 14 13:35:53.448716 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin.
Jan 14 13:35:53.540300 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
Jan 14 13:35:53.548368 systemd[1]: Reached target multi-user.target - Multi-User System.
Jan 14 13:35:53.554532 systemd[1]: Startup finished in 664ms (kernel) + 12.516s (initrd) + 12.002s (userspace) = 25.183s.
Jan 14 13:35:53.718176 (kubelet)[1891]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS
Jan 14 13:35:53.737376 agetty[1878]: failed to open credentials directory
Jan 14 13:35:53.738490 agetty[1875]: failed to open credentials directory
Jan 14 13:35:54.079158 login[1875]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:35:54.085298 login[1878]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:35:54.090622 systemd[1]: Created slice user-500.slice - User Slice of UID 500.
Jan 14 13:35:54.096253 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500...
Jan 14 13:35:54.099165 systemd-logind[1736]: New session 2 of user core.
Jan 14 13:35:54.103135 systemd-logind[1736]: New session 1 of user core.
Jan 14 13:35:54.114929 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500.
Jan 14 13:35:54.122374 systemd[1]: Starting user@500.service - User Manager for UID 500...
Jan 14 13:35:54.124798 (systemd)[1904]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0)
Jan 14 13:35:54.215858 kubelet[1891]: E0114 13:35:54.215778    1891 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory"
Jan 14 13:35:54.218521 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
Jan 14 13:35:54.218662 systemd[1]: kubelet.service: Failed with result 'exit-code'.
Jan 14 13:35:54.380016 systemd[1904]: Queued start job for default target default.target.
Jan 14 13:35:54.386866 systemd[1904]: Created slice app.slice - User Application Slice.
Jan 14 13:35:54.386895 systemd[1904]: Reached target paths.target - Paths.
Jan 14 13:35:54.386907 systemd[1904]: Reached target timers.target - Timers.
Jan 14 13:35:54.388052 systemd[1904]: Starting dbus.socket - D-Bus User Message Bus Socket...
Jan 14 13:35:54.398132 systemd[1904]: Listening on dbus.socket - D-Bus User Message Bus Socket.
Jan 14 13:35:54.398311 systemd[1904]: Reached target sockets.target - Sockets.
Jan 14 13:35:54.398326 systemd[1904]: Reached target basic.target - Basic System.
Jan 14 13:35:54.398367 systemd[1904]: Reached target default.target - Main User Target.
Jan 14 13:35:54.398392 systemd[1904]: Startup finished in 265ms.
Jan 14 13:35:54.398641 systemd[1]: Started user@500.service - User Manager for UID 500.
Jan 14 13:35:54.399894 systemd[1]: Started session-1.scope - Session 1 of User core.
Jan 14 13:35:54.400544 systemd[1]: Started session-2.scope - Session 2 of User core.
Jan 14 13:35:55.113021 waagent[1871]: 2025-01-14T13:35:55.111060Z INFO Daemon Daemon Azure Linux Agent Version: 2.9.1.1
Jan 14 13:35:55.116950 waagent[1871]: 2025-01-14T13:35:55.116886Z INFO Daemon Daemon OS: flatcar 4186.1.0
Jan 14 13:35:55.121829 waagent[1871]: 2025-01-14T13:35:55.121778Z INFO Daemon Daemon Python: 3.11.10
Jan 14 13:35:55.126863 waagent[1871]: 2025-01-14T13:35:55.126796Z INFO Daemon Daemon Run daemon
Jan 14 13:35:55.131167 waagent[1871]: 2025-01-14T13:35:55.131122Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4186.1.0'
Jan 14 13:35:55.140457 waagent[1871]: 2025-01-14T13:35:55.140404Z INFO Daemon Daemon Using waagent for provisioning
Jan 14 13:35:55.146465 waagent[1871]: 2025-01-14T13:35:55.146423Z INFO Daemon Daemon Activate resource disk
Jan 14 13:35:55.151418 waagent[1871]: 2025-01-14T13:35:55.151375Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb
Jan 14 13:35:55.163971 waagent[1871]: 2025-01-14T13:35:55.163921Z INFO Daemon Daemon Found device: None
Jan 14 13:35:55.168419 waagent[1871]: 2025-01-14T13:35:55.168377Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology
Jan 14 13:35:55.177521 waagent[1871]: 2025-01-14T13:35:55.177472Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0
Jan 14 13:35:55.189370 waagent[1871]: 2025-01-14T13:35:55.189323Z INFO Daemon Daemon Clean protocol and wireserver endpoint
Jan 14 13:35:55.195118 waagent[1871]: 2025-01-14T13:35:55.195077Z INFO Daemon Daemon Running default provisioning handler
Jan 14 13:35:55.206930 waagent[1871]: 2025-01-14T13:35:55.206404Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4.
Jan 14 13:35:55.220707 waagent[1871]: 2025-01-14T13:35:55.220643Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service'
Jan 14 13:35:55.230505 waagent[1871]: 2025-01-14T13:35:55.230456Z INFO Daemon Daemon cloud-init is enabled: False
Jan 14 13:35:55.235772 waagent[1871]: 2025-01-14T13:35:55.235729Z INFO Daemon Daemon Copying ovf-env.xml
Jan 14 13:35:55.334020 waagent[1871]: 2025-01-14T13:35:55.328685Z INFO Daemon Daemon Successfully mounted dvd
Jan 14 13:35:55.343665 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully.
Jan 14 13:35:55.345955 waagent[1871]: 2025-01-14T13:35:55.345885Z INFO Daemon Daemon Detect protocol endpoint
Jan 14 13:35:55.350884 waagent[1871]: 2025-01-14T13:35:55.350835Z INFO Daemon Daemon Clean protocol and wireserver endpoint
Jan 14 13:35:55.356957 waagent[1871]: 2025-01-14T13:35:55.356914Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler
Jan 14 13:35:55.364163 waagent[1871]: 2025-01-14T13:35:55.364089Z INFO Daemon Daemon Test for route to 168.63.129.16
Jan 14 13:35:55.369451 waagent[1871]: 2025-01-14T13:35:55.369409Z INFO Daemon Daemon Route to 168.63.129.16 exists
Jan 14 13:35:55.374574 waagent[1871]: 2025-01-14T13:35:55.374536Z INFO Daemon Daemon Wire server endpoint:168.63.129.16
Jan 14 13:35:55.423654 waagent[1871]: 2025-01-14T13:35:55.423604Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05
Jan 14 13:35:55.430426 waagent[1871]: 2025-01-14T13:35:55.430399Z INFO Daemon Daemon Wire protocol version:2012-11-30
Jan 14 13:35:55.436059 waagent[1871]: 2025-01-14T13:35:55.436021Z INFO Daemon Daemon Server preferred version:2015-04-05
Jan 14 13:35:55.631537 waagent[1871]: 2025-01-14T13:35:55.631382Z INFO Daemon Daemon Initializing goal state during protocol detection
Jan 14 13:35:55.639064 waagent[1871]: 2025-01-14T13:35:55.638980Z INFO Daemon Daemon Forcing an update of the goal state.
Jan 14 13:35:55.650809 waagent[1871]: 2025-01-14T13:35:55.650754Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1]
Jan 14 13:35:55.715086 waagent[1871]: 2025-01-14T13:35:55.715033Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.159
Jan 14 13:35:55.721177 waagent[1871]: 2025-01-14T13:35:55.721126Z INFO Daemon
Jan 14 13:35:55.724109 waagent[1871]: 2025-01-14T13:35:55.724064Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 07b792b3-cf36-4963-9d9f-386a48dfb2a2 eTag: 18217793258176285486 source: Fabric]
Jan 14 13:35:55.736089 waagent[1871]: 2025-01-14T13:35:55.736043Z INFO Daemon The vmSettings originated via Fabric; will ignore them.
Jan 14 13:35:55.743028 waagent[1871]: 2025-01-14T13:35:55.742961Z INFO Daemon
Jan 14 13:35:55.745817 waagent[1871]: 2025-01-14T13:35:55.745771Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1]
Jan 14 13:35:55.758846 waagent[1871]: 2025-01-14T13:35:55.758808Z INFO Daemon Daemon Downloading artifacts profile blob
Jan 14 13:35:55.848060 waagent[1871]: 2025-01-14T13:35:55.847958Z INFO Daemon Downloaded certificate {'thumbprint': 'ECA07113A60CA0324DC8E0395AB43A70A2DE4979', 'hasPrivateKey': False}
Jan 14 13:35:55.860265 waagent[1871]: 2025-01-14T13:35:55.860211Z INFO Daemon Downloaded certificate {'thumbprint': '732F73DDB4A3CA51C36634AB02F5B2AE037A6AB6', 'hasPrivateKey': True}
Jan 14 13:35:55.870903 waagent[1871]: 2025-01-14T13:35:55.870852Z INFO Daemon Fetch goal state completed
Jan 14 13:35:55.882201 waagent[1871]: 2025-01-14T13:35:55.882103Z INFO Daemon Daemon Starting provisioning
Jan 14 13:35:55.888360 waagent[1871]: 2025-01-14T13:35:55.888305Z INFO Daemon Daemon Handle ovf-env.xml.
Jan 14 13:35:55.893623 waagent[1871]: 2025-01-14T13:35:55.893579Z INFO Daemon Daemon Set hostname [ci-4186.1.0-a-e83668d6e0]
Jan 14 13:35:55.935017 waagent[1871]: 2025-01-14T13:35:55.930065Z INFO Daemon Daemon Publish hostname [ci-4186.1.0-a-e83668d6e0]
Jan 14 13:35:55.936615 waagent[1871]: 2025-01-14T13:35:55.936559Z INFO Daemon Daemon Examine /proc/net/route for primary interface
Jan 14 13:35:55.942918 waagent[1871]: 2025-01-14T13:35:55.942870Z INFO Daemon Daemon Primary interface is [eth0]
Jan 14 13:35:55.972348 systemd-networkd[1331]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name.
Jan 14 13:35:55.972357 systemd-networkd[1331]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network.
Jan 14 13:35:55.972383 systemd-networkd[1331]: eth0: DHCP lease lost
Jan 14 13:35:55.974023 waagent[1871]: 2025-01-14T13:35:55.973287Z INFO Daemon Daemon Create user account if not exists
Jan 14 13:35:55.979510 waagent[1871]: 2025-01-14T13:35:55.979450Z INFO Daemon Daemon User core already exists, skip useradd
Jan 14 13:35:55.985720 waagent[1871]: 2025-01-14T13:35:55.985664Z INFO Daemon Daemon Configure sudoer
Jan 14 13:35:55.990858 waagent[1871]: 2025-01-14T13:35:55.990804Z INFO Daemon Daemon Configure sshd
Jan 14 13:35:55.995729 waagent[1871]: 2025-01-14T13:35:55.995675Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive.
Jan 14 13:35:55.995822 systemd-networkd[1331]: eth0: DHCPv6 lease lost
Jan 14 13:35:56.011212 waagent[1871]: 2025-01-14T13:35:56.011134Z INFO Daemon Daemon Deploy ssh public key.
Jan 14 13:35:56.025063 systemd-networkd[1331]: eth0: DHCPv4 address 10.200.20.15/24, gateway 10.200.20.1 acquired from 168.63.129.16
Jan 14 13:35:57.155675 waagent[1871]: 2025-01-14T13:35:57.155592Z INFO Daemon Daemon Provisioning complete
Jan 14 13:35:57.174175 waagent[1871]: 2025-01-14T13:35:57.174129Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping
Jan 14 13:35:57.180853 waagent[1871]: 2025-01-14T13:35:57.180800Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions.
Jan 14 13:35:57.190883 waagent[1871]: 2025-01-14T13:35:57.190831Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.9.1.1 is the most current agent
Jan 14 13:35:57.317706 waagent[1961]: 2025-01-14T13:35:57.317223Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.9.1.1)
Jan 14 13:35:57.317706 waagent[1961]: 2025-01-14T13:35:57.317362Z INFO ExtHandler ExtHandler OS: flatcar 4186.1.0
Jan 14 13:35:57.317706 waagent[1961]: 2025-01-14T13:35:57.317414Z INFO ExtHandler ExtHandler Python: 3.11.10
Jan 14 13:35:57.366228 waagent[1961]: 2025-01-14T13:35:57.366151Z INFO ExtHandler ExtHandler Distro: flatcar-4186.1.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.10; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.20.1;
Jan 14 13:35:57.366539 waagent[1961]: 2025-01-14T13:35:57.366503Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file
Jan 14 13:35:57.366661 waagent[1961]: 2025-01-14T13:35:57.366631Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16
Jan 14 13:35:57.374693 waagent[1961]: 2025-01-14T13:35:57.374638Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1]
Jan 14 13:35:57.383982 waagent[1961]: 2025-01-14T13:35:57.383941Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.159
Jan 14 13:35:57.386023 waagent[1961]: 2025-01-14T13:35:57.384545Z INFO ExtHandler
Jan 14 13:35:57.386023 waagent[1961]: 2025-01-14T13:35:57.384617Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: d66aa29a-d29d-49af-982b-4d0a0c34c3fe eTag: 18217793258176285486 source: Fabric]
Jan 14 13:35:57.386023 waagent[1961]: 2025-01-14T13:35:57.384890Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them.
Jan 14 13:35:57.386023 waagent[1961]: 2025-01-14T13:35:57.385431Z INFO ExtHandler
Jan 14 13:35:57.386023 waagent[1961]: 2025-01-14T13:35:57.385500Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1]
Jan 14 13:35:57.389184 waagent[1961]: 2025-01-14T13:35:57.389151Z INFO ExtHandler ExtHandler Downloading artifacts profile blob
Jan 14 13:35:57.464198 waagent[1961]: 2025-01-14T13:35:57.464058Z INFO ExtHandler Downloaded certificate {'thumbprint': 'ECA07113A60CA0324DC8E0395AB43A70A2DE4979', 'hasPrivateKey': False}
Jan 14 13:35:57.464569 waagent[1961]: 2025-01-14T13:35:57.464522Z INFO ExtHandler Downloaded certificate {'thumbprint': '732F73DDB4A3CA51C36634AB02F5B2AE037A6AB6', 'hasPrivateKey': True}
Jan 14 13:35:57.464969 waagent[1961]: 2025-01-14T13:35:57.464928Z INFO ExtHandler Fetch goal state completed
Jan 14 13:35:57.484290 waagent[1961]: 2025-01-14T13:35:57.484220Z INFO ExtHandler ExtHandler WALinuxAgent-2.9.1.1 running as process 1961
Jan 14 13:35:57.484459 waagent[1961]: 2025-01-14T13:35:57.484423Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ********
Jan 14 13:35:57.486081 waagent[1961]: 2025-01-14T13:35:57.486034Z INFO ExtHandler ExtHandler Cgroup monitoring is not supported on ['flatcar', '4186.1.0', '', 'Flatcar Container Linux by Kinvolk']
Jan 14 13:35:57.486458 waagent[1961]: 2025-01-14T13:35:57.486421Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules
Jan 14 13:35:57.505940 waagent[1961]: 2025-01-14T13:35:57.505894Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service
Jan 14 13:35:57.506161 waagent[1961]: 2025-01-14T13:35:57.506121Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup
Jan 14 13:35:57.512723 waagent[1961]: 2025-01-14T13:35:57.512678Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now
Jan 14 13:35:57.519044 systemd[1]: Reloading requested from client PID 1976 ('systemctl') (unit waagent.service)...
Jan 14 13:35:57.519059 systemd[1]: Reloading...
Jan 14 13:35:57.600159 zram_generator::config[2013]: No configuration found.
Jan 14 13:35:57.699067 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly.
Jan 14 13:35:57.776404 systemd[1]: Reloading finished in 257 ms.
Jan 14 13:35:57.802153 waagent[1961]: 2025-01-14T13:35:57.800219Z INFO ExtHandler ExtHandler Executing systemctl daemon-reload for setting up waagent-network-setup.service
Jan 14 13:35:57.807952 systemd[1]: Reloading requested from client PID 2064 ('systemctl') (unit waagent.service)...
Jan 14 13:35:57.807967 systemd[1]: Reloading...
Jan 14 13:35:57.898068 zram_generator::config[2101]: No configuration found.
Jan 14 13:35:57.995854 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly.
Jan 14 13:35:58.073234 systemd[1]: Reloading finished in 264 ms.
Jan 14 13:35:58.097125 waagent[1961]: 2025-01-14T13:35:58.095179Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service
Jan 14 13:35:58.097125 waagent[1961]: 2025-01-14T13:35:58.095358Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully
Jan 14 13:35:58.916298 waagent[1961]: 2025-01-14T13:35:58.915107Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up.
Jan 14 13:35:58.916298 waagent[1961]: 2025-01-14T13:35:58.915717Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: configuration enabled [True], cgroups enabled [False], python supported: [True]
Jan 14 13:35:58.916634 waagent[1961]: 2025-01-14T13:35:58.916501Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file
Jan 14 13:35:58.916634 waagent[1961]: 2025-01-14T13:35:58.916582Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16
Jan 14 13:35:58.916819 waagent[1961]: 2025-01-14T13:35:58.916773Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled.
Jan 14 13:35:58.916933 waagent[1961]: 2025-01-14T13:35:58.916881Z INFO ExtHandler ExtHandler Starting env monitor service.
Jan 14 13:35:58.917155 waagent[1961]: 2025-01-14T13:35:58.917101Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route:
Jan 14 13:35:58.917155 waagent[1961]: Iface        Destination        Gateway         Flags        RefCnt        Use        Metric        Mask                MTU        Window        IRTT
Jan 14 13:35:58.917155 waagent[1961]: eth0        00000000        0114C80A        0003        0        0        1024        00000000        0        0        0
Jan 14 13:35:58.917155 waagent[1961]: eth0        0014C80A        00000000        0001        0        0        1024        00FFFFFF        0        0        0
Jan 14 13:35:58.917155 waagent[1961]: eth0        0114C80A        00000000        0005        0        0        1024        FFFFFFFF        0        0        0
Jan 14 13:35:58.917155 waagent[1961]: eth0        10813FA8        0114C80A        0007        0        0        1024        FFFFFFFF        0        0        0
Jan 14 13:35:58.917155 waagent[1961]: eth0        FEA9FEA9        0114C80A        0007        0        0        1024        FFFFFFFF        0        0        0
Jan 14 13:35:58.917711 waagent[1961]: 2025-01-14T13:35:58.917655Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service.
Jan 14 13:35:58.917947 waagent[1961]: 2025-01-14T13:35:58.917901Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file
Jan 14 13:35:58.918309 waagent[1961]: 2025-01-14T13:35:58.918259Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16
Jan 14 13:35:58.918458 waagent[1961]: 2025-01-14T13:35:58.918417Z INFO EnvHandler ExtHandler Configure routes
Jan 14 13:35:58.918518 waagent[1961]: 2025-01-14T13:35:58.918490Z INFO EnvHandler ExtHandler Gateway:None
Jan 14 13:35:58.918566 waagent[1961]: 2025-01-14T13:35:58.918541Z INFO EnvHandler ExtHandler Routes:None
Jan 14 13:35:58.919018 waagent[1961]: 2025-01-14T13:35:58.918897Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread
Jan 14 13:35:58.919018 waagent[1961]: 2025-01-14T13:35:58.918949Z INFO ExtHandler ExtHandler Start Extension Telemetry service.
Jan 14 13:35:58.919589 waagent[1961]: 2025-01-14T13:35:58.919528Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True
Jan 14 13:35:58.919808 waagent[1961]: 2025-01-14T13:35:58.919765Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status.
Jan 14 13:35:58.920481 waagent[1961]: 2025-01-14T13:35:58.920418Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread
Jan 14 13:35:58.925639 waagent[1961]: 2025-01-14T13:35:58.925583Z INFO ExtHandler ExtHandler
Jan 14 13:35:58.926157 waagent[1961]: 2025-01-14T13:35:58.926104Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 529b48da-9b60-4fa2-bde9-e373ed65a8eb correlation 854e4af6-02de-4ffe-8bc6-4970247d73a6 created: 2025-01-14T13:34:43.312973Z]
Jan 14 13:35:58.927074 waagent[1961]: 2025-01-14T13:35:58.927017Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything.
Jan 14 13:35:58.929043 waagent[1961]: 2025-01-14T13:35:58.928356Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 2 ms]
Jan 14 13:35:58.969681 waagent[1961]: 2025-01-14T13:35:58.969618Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.9.1.1 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 66FB5469-19E3-417E-BC6B-39496511EB6A;DroppedPackets: 0;UpdateGSErrors: 0;AutoUpdate: 0]
Jan 14 13:35:58.977499 waagent[1961]: 2025-01-14T13:35:58.977064Z INFO MonitorHandler ExtHandler Network interfaces:
Jan 14 13:35:58.977499 waagent[1961]: Executing ['ip', '-a', '-o', 'link']:
Jan 14 13:35:58.977499 waagent[1961]: 1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\    link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
Jan 14 13:35:58.977499 waagent[1961]: 2: eth0: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\    link/ether 00:22:48:7b:74:b7 brd ff:ff:ff:ff:ff:ff
Jan 14 13:35:58.977499 waagent[1961]: 3: enP32736s1: <BROADCAST,MULTICAST,SLAVE,UP,LOWER_UP> mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\    link/ether 00:22:48:7b:74:b7 brd ff:ff:ff:ff:ff:ff\    altname enP32736p0s2
Jan 14 13:35:58.977499 waagent[1961]: Executing ['ip', '-4', '-a', '-o', 'address']:
Jan 14 13:35:58.977499 waagent[1961]: 1: lo    inet 127.0.0.1/8 scope host lo\       valid_lft forever preferred_lft forever
Jan 14 13:35:58.977499 waagent[1961]: 2: eth0    inet 10.200.20.15/24 metric 1024 brd 10.200.20.255 scope global eth0\       valid_lft forever preferred_lft forever
Jan 14 13:35:58.977499 waagent[1961]: Executing ['ip', '-6', '-a', '-o', 'address']:
Jan 14 13:35:58.977499 waagent[1961]: 1: lo    inet6 ::1/128 scope host noprefixroute \       valid_lft forever preferred_lft forever
Jan 14 13:35:58.977499 waagent[1961]: 2: eth0    inet6 fe80::222:48ff:fe7b:74b7/64 scope link proto kernel_ll \       valid_lft forever preferred_lft forever
Jan 14 13:35:58.977499 waagent[1961]: 3: enP32736s1    inet6 fe80::222:48ff:fe7b:74b7/64 scope link proto kernel_ll \       valid_lft forever preferred_lft forever
Jan 14 13:35:59.018315 waagent[1961]: 2025-01-14T13:35:59.018244Z INFO EnvHandler ExtHandler Successfully added Azure fabric firewall rules. Current Firewall rules:
Jan 14 13:35:59.018315 waagent[1961]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes)
Jan 14 13:35:59.018315 waagent[1961]:     pkts      bytes target     prot opt in     out     source               destination
Jan 14 13:35:59.018315 waagent[1961]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes)
Jan 14 13:35:59.018315 waagent[1961]:     pkts      bytes target     prot opt in     out     source               destination
Jan 14 13:35:59.018315 waagent[1961]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes)
Jan 14 13:35:59.018315 waagent[1961]:     pkts      bytes target     prot opt in     out     source               destination
Jan 14 13:35:59.018315 waagent[1961]:        0        0 ACCEPT     tcp  --  *      *       0.0.0.0/0            168.63.129.16        tcp dpt:53
Jan 14 13:35:59.018315 waagent[1961]:        0        0 ACCEPT     tcp  --  *      *       0.0.0.0/0            168.63.129.16        owner UID match 0
Jan 14 13:35:59.018315 waagent[1961]:        0        0 DROP       tcp  --  *      *       0.0.0.0/0            168.63.129.16        ctstate INVALID,NEW
Jan 14 13:35:59.021194 waagent[1961]: 2025-01-14T13:35:59.021134Z INFO EnvHandler ExtHandler Current Firewall rules:
Jan 14 13:35:59.021194 waagent[1961]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes)
Jan 14 13:35:59.021194 waagent[1961]:     pkts      bytes target     prot opt in     out     source               destination
Jan 14 13:35:59.021194 waagent[1961]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes)
Jan 14 13:35:59.021194 waagent[1961]:     pkts      bytes target     prot opt in     out     source               destination
Jan 14 13:35:59.021194 waagent[1961]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes)
Jan 14 13:35:59.021194 waagent[1961]:     pkts      bytes target     prot opt in     out     source               destination
Jan 14 13:35:59.021194 waagent[1961]:        0        0 ACCEPT     tcp  --  *      *       0.0.0.0/0            168.63.129.16        tcp dpt:53
Jan 14 13:35:59.021194 waagent[1961]:        0        0 ACCEPT     tcp  --  *      *       0.0.0.0/0            168.63.129.16        owner UID match 0
Jan 14 13:35:59.021194 waagent[1961]:        0        0 DROP       tcp  --  *      *       0.0.0.0/0            168.63.129.16        ctstate INVALID,NEW
Jan 14 13:35:59.021430 waagent[1961]: 2025-01-14T13:35:59.021392Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300
Jan 14 13:36:04.355252 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1.
Jan 14 13:36:04.365239 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent...
Jan 14 13:36:04.445702 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
Jan 14 13:36:04.449129 (kubelet)[2191]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS
Jan 14 13:36:04.526570 kubelet[2191]: E0114 13:36:04.526512    2191 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory"
Jan 14 13:36:04.529337 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
Jan 14 13:36:04.529457 systemd[1]: kubelet.service: Failed with result 'exit-code'.
Jan 14 13:36:14.605357 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2.
Jan 14 13:36:14.612192 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent...
Jan 14 13:36:14.713681 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
Jan 14 13:36:14.725377 (kubelet)[2207]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS
Jan 14 13:36:14.781472 kubelet[2207]: E0114 13:36:14.781412    2207 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory"
Jan 14 13:36:14.784441 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
Jan 14 13:36:14.784577 systemd[1]: kubelet.service: Failed with result 'exit-code'.
Jan 14 13:36:16.177410 chronyd[1717]: Selected source PHC0
Jan 14 13:36:24.855321 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3.
Jan 14 13:36:24.865148 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent...
Jan 14 13:36:24.943679 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
Jan 14 13:36:24.946956 (kubelet)[2224]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS
Jan 14 13:36:25.048056 kubelet[2224]: E0114 13:36:25.048009    2224 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory"
Jan 14 13:36:25.050150 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
Jan 14 13:36:25.050268 systemd[1]: kubelet.service: Failed with result 'exit-code'.
Jan 14 13:36:35.105323 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4.
Jan 14 13:36:35.111189 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent...
Jan 14 13:36:35.194193 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
Jan 14 13:36:35.197851 (kubelet)[2240]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS
Jan 14 13:36:35.259258 kubelet[2240]: E0114 13:36:35.259175    2240 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory"
Jan 14 13:36:35.261499 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
Jan 14 13:36:35.261620 systemd[1]: kubelet.service: Failed with result 'exit-code'.
Jan 14 13:36:35.839316 kernel: hv_balloon: Max. dynamic memory size: 4096 MB
Jan 14 13:36:37.402114 update_engine[1740]: I20250114 13:36:37.402030  1740 update_attempter.cc:509] Updating boot flags...
Jan 14 13:36:37.476159 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 38 scanned by (udev-worker) (2263)
Jan 14 13:36:37.565017 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 38 scanned by (udev-worker) (2269)
Jan 14 13:36:45.355313 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5.
Jan 14 13:36:45.363211 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent...
Jan 14 13:36:45.439265 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
Jan 14 13:36:45.442427 (kubelet)[2370]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS
Jan 14 13:36:45.479485 kubelet[2370]: E0114 13:36:45.479428    2370 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory"
Jan 14 13:36:45.481373 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
Jan 14 13:36:45.481490 systemd[1]: kubelet.service: Failed with result 'exit-code'.
Jan 14 13:36:50.313978 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd.
Jan 14 13:36:50.315889 systemd[1]: Started sshd@0-10.200.20.15:22-10.200.16.10:33166.service - OpenSSH per-connection server daemon (10.200.16.10:33166).
Jan 14 13:36:50.965402 sshd[2380]: Accepted publickey for core from 10.200.16.10 port 33166 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:36:50.966651 sshd-session[2380]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:36:50.970460 systemd-logind[1736]: New session 3 of user core.
Jan 14 13:36:50.974126 systemd[1]: Started session-3.scope - Session 3 of User core.
Jan 14 13:36:51.381159 systemd[1]: Started sshd@1-10.200.20.15:22-10.200.16.10:33170.service - OpenSSH per-connection server daemon (10.200.16.10:33170).
Jan 14 13:36:51.838228 sshd[2385]: Accepted publickey for core from 10.200.16.10 port 33170 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:36:51.839467 sshd-session[2385]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:36:51.843049 systemd-logind[1736]: New session 4 of user core.
Jan 14 13:36:51.855131 systemd[1]: Started session-4.scope - Session 4 of User core.
Jan 14 13:36:52.175319 sshd[2387]: Connection closed by 10.200.16.10 port 33170
Jan 14 13:36:52.175846 sshd-session[2385]: pam_unix(sshd:session): session closed for user core
Jan 14 13:36:52.178986 systemd[1]: sshd@1-10.200.20.15:22-10.200.16.10:33170.service: Deactivated successfully.
Jan 14 13:36:52.180449 systemd[1]: session-4.scope: Deactivated successfully.
Jan 14 13:36:52.181034 systemd-logind[1736]: Session 4 logged out. Waiting for processes to exit.
Jan 14 13:36:52.182111 systemd-logind[1736]: Removed session 4.
Jan 14 13:36:52.263360 systemd[1]: Started sshd@2-10.200.20.15:22-10.200.16.10:33174.service - OpenSSH per-connection server daemon (10.200.16.10:33174).
Jan 14 13:36:52.715667 sshd[2392]: Accepted publickey for core from 10.200.16.10 port 33174 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:36:52.716897 sshd-session[2392]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:36:52.720426 systemd-logind[1736]: New session 5 of user core.
Jan 14 13:36:52.728124 systemd[1]: Started session-5.scope - Session 5 of User core.
Jan 14 13:36:53.048417 sshd[2394]: Connection closed by 10.200.16.10 port 33174
Jan 14 13:36:53.047908 sshd-session[2392]: pam_unix(sshd:session): session closed for user core
Jan 14 13:36:53.050651 systemd[1]: sshd@2-10.200.20.15:22-10.200.16.10:33174.service: Deactivated successfully.
Jan 14 13:36:53.052395 systemd[1]: session-5.scope: Deactivated successfully.
Jan 14 13:36:53.053757 systemd-logind[1736]: Session 5 logged out. Waiting for processes to exit.
Jan 14 13:36:53.054730 systemd-logind[1736]: Removed session 5.
Jan 14 13:36:53.135331 systemd[1]: Started sshd@3-10.200.20.15:22-10.200.16.10:33190.service - OpenSSH per-connection server daemon (10.200.16.10:33190).
Jan 14 13:36:53.591258 sshd[2399]: Accepted publickey for core from 10.200.16.10 port 33190 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:36:53.592513 sshd-session[2399]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:36:53.596055 systemd-logind[1736]: New session 6 of user core.
Jan 14 13:36:53.604133 systemd[1]: Started session-6.scope - Session 6 of User core.
Jan 14 13:36:53.929120 sshd[2401]: Connection closed by 10.200.16.10 port 33190
Jan 14 13:36:53.928986 sshd-session[2399]: pam_unix(sshd:session): session closed for user core
Jan 14 13:36:53.932393 systemd[1]: sshd@3-10.200.20.15:22-10.200.16.10:33190.service: Deactivated successfully.
Jan 14 13:36:53.934056 systemd[1]: session-6.scope: Deactivated successfully.
Jan 14 13:36:53.934636 systemd-logind[1736]: Session 6 logged out. Waiting for processes to exit.
Jan 14 13:36:53.935518 systemd-logind[1736]: Removed session 6.
Jan 14 13:36:54.011789 systemd[1]: Started sshd@4-10.200.20.15:22-10.200.16.10:33206.service - OpenSSH per-connection server daemon (10.200.16.10:33206).
Jan 14 13:36:54.468542 sshd[2406]: Accepted publickey for core from 10.200.16.10 port 33206 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:36:54.469781 sshd-session[2406]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:36:54.474656 systemd-logind[1736]: New session 7 of user core.
Jan 14 13:36:54.480145 systemd[1]: Started session-7.scope - Session 7 of User core.
Jan 14 13:36:54.858270 sudo[2409]:     core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1
Jan 14 13:36:54.858567 sudo[2409]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500)
Jan 14 13:36:54.870795 sudo[2409]: pam_unix(sudo:session): session closed for user root
Jan 14 13:36:54.945480 sshd[2408]: Connection closed by 10.200.16.10 port 33206
Jan 14 13:36:54.944653 sshd-session[2406]: pam_unix(sshd:session): session closed for user core
Jan 14 13:36:54.947835 systemd-logind[1736]: Session 7 logged out. Waiting for processes to exit.
Jan 14 13:36:54.948125 systemd[1]: sshd@4-10.200.20.15:22-10.200.16.10:33206.service: Deactivated successfully.
Jan 14 13:36:54.949614 systemd[1]: session-7.scope: Deactivated successfully.
Jan 14 13:36:54.951489 systemd-logind[1736]: Removed session 7.
Jan 14 13:36:55.025747 systemd[1]: Started sshd@5-10.200.20.15:22-10.200.16.10:33218.service - OpenSSH per-connection server daemon (10.200.16.10:33218).
Jan 14 13:36:55.482460 sshd[2414]: Accepted publickey for core from 10.200.16.10 port 33218 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:36:55.483771 sshd-session[2414]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:36:55.484623 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6.
Jan 14 13:36:55.493172 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent...
Jan 14 13:36:55.498210 systemd-logind[1736]: New session 8 of user core.
Jan 14 13:36:55.498774 systemd[1]: Started session-8.scope - Session 8 of User core.
Jan 14 13:36:55.578272 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
Jan 14 13:36:55.582054 (kubelet)[2425]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS
Jan 14 13:36:55.629904 kubelet[2425]: E0114 13:36:55.629829    2425 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory"
Jan 14 13:36:55.632533 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
Jan 14 13:36:55.632688 systemd[1]: kubelet.service: Failed with result 'exit-code'.
Jan 14 13:36:55.739878 sudo[2435]:     core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules
Jan 14 13:36:55.740304 sudo[2435]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500)
Jan 14 13:36:55.743487 sudo[2435]: pam_unix(sudo:session): session closed for user root
Jan 14 13:36:55.747795 sudo[2434]:     core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules
Jan 14 13:36:55.748311 sudo[2434]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500)
Jan 14 13:36:55.766280 systemd[1]: Starting audit-rules.service - Load Audit Rules...
Jan 14 13:36:55.787010 augenrules[2457]: No rules
Jan 14 13:36:55.788372 systemd[1]: audit-rules.service: Deactivated successfully.
Jan 14 13:36:55.788549 systemd[1]: Finished audit-rules.service - Load Audit Rules.
Jan 14 13:36:55.790190 sudo[2434]: pam_unix(sudo:session): session closed for user root
Jan 14 13:36:55.871664 sshd[2419]: Connection closed by 10.200.16.10 port 33218
Jan 14 13:36:55.872222 sshd-session[2414]: pam_unix(sshd:session): session closed for user core
Jan 14 13:36:55.874986 systemd[1]: sshd@5-10.200.20.15:22-10.200.16.10:33218.service: Deactivated successfully.
Jan 14 13:36:55.876516 systemd[1]: session-8.scope: Deactivated successfully.
Jan 14 13:36:55.877870 systemd-logind[1736]: Session 8 logged out. Waiting for processes to exit.
Jan 14 13:36:55.878873 systemd-logind[1736]: Removed session 8.
Jan 14 13:36:55.955190 systemd[1]: Started sshd@6-10.200.20.15:22-10.200.16.10:47846.service - OpenSSH per-connection server daemon (10.200.16.10:47846).
Jan 14 13:36:56.407088 sshd[2465]: Accepted publickey for core from 10.200.16.10 port 47846 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:36:56.408350 sshd-session[2465]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:36:56.411915 systemd-logind[1736]: New session 9 of user core.
Jan 14 13:36:56.419162 systemd[1]: Started session-9.scope - Session 9 of User core.
Jan 14 13:36:56.661665 sudo[2468]:     core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh
Jan 14 13:36:56.662477 sudo[2468]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500)
Jan 14 13:36:57.621298 systemd[1]: Starting docker.service - Docker Application Container Engine...
Jan 14 13:36:57.621418 (dockerd)[2487]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU
Jan 14 13:36:58.292232 dockerd[2487]: time="2025-01-14T13:36:58.292174255Z" level=info msg="Starting up"
Jan 14 13:36:58.588676 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport3630169517-merged.mount: Deactivated successfully.
Jan 14 13:36:58.637784 dockerd[2487]: time="2025-01-14T13:36:58.637730064Z" level=info msg="Loading containers: start."
Jan 14 13:36:58.836027 kernel: Initializing XFRM netlink socket
Jan 14 13:36:58.942764 systemd-networkd[1331]: docker0: Link UP
Jan 14 13:36:58.972119 dockerd[2487]: time="2025-01-14T13:36:58.972037464Z" level=info msg="Loading containers: done."
Jan 14 13:36:58.981700 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1914381345-merged.mount: Deactivated successfully.
Jan 14 13:36:58.990028 dockerd[2487]: time="2025-01-14T13:36:58.989810518Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2
Jan 14 13:36:58.990028 dockerd[2487]: time="2025-01-14T13:36:58.989906998Z" level=info msg="Docker daemon" commit=41ca978a0a5400cc24b274137efa9f25517fcc0b containerd-snapshotter=false storage-driver=overlay2 version=27.3.1
Jan 14 13:36:58.990146 dockerd[2487]: time="2025-01-14T13:36:58.990049279Z" level=info msg="Daemon has completed initialization"
Jan 14 13:36:59.036604 dockerd[2487]: time="2025-01-14T13:36:59.036535157Z" level=info msg="API listen on /run/docker.sock"
Jan 14 13:36:59.037028 systemd[1]: Started docker.service - Docker Application Container Engine.
Jan 14 13:37:00.471198 containerd[1759]: time="2025-01-14T13:37:00.471156997Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.29.12\""
Jan 14 13:37:01.243473 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3260325639.mount: Deactivated successfully.
Jan 14 13:37:03.382056 containerd[1759]: time="2025-01-14T13:37:03.381331670Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:03.385684 containerd[1759]: time="2025-01-14T13:37:03.385631514Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.29.12: active requests=0, bytes read=32201250"
Jan 14 13:37:03.390211 containerd[1759]: time="2025-01-14T13:37:03.390165677Z" level=info msg="ImageCreate event name:\"sha256:50c86b7f73fdd28bacd4abf45260c9d3abc3b57eb038fa61fc45b5d0f2763e6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:03.395663 containerd[1759]: time="2025-01-14T13:37:03.395618522Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:2804b1e7b9e08f3a3468f8fd2f6487c55968b9293ee51b9efb865b3298acfa26\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:03.396625 containerd[1759]: time="2025-01-14T13:37:03.396592643Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.29.12\" with image id \"sha256:50c86b7f73fdd28bacd4abf45260c9d3abc3b57eb038fa61fc45b5d0f2763e6f\", repo tag \"registry.k8s.io/kube-apiserver:v1.29.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:2804b1e7b9e08f3a3468f8fd2f6487c55968b9293ee51b9efb865b3298acfa26\", size \"32198050\" in 2.925397086s"
Jan 14 13:37:03.396855 containerd[1759]: time="2025-01-14T13:37:03.396717403Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.29.12\" returns image reference \"sha256:50c86b7f73fdd28bacd4abf45260c9d3abc3b57eb038fa61fc45b5d0f2763e6f\""
Jan 14 13:37:03.414129 containerd[1759]: time="2025-01-14T13:37:03.413933417Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.29.12\""
Jan 14 13:37:05.855328 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7.
Jan 14 13:37:05.863636 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent...
Jan 14 13:37:05.986221 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
Jan 14 13:37:05.997730 (kubelet)[2750]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS
Jan 14 13:37:06.057102 kubelet[2750]: E0114 13:37:06.057018    2750 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory"
Jan 14 13:37:06.061169 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
Jan 14 13:37:06.061543 systemd[1]: kubelet.service: Failed with result 'exit-code'.
Jan 14 13:37:06.297128 containerd[1759]: time="2025-01-14T13:37:06.296770836Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:06.301406 containerd[1759]: time="2025-01-14T13:37:06.301131440Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.29.12: active requests=0, bytes read=29381297"
Jan 14 13:37:06.305779 containerd[1759]: time="2025-01-14T13:37:06.305712724Z" level=info msg="ImageCreate event name:\"sha256:2d47abaa6ccc533f84ef74fff6d509de10bb040317351b45afe95a8021a1ddf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:06.311865 containerd[1759]: time="2025-01-14T13:37:06.311785809Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:e2f26a3f5ef3fd01f6330cab8b078cf303cfb6d36911a210d0915d535910e412\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:06.313007 containerd[1759]: time="2025-01-14T13:37:06.312858050Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.29.12\" with image id \"sha256:2d47abaa6ccc533f84ef74fff6d509de10bb040317351b45afe95a8021a1ddf7\", repo tag \"registry.k8s.io/kube-controller-manager:v1.29.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:e2f26a3f5ef3fd01f6330cab8b078cf303cfb6d36911a210d0915d535910e412\", size \"30783618\" in 2.898890353s"
Jan 14 13:37:06.313007 containerd[1759]: time="2025-01-14T13:37:06.312893610Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.29.12\" returns image reference \"sha256:2d47abaa6ccc533f84ef74fff6d509de10bb040317351b45afe95a8021a1ddf7\""
Jan 14 13:37:06.336532 containerd[1759]: time="2025-01-14T13:37:06.336483069Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.29.12\""
Jan 14 13:37:07.642946 containerd[1759]: time="2025-01-14T13:37:07.641982286Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:07.645544 containerd[1759]: time="2025-01-14T13:37:07.645316049Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.29.12: active requests=0, bytes read=15765640"
Jan 14 13:37:07.649097 containerd[1759]: time="2025-01-14T13:37:07.649068612Z" level=info msg="ImageCreate event name:\"sha256:ae633c52a23907b58f7a7867d2cccf3d3f5ebd8977beb6788e20fbecd3f446db\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:07.657678 containerd[1759]: time="2025-01-14T13:37:07.657606299Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:ed66e2102f4705d45de7513decf3ac61879704984409323779d19e98b970568c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:07.659045 containerd[1759]: time="2025-01-14T13:37:07.658676980Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.29.12\" with image id \"sha256:ae633c52a23907b58f7a7867d2cccf3d3f5ebd8977beb6788e20fbecd3f446db\", repo tag \"registry.k8s.io/kube-scheduler:v1.29.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:ed66e2102f4705d45de7513decf3ac61879704984409323779d19e98b970568c\", size \"17167979\" in 1.322146591s"
Jan 14 13:37:07.659045 containerd[1759]: time="2025-01-14T13:37:07.658716060Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.29.12\" returns image reference \"sha256:ae633c52a23907b58f7a7867d2cccf3d3f5ebd8977beb6788e20fbecd3f446db\""
Jan 14 13:37:07.678875 containerd[1759]: time="2025-01-14T13:37:07.678823557Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.29.12\""
Jan 14 13:37:09.598177 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4114145644.mount: Deactivated successfully.
Jan 14 13:37:10.474146 containerd[1759]: time="2025-01-14T13:37:10.474074786Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:10.477819 containerd[1759]: time="2025-01-14T13:37:10.477757669Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.29.12: active requests=0, bytes read=25273977"
Jan 14 13:37:10.480774 containerd[1759]: time="2025-01-14T13:37:10.480733471Z" level=info msg="ImageCreate event name:\"sha256:768ee8cfd9311233d038d18430c18136e1ae4dd2e6de40fcf1c670bba2da6d06\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:10.484145 containerd[1759]: time="2025-01-14T13:37:10.484069834Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bc761494b78fa152a759457f42bc9b86ee9d18f5929bb127bd5f72f8e2112c39\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:10.484798 containerd[1759]: time="2025-01-14T13:37:10.484649315Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.29.12\" with image id \"sha256:768ee8cfd9311233d038d18430c18136e1ae4dd2e6de40fcf1c670bba2da6d06\", repo tag \"registry.k8s.io/kube-proxy:v1.29.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:bc761494b78fa152a759457f42bc9b86ee9d18f5929bb127bd5f72f8e2112c39\", size \"25272996\" in 2.805759238s"
Jan 14 13:37:10.484798 containerd[1759]: time="2025-01-14T13:37:10.484686835Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.29.12\" returns image reference \"sha256:768ee8cfd9311233d038d18430c18136e1ae4dd2e6de40fcf1c670bba2da6d06\""
Jan 14 13:37:10.503936 containerd[1759]: time="2025-01-14T13:37:10.503887411Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\""
Jan 14 13:37:11.170560 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount355866204.mount: Deactivated successfully.
Jan 14 13:37:12.129032 containerd[1759]: time="2025-01-14T13:37:12.128525176Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:12.131876 containerd[1759]: time="2025-01-14T13:37:12.131632339Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=16485381"
Jan 14 13:37:12.134428 containerd[1759]: time="2025-01-14T13:37:12.134372101Z" level=info msg="ImageCreate event name:\"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:12.139877 containerd[1759]: time="2025-01-14T13:37:12.139806785Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:12.141347 containerd[1759]: time="2025-01-14T13:37:12.140888786Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"16482581\" in 1.636955815s"
Jan 14 13:37:12.141347 containerd[1759]: time="2025-01-14T13:37:12.140928706Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\""
Jan 14 13:37:12.163297 containerd[1759]: time="2025-01-14T13:37:12.163256965Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\""
Jan 14 13:37:12.796044 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1029512243.mount: Deactivated successfully.
Jan 14 13:37:12.822030 containerd[1759]: time="2025-01-14T13:37:12.821893959Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:12.824041 containerd[1759]: time="2025-01-14T13:37:12.823981040Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=268821"
Jan 14 13:37:12.829627 containerd[1759]: time="2025-01-14T13:37:12.829573365Z" level=info msg="ImageCreate event name:\"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:12.834360 containerd[1759]: time="2025-01-14T13:37:12.834299569Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:12.835268 containerd[1759]: time="2025-01-14T13:37:12.835029730Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"268051\" in 671.731765ms"
Jan 14 13:37:12.835268 containerd[1759]: time="2025-01-14T13:37:12.835065610Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\""
Jan 14 13:37:12.858201 containerd[1759]: time="2025-01-14T13:37:12.857483549Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.10-0\""
Jan 14 13:37:13.557492 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount436538097.mount: Deactivated successfully.
Jan 14 13:37:16.105288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8.
Jan 14 13:37:16.114235 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent...
Jan 14 13:37:16.239243 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
Jan 14 13:37:16.241380 (kubelet)[2894]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS
Jan 14 13:37:16.292645 kubelet[2894]: E0114 13:37:16.292156    2894 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory"
Jan 14 13:37:16.295393 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
Jan 14 13:37:16.295527 systemd[1]: kubelet.service: Failed with result 'exit-code'.
Jan 14 13:37:16.848349 containerd[1759]: time="2025-01-14T13:37:16.848299663Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.10-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:16.851929 containerd[1759]: time="2025-01-14T13:37:16.851854906Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.10-0: active requests=0, bytes read=65200786"
Jan 14 13:37:16.856531 containerd[1759]: time="2025-01-14T13:37:16.856478070Z" level=info msg="ImageCreate event name:\"sha256:79f8d13ae8b8839cadfb2f83416935f5184206d386028e2d1263577f0ab3620b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:16.861985 containerd[1759]: time="2025-01-14T13:37:16.861911595Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:22f892d7672adc0b9c86df67792afdb8b2dc08880f49f669eaaa59c47d7908c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:16.863346 containerd[1759]: time="2025-01-14T13:37:16.863012236Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.10-0\" with image id \"sha256:79f8d13ae8b8839cadfb2f83416935f5184206d386028e2d1263577f0ab3620b\", repo tag \"registry.k8s.io/etcd:3.5.10-0\", repo digest \"registry.k8s.io/etcd@sha256:22f892d7672adc0b9c86df67792afdb8b2dc08880f49f669eaaa59c47d7908c2\", size \"65198393\" in 4.005468966s"
Jan 14 13:37:16.863346 containerd[1759]: time="2025-01-14T13:37:16.863052196Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.10-0\" returns image reference \"sha256:79f8d13ae8b8839cadfb2f83416935f5184206d386028e2d1263577f0ab3620b\""
Jan 14 13:37:21.668747 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
Jan 14 13:37:21.679259 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent...
Jan 14 13:37:21.709292 systemd[1]: Reloading requested from client PID 2965 ('systemctl') (unit session-9.scope)...
Jan 14 13:37:21.709308 systemd[1]: Reloading...
Jan 14 13:37:21.828132 zram_generator::config[3011]: No configuration found.
Jan 14 13:37:21.910412 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly.
Jan 14 13:37:21.987898 systemd[1]: Reloading finished in 278 ms.
Jan 14 13:37:22.459433 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM
Jan 14 13:37:22.459525 systemd[1]: kubelet.service: Failed with result 'signal'.
Jan 14 13:37:22.459932 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
Jan 14 13:37:22.465415 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent...
Jan 14 13:37:22.565588 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
Jan 14 13:37:22.570953 (kubelet)[3069]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS
Jan 14 13:37:22.614525 kubelet[3069]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Jan 14 13:37:22.614525 kubelet[3069]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI.
Jan 14 13:37:22.614525 kubelet[3069]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Jan 14 13:37:22.614872 kubelet[3069]: I0114 13:37:22.614524    3069 server.go:204] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime"
Jan 14 13:37:23.403386 kubelet[3069]: I0114 13:37:23.403350    3069 server.go:487] "Kubelet version" kubeletVersion="v1.29.2"
Jan 14 13:37:23.403764 kubelet[3069]: I0114 13:37:23.403571    3069 server.go:489] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
Jan 14 13:37:23.405811 kubelet[3069]: I0114 13:37:23.405756    3069 server.go:919] "Client rotation is on, will bootstrap in background"
Jan 14 13:37:23.422921 kubelet[3069]: I0114 13:37:23.422711    3069 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt"
Jan 14 13:37:23.423121 kubelet[3069]: E0114 13:37:23.423093    3069 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.200.20.15:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:23.433099 kubelet[3069]: I0114 13:37:23.433063    3069 server.go:745] "--cgroups-per-qos enabled, but --cgroup-root was not specified.  defaulting to /"
Jan 14 13:37:23.433890 kubelet[3069]: I0114 13:37:23.433850    3069 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[]
Jan 14 13:37:23.434089 kubelet[3069]: I0114 13:37:23.434063    3069 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null}
Jan 14 13:37:23.434089 kubelet[3069]: I0114 13:37:23.434092    3069 topology_manager.go:138] "Creating topology manager with none policy"
Jan 14 13:37:23.434212 kubelet[3069]: I0114 13:37:23.434102    3069 container_manager_linux.go:301] "Creating device plugin manager"
Jan 14 13:37:23.435371 kubelet[3069]: I0114 13:37:23.435342    3069 state_mem.go:36] "Initialized new in-memory state store"
Jan 14 13:37:23.437783 kubelet[3069]: I0114 13:37:23.437749    3069 kubelet.go:396] "Attempting to sync node with API server"
Jan 14 13:37:23.437783 kubelet[3069]: I0114 13:37:23.437783    3069 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests"
Jan 14 13:37:23.438059 kubelet[3069]: I0114 13:37:23.437809    3069 kubelet.go:312] "Adding apiserver pod source"
Jan 14 13:37:23.438059 kubelet[3069]: I0114 13:37:23.437824    3069 apiserver.go:42] "Waiting for node sync before watching apiserver pods"
Jan 14 13:37:23.441611 kubelet[3069]: W0114 13:37:23.441235    3069 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Service: Get "https://10.200.20.15:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:23.441611 kubelet[3069]: E0114 13:37:23.441287    3069 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.200.20.15:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:23.441611 kubelet[3069]: W0114 13:37:23.441532    3069 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Node: Get "https://10.200.20.15:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4186.1.0-a-e83668d6e0&limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:23.441611 kubelet[3069]: E0114 13:37:23.441562    3069 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.200.20.15:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4186.1.0-a-e83668d6e0&limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:23.441959 kubelet[3069]: I0114 13:37:23.441923    3069 kuberuntime_manager.go:258] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1"
Jan 14 13:37:23.442259 kubelet[3069]: I0114 13:37:23.442232    3069 kubelet.go:809] "Not starting ClusterTrustBundle informer because we are in static kubelet mode"
Jan 14 13:37:23.442306 kubelet[3069]: W0114 13:37:23.442300    3069 probe.go:268] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating.
Jan 14 13:37:23.443087 kubelet[3069]: I0114 13:37:23.443041    3069 server.go:1256] "Started kubelet"
Jan 14 13:37:23.444047 kubelet[3069]: I0114 13:37:23.443795    3069 server.go:162] "Starting to listen" address="0.0.0.0" port=10250
Jan 14 13:37:23.444645 kubelet[3069]: I0114 13:37:23.444585    3069 server.go:461] "Adding debug handlers to kubelet server"
Jan 14 13:37:23.445911 kubelet[3069]: I0114 13:37:23.445419    3069 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10
Jan 14 13:37:23.445911 kubelet[3069]: I0114 13:37:23.445670    3069 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock"
Jan 14 13:37:23.447367 kubelet[3069]: I0114 13:37:23.447331    3069 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer"
Jan 14 13:37:23.448817 kubelet[3069]: E0114 13:37:23.448305    3069 event.go:355] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.15:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.15:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4186.1.0-a-e83668d6e0.181a92a4f259d658  default    0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4186.1.0-a-e83668d6e0,UID:ci-4186.1.0-a-e83668d6e0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4186.1.0-a-e83668d6e0,},FirstTimestamp:2025-01-14 13:37:23.443013208 +0000 UTC m=+0.868285411,LastTimestamp:2025-01-14 13:37:23.443013208 +0000 UTC m=+0.868285411,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4186.1.0-a-e83668d6e0,}"
Jan 14 13:37:23.452569 kubelet[3069]: I0114 13:37:23.452513    3069 volume_manager.go:291] "Starting Kubelet Volume Manager"
Jan 14 13:37:23.453157 kubelet[3069]: I0114 13:37:23.453124    3069 desired_state_of_world_populator.go:151] "Desired state populator starts to run"
Jan 14 13:37:23.453240 kubelet[3069]: I0114 13:37:23.453218    3069 reconciler_new.go:29] "Reconciler: start to sync state"
Jan 14 13:37:23.454030 kubelet[3069]: W0114 13:37:23.453767    3069 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: Get "https://10.200.20.15:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:23.454030 kubelet[3069]: E0114 13:37:23.453835    3069 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.200.20.15:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:23.454030 kubelet[3069]: E0114 13:37:23.453935    3069 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.15:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4186.1.0-a-e83668d6e0?timeout=10s\": dial tcp 10.200.20.15:6443: connect: connection refused" interval="200ms"
Jan 14 13:37:23.455919 kubelet[3069]: I0114 13:37:23.455819    3069 factory.go:221] Registration of the systemd container factory successfully
Jan 14 13:37:23.456235 kubelet[3069]: I0114 13:37:23.455943    3069 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory
Jan 14 13:37:23.458761 kubelet[3069]: I0114 13:37:23.458681    3069 factory.go:221] Registration of the containerd container factory successfully
Jan 14 13:37:23.465785 kubelet[3069]: E0114 13:37:23.465753    3069 kubelet.go:1462] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem"
Jan 14 13:37:23.475786 kubelet[3069]: I0114 13:37:23.474300    3069 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4"
Jan 14 13:37:23.478257 kubelet[3069]: I0114 13:37:23.477485    3069 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6"
Jan 14 13:37:23.478590 kubelet[3069]: I0114 13:37:23.478560    3069 status_manager.go:217] "Starting to sync pod status with apiserver"
Jan 14 13:37:23.478636 kubelet[3069]: I0114 13:37:23.478596    3069 kubelet.go:2329] "Starting kubelet main sync loop"
Jan 14 13:37:23.479896 kubelet[3069]: E0114 13:37:23.479700    3069 kubelet.go:2353] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]"
Jan 14 13:37:23.479896 kubelet[3069]: W0114 13:37:23.479790    3069 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.RuntimeClass: Get "https://10.200.20.15:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:23.479896 kubelet[3069]: E0114 13:37:23.479815    3069 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.200.20.15:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:23.499233 kubelet[3069]: I0114 13:37:23.499200    3069 cpu_manager.go:214] "Starting CPU manager" policy="none"
Jan 14 13:37:23.499233 kubelet[3069]: I0114 13:37:23.499226    3069 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s"
Jan 14 13:37:23.499395 kubelet[3069]: I0114 13:37:23.499248    3069 state_mem.go:36] "Initialized new in-memory state store"
Jan 14 13:37:23.555055 kubelet[3069]: I0114 13:37:23.555022    3069 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:23.555440 kubelet[3069]: E0114 13:37:23.555417    3069 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.20.15:6443/api/v1/nodes\": dial tcp 10.200.20.15:6443: connect: connection refused" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:23.580639 kubelet[3069]: E0114 13:37:23.580612    3069 kubelet.go:2353] "Skipping pod synchronization" err="container runtime status check may not have completed yet"
Jan 14 13:37:23.655877 kubelet[3069]: E0114 13:37:23.655264    3069 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.15:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4186.1.0-a-e83668d6e0?timeout=10s\": dial tcp 10.200.20.15:6443: connect: connection refused" interval="400ms"
Jan 14 13:37:23.757474 kubelet[3069]: I0114 13:37:23.757143    3069 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:23.757626 kubelet[3069]: E0114 13:37:23.757536    3069 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.20.15:6443/api/v1/nodes\": dial tcp 10.200.20.15:6443: connect: connection refused" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:23.780856 kubelet[3069]: E0114 13:37:23.780836    3069 kubelet.go:2353] "Skipping pod synchronization" err="container runtime status check may not have completed yet"
Jan 14 13:37:24.056359 kubelet[3069]: E0114 13:37:24.056263    3069 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.15:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4186.1.0-a-e83668d6e0?timeout=10s\": dial tcp 10.200.20.15:6443: connect: connection refused" interval="800ms"
Jan 14 13:37:24.159152 kubelet[3069]: I0114 13:37:24.159117    3069 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:24.159450 kubelet[3069]: E0114 13:37:24.159427    3069 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.20.15:6443/api/v1/nodes\": dial tcp 10.200.20.15:6443: connect: connection refused" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:24.181606 kubelet[3069]: E0114 13:37:24.181578    3069 kubelet.go:2353] "Skipping pod synchronization" err="container runtime status check may not have completed yet"
Jan 14 13:37:24.514665 kubelet[3069]: W0114 13:37:24.514631    3069 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: Get "https://10.200.20.15:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:24.514665 kubelet[3069]: E0114 13:37:24.514671    3069 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.200.20.15:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:24.538049 kubelet[3069]: W0114 13:37:24.537985    3069 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Service: Get "https://10.200.20.15:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:24.538049 kubelet[3069]: E0114 13:37:24.538049    3069 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.200.20.15:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:24.715011 kubelet[3069]: W0114 13:37:24.714955    3069 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.RuntimeClass: Get "https://10.200.20.15:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:24.715011 kubelet[3069]: E0114 13:37:24.715011    3069 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.200.20.15:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:24.857034 kubelet[3069]: E0114 13:37:24.857003    3069 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.15:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4186.1.0-a-e83668d6e0?timeout=10s\": dial tcp 10.200.20.15:6443: connect: connection refused" interval="1.6s"
Jan 14 13:37:24.932772 kubelet[3069]: W0114 13:37:24.932717    3069 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Node: Get "https://10.200.20.15:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4186.1.0-a-e83668d6e0&limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:24.932916 kubelet[3069]: E0114 13:37:24.932787    3069 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.200.20.15:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4186.1.0-a-e83668d6e0&limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:24.961509 kubelet[3069]: I0114 13:37:24.961477    3069 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:24.961812 kubelet[3069]: E0114 13:37:24.961785    3069 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.20.15:6443/api/v1/nodes\": dial tcp 10.200.20.15:6443: connect: connection refused" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:24.981979 kubelet[3069]: E0114 13:37:24.981957    3069 kubelet.go:2353] "Skipping pod synchronization" err="container runtime status check may not have completed yet"
Jan 14 13:37:25.576586 kubelet[3069]: E0114 13:37:25.576556    3069 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.200.20.15:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:25.666640 kubelet[3069]: I0114 13:37:25.666327    3069 policy_none.go:49] "None policy: Start"
Jan 14 13:37:25.667349 kubelet[3069]: I0114 13:37:25.667316    3069 memory_manager.go:170] "Starting memorymanager" policy="None"
Jan 14 13:37:25.667457 kubelet[3069]: I0114 13:37:25.667376    3069 state_mem.go:35] "Initializing new in-memory state store"
Jan 14 13:37:25.762170 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice.
Jan 14 13:37:25.772460 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice.
Jan 14 13:37:25.779980 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice.
Jan 14 13:37:25.781909 kubelet[3069]: I0114 13:37:25.781143    3069 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found"
Jan 14 13:37:25.781909 kubelet[3069]: I0114 13:37:25.781403    3069 plugin_manager.go:118] "Starting Kubelet Plugin Manager"
Jan 14 13:37:25.783859 kubelet[3069]: E0114 13:37:25.783835    3069 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4186.1.0-a-e83668d6e0\" not found"
Jan 14 13:37:26.458122 kubelet[3069]: E0114 13:37:26.458088    3069 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.15:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4186.1.0-a-e83668d6e0?timeout=10s\": dial tcp 10.200.20.15:6443: connect: connection refused" interval="3.2s"
Jan 14 13:37:26.564163 kubelet[3069]: I0114 13:37:26.564130    3069 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:26.564488 kubelet[3069]: E0114 13:37:26.564463    3069 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.20.15:6443/api/v1/nodes\": dial tcp 10.200.20.15:6443: connect: connection refused" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:26.569094 kubelet[3069]: W0114 13:37:26.569062    3069 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Service: Get "https://10.200.20.15:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:26.569155 kubelet[3069]: E0114 13:37:26.569104    3069 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.200.20.15:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:26.582096 kubelet[3069]: I0114 13:37:26.582070    3069 topology_manager.go:215] "Topology Admit Handler" podUID="10ccfcf93b347a6d7cb84ddeb2e79cd1" podNamespace="kube-system" podName="kube-apiserver-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:26.583898 kubelet[3069]: I0114 13:37:26.583853    3069 topology_manager.go:215] "Topology Admit Handler" podUID="5f8e24e91e237b13c726e3ae93873aac" podNamespace="kube-system" podName="kube-controller-manager-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:26.585886 kubelet[3069]: I0114 13:37:26.585809    3069 topology_manager.go:215] "Topology Admit Handler" podUID="dc4486a3a33639b031f5be51e1899536" podNamespace="kube-system" podName="kube-scheduler-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:26.593809 systemd[1]: Created slice kubepods-burstable-pod10ccfcf93b347a6d7cb84ddeb2e79cd1.slice - libcontainer container kubepods-burstable-pod10ccfcf93b347a6d7cb84ddeb2e79cd1.slice.
Jan 14 13:37:26.614636 systemd[1]: Created slice kubepods-burstable-pod5f8e24e91e237b13c726e3ae93873aac.slice - libcontainer container kubepods-burstable-pod5f8e24e91e237b13c726e3ae93873aac.slice.
Jan 14 13:37:26.618892 systemd[1]: Created slice kubepods-burstable-poddc4486a3a33639b031f5be51e1899536.slice - libcontainer container kubepods-burstable-poddc4486a3a33639b031f5be51e1899536.slice.
Jan 14 13:37:26.669782 kubelet[3069]: I0114 13:37:26.669738    3069 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/10ccfcf93b347a6d7cb84ddeb2e79cd1-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4186.1.0-a-e83668d6e0\" (UID: \"10ccfcf93b347a6d7cb84ddeb2e79cd1\") " pod="kube-system/kube-apiserver-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:26.669782 kubelet[3069]: I0114 13:37:26.669781    3069 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5f8e24e91e237b13c726e3ae93873aac-flexvolume-dir\") pod \"kube-controller-manager-ci-4186.1.0-a-e83668d6e0\" (UID: \"5f8e24e91e237b13c726e3ae93873aac\") " pod="kube-system/kube-controller-manager-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:26.670063 kubelet[3069]: I0114 13:37:26.669803    3069 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5f8e24e91e237b13c726e3ae93873aac-kubeconfig\") pod \"kube-controller-manager-ci-4186.1.0-a-e83668d6e0\" (UID: \"5f8e24e91e237b13c726e3ae93873aac\") " pod="kube-system/kube-controller-manager-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:26.670063 kubelet[3069]: I0114 13:37:26.669824    3069 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5f8e24e91e237b13c726e3ae93873aac-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4186.1.0-a-e83668d6e0\" (UID: \"5f8e24e91e237b13c726e3ae93873aac\") " pod="kube-system/kube-controller-manager-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:26.670063 kubelet[3069]: I0114 13:37:26.669851    3069 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/dc4486a3a33639b031f5be51e1899536-kubeconfig\") pod \"kube-scheduler-ci-4186.1.0-a-e83668d6e0\" (UID: \"dc4486a3a33639b031f5be51e1899536\") " pod="kube-system/kube-scheduler-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:26.670063 kubelet[3069]: I0114 13:37:26.669869    3069 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/10ccfcf93b347a6d7cb84ddeb2e79cd1-k8s-certs\") pod \"kube-apiserver-ci-4186.1.0-a-e83668d6e0\" (UID: \"10ccfcf93b347a6d7cb84ddeb2e79cd1\") " pod="kube-system/kube-apiserver-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:26.670063 kubelet[3069]: I0114 13:37:26.669886    3069 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5f8e24e91e237b13c726e3ae93873aac-ca-certs\") pod \"kube-controller-manager-ci-4186.1.0-a-e83668d6e0\" (UID: \"5f8e24e91e237b13c726e3ae93873aac\") " pod="kube-system/kube-controller-manager-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:26.670175 kubelet[3069]: I0114 13:37:26.669904    3069 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5f8e24e91e237b13c726e3ae93873aac-k8s-certs\") pod \"kube-controller-manager-ci-4186.1.0-a-e83668d6e0\" (UID: \"5f8e24e91e237b13c726e3ae93873aac\") " pod="kube-system/kube-controller-manager-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:26.670175 kubelet[3069]: I0114 13:37:26.669921    3069 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/10ccfcf93b347a6d7cb84ddeb2e79cd1-ca-certs\") pod \"kube-apiserver-ci-4186.1.0-a-e83668d6e0\" (UID: \"10ccfcf93b347a6d7cb84ddeb2e79cd1\") " pod="kube-system/kube-apiserver-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:26.811715 kubelet[3069]: E0114 13:37:26.811610    3069 event.go:355] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.15:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.15:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4186.1.0-a-e83668d6e0.181a92a4f259d658  default    0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4186.1.0-a-e83668d6e0,UID:ci-4186.1.0-a-e83668d6e0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4186.1.0-a-e83668d6e0,},FirstTimestamp:2025-01-14 13:37:23.443013208 +0000 UTC m=+0.868285411,LastTimestamp:2025-01-14 13:37:23.443013208 +0000 UTC m=+0.868285411,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4186.1.0-a-e83668d6e0,}"
Jan 14 13:37:26.913000 containerd[1759]: time="2025-01-14T13:37:26.912943127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4186.1.0-a-e83668d6e0,Uid:10ccfcf93b347a6d7cb84ddeb2e79cd1,Namespace:kube-system,Attempt:0,}"
Jan 14 13:37:26.917731 containerd[1759]: time="2025-01-14T13:37:26.917592011Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4186.1.0-a-e83668d6e0,Uid:5f8e24e91e237b13c726e3ae93873aac,Namespace:kube-system,Attempt:0,}"
Jan 14 13:37:26.921678 containerd[1759]: time="2025-01-14T13:37:26.921503854Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4186.1.0-a-e83668d6e0,Uid:dc4486a3a33639b031f5be51e1899536,Namespace:kube-system,Attempt:0,}"
Jan 14 13:37:27.055929 kubelet[3069]: W0114 13:37:27.055894    3069 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.RuntimeClass: Get "https://10.200.20.15:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:27.056180 kubelet[3069]: E0114 13:37:27.056159    3069 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.200.20.15:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:27.368832 kubelet[3069]: W0114 13:37:27.368767    3069 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: Get "https://10.200.20.15:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:27.368832 kubelet[3069]: E0114 13:37:27.368809    3069 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.200.20.15:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:28.052310 kubelet[3069]: W0114 13:37:28.052271    3069 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Node: Get "https://10.200.20.15:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4186.1.0-a-e83668d6e0&limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:28.052310 kubelet[3069]: E0114 13:37:28.052311    3069 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.200.20.15:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4186.1.0-a-e83668d6e0&limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:29.218549 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1569505314.mount: Deactivated successfully.
Jan 14 13:37:29.508086 containerd[1759]: time="2025-01-14T13:37:29.507940469Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}"
Jan 14 13:37:29.607135 containerd[1759]: time="2025-01-14T13:37:29.607073032Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269173"
Jan 14 13:37:29.659017 kubelet[3069]: E0114 13:37:29.658964    3069 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.15:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4186.1.0-a-e83668d6e0?timeout=10s\": dial tcp 10.200.20.15:6443: connect: connection refused" interval="6.4s"
Jan 14 13:37:29.668986 containerd[1759]: time="2025-01-14T13:37:29.668896444Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}"
Jan 14 13:37:29.715768 containerd[1759]: time="2025-01-14T13:37:29.715692404Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}"
Jan 14 13:37:29.767219 kubelet[3069]: I0114 13:37:29.767126    3069 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:29.767620 kubelet[3069]: E0114 13:37:29.767576    3069 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.20.15:6443/api/v1/nodes\": dial tcp 10.200.20.15:6443: connect: connection refused" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:29.806457 containerd[1759]: time="2025-01-14T13:37:29.806382560Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}"
Jan 14 13:37:29.808963 containerd[1759]: time="2025-01-14T13:37:29.808904882Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0"
Jan 14 13:37:29.854869 containerd[1759]: time="2025-01-14T13:37:29.854812841Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}"
Jan 14 13:37:29.856430 containerd[1759]: time="2025-01-14T13:37:29.856287602Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 2.943239435s"
Jan 14 13:37:29.886879 kubelet[3069]: E0114 13:37:29.886851    3069 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.200.20.15:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:29.901267 containerd[1759]: time="2025-01-14T13:37:29.901070799Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0"
Jan 14 13:37:29.967540 containerd[1759]: time="2025-01-14T13:37:29.967285415Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 3.049628884s"
Jan 14 13:37:30.054977 containerd[1759]: time="2025-01-14T13:37:30.054405968Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 3.132838234s"
Jan 14 13:37:31.171641 containerd[1759]: time="2025-01-14T13:37:31.171308107Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
Jan 14 13:37:31.171641 containerd[1759]: time="2025-01-14T13:37:31.171364907Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
Jan 14 13:37:31.171641 containerd[1759]: time="2025-01-14T13:37:31.171379347Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:37:31.171641 containerd[1759]: time="2025-01-14T13:37:31.171447987Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:37:31.175276 containerd[1759]: time="2025-01-14T13:37:31.175174191Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
Jan 14 13:37:31.175276 containerd[1759]: time="2025-01-14T13:37:31.175238711Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
Jan 14 13:37:31.176138 containerd[1759]: time="2025-01-14T13:37:31.175254871Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:37:31.176138 containerd[1759]: time="2025-01-14T13:37:31.175690551Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:37:31.187379 containerd[1759]: time="2025-01-14T13:37:31.187245281Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
Jan 14 13:37:31.187878 containerd[1759]: time="2025-01-14T13:37:31.187567641Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
Jan 14 13:37:31.187878 containerd[1759]: time="2025-01-14T13:37:31.187619481Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:37:31.188224 containerd[1759]: time="2025-01-14T13:37:31.188179962Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:37:31.202725 systemd[1]: Started cri-containerd-95730ae593ba594bbc992842869c5e33b2e041a3d813b2d6d85cdbf0c3f2bc60.scope - libcontainer container 95730ae593ba594bbc992842869c5e33b2e041a3d813b2d6d85cdbf0c3f2bc60.
Jan 14 13:37:31.217171 systemd[1]: Started cri-containerd-4122ecac76b4cc8a4e2b5dd84794b1f3a10818d6fb68b3a2a1ba992b71660aa2.scope - libcontainer container 4122ecac76b4cc8a4e2b5dd84794b1f3a10818d6fb68b3a2a1ba992b71660aa2.
Jan 14 13:37:31.218972 systemd[1]: Started cri-containerd-551c8ca826a7c4b252f2ce2216f24102753a1b75d550cefd021c53c175b6d936.scope - libcontainer container 551c8ca826a7c4b252f2ce2216f24102753a1b75d550cefd021c53c175b6d936.
Jan 14 13:37:31.264535 containerd[1759]: time="2025-01-14T13:37:31.264484786Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4186.1.0-a-e83668d6e0,Uid:5f8e24e91e237b13c726e3ae93873aac,Namespace:kube-system,Attempt:0,} returns sandbox id \"95730ae593ba594bbc992842869c5e33b2e041a3d813b2d6d85cdbf0c3f2bc60\""
Jan 14 13:37:31.274675 containerd[1759]: time="2025-01-14T13:37:31.274530394Z" level=info msg="CreateContainer within sandbox \"95730ae593ba594bbc992842869c5e33b2e041a3d813b2d6d85cdbf0c3f2bc60\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}"
Jan 14 13:37:31.275432 containerd[1759]: time="2025-01-14T13:37:31.275321835Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4186.1.0-a-e83668d6e0,Uid:10ccfcf93b347a6d7cb84ddeb2e79cd1,Namespace:kube-system,Attempt:0,} returns sandbox id \"4122ecac76b4cc8a4e2b5dd84794b1f3a10818d6fb68b3a2a1ba992b71660aa2\""
Jan 14 13:37:31.280779 containerd[1759]: time="2025-01-14T13:37:31.280732159Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4186.1.0-a-e83668d6e0,Uid:dc4486a3a33639b031f5be51e1899536,Namespace:kube-system,Attempt:0,} returns sandbox id \"551c8ca826a7c4b252f2ce2216f24102753a1b75d550cefd021c53c175b6d936\""
Jan 14 13:37:31.282413 containerd[1759]: time="2025-01-14T13:37:31.282367841Z" level=info msg="CreateContainer within sandbox \"4122ecac76b4cc8a4e2b5dd84794b1f3a10818d6fb68b3a2a1ba992b71660aa2\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}"
Jan 14 13:37:31.285378 containerd[1759]: time="2025-01-14T13:37:31.285074763Z" level=info msg="CreateContainer within sandbox \"551c8ca826a7c4b252f2ce2216f24102753a1b75d550cefd021c53c175b6d936\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}"
Jan 14 13:37:31.668934 kubelet[3069]: W0114 13:37:31.668860    3069 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: Get "https://10.200.20.15:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:31.668934 kubelet[3069]: E0114 13:37:31.668940    3069 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.200.20.15:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:31.823969 kubelet[3069]: W0114 13:37:31.823909    3069 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Service: Get "https://10.200.20.15:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:31.823969 kubelet[3069]: E0114 13:37:31.823972    3069 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.200.20.15:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:32.163183 containerd[1759]: time="2025-01-14T13:37:32.163060261Z" level=info msg="CreateContainer within sandbox \"95730ae593ba594bbc992842869c5e33b2e041a3d813b2d6d85cdbf0c3f2bc60\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"80131fb008518de5fae24033edc182308d5306341648ec5d1e22610427eec863\""
Jan 14 13:37:32.164170 containerd[1759]: time="2025-01-14T13:37:32.163832942Z" level=info msg="StartContainer for \"80131fb008518de5fae24033edc182308d5306341648ec5d1e22610427eec863\""
Jan 14 13:37:32.192191 systemd[1]: Started cri-containerd-80131fb008518de5fae24033edc182308d5306341648ec5d1e22610427eec863.scope - libcontainer container 80131fb008518de5fae24033edc182308d5306341648ec5d1e22610427eec863.
Jan 14 13:37:32.401223 kubelet[3069]: W0114 13:37:32.401161    3069 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.RuntimeClass: Get "https://10.200.20.15:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:32.401223 kubelet[3069]: E0114 13:37:32.401226    3069 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.200.20.15:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused
Jan 14 13:37:32.417810 containerd[1759]: time="2025-01-14T13:37:32.417587075Z" level=info msg="CreateContainer within sandbox \"551c8ca826a7c4b252f2ce2216f24102753a1b75d550cefd021c53c175b6d936\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"0724787c81691fcc8657b1790d1ba450fb93e7eb5c8ddac28403b009fb376a5d\""
Jan 14 13:37:32.417810 containerd[1759]: time="2025-01-14T13:37:32.417681315Z" level=info msg="StartContainer for \"80131fb008518de5fae24033edc182308d5306341648ec5d1e22610427eec863\" returns successfully"
Jan 14 13:37:32.419349 containerd[1759]: time="2025-01-14T13:37:32.418423796Z" level=info msg="StartContainer for \"0724787c81691fcc8657b1790d1ba450fb93e7eb5c8ddac28403b009fb376a5d\""
Jan 14 13:37:32.419457 containerd[1759]: time="2025-01-14T13:37:32.419433077Z" level=info msg="CreateContainer within sandbox \"4122ecac76b4cc8a4e2b5dd84794b1f3a10818d6fb68b3a2a1ba992b71660aa2\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"6b8850e7fdda829af89828a3765fb526934ec99dfa693ea27f0a46da6cdadf7e\""
Jan 14 13:37:32.422328 containerd[1759]: time="2025-01-14T13:37:32.421348598Z" level=info msg="StartContainer for \"6b8850e7fdda829af89828a3765fb526934ec99dfa693ea27f0a46da6cdadf7e\""
Jan 14 13:37:32.470168 systemd[1]: Started cri-containerd-6b8850e7fdda829af89828a3765fb526934ec99dfa693ea27f0a46da6cdadf7e.scope - libcontainer container 6b8850e7fdda829af89828a3765fb526934ec99dfa693ea27f0a46da6cdadf7e.
Jan 14 13:37:32.473214 systemd[1]: Started cri-containerd-0724787c81691fcc8657b1790d1ba450fb93e7eb5c8ddac28403b009fb376a5d.scope - libcontainer container 0724787c81691fcc8657b1790d1ba450fb93e7eb5c8ddac28403b009fb376a5d.
Jan 14 13:37:34.445162 kubelet[3069]: I0114 13:37:34.445124    3069 apiserver.go:52] "Watching apiserver"
Jan 14 13:37:34.513830 kubelet[3069]: I0114 13:37:34.454054    3069 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world"
Jan 14 13:37:34.519980 containerd[1759]: time="2025-01-14T13:37:34.519936403Z" level=info msg="StartContainer for \"6b8850e7fdda829af89828a3765fb526934ec99dfa693ea27f0a46da6cdadf7e\" returns successfully"
Jan 14 13:37:34.520305 containerd[1759]: time="2025-01-14T13:37:34.520006483Z" level=info msg="StartContainer for \"0724787c81691fcc8657b1790d1ba450fb93e7eb5c8ddac28403b009fb376a5d\" returns successfully"
Jan 14 13:37:34.955417 kubelet[3069]: E0114 13:37:34.955364    3069 csi_plugin.go:300] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4186.1.0-a-e83668d6e0" not found
Jan 14 13:37:35.365111 kubelet[3069]: E0114 13:37:35.365078    3069 csi_plugin.go:300] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4186.1.0-a-e83668d6e0" not found
Jan 14 13:37:35.784251 kubelet[3069]: E0114 13:37:35.783953    3069 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4186.1.0-a-e83668d6e0\" not found"
Jan 14 13:37:35.807104 kubelet[3069]: E0114 13:37:35.807054    3069 csi_plugin.go:300] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4186.1.0-a-e83668d6e0" not found
Jan 14 13:37:36.062883 kubelet[3069]: E0114 13:37:36.062651    3069 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4186.1.0-a-e83668d6e0\" not found" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:36.170014 kubelet[3069]: I0114 13:37:36.169959    3069 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:36.177239 kubelet[3069]: I0114 13:37:36.177196    3069 kubelet_node_status.go:76] "Successfully registered node" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:36.539354 kubelet[3069]: W0114 13:37:36.539194    3069 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]
Jan 14 13:37:37.202274 systemd[1]: Reloading requested from client PID 3338 ('systemctl') (unit session-9.scope)...
Jan 14 13:37:37.202288 systemd[1]: Reloading...
Jan 14 13:37:37.285027 zram_generator::config[3378]: No configuration found.
Jan 14 13:37:37.385507 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly.
Jan 14 13:37:37.474373 systemd[1]: Reloading finished in 271 ms.
Jan 14 13:37:37.507263 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent...
Jan 14 13:37:37.523024 systemd[1]: kubelet.service: Deactivated successfully.
Jan 14 13:37:37.523298 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
Jan 14 13:37:37.523359 systemd[1]: kubelet.service: Consumed 1.222s CPU time, 111.1M memory peak, 0B memory swap peak.
Jan 14 13:37:37.530324 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent...
Jan 14 13:37:37.634039 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
Jan 14 13:37:37.640636 (kubelet)[3442]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS
Jan 14 13:37:37.686774 kubelet[3442]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Jan 14 13:37:37.686774 kubelet[3442]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI.
Jan 14 13:37:37.686774 kubelet[3442]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Jan 14 13:37:37.689674 kubelet[3442]: I0114 13:37:37.689611    3442 server.go:204] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime"
Jan 14 13:37:37.695753 kubelet[3442]: I0114 13:37:37.695637    3442 server.go:487] "Kubelet version" kubeletVersion="v1.29.2"
Jan 14 13:37:37.697029 kubelet[3442]: I0114 13:37:37.695921    3442 server.go:489] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
Jan 14 13:37:37.697029 kubelet[3442]: I0114 13:37:37.696128    3442 server.go:919] "Client rotation is on, will bootstrap in background"
Jan 14 13:37:37.697845 kubelet[3442]: I0114 13:37:37.697821    3442 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem".
Jan 14 13:37:37.699753 kubelet[3442]: I0114 13:37:37.699630    3442 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt"
Jan 14 13:37:37.708514 kubelet[3442]: I0114 13:37:37.708482    3442 server.go:745] "--cgroups-per-qos enabled, but --cgroup-root was not specified.  defaulting to /"
Jan 14 13:37:37.709132 kubelet[3442]: I0114 13:37:37.708687    3442 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[]
Jan 14 13:37:37.709132 kubelet[3442]: I0114 13:37:37.708840    3442 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null}
Jan 14 13:37:37.709132 kubelet[3442]: I0114 13:37:37.708876    3442 topology_manager.go:138] "Creating topology manager with none policy"
Jan 14 13:37:37.709132 kubelet[3442]: I0114 13:37:37.708885    3442 container_manager_linux.go:301] "Creating device plugin manager"
Jan 14 13:37:37.709132 kubelet[3442]: I0114 13:37:37.708916    3442 state_mem.go:36] "Initialized new in-memory state store"
Jan 14 13:37:37.710266 kubelet[3442]: I0114 13:37:37.709417    3442 kubelet.go:396] "Attempting to sync node with API server"
Jan 14 13:37:37.710266 kubelet[3442]: I0114 13:37:37.709450    3442 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests"
Jan 14 13:37:37.710266 kubelet[3442]: I0114 13:37:37.709474    3442 kubelet.go:312] "Adding apiserver pod source"
Jan 14 13:37:37.710266 kubelet[3442]: I0114 13:37:37.709508    3442 apiserver.go:42] "Waiting for node sync before watching apiserver pods"
Jan 14 13:37:37.711105 kubelet[3442]: I0114 13:37:37.711070    3442 kuberuntime_manager.go:258] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1"
Jan 14 13:37:37.711374 kubelet[3442]: I0114 13:37:37.711359    3442 kubelet.go:809] "Not starting ClusterTrustBundle informer because we are in static kubelet mode"
Jan 14 13:37:37.711826 kubelet[3442]: I0114 13:37:37.711808    3442 server.go:1256] "Started kubelet"
Jan 14 13:37:37.714376 kubelet[3442]: I0114 13:37:37.714347    3442 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer"
Jan 14 13:37:37.719550 kubelet[3442]: I0114 13:37:37.719525    3442 server.go:162] "Starting to listen" address="0.0.0.0" port=10250
Jan 14 13:37:37.727256 kubelet[3442]: I0114 13:37:37.727146    3442 server.go:461] "Adding debug handlers to kubelet server"
Jan 14 13:37:37.729055 kubelet[3442]: I0114 13:37:37.728530    3442 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10
Jan 14 13:37:37.730445 kubelet[3442]: I0114 13:37:37.728696    3442 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock"
Jan 14 13:37:37.733316 kubelet[3442]: I0114 13:37:37.732308    3442 volume_manager.go:291] "Starting Kubelet Volume Manager"
Jan 14 13:37:37.733316 kubelet[3442]: I0114 13:37:37.732425    3442 desired_state_of_world_populator.go:151] "Desired state populator starts to run"
Jan 14 13:37:37.733316 kubelet[3442]: I0114 13:37:37.732552    3442 reconciler_new.go:29] "Reconciler: start to sync state"
Jan 14 13:37:37.742157 kubelet[3442]: I0114 13:37:37.741680    3442 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory
Jan 14 13:37:37.742157 kubelet[3442]: I0114 13:37:37.741885    3442 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4"
Jan 14 13:37:37.744239 kubelet[3442]: I0114 13:37:37.743665    3442 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6"
Jan 14 13:37:37.744239 kubelet[3442]: I0114 13:37:37.743693    3442 status_manager.go:217] "Starting to sync pod status with apiserver"
Jan 14 13:37:37.744239 kubelet[3442]: I0114 13:37:37.743708    3442 kubelet.go:2329] "Starting kubelet main sync loop"
Jan 14 13:37:37.744239 kubelet[3442]: E0114 13:37:37.743749    3442 kubelet.go:2353] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]"
Jan 14 13:37:37.758468 kubelet[3442]: I0114 13:37:37.758017    3442 factory.go:221] Registration of the containerd container factory successfully
Jan 14 13:37:37.758468 kubelet[3442]: I0114 13:37:37.758038    3442 factory.go:221] Registration of the systemd container factory successfully
Jan 14 13:37:37.811728 kubelet[3442]: I0114 13:37:37.811694    3442 cpu_manager.go:214] "Starting CPU manager" policy="none"
Jan 14 13:37:37.812117 kubelet[3442]: I0114 13:37:37.811923    3442 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s"
Jan 14 13:37:37.812117 kubelet[3442]: I0114 13:37:37.811967    3442 state_mem.go:36] "Initialized new in-memory state store"
Jan 14 13:37:37.812434 kubelet[3442]: I0114 13:37:37.812327    3442 state_mem.go:88] "Updated default CPUSet" cpuSet=""
Jan 14 13:37:37.812434 kubelet[3442]: I0114 13:37:37.812353    3442 state_mem.go:96] "Updated CPUSet assignments" assignments={}
Jan 14 13:37:37.812434 kubelet[3442]: I0114 13:37:37.812360    3442 policy_none.go:49] "None policy: Start"
Jan 14 13:37:37.813272 kubelet[3442]: I0114 13:37:37.812930    3442 memory_manager.go:170] "Starting memorymanager" policy="None"
Jan 14 13:37:37.813272 kubelet[3442]: I0114 13:37:37.812955    3442 state_mem.go:35] "Initializing new in-memory state store"
Jan 14 13:37:37.813272 kubelet[3442]: I0114 13:37:37.813146    3442 state_mem.go:75] "Updated machine memory state"
Jan 14 13:37:37.822003 kubelet[3442]: I0114 13:37:37.821967    3442 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found"
Jan 14 13:37:37.822309 kubelet[3442]: I0114 13:37:37.822293    3442 plugin_manager.go:118] "Starting Kubelet Plugin Manager"
Jan 14 13:37:37.835849 kubelet[3442]: I0114 13:37:37.835826    3442 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:37.844651 kubelet[3442]: I0114 13:37:37.844559    3442 topology_manager.go:215] "Topology Admit Handler" podUID="5f8e24e91e237b13c726e3ae93873aac" podNamespace="kube-system" podName="kube-controller-manager-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:37.844772 kubelet[3442]: I0114 13:37:37.844732    3442 topology_manager.go:215] "Topology Admit Handler" podUID="dc4486a3a33639b031f5be51e1899536" podNamespace="kube-system" podName="kube-scheduler-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:37.844798 kubelet[3442]: I0114 13:37:37.844793    3442 topology_manager.go:215] "Topology Admit Handler" podUID="10ccfcf93b347a6d7cb84ddeb2e79cd1" podNamespace="kube-system" podName="kube-apiserver-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:37.851826 kubelet[3442]: W0114 13:37:37.851690    3442 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]
Jan 14 13:37:37.855878 kubelet[3442]: W0114 13:37:37.855278    3442 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]
Jan 14 13:37:37.855878 kubelet[3442]: I0114 13:37:37.855394    3442 kubelet_node_status.go:112] "Node was previously registered" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:37.855878 kubelet[3442]: I0114 13:37:37.855469    3442 kubelet_node_status.go:76] "Successfully registered node" node="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:37.855878 kubelet[3442]: W0114 13:37:37.855765    3442 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]
Jan 14 13:37:37.855878 kubelet[3442]: E0114 13:37:37.855807    3442 kubelet.go:1921] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4186.1.0-a-e83668d6e0\" already exists" pod="kube-system/kube-apiserver-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:38.034367 kubelet[3442]: I0114 13:37:38.034247    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5f8e24e91e237b13c726e3ae93873aac-k8s-certs\") pod \"kube-controller-manager-ci-4186.1.0-a-e83668d6e0\" (UID: \"5f8e24e91e237b13c726e3ae93873aac\") " pod="kube-system/kube-controller-manager-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:38.034367 kubelet[3442]: I0114 13:37:38.034309    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5f8e24e91e237b13c726e3ae93873aac-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4186.1.0-a-e83668d6e0\" (UID: \"5f8e24e91e237b13c726e3ae93873aac\") " pod="kube-system/kube-controller-manager-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:38.034367 kubelet[3442]: I0114 13:37:38.034332    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5f8e24e91e237b13c726e3ae93873aac-ca-certs\") pod \"kube-controller-manager-ci-4186.1.0-a-e83668d6e0\" (UID: \"5f8e24e91e237b13c726e3ae93873aac\") " pod="kube-system/kube-controller-manager-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:38.034367 kubelet[3442]: I0114 13:37:38.034352    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5f8e24e91e237b13c726e3ae93873aac-flexvolume-dir\") pod \"kube-controller-manager-ci-4186.1.0-a-e83668d6e0\" (UID: \"5f8e24e91e237b13c726e3ae93873aac\") " pod="kube-system/kube-controller-manager-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:38.034551 kubelet[3442]: I0114 13:37:38.034380    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/10ccfcf93b347a6d7cb84ddeb2e79cd1-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4186.1.0-a-e83668d6e0\" (UID: \"10ccfcf93b347a6d7cb84ddeb2e79cd1\") " pod="kube-system/kube-apiserver-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:38.034551 kubelet[3442]: I0114 13:37:38.034400    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5f8e24e91e237b13c726e3ae93873aac-kubeconfig\") pod \"kube-controller-manager-ci-4186.1.0-a-e83668d6e0\" (UID: \"5f8e24e91e237b13c726e3ae93873aac\") " pod="kube-system/kube-controller-manager-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:38.034551 kubelet[3442]: I0114 13:37:38.034427    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/dc4486a3a33639b031f5be51e1899536-kubeconfig\") pod \"kube-scheduler-ci-4186.1.0-a-e83668d6e0\" (UID: \"dc4486a3a33639b031f5be51e1899536\") " pod="kube-system/kube-scheduler-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:38.034551 kubelet[3442]: I0114 13:37:38.034446    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/10ccfcf93b347a6d7cb84ddeb2e79cd1-ca-certs\") pod \"kube-apiserver-ci-4186.1.0-a-e83668d6e0\" (UID: \"10ccfcf93b347a6d7cb84ddeb2e79cd1\") " pod="kube-system/kube-apiserver-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:38.034551 kubelet[3442]: I0114 13:37:38.034470    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/10ccfcf93b347a6d7cb84ddeb2e79cd1-k8s-certs\") pod \"kube-apiserver-ci-4186.1.0-a-e83668d6e0\" (UID: \"10ccfcf93b347a6d7cb84ddeb2e79cd1\") " pod="kube-system/kube-apiserver-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:38.710269 kubelet[3442]: I0114 13:37:38.710230    3442 apiserver.go:52] "Watching apiserver"
Jan 14 13:37:38.733389 kubelet[3442]: I0114 13:37:38.733337    3442 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world"
Jan 14 13:37:38.840429 kubelet[3442]: W0114 13:37:38.838933    3442 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]
Jan 14 13:37:38.840429 kubelet[3442]: E0114 13:37:38.839016    3442 kubelet.go:1921] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4186.1.0-a-e83668d6e0\" already exists" pod="kube-system/kube-apiserver-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:38.840429 kubelet[3442]: W0114 13:37:38.839672    3442 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]
Jan 14 13:37:38.840429 kubelet[3442]: E0114 13:37:38.839714    3442 kubelet.go:1921] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-4186.1.0-a-e83668d6e0\" already exists" pod="kube-system/kube-controller-manager-ci-4186.1.0-a-e83668d6e0"
Jan 14 13:37:38.899009 kubelet[3442]: I0114 13:37:38.897206    3442 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4186.1.0-a-e83668d6e0" podStartSLOduration=1.897049875 podStartE2EDuration="1.897049875s" podCreationTimestamp="2025-01-14 13:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-14 13:37:38.896985234 +0000 UTC m=+1.250930048" watchObservedRunningTime="2025-01-14 13:37:38.897049875 +0000 UTC m=+1.250994689"
Jan 14 13:37:38.899009 kubelet[3442]: I0114 13:37:38.897452    3442 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4186.1.0-a-e83668d6e0" podStartSLOduration=1.897401795 podStartE2EDuration="1.897401795s" podCreationTimestamp="2025-01-14 13:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-14 13:37:38.874205495 +0000 UTC m=+1.228150309" watchObservedRunningTime="2025-01-14 13:37:38.897401795 +0000 UTC m=+1.251346609"
Jan 14 13:37:38.968332 kubelet[3442]: I0114 13:37:38.968221    3442 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4186.1.0-a-e83668d6e0" podStartSLOduration=2.968180734 podStartE2EDuration="2.968180734s" podCreationTimestamp="2025-01-14 13:37:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-14 13:37:38.947742117 +0000 UTC m=+1.301686931" watchObservedRunningTime="2025-01-14 13:37:38.968180734 +0000 UTC m=+1.322125548"
Jan 14 13:37:42.300569 sudo[2468]: pam_unix(sudo:session): session closed for user root
Jan 14 13:37:42.382359 sshd[2467]: Connection closed by 10.200.16.10 port 47846
Jan 14 13:37:42.382934 sshd-session[2465]: pam_unix(sshd:session): session closed for user core
Jan 14 13:37:42.386924 systemd-logind[1736]: Session 9 logged out. Waiting for processes to exit.
Jan 14 13:37:42.387795 systemd[1]: sshd@6-10.200.20.15:22-10.200.16.10:47846.service: Deactivated successfully.
Jan 14 13:37:42.390611 systemd[1]: session-9.scope: Deactivated successfully.
Jan 14 13:37:42.391016 systemd[1]: session-9.scope: Consumed 5.918s CPU time, 185.8M memory peak, 0B memory swap peak.
Jan 14 13:37:42.391961 systemd-logind[1736]: Removed session 9.
Jan 14 13:37:52.186038 kubelet[3442]: I0114 13:37:52.185884    3442 kuberuntime_manager.go:1529] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24"
Jan 14 13:37:52.187789 containerd[1759]: time="2025-01-14T13:37:52.187505531Z" level=info msg="No cni config template is specified, wait for other system components to drop the config."
Jan 14 13:37:52.188164 kubelet[3442]: I0114 13:37:52.187708    3442 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24"
Jan 14 13:37:53.074962 kubelet[3442]: I0114 13:37:53.074396    3442 topology_manager.go:215] "Topology Admit Handler" podUID="35b2267f-b2df-4533-a731-2357292a8d77" podNamespace="kube-system" podName="kube-proxy-hz95q"
Jan 14 13:37:53.083817 systemd[1]: Created slice kubepods-besteffort-pod35b2267f_b2df_4533_a731_2357292a8d77.slice - libcontainer container kubepods-besteffort-pod35b2267f_b2df_4533_a731_2357292a8d77.slice.
Jan 14 13:37:53.124295 kubelet[3442]: I0114 13:37:53.124205    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/35b2267f-b2df-4533-a731-2357292a8d77-xtables-lock\") pod \"kube-proxy-hz95q\" (UID: \"35b2267f-b2df-4533-a731-2357292a8d77\") " pod="kube-system/kube-proxy-hz95q"
Jan 14 13:37:53.124295 kubelet[3442]: I0114 13:37:53.124247    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b2w9\" (UniqueName: \"kubernetes.io/projected/35b2267f-b2df-4533-a731-2357292a8d77-kube-api-access-9b2w9\") pod \"kube-proxy-hz95q\" (UID: \"35b2267f-b2df-4533-a731-2357292a8d77\") " pod="kube-system/kube-proxy-hz95q"
Jan 14 13:37:53.124295 kubelet[3442]: I0114 13:37:53.124273    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/35b2267f-b2df-4533-a731-2357292a8d77-kube-proxy\") pod \"kube-proxy-hz95q\" (UID: \"35b2267f-b2df-4533-a731-2357292a8d77\") " pod="kube-system/kube-proxy-hz95q"
Jan 14 13:37:53.124295 kubelet[3442]: I0114 13:37:53.124294    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/35b2267f-b2df-4533-a731-2357292a8d77-lib-modules\") pod \"kube-proxy-hz95q\" (UID: \"35b2267f-b2df-4533-a731-2357292a8d77\") " pod="kube-system/kube-proxy-hz95q"
Jan 14 13:37:53.202160 kubelet[3442]: I0114 13:37:53.201235    3442 topology_manager.go:215] "Topology Admit Handler" podUID="fb80bf25-abf6-461a-9fa4-e92bd525fd8e" podNamespace="tigera-operator" podName="tigera-operator-c7ccbd65-ppnsp"
Jan 14 13:37:53.210183 systemd[1]: Created slice kubepods-besteffort-podfb80bf25_abf6_461a_9fa4_e92bd525fd8e.slice - libcontainer container kubepods-besteffort-podfb80bf25_abf6_461a_9fa4_e92bd525fd8e.slice.
Jan 14 13:37:53.225483 kubelet[3442]: I0114 13:37:53.225198    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fb80bf25-abf6-461a-9fa4-e92bd525fd8e-var-lib-calico\") pod \"tigera-operator-c7ccbd65-ppnsp\" (UID: \"fb80bf25-abf6-461a-9fa4-e92bd525fd8e\") " pod="tigera-operator/tigera-operator-c7ccbd65-ppnsp"
Jan 14 13:37:53.225483 kubelet[3442]: I0114 13:37:53.225239    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf6ct\" (UniqueName: \"kubernetes.io/projected/fb80bf25-abf6-461a-9fa4-e92bd525fd8e-kube-api-access-zf6ct\") pod \"tigera-operator-c7ccbd65-ppnsp\" (UID: \"fb80bf25-abf6-461a-9fa4-e92bd525fd8e\") " pod="tigera-operator/tigera-operator-c7ccbd65-ppnsp"
Jan 14 13:37:53.391918 containerd[1759]: time="2025-01-14T13:37:53.391869305Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hz95q,Uid:35b2267f-b2df-4533-a731-2357292a8d77,Namespace:kube-system,Attempt:0,}"
Jan 14 13:37:53.447645 containerd[1759]: time="2025-01-14T13:37:53.447399310Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
Jan 14 13:37:53.447645 containerd[1759]: time="2025-01-14T13:37:53.447446030Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
Jan 14 13:37:53.447645 containerd[1759]: time="2025-01-14T13:37:53.447460390Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:37:53.447645 containerd[1759]: time="2025-01-14T13:37:53.447523590Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:37:53.470135 systemd[1]: Started cri-containerd-f91016983eb27693bca86dd3b7b00b1d641cb39bfa78ccb20d3abdcfc3d172df.scope - libcontainer container f91016983eb27693bca86dd3b7b00b1d641cb39bfa78ccb20d3abdcfc3d172df.
Jan 14 13:37:53.490811 containerd[1759]: time="2025-01-14T13:37:53.490746105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hz95q,Uid:35b2267f-b2df-4533-a731-2357292a8d77,Namespace:kube-system,Attempt:0,} returns sandbox id \"f91016983eb27693bca86dd3b7b00b1d641cb39bfa78ccb20d3abdcfc3d172df\""
Jan 14 13:37:53.493914 containerd[1759]: time="2025-01-14T13:37:53.493824227Z" level=info msg="CreateContainer within sandbox \"f91016983eb27693bca86dd3b7b00b1d641cb39bfa78ccb20d3abdcfc3d172df\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}"
Jan 14 13:37:53.519814 containerd[1759]: time="2025-01-14T13:37:53.519578208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-c7ccbd65-ppnsp,Uid:fb80bf25-abf6-461a-9fa4-e92bd525fd8e,Namespace:tigera-operator,Attempt:0,}"
Jan 14 13:37:53.553676 containerd[1759]: time="2025-01-14T13:37:53.553637956Z" level=info msg="CreateContainer within sandbox \"f91016983eb27693bca86dd3b7b00b1d641cb39bfa78ccb20d3abdcfc3d172df\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"83399f88640b89f994b9f439f4bb55fa8ab9a5533f2950c3b9048ffd5b317123\""
Jan 14 13:37:53.554360 containerd[1759]: time="2025-01-14T13:37:53.554330436Z" level=info msg="StartContainer for \"83399f88640b89f994b9f439f4bb55fa8ab9a5533f2950c3b9048ffd5b317123\""
Jan 14 13:37:53.580256 systemd[1]: Started cri-containerd-83399f88640b89f994b9f439f4bb55fa8ab9a5533f2950c3b9048ffd5b317123.scope - libcontainer container 83399f88640b89f994b9f439f4bb55fa8ab9a5533f2950c3b9048ffd5b317123.
Jan 14 13:37:53.587793 containerd[1759]: time="2025-01-14T13:37:53.587585783Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
Jan 14 13:37:53.587793 containerd[1759]: time="2025-01-14T13:37:53.587671223Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
Jan 14 13:37:53.587793 containerd[1759]: time="2025-01-14T13:37:53.587684543Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:37:53.588152 containerd[1759]: time="2025-01-14T13:37:53.587867143Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:37:53.605227 systemd[1]: Started cri-containerd-15f310c339de143d205be1200490f94f25d72857e04544fca3b48de89a4b5d8e.scope - libcontainer container 15f310c339de143d205be1200490f94f25d72857e04544fca3b48de89a4b5d8e.
Jan 14 13:37:53.621317 containerd[1759]: time="2025-01-14T13:37:53.619830049Z" level=info msg="StartContainer for \"83399f88640b89f994b9f439f4bb55fa8ab9a5533f2950c3b9048ffd5b317123\" returns successfully"
Jan 14 13:37:53.652516 containerd[1759]: time="2025-01-14T13:37:53.651971035Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-c7ccbd65-ppnsp,Uid:fb80bf25-abf6-461a-9fa4-e92bd525fd8e,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"15f310c339de143d205be1200490f94f25d72857e04544fca3b48de89a4b5d8e\""
Jan 14 13:37:53.656927 containerd[1759]: time="2025-01-14T13:37:53.656765039Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\""
Jan 14 13:37:55.208844 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2737600517.mount: Deactivated successfully.
Jan 14 13:37:55.542400 containerd[1759]: time="2025-01-14T13:37:55.542277799Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:55.545064 containerd[1759]: time="2025-01-14T13:37:55.545012882Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=19125948"
Jan 14 13:37:55.553052 containerd[1759]: time="2025-01-14T13:37:55.552983168Z" level=info msg="ImageCreate event name:\"sha256:30d521e4e84764b396aacbb2a373ca7a573f84571e3955b34329652acccfb73c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:55.557269 containerd[1759]: time="2025-01-14T13:37:55.557222011Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:37:55.558334 containerd[1759]: time="2025-01-14T13:37:55.557794692Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:30d521e4e84764b396aacbb2a373ca7a573f84571e3955b34329652acccfb73c\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"19120155\" in 1.899903372s"
Jan 14 13:37:55.558334 containerd[1759]: time="2025-01-14T13:37:55.557828572Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:30d521e4e84764b396aacbb2a373ca7a573f84571e3955b34329652acccfb73c\""
Jan 14 13:37:55.559530 containerd[1759]: time="2025-01-14T13:37:55.559505093Z" level=info msg="CreateContainer within sandbox \"15f310c339de143d205be1200490f94f25d72857e04544fca3b48de89a4b5d8e\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}"
Jan 14 13:37:55.605594 containerd[1759]: time="2025-01-14T13:37:55.605483570Z" level=info msg="CreateContainer within sandbox \"15f310c339de143d205be1200490f94f25d72857e04544fca3b48de89a4b5d8e\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"46bbc3bfa4db79e69b72c676b2438a9409d60c42586d38e77b4d7e0ea1277c8f\""
Jan 14 13:37:55.606238 containerd[1759]: time="2025-01-14T13:37:55.606104251Z" level=info msg="StartContainer for \"46bbc3bfa4db79e69b72c676b2438a9409d60c42586d38e77b4d7e0ea1277c8f\""
Jan 14 13:37:55.633148 systemd[1]: Started cri-containerd-46bbc3bfa4db79e69b72c676b2438a9409d60c42586d38e77b4d7e0ea1277c8f.scope - libcontainer container 46bbc3bfa4db79e69b72c676b2438a9409d60c42586d38e77b4d7e0ea1277c8f.
Jan 14 13:37:55.658677 containerd[1759]: time="2025-01-14T13:37:55.658608693Z" level=info msg="StartContainer for \"46bbc3bfa4db79e69b72c676b2438a9409d60c42586d38e77b4d7e0ea1277c8f\" returns successfully"
Jan 14 13:37:55.844373 kubelet[3442]: I0114 13:37:55.844261    3442 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-proxy-hz95q" podStartSLOduration=2.8442211630000003 podStartE2EDuration="2.844221163s" podCreationTimestamp="2025-01-14 13:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-14 13:37:53.840271307 +0000 UTC m=+16.194216121" watchObservedRunningTime="2025-01-14 13:37:55.844221163 +0000 UTC m=+18.198165937"
Jan 14 13:37:55.845612 kubelet[3442]: I0114 13:37:55.845566    3442 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="tigera-operator/tigera-operator-c7ccbd65-ppnsp" podStartSLOduration=0.942335309 podStartE2EDuration="2.845528004s" podCreationTimestamp="2025-01-14 13:37:53 +0000 UTC" firstStartedPulling="2025-01-14 13:37:53.654941837 +0000 UTC m=+16.008886651" lastFinishedPulling="2025-01-14 13:37:55.558134532 +0000 UTC m=+17.912079346" observedRunningTime="2025-01-14 13:37:55.845521804 +0000 UTC m=+18.199466618" watchObservedRunningTime="2025-01-14 13:37:55.845528004 +0000 UTC m=+18.199472818"
Jan 14 13:37:59.265054 kubelet[3442]: I0114 13:37:59.263162    3442 topology_manager.go:215] "Topology Admit Handler" podUID="e54f8306-d45e-4df6-8b63-054930d85a11" podNamespace="calico-system" podName="calico-typha-dc9f4f78d-nfd7p"
Jan 14 13:37:59.271901 systemd[1]: Created slice kubepods-besteffort-pode54f8306_d45e_4df6_8b63_054930d85a11.slice - libcontainer container kubepods-besteffort-pode54f8306_d45e_4df6_8b63_054930d85a11.slice.
Jan 14 13:37:59.363519 kubelet[3442]: I0114 13:37:59.363173    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt9bg\" (UniqueName: \"kubernetes.io/projected/e54f8306-d45e-4df6-8b63-054930d85a11-kube-api-access-bt9bg\") pod \"calico-typha-dc9f4f78d-nfd7p\" (UID: \"e54f8306-d45e-4df6-8b63-054930d85a11\") " pod="calico-system/calico-typha-dc9f4f78d-nfd7p"
Jan 14 13:37:59.363519 kubelet[3442]: I0114 13:37:59.363217    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e54f8306-d45e-4df6-8b63-054930d85a11-tigera-ca-bundle\") pod \"calico-typha-dc9f4f78d-nfd7p\" (UID: \"e54f8306-d45e-4df6-8b63-054930d85a11\") " pod="calico-system/calico-typha-dc9f4f78d-nfd7p"
Jan 14 13:37:59.363519 kubelet[3442]: I0114 13:37:59.363239    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/e54f8306-d45e-4df6-8b63-054930d85a11-typha-certs\") pod \"calico-typha-dc9f4f78d-nfd7p\" (UID: \"e54f8306-d45e-4df6-8b63-054930d85a11\") " pod="calico-system/calico-typha-dc9f4f78d-nfd7p"
Jan 14 13:37:59.363519 kubelet[3442]: I0114 13:37:59.363405    3442 topology_manager.go:215] "Topology Admit Handler" podUID="0afbb13e-6495-4a3d-bde4-b65c77a5a21f" podNamespace="calico-system" podName="calico-node-mkcgd"
Jan 14 13:37:59.371833 systemd[1]: Created slice kubepods-besteffort-pod0afbb13e_6495_4a3d_bde4_b65c77a5a21f.slice - libcontainer container kubepods-besteffort-pod0afbb13e_6495_4a3d_bde4_b65c77a5a21f.slice.
Jan 14 13:37:59.463530 kubelet[3442]: I0114 13:37:59.463478    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/0afbb13e-6495-4a3d-bde4-b65c77a5a21f-policysync\") pod \"calico-node-mkcgd\" (UID: \"0afbb13e-6495-4a3d-bde4-b65c77a5a21f\") " pod="calico-system/calico-node-mkcgd"
Jan 14 13:37:59.463676 kubelet[3442]: I0114 13:37:59.463545    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/0afbb13e-6495-4a3d-bde4-b65c77a5a21f-cni-log-dir\") pod \"calico-node-mkcgd\" (UID: \"0afbb13e-6495-4a3d-bde4-b65c77a5a21f\") " pod="calico-system/calico-node-mkcgd"
Jan 14 13:37:59.463676 kubelet[3442]: I0114 13:37:59.463615    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0afbb13e-6495-4a3d-bde4-b65c77a5a21f-lib-modules\") pod \"calico-node-mkcgd\" (UID: \"0afbb13e-6495-4a3d-bde4-b65c77a5a21f\") " pod="calico-system/calico-node-mkcgd"
Jan 14 13:37:59.463676 kubelet[3442]: I0114 13:37:59.463639    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0afbb13e-6495-4a3d-bde4-b65c77a5a21f-tigera-ca-bundle\") pod \"calico-node-mkcgd\" (UID: \"0afbb13e-6495-4a3d-bde4-b65c77a5a21f\") " pod="calico-system/calico-node-mkcgd"
Jan 14 13:37:59.463749 kubelet[3442]: I0114 13:37:59.463690    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6sxm\" (UniqueName: \"kubernetes.io/projected/0afbb13e-6495-4a3d-bde4-b65c77a5a21f-kube-api-access-v6sxm\") pod \"calico-node-mkcgd\" (UID: \"0afbb13e-6495-4a3d-bde4-b65c77a5a21f\") " pod="calico-system/calico-node-mkcgd"
Jan 14 13:37:59.463749 kubelet[3442]: I0114 13:37:59.463712    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/0afbb13e-6495-4a3d-bde4-b65c77a5a21f-var-run-calico\") pod \"calico-node-mkcgd\" (UID: \"0afbb13e-6495-4a3d-bde4-b65c77a5a21f\") " pod="calico-system/calico-node-mkcgd"
Jan 14 13:37:59.463749 kubelet[3442]: I0114 13:37:59.463733    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0afbb13e-6495-4a3d-bde4-b65c77a5a21f-xtables-lock\") pod \"calico-node-mkcgd\" (UID: \"0afbb13e-6495-4a3d-bde4-b65c77a5a21f\") " pod="calico-system/calico-node-mkcgd"
Jan 14 13:37:59.463808 kubelet[3442]: I0114 13:37:59.463755    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/0afbb13e-6495-4a3d-bde4-b65c77a5a21f-flexvol-driver-host\") pod \"calico-node-mkcgd\" (UID: \"0afbb13e-6495-4a3d-bde4-b65c77a5a21f\") " pod="calico-system/calico-node-mkcgd"
Jan 14 13:37:59.463808 kubelet[3442]: I0114 13:37:59.463774    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/0afbb13e-6495-4a3d-bde4-b65c77a5a21f-node-certs\") pod \"calico-node-mkcgd\" (UID: \"0afbb13e-6495-4a3d-bde4-b65c77a5a21f\") " pod="calico-system/calico-node-mkcgd"
Jan 14 13:37:59.463808 kubelet[3442]: I0114 13:37:59.463793    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/0afbb13e-6495-4a3d-bde4-b65c77a5a21f-cni-bin-dir\") pod \"calico-node-mkcgd\" (UID: \"0afbb13e-6495-4a3d-bde4-b65c77a5a21f\") " pod="calico-system/calico-node-mkcgd"
Jan 14 13:37:59.463868 kubelet[3442]: I0114 13:37:59.463816    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/0afbb13e-6495-4a3d-bde4-b65c77a5a21f-cni-net-dir\") pod \"calico-node-mkcgd\" (UID: \"0afbb13e-6495-4a3d-bde4-b65c77a5a21f\") " pod="calico-system/calico-node-mkcgd"
Jan 14 13:37:59.463868 kubelet[3442]: I0114 13:37:59.463846    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0afbb13e-6495-4a3d-bde4-b65c77a5a21f-var-lib-calico\") pod \"calico-node-mkcgd\" (UID: \"0afbb13e-6495-4a3d-bde4-b65c77a5a21f\") " pod="calico-system/calico-node-mkcgd"
Jan 14 13:37:59.505107 kubelet[3442]: I0114 13:37:59.504699    3442 topology_manager.go:215] "Topology Admit Handler" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4" podNamespace="calico-system" podName="csi-node-driver-7bk8l"
Jan 14 13:37:59.505107 kubelet[3442]: E0114 13:37:59.504975    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:37:59.565615 kubelet[3442]: I0114 13:37:59.564561    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/af70d9ef-1b20-4f9f-93e1-f55f680c58b4-varrun\") pod \"csi-node-driver-7bk8l\" (UID: \"af70d9ef-1b20-4f9f-93e1-f55f680c58b4\") " pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:37:59.565615 kubelet[3442]: I0114 13:37:59.564608    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af70d9ef-1b20-4f9f-93e1-f55f680c58b4-kubelet-dir\") pod \"csi-node-driver-7bk8l\" (UID: \"af70d9ef-1b20-4f9f-93e1-f55f680c58b4\") " pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:37:59.565615 kubelet[3442]: I0114 13:37:59.564662    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/af70d9ef-1b20-4f9f-93e1-f55f680c58b4-socket-dir\") pod \"csi-node-driver-7bk8l\" (UID: \"af70d9ef-1b20-4f9f-93e1-f55f680c58b4\") " pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:37:59.565615 kubelet[3442]: I0114 13:37:59.564733    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ktpt\" (UniqueName: \"kubernetes.io/projected/af70d9ef-1b20-4f9f-93e1-f55f680c58b4-kube-api-access-9ktpt\") pod \"csi-node-driver-7bk8l\" (UID: \"af70d9ef-1b20-4f9f-93e1-f55f680c58b4\") " pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:37:59.565615 kubelet[3442]: I0114 13:37:59.564755    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/af70d9ef-1b20-4f9f-93e1-f55f680c58b4-registration-dir\") pod \"csi-node-driver-7bk8l\" (UID: \"af70d9ef-1b20-4f9f-93e1-f55f680c58b4\") " pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:37:59.569156 kubelet[3442]: E0114 13:37:59.569135    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.569380 kubelet[3442]: W0114 13:37:59.569363    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.569449 kubelet[3442]: E0114 13:37:59.569435    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.573665 kubelet[3442]: E0114 13:37:59.573520    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.573665 kubelet[3442]: W0114 13:37:59.573546    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.573665 kubelet[3442]: E0114 13:37:59.573569    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.575115 kubelet[3442]: E0114 13:37:59.575089    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.575191 kubelet[3442]: W0114 13:37:59.575109    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.575191 kubelet[3442]: E0114 13:37:59.575147    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.579633 kubelet[3442]: E0114 13:37:59.579606    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.579633 kubelet[3442]: W0114 13:37:59.579624    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.579779 kubelet[3442]: E0114 13:37:59.579766    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.579839 containerd[1759]: time="2025-01-14T13:37:59.579801935Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-dc9f4f78d-nfd7p,Uid:e54f8306-d45e-4df6-8b63-054930d85a11,Namespace:calico-system,Attempt:0,}"
Jan 14 13:37:59.580829 kubelet[3442]: E0114 13:37:59.580809    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.580829 kubelet[3442]: W0114 13:37:59.580826    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.580936 kubelet[3442]: E0114 13:37:59.580916    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.582079 kubelet[3442]: E0114 13:37:59.582051    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.582079 kubelet[3442]: W0114 13:37:59.582072    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.582318 kubelet[3442]: E0114 13:37:59.582194    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.582374 kubelet[3442]: E0114 13:37:59.582365    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.582409 kubelet[3442]: W0114 13:37:59.582378    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.582487 kubelet[3442]: E0114 13:37:59.582467    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.582654 kubelet[3442]: E0114 13:37:59.582633    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.582654 kubelet[3442]: W0114 13:37:59.582648    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.582770 kubelet[3442]: E0114 13:37:59.582748    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.582975 kubelet[3442]: E0114 13:37:59.582928    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.582975 kubelet[3442]: W0114 13:37:59.582963    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.583103 kubelet[3442]: E0114 13:37:59.583035    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.583605 kubelet[3442]: E0114 13:37:59.583585    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.583605 kubelet[3442]: W0114 13:37:59.583601    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.583705 kubelet[3442]: E0114 13:37:59.583687    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.584982 kubelet[3442]: E0114 13:37:59.584889    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.584982 kubelet[3442]: W0114 13:37:59.584905    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.585314 kubelet[3442]: E0114 13:37:59.585212    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.585314 kubelet[3442]: W0114 13:37:59.585224    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.585892 kubelet[3442]: E0114 13:37:59.585880    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.585953 kubelet[3442]: W0114 13:37:59.585943    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.587150 kubelet[3442]: E0114 13:37:59.587133    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.587236 kubelet[3442]: W0114 13:37:59.587224    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.587646 kubelet[3442]: E0114 13:37:59.587632    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.587748 kubelet[3442]: W0114 13:37:59.587729    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.588074 kubelet[3442]: E0114 13:37:59.588043    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.588152 kubelet[3442]: W0114 13:37:59.588141    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.588210 kubelet[3442]: E0114 13:37:59.588201    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.588557 kubelet[3442]: E0114 13:37:59.588544    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.588917 kubelet[3442]: W0114 13:37:59.588901    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.589611 kubelet[3442]: E0114 13:37:59.589537    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.589875 kubelet[3442]: E0114 13:37:59.589762    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.589875 kubelet[3442]: W0114 13:37:59.589772    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.589875 kubelet[3442]: E0114 13:37:59.589785    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.592050 kubelet[3442]: E0114 13:37:59.590924    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.592050 kubelet[3442]: E0114 13:37:59.590951    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.592050 kubelet[3442]: E0114 13:37:59.590976    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.592050 kubelet[3442]: E0114 13:37:59.591018    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.592050 kubelet[3442]: E0114 13:37:59.591030    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.593922 kubelet[3442]: E0114 13:37:59.593807    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.593922 kubelet[3442]: W0114 13:37:59.593821    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.593922 kubelet[3442]: E0114 13:37:59.593844    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.594254 kubelet[3442]: E0114 13:37:59.594184    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.594254 kubelet[3442]: W0114 13:37:59.594196    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.594254 kubelet[3442]: E0114 13:37:59.594216    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.594537 kubelet[3442]: E0114 13:37:59.594475    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.594537 kubelet[3442]: W0114 13:37:59.594485    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.594537 kubelet[3442]: E0114 13:37:59.594503    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.594866 kubelet[3442]: E0114 13:37:59.594793    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.594866 kubelet[3442]: W0114 13:37:59.594807    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.594866 kubelet[3442]: E0114 13:37:59.594845    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.595215 kubelet[3442]: E0114 13:37:59.595084    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.595215 kubelet[3442]: W0114 13:37:59.595095    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.595215 kubelet[3442]: E0114 13:37:59.595131    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.595455 kubelet[3442]: E0114 13:37:59.595358    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.595455 kubelet[3442]: W0114 13:37:59.595367    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.595455 kubelet[3442]: E0114 13:37:59.595419    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.595747 kubelet[3442]: E0114 13:37:59.595695    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.595747 kubelet[3442]: W0114 13:37:59.595706    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.595827 kubelet[3442]: E0114 13:37:59.595742    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.596099 kubelet[3442]: E0114 13:37:59.595977    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.596099 kubelet[3442]: W0114 13:37:59.596014    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.596099 kubelet[3442]: E0114 13:37:59.596071    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.596467 kubelet[3442]: E0114 13:37:59.596335    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.596467 kubelet[3442]: W0114 13:37:59.596345    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.596639 kubelet[3442]: E0114 13:37:59.596575    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.596639 kubelet[3442]: W0114 13:37:59.596584    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.596856 kubelet[3442]: E0114 13:37:59.596781    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.596856 kubelet[3442]: W0114 13:37:59.596790    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.597106 kubelet[3442]: E0114 13:37:59.597025    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.597106 kubelet[3442]: W0114 13:37:59.597037    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.597106 kubelet[3442]: E0114 13:37:59.597087    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.597211 kubelet[3442]: E0114 13:37:59.597132    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.597211 kubelet[3442]: E0114 13:37:59.597155    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.597211 kubelet[3442]: E0114 13:37:59.597166    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.597368 kubelet[3442]: E0114 13:37:59.597310    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.597368 kubelet[3442]: W0114 13:37:59.597319    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.597493 kubelet[3442]: E0114 13:37:59.597437    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.597888 kubelet[3442]: E0114 13:37:59.597877    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.598015 kubelet[3442]: W0114 13:37:59.597936    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.598015 kubelet[3442]: E0114 13:37:59.597956    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.598907 kubelet[3442]: E0114 13:37:59.598875    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.598907 kubelet[3442]: W0114 13:37:59.598887    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.598907 kubelet[3442]: E0114 13:37:59.598907    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.599182 kubelet[3442]: E0114 13:37:59.599172    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.599182 kubelet[3442]: W0114 13:37:59.599181    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.599266 kubelet[3442]: E0114 13:37:59.599193    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.599361 kubelet[3442]: E0114 13:37:59.599339    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.599361 kubelet[3442]: W0114 13:37:59.599354    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.599417 kubelet[3442]: E0114 13:37:59.599365    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.666211 kubelet[3442]: E0114 13:37:59.666145    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.666211 kubelet[3442]: W0114 13:37:59.666166    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.666211 kubelet[3442]: E0114 13:37:59.666186    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.666600 kubelet[3442]: E0114 13:37:59.666358    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.666600 kubelet[3442]: W0114 13:37:59.666366    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.666600 kubelet[3442]: E0114 13:37:59.666383    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.666804 kubelet[3442]: E0114 13:37:59.666685    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.666804 kubelet[3442]: W0114 13:37:59.666696    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.666973 kubelet[3442]: E0114 13:37:59.666909    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.667298 kubelet[3442]: E0114 13:37:59.667226    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.667298 kubelet[3442]: W0114 13:37:59.667239    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.667298 kubelet[3442]: E0114 13:37:59.667260    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.667656 kubelet[3442]: E0114 13:37:59.667535    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.667656 kubelet[3442]: W0114 13:37:59.667547    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.667656 kubelet[3442]: E0114 13:37:59.667566    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.668036 kubelet[3442]: E0114 13:37:59.667957    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.668036 kubelet[3442]: W0114 13:37:59.667978    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.668109 kubelet[3442]: E0114 13:37:59.668070    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.668380 kubelet[3442]: E0114 13:37:59.668288    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.668380 kubelet[3442]: W0114 13:37:59.668299    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.668380 kubelet[3442]: E0114 13:37:59.668327    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.668546 kubelet[3442]: E0114 13:37:59.668534    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.668724 kubelet[3442]: W0114 13:37:59.668639    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.668724 kubelet[3442]: E0114 13:37:59.668669    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.668976 kubelet[3442]: E0114 13:37:59.668916    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.668976 kubelet[3442]: W0114 13:37:59.668927    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.668976 kubelet[3442]: E0114 13:37:59.668945    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.669343 kubelet[3442]: E0114 13:37:59.669250    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.669343 kubelet[3442]: W0114 13:37:59.669264    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.669343 kubelet[3442]: E0114 13:37:59.669285    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.669918 kubelet[3442]: E0114 13:37:59.669834    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.669918 kubelet[3442]: W0114 13:37:59.669851    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.669918 kubelet[3442]: E0114 13:37:59.669899    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.670526 kubelet[3442]: E0114 13:37:59.670344    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.670526 kubelet[3442]: W0114 13:37:59.670360    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.670526 kubelet[3442]: E0114 13:37:59.670407    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.671273 kubelet[3442]: E0114 13:37:59.671200    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.671273 kubelet[3442]: W0114 13:37:59.671220    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.671778 kubelet[3442]: E0114 13:37:59.671604    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.672395 kubelet[3442]: E0114 13:37:59.671935    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.672395 kubelet[3442]: W0114 13:37:59.671948    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.672395 kubelet[3442]: E0114 13:37:59.671964    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.672566 kubelet[3442]: E0114 13:37:59.672554    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.673100 kubelet[3442]: W0114 13:37:59.672937    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.673100 kubelet[3442]: E0114 13:37:59.673045    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.673397 kubelet[3442]: E0114 13:37:59.673314    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.673397 kubelet[3442]: W0114 13:37:59.673329    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.673397 kubelet[3442]: E0114 13:37:59.673360    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.673659 kubelet[3442]: E0114 13:37:59.673581    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.673659 kubelet[3442]: W0114 13:37:59.673595    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.673659 kubelet[3442]: E0114 13:37:59.673626    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.673949 kubelet[3442]: E0114 13:37:59.673842    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.673949 kubelet[3442]: W0114 13:37:59.673854    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.673949 kubelet[3442]: E0114 13:37:59.673873    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.674190 kubelet[3442]: E0114 13:37:59.674176    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.674350 kubelet[3442]: W0114 13:37:59.674235    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.674350 kubelet[3442]: E0114 13:37:59.674262    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.674535 kubelet[3442]: E0114 13:37:59.674523    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.675196 kubelet[3442]: W0114 13:37:59.674579    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.675196 kubelet[3442]: E0114 13:37:59.674605    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.676204 kubelet[3442]: E0114 13:37:59.676147    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.676204 kubelet[3442]: W0114 13:37:59.676162    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.676204 kubelet[3442]: E0114 13:37:59.676192    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.676624 kubelet[3442]: E0114 13:37:59.676384    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.676624 kubelet[3442]: W0114 13:37:59.676400    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.676624 kubelet[3442]: E0114 13:37:59.676411    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.676735 kubelet[3442]: E0114 13:37:59.676708    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.676735 kubelet[3442]: W0114 13:37:59.676718    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.676735 kubelet[3442]: E0114 13:37:59.676730    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.677693 kubelet[3442]: E0114 13:37:59.677667    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.677693 kubelet[3442]: W0114 13:37:59.677684    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.677765 kubelet[3442]: E0114 13:37:59.677699    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.677957 containerd[1759]: time="2025-01-14T13:37:59.677914174Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mkcgd,Uid:0afbb13e-6495-4a3d-bde4-b65c77a5a21f,Namespace:calico-system,Attempt:0,}"
Jan 14 13:37:59.678363 kubelet[3442]: E0114 13:37:59.678336    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.678363 kubelet[3442]: W0114 13:37:59.678353    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.678539 kubelet[3442]: E0114 13:37:59.678477    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.690695 kubelet[3442]: E0114 13:37:59.690662    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:37:59.690695 kubelet[3442]: W0114 13:37:59.690685    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:37:59.690695 kubelet[3442]: E0114 13:37:59.690704    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:37:59.887041 containerd[1759]: time="2025-01-14T13:37:59.886941823Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
Jan 14 13:37:59.887041 containerd[1759]: time="2025-01-14T13:37:59.887018303Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
Jan 14 13:37:59.887041 containerd[1759]: time="2025-01-14T13:37:59.887061423Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:37:59.887041 containerd[1759]: time="2025-01-14T13:37:59.887145263Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:37:59.903163 systemd[1]: Started cri-containerd-eaaaa7b73f6a02173fb913ab73136d1605ef9f14d72a2706ed960db413cd81aa.scope - libcontainer container eaaaa7b73f6a02173fb913ab73136d1605ef9f14d72a2706ed960db413cd81aa.
Jan 14 13:37:59.934917 containerd[1759]: time="2025-01-14T13:37:59.934867581Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-dc9f4f78d-nfd7p,Uid:e54f8306-d45e-4df6-8b63-054930d85a11,Namespace:calico-system,Attempt:0,} returns sandbox id \"eaaaa7b73f6a02173fb913ab73136d1605ef9f14d72a2706ed960db413cd81aa\""
Jan 14 13:37:59.937075 containerd[1759]: time="2025-01-14T13:37:59.936659063Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\""
Jan 14 13:38:00.037863 containerd[1759]: time="2025-01-14T13:38:00.037472464Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
Jan 14 13:38:00.037863 containerd[1759]: time="2025-01-14T13:38:00.037521064Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
Jan 14 13:38:00.037863 containerd[1759]: time="2025-01-14T13:38:00.037532704Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:38:00.037863 containerd[1759]: time="2025-01-14T13:38:00.037600584Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:38:00.055193 systemd[1]: Started cri-containerd-2bfbdef55f246b1c65be1142c8ccc79669bcb65247943c9f2ada4e457c3083de.scope - libcontainer container 2bfbdef55f246b1c65be1142c8ccc79669bcb65247943c9f2ada4e457c3083de.
Jan 14 13:38:00.077635 containerd[1759]: time="2025-01-14T13:38:00.077596576Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mkcgd,Uid:0afbb13e-6495-4a3d-bde4-b65c77a5a21f,Namespace:calico-system,Attempt:0,} returns sandbox id \"2bfbdef55f246b1c65be1142c8ccc79669bcb65247943c9f2ada4e457c3083de\""
Jan 14 13:38:01.746112 kubelet[3442]: E0114 13:38:01.744886    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:03.745435 kubelet[3442]: E0114 13:38:03.745137    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:05.505761 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3668905820.mount: Deactivated successfully.
Jan 14 13:38:05.745505 kubelet[3442]: E0114 13:38:05.745193    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:06.376012 containerd[1759]: time="2025-01-14T13:38:06.374231221Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:38:06.378899 containerd[1759]: time="2025-01-14T13:38:06.378847541Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=29231308"
Jan 14 13:38:06.438159 containerd[1759]: time="2025-01-14T13:38:06.438099057Z" level=info msg="ImageCreate event name:\"sha256:1d1fc316829ae1650b0b1629b54232520f297e7c3b1444eecd290ae088902a28\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:38:06.442957 containerd[1759]: time="2025-01-14T13:38:06.442916097Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:38:06.443566 containerd[1759]: time="2025-01-14T13:38:06.443529337Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:1d1fc316829ae1650b0b1629b54232520f297e7c3b1444eecd290ae088902a28\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"29231162\" in 6.506837754s"
Jan 14 13:38:06.443566 containerd[1759]: time="2025-01-14T13:38:06.443561897Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:1d1fc316829ae1650b0b1629b54232520f297e7c3b1444eecd290ae088902a28\""
Jan 14 13:38:06.444137 containerd[1759]: time="2025-01-14T13:38:06.444101697Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\""
Jan 14 13:38:06.454901 containerd[1759]: time="2025-01-14T13:38:06.454860056Z" level=info msg="CreateContainer within sandbox \"eaaaa7b73f6a02173fb913ab73136d1605ef9f14d72a2706ed960db413cd81aa\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}"
Jan 14 13:38:06.831092 containerd[1759]: time="2025-01-14T13:38:06.830994992Z" level=info msg="CreateContainer within sandbox \"eaaaa7b73f6a02173fb913ab73136d1605ef9f14d72a2706ed960db413cd81aa\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"ba793e0beaf8aad7093449386fcb7fcea9acc271f08ec88bebf137102c3e20ff\""
Jan 14 13:38:06.831919 containerd[1759]: time="2025-01-14T13:38:06.831895592Z" level=info msg="StartContainer for \"ba793e0beaf8aad7093449386fcb7fcea9acc271f08ec88bebf137102c3e20ff\""
Jan 14 13:38:06.863130 systemd[1]: Started cri-containerd-ba793e0beaf8aad7093449386fcb7fcea9acc271f08ec88bebf137102c3e20ff.scope - libcontainer container ba793e0beaf8aad7093449386fcb7fcea9acc271f08ec88bebf137102c3e20ff.
Jan 14 13:38:06.930854 containerd[1759]: time="2025-01-14T13:38:06.930820345Z" level=info msg="StartContainer for \"ba793e0beaf8aad7093449386fcb7fcea9acc271f08ec88bebf137102c3e20ff\" returns successfully"
Jan 14 13:38:07.745813 kubelet[3442]: E0114 13:38:07.744602    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:07.877941 kubelet[3442]: I0114 13:38:07.877528    3442 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-typha-dc9f4f78d-nfd7p" podStartSLOduration=2.3693110219999998 podStartE2EDuration="8.876870296s" podCreationTimestamp="2025-01-14 13:37:59 +0000 UTC" firstStartedPulling="2025-01-14 13:37:59.936349063 +0000 UTC m=+22.290293877" lastFinishedPulling="2025-01-14 13:38:06.443908297 +0000 UTC m=+28.797853151" observedRunningTime="2025-01-14 13:38:07.876171135 +0000 UTC m=+30.230115909" watchObservedRunningTime="2025-01-14 13:38:07.876870296 +0000 UTC m=+30.230815110"
Jan 14 13:38:07.911089 kubelet[3442]: E0114 13:38:07.910899    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.911089 kubelet[3442]: W0114 13:38:07.910922    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.911089 kubelet[3442]: E0114 13:38:07.910943    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.911469 kubelet[3442]: E0114 13:38:07.911309    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.911469 kubelet[3442]: W0114 13:38:07.911322    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.911469 kubelet[3442]: E0114 13:38:07.911335    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.911685 kubelet[3442]: E0114 13:38:07.911596    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.911685 kubelet[3442]: W0114 13:38:07.911610    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.911685 kubelet[3442]: E0114 13:38:07.911622    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.912080 kubelet[3442]: E0114 13:38:07.911919    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.912080 kubelet[3442]: W0114 13:38:07.911930    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.912080 kubelet[3442]: E0114 13:38:07.911942    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.912309 kubelet[3442]: E0114 13:38:07.912224    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.912309 kubelet[3442]: W0114 13:38:07.912236    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.912309 kubelet[3442]: E0114 13:38:07.912248    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.912578 kubelet[3442]: E0114 13:38:07.912491    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.912578 kubelet[3442]: W0114 13:38:07.912501    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.912578 kubelet[3442]: E0114 13:38:07.912513    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.912890 kubelet[3442]: E0114 13:38:07.912799    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.912890 kubelet[3442]: W0114 13:38:07.912811    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.912890 kubelet[3442]: E0114 13:38:07.912823    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.913267 kubelet[3442]: E0114 13:38:07.913160    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.913267 kubelet[3442]: W0114 13:38:07.913171    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.913267 kubelet[3442]: E0114 13:38:07.913184    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.913562 kubelet[3442]: E0114 13:38:07.913481    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.913562 kubelet[3442]: W0114 13:38:07.913493    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.913562 kubelet[3442]: E0114 13:38:07.913504    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.913829 kubelet[3442]: E0114 13:38:07.913747    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.913829 kubelet[3442]: W0114 13:38:07.913758    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.913829 kubelet[3442]: E0114 13:38:07.913771    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.914115 kubelet[3442]: E0114 13:38:07.914032    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.914115 kubelet[3442]: W0114 13:38:07.914043    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.914115 kubelet[3442]: E0114 13:38:07.914055    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.914402 kubelet[3442]: E0114 13:38:07.914310    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.914402 kubelet[3442]: W0114 13:38:07.914321    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.914402 kubelet[3442]: E0114 13:38:07.914332    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.914708 kubelet[3442]: E0114 13:38:07.914585    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.914708 kubelet[3442]: W0114 13:38:07.914595    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.914708 kubelet[3442]: E0114 13:38:07.914606    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.914924 kubelet[3442]: E0114 13:38:07.914844    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.914924 kubelet[3442]: W0114 13:38:07.914855    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.914924 kubelet[3442]: E0114 13:38:07.914867    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.915245 kubelet[3442]: E0114 13:38:07.915172    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.915245 kubelet[3442]: W0114 13:38:07.915184    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.915245 kubelet[3442]: E0114 13:38:07.915196    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.926651 kubelet[3442]: E0114 13:38:07.926559    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.926651 kubelet[3442]: W0114 13:38:07.926642    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.926776 kubelet[3442]: E0114 13:38:07.926679    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.927116 kubelet[3442]: E0114 13:38:07.926967    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.927116 kubelet[3442]: W0114 13:38:07.926982    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.927116 kubelet[3442]: E0114 13:38:07.927034    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.927492 kubelet[3442]: E0114 13:38:07.927302    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.927492 kubelet[3442]: W0114 13:38:07.927313    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.927492 kubelet[3442]: E0114 13:38:07.927326    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.927967 kubelet[3442]: E0114 13:38:07.927792    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.927967 kubelet[3442]: W0114 13:38:07.927807    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.927967 kubelet[3442]: E0114 13:38:07.927828    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.928096 kubelet[3442]: E0114 13:38:07.927977    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.928096 kubelet[3442]: W0114 13:38:07.927985    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.928096 kubelet[3442]: E0114 13:38:07.928012    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.928441 kubelet[3442]: E0114 13:38:07.928131    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.928441 kubelet[3442]: W0114 13:38:07.928138    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.928441 kubelet[3442]: E0114 13:38:07.928148    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.928441 kubelet[3442]: E0114 13:38:07.928290    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.928441 kubelet[3442]: W0114 13:38:07.928296    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.928441 kubelet[3442]: E0114 13:38:07.928306    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.929224 kubelet[3442]: E0114 13:38:07.929082    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.929224 kubelet[3442]: W0114 13:38:07.929096    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.929224 kubelet[3442]: E0114 13:38:07.929118    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.929327 kubelet[3442]: E0114 13:38:07.929303    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.929327 kubelet[3442]: W0114 13:38:07.929318    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.929327 kubelet[3442]: E0114 13:38:07.929350    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.929582 kubelet[3442]: E0114 13:38:07.929557    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.929582 kubelet[3442]: W0114 13:38:07.929571    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.929767 kubelet[3442]: E0114 13:38:07.929745    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.930243 kubelet[3442]: E0114 13:38:07.930203    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.930243 kubelet[3442]: W0114 13:38:07.930235    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.930548 kubelet[3442]: E0114 13:38:07.930365    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.930581 kubelet[3442]: E0114 13:38:07.930549    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.930581 kubelet[3442]: W0114 13:38:07.930557    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.930581 kubelet[3442]: E0114 13:38:07.930571    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.931033 kubelet[3442]: E0114 13:38:07.930769    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.931033 kubelet[3442]: W0114 13:38:07.930784    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.931033 kubelet[3442]: E0114 13:38:07.930802    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.931220 kubelet[3442]: E0114 13:38:07.931193    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.931220 kubelet[3442]: W0114 13:38:07.931209    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.931282 kubelet[3442]: E0114 13:38:07.931225    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.932072 kubelet[3442]: E0114 13:38:07.931559    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.932072 kubelet[3442]: W0114 13:38:07.931574    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.932072 kubelet[3442]: E0114 13:38:07.931697    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.932072 kubelet[3442]: E0114 13:38:07.931868    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.932072 kubelet[3442]: W0114 13:38:07.931877    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.932072 kubelet[3442]: E0114 13:38:07.931918    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.932238 kubelet[3442]: E0114 13:38:07.932200    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.932238 kubelet[3442]: W0114 13:38:07.932209    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.932238 kubelet[3442]: E0114 13:38:07.932221    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:07.933366 kubelet[3442]: E0114 13:38:07.933277    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:07.933366 kubelet[3442]: W0114 13:38:07.933300    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:07.933366 kubelet[3442]: E0114 13:38:07.933313    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.863563 kubelet[3442]: I0114 13:38:08.863531    3442 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness"
Jan 14 13:38:08.923290 kubelet[3442]: E0114 13:38:08.923257    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.923290 kubelet[3442]: W0114 13:38:08.923283    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.923451 kubelet[3442]: E0114 13:38:08.923309    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.923483 kubelet[3442]: E0114 13:38:08.923450    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.923483 kubelet[3442]: W0114 13:38:08.923458    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.923483 kubelet[3442]: E0114 13:38:08.923469    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.923616 kubelet[3442]: E0114 13:38:08.923601    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.923616 kubelet[3442]: W0114 13:38:08.923613    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.923680 kubelet[3442]: E0114 13:38:08.923623    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.923759 kubelet[3442]: E0114 13:38:08.923746    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.923759 kubelet[3442]: W0114 13:38:08.923756    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.923821 kubelet[3442]: E0114 13:38:08.923766    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.923918 kubelet[3442]: E0114 13:38:08.923905    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.923918 kubelet[3442]: W0114 13:38:08.923916    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.923979 kubelet[3442]: E0114 13:38:08.923926    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.924082 kubelet[3442]: E0114 13:38:08.924068    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.924082 kubelet[3442]: W0114 13:38:08.924079    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.924142 kubelet[3442]: E0114 13:38:08.924089    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.924222 kubelet[3442]: E0114 13:38:08.924209    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.924222 kubelet[3442]: W0114 13:38:08.924219    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.924285 kubelet[3442]: E0114 13:38:08.924229    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.924358 kubelet[3442]: E0114 13:38:08.924342    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.924358 kubelet[3442]: W0114 13:38:08.924354    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.924421 kubelet[3442]: E0114 13:38:08.924363    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.924506 kubelet[3442]: E0114 13:38:08.924493    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.924506 kubelet[3442]: W0114 13:38:08.924505    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.924566 kubelet[3442]: E0114 13:38:08.924515    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.924644 kubelet[3442]: E0114 13:38:08.924631    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.924644 kubelet[3442]: W0114 13:38:08.924640    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.924705 kubelet[3442]: E0114 13:38:08.924650    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.924776 kubelet[3442]: E0114 13:38:08.924763    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.924776 kubelet[3442]: W0114 13:38:08.924773    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.924836 kubelet[3442]: E0114 13:38:08.924783    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.924912 kubelet[3442]: E0114 13:38:08.924898    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.924912 kubelet[3442]: W0114 13:38:08.924909    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.924971 kubelet[3442]: E0114 13:38:08.924919    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.925077 kubelet[3442]: E0114 13:38:08.925062    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.925077 kubelet[3442]: W0114 13:38:08.925073    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.925147 kubelet[3442]: E0114 13:38:08.925084    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.925212 kubelet[3442]: E0114 13:38:08.925198    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.925212 kubelet[3442]: W0114 13:38:08.925210    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.925272 kubelet[3442]: E0114 13:38:08.925220    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.925350 kubelet[3442]: E0114 13:38:08.925337    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.925350 kubelet[3442]: W0114 13:38:08.925348    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.925407 kubelet[3442]: E0114 13:38:08.925358    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.934709 kubelet[3442]: E0114 13:38:08.934684    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.934921 kubelet[3442]: W0114 13:38:08.934800    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.934921 kubelet[3442]: E0114 13:38:08.934821    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.935236 kubelet[3442]: E0114 13:38:08.935104    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.935236 kubelet[3442]: W0114 13:38:08.935116    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.935236 kubelet[3442]: E0114 13:38:08.935139    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.935445 kubelet[3442]: E0114 13:38:08.935434    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.935509 kubelet[3442]: W0114 13:38:08.935498    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.935571 kubelet[3442]: E0114 13:38:08.935562    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.935828 kubelet[3442]: E0114 13:38:08.935808    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.935828 kubelet[3442]: W0114 13:38:08.935823    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.935904 kubelet[3442]: E0114 13:38:08.935840    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.936063 kubelet[3442]: E0114 13:38:08.936048    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.936063 kubelet[3442]: W0114 13:38:08.936062    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.936138 kubelet[3442]: E0114 13:38:08.936078    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.936281 kubelet[3442]: E0114 13:38:08.936265    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.936281 kubelet[3442]: W0114 13:38:08.936279    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.936362 kubelet[3442]: E0114 13:38:08.936300    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.936505 kubelet[3442]: E0114 13:38:08.936482    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.936505 kubelet[3442]: W0114 13:38:08.936494    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.936589 kubelet[3442]: E0114 13:38:08.936572    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.936810 kubelet[3442]: E0114 13:38:08.936794    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.936810 kubelet[3442]: W0114 13:38:08.936808    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.936980 kubelet[3442]: E0114 13:38:08.936896    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.936980 kubelet[3442]: E0114 13:38:08.936971    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.936980 kubelet[3442]: W0114 13:38:08.936978    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.937069 kubelet[3442]: E0114 13:38:08.937059    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.937173 kubelet[3442]: E0114 13:38:08.937157    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.937173 kubelet[3442]: W0114 13:38:08.937170    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.937225 kubelet[3442]: E0114 13:38:08.937185    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.937399 kubelet[3442]: E0114 13:38:08.937377    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.937399 kubelet[3442]: W0114 13:38:08.937394    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.937471 kubelet[3442]: E0114 13:38:08.937413    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.937640 kubelet[3442]: E0114 13:38:08.937541    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.937640 kubelet[3442]: W0114 13:38:08.937555    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.937640 kubelet[3442]: E0114 13:38:08.937566    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.937875 kubelet[3442]: E0114 13:38:08.937774    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.937875 kubelet[3442]: W0114 13:38:08.937788    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.937875 kubelet[3442]: E0114 13:38:08.937807    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.938226 kubelet[3442]: E0114 13:38:08.938089    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.938226 kubelet[3442]: W0114 13:38:08.938103    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.938226 kubelet[3442]: E0114 13:38:08.938127    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.938390 kubelet[3442]: E0114 13:38:08.938378    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.938443 kubelet[3442]: W0114 13:38:08.938432    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.938506 kubelet[3442]: E0114 13:38:08.938497    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.938725 kubelet[3442]: E0114 13:38:08.938697    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.938725 kubelet[3442]: W0114 13:38:08.938714    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.938790 kubelet[3442]: E0114 13:38:08.938739    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.939152 kubelet[3442]: E0114 13:38:08.939129    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.939152 kubelet[3442]: W0114 13:38:08.939149    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.939229 kubelet[3442]: E0114 13:38:08.939169    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:08.939347 kubelet[3442]: E0114 13:38:08.939328    3442 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input
Jan 14 13:38:08.939347 kubelet[3442]: W0114 13:38:08.939342    3442 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: ""
Jan 14 13:38:08.939402 kubelet[3442]: E0114 13:38:08.939353    3442 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input"
Jan 14 13:38:09.021937 containerd[1759]: time="2025-01-14T13:38:09.021405869Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:38:09.023861 containerd[1759]: time="2025-01-14T13:38:09.023768711Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5117811"
Jan 14 13:38:09.087353 containerd[1759]: time="2025-01-14T13:38:09.087277436Z" level=info msg="ImageCreate event name:\"sha256:ece9bca32e64e726de8bbfc9e175a3ca91e0881cd40352bfcd1d107411f4f348\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:38:09.093436 containerd[1759]: time="2025-01-14T13:38:09.093398800Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:38:09.094330 containerd[1759]: time="2025-01-14T13:38:09.093818920Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:ece9bca32e64e726de8bbfc9e175a3ca91e0881cd40352bfcd1d107411f4f348\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6487425\" in 2.649682943s"
Jan 14 13:38:09.094330 containerd[1759]: time="2025-01-14T13:38:09.093848600Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:ece9bca32e64e726de8bbfc9e175a3ca91e0881cd40352bfcd1d107411f4f348\""
Jan 14 13:38:09.096236 containerd[1759]: time="2025-01-14T13:38:09.096187522Z" level=info msg="CreateContainer within sandbox \"2bfbdef55f246b1c65be1142c8ccc79669bcb65247943c9f2ada4e457c3083de\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}"
Jan 14 13:38:09.432022 containerd[1759]: time="2025-01-14T13:38:09.431948398Z" level=info msg="CreateContainer within sandbox \"2bfbdef55f246b1c65be1142c8ccc79669bcb65247943c9f2ada4e457c3083de\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"8c8222d55941d5baa2c4dc5c598465434af8d9549860d1a2bcae9b3f76d20c43\""
Jan 14 13:38:09.432556 containerd[1759]: time="2025-01-14T13:38:09.432507879Z" level=info msg="StartContainer for \"8c8222d55941d5baa2c4dc5c598465434af8d9549860d1a2bcae9b3f76d20c43\""
Jan 14 13:38:09.460153 systemd[1]: Started cri-containerd-8c8222d55941d5baa2c4dc5c598465434af8d9549860d1a2bcae9b3f76d20c43.scope - libcontainer container 8c8222d55941d5baa2c4dc5c598465434af8d9549860d1a2bcae9b3f76d20c43.
Jan 14 13:38:09.490827 containerd[1759]: time="2025-01-14T13:38:09.490333039Z" level=info msg="StartContainer for \"8c8222d55941d5baa2c4dc5c598465434af8d9549860d1a2bcae9b3f76d20c43\" returns successfully"
Jan 14 13:38:09.496430 systemd[1]: cri-containerd-8c8222d55941d5baa2c4dc5c598465434af8d9549860d1a2bcae9b3f76d20c43.scope: Deactivated successfully.
Jan 14 13:38:09.517768 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8c8222d55941d5baa2c4dc5c598465434af8d9549860d1a2bcae9b3f76d20c43-rootfs.mount: Deactivated successfully.
Jan 14 13:38:09.745965 kubelet[3442]: E0114 13:38:09.744655    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:11.745539 kubelet[3442]: E0114 13:38:11.744775    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:13.745550 kubelet[3442]: E0114 13:38:13.745137    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:15.745045 kubelet[3442]: E0114 13:38:15.744820    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:16.196838 containerd[1759]: time="2025-01-14T13:38:16.196748680Z" level=info msg="shim disconnected" id=8c8222d55941d5baa2c4dc5c598465434af8d9549860d1a2bcae9b3f76d20c43 namespace=k8s.io
Jan 14 13:38:16.196838 containerd[1759]: time="2025-01-14T13:38:16.196832520Z" level=warning msg="cleaning up after shim disconnected" id=8c8222d55941d5baa2c4dc5c598465434af8d9549860d1a2bcae9b3f76d20c43 namespace=k8s.io
Jan 14 13:38:16.196838 containerd[1759]: time="2025-01-14T13:38:16.196842120Z" level=info msg="cleaning up dead shim" namespace=k8s.io
Jan 14 13:38:16.880564 containerd[1759]: time="2025-01-14T13:38:16.880502069Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\""
Jan 14 13:38:17.745710 kubelet[3442]: E0114 13:38:17.744602    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:19.744223 kubelet[3442]: E0114 13:38:19.744189    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:20.878792 containerd[1759]: time="2025-01-14T13:38:20.878740655Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:38:20.880846 containerd[1759]: time="2025-01-14T13:38:20.880806496Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=89703123"
Jan 14 13:38:20.925752 containerd[1759]: time="2025-01-14T13:38:20.925708283Z" level=info msg="ImageCreate event name:\"sha256:e5ca62af4ff61b88f55fe4e0d7723151103d3f6a470fd4ebb311a2de27a9597f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:38:20.930050 containerd[1759]: time="2025-01-14T13:38:20.929980246Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:38:20.931031 containerd[1759]: time="2025-01-14T13:38:20.930523726Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:e5ca62af4ff61b88f55fe4e0d7723151103d3f6a470fd4ebb311a2de27a9597f\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"91072777\" in 4.049978417s"
Jan 14 13:38:20.931031 containerd[1759]: time="2025-01-14T13:38:20.930554526Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:e5ca62af4ff61b88f55fe4e0d7723151103d3f6a470fd4ebb311a2de27a9597f\""
Jan 14 13:38:20.932385 containerd[1759]: time="2025-01-14T13:38:20.932344087Z" level=info msg="CreateContainer within sandbox \"2bfbdef55f246b1c65be1142c8ccc79669bcb65247943c9f2ada4e457c3083de\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}"
Jan 14 13:38:21.276138 containerd[1759]: time="2025-01-14T13:38:21.275910816Z" level=info msg="CreateContainer within sandbox \"2bfbdef55f246b1c65be1142c8ccc79669bcb65247943c9f2ada4e457c3083de\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"36961f03308957093be39a31ddd69edf632f054cb9d6d857188198d52dd55163\""
Jan 14 13:38:21.277123 containerd[1759]: time="2025-01-14T13:38:21.277078296Z" level=info msg="StartContainer for \"36961f03308957093be39a31ddd69edf632f054cb9d6d857188198d52dd55163\""
Jan 14 13:38:21.305158 systemd[1]: Started cri-containerd-36961f03308957093be39a31ddd69edf632f054cb9d6d857188198d52dd55163.scope - libcontainer container 36961f03308957093be39a31ddd69edf632f054cb9d6d857188198d52dd55163.
Jan 14 13:38:21.335932 containerd[1759]: time="2025-01-14T13:38:21.335881892Z" level=info msg="StartContainer for \"36961f03308957093be39a31ddd69edf632f054cb9d6d857188198d52dd55163\" returns successfully"
Jan 14 13:38:21.745026 kubelet[3442]: E0114 13:38:21.744664    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:23.686602 kubelet[3442]: I0114 13:38:23.686495    3442 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness"
Jan 14 13:38:23.744796 kubelet[3442]: E0114 13:38:23.744432    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:25.745336 kubelet[3442]: E0114 13:38:25.744193    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:27.745580 kubelet[3442]: E0114 13:38:27.744973    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:30.076109 kubelet[3442]: E0114 13:38:29.744461    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:31.745493 kubelet[3442]: E0114 13:38:31.744322    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:31.853969 containerd[1759]: time="2025-01-14T13:38:31.853920342Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE         \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
Jan 14 13:38:31.856193 systemd[1]: cri-containerd-36961f03308957093be39a31ddd69edf632f054cb9d6d857188198d52dd55163.scope: Deactivated successfully.
Jan 14 13:38:31.876813 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-36961f03308957093be39a31ddd69edf632f054cb9d6d857188198d52dd55163-rootfs.mount: Deactivated successfully.
Jan 14 13:38:31.905106 kubelet[3442]: I0114 13:38:31.904875    3442 kubelet_node_status.go:497] "Fast updating node status as it just became ready"
Jan 14 13:38:33.581286 kubelet[3442]: I0114 13:38:31.938790    3442 topology_manager.go:215] "Topology Admit Handler" podUID="af2992f6-a261-4e1c-9cfc-1ef759086d8d" podNamespace="kube-system" podName="coredns-76f75df574-2fhlb"
Jan 14 13:38:33.581286 kubelet[3442]: I0114 13:38:31.947646    3442 topology_manager.go:215] "Topology Admit Handler" podUID="c15a84fb-ac68-4acd-b385-a152a2911116" podNamespace="kube-system" podName="coredns-76f75df574-msgc9"
Jan 14 13:38:33.581286 kubelet[3442]: I0114 13:38:31.948493    3442 topology_manager.go:215] "Topology Admit Handler" podUID="35d51e43-1c53-427a-a8d2-bd422d727c5b" podNamespace="calico-apiserver" podName="calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:33.581286 kubelet[3442]: I0114 13:38:31.948608    3442 topology_manager.go:215] "Topology Admit Handler" podUID="f475c534-3126-4484-b2ac-7f780fe28e12" podNamespace="calico-system" podName="calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:33.581286 kubelet[3442]: I0114 13:38:31.953380    3442 topology_manager.go:215] "Topology Admit Handler" podUID="f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1" podNamespace="calico-apiserver" podName="calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:33.581286 kubelet[3442]: I0114 13:38:31.984830    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af2992f6-a261-4e1c-9cfc-1ef759086d8d-config-volume\") pod \"coredns-76f75df574-2fhlb\" (UID: \"af2992f6-a261-4e1c-9cfc-1ef759086d8d\") " pod="kube-system/coredns-76f75df574-2fhlb"
Jan 14 13:38:33.581286 kubelet[3442]: I0114 13:38:31.984946    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f475c534-3126-4484-b2ac-7f780fe28e12-tigera-ca-bundle\") pod \"calico-kube-controllers-8555cc7446-bv6h2\" (UID: \"f475c534-3126-4484-b2ac-7f780fe28e12\") " pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:31.947801 systemd[1]: Created slice kubepods-burstable-podaf2992f6_a261_4e1c_9cfc_1ef759086d8d.slice - libcontainer container kubepods-burstable-podaf2992f6_a261_4e1c_9cfc_1ef759086d8d.slice.
Jan 14 13:38:33.581922 kubelet[3442]: I0114 13:38:31.985089    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpsrp\" (UniqueName: \"kubernetes.io/projected/af2992f6-a261-4e1c-9cfc-1ef759086d8d-kube-api-access-tpsrp\") pod \"coredns-76f75df574-2fhlb\" (UID: \"af2992f6-a261-4e1c-9cfc-1ef759086d8d\") " pod="kube-system/coredns-76f75df574-2fhlb"
Jan 14 13:38:33.581922 kubelet[3442]: I0114 13:38:31.985178    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/35d51e43-1c53-427a-a8d2-bd422d727c5b-calico-apiserver-certs\") pod \"calico-apiserver-7478669d4-zcsrg\" (UID: \"35d51e43-1c53-427a-a8d2-bd422d727c5b\") " pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:33.581922 kubelet[3442]: I0114 13:38:31.985406    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxw4t\" (UniqueName: \"kubernetes.io/projected/c15a84fb-ac68-4acd-b385-a152a2911116-kube-api-access-qxw4t\") pod \"coredns-76f75df574-msgc9\" (UID: \"c15a84fb-ac68-4acd-b385-a152a2911116\") " pod="kube-system/coredns-76f75df574-msgc9"
Jan 14 13:38:33.581922 kubelet[3442]: I0114 13:38:31.985527    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjlc4\" (UniqueName: \"kubernetes.io/projected/f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1-kube-api-access-rjlc4\") pod \"calico-apiserver-7478669d4-9xx9c\" (UID: \"f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1\") " pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:33.581922 kubelet[3442]: I0114 13:38:31.985633    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c15a84fb-ac68-4acd-b385-a152a2911116-config-volume\") pod \"coredns-76f75df574-msgc9\" (UID: \"c15a84fb-ac68-4acd-b385-a152a2911116\") " pod="kube-system/coredns-76f75df574-msgc9"
Jan 14 13:38:31.959446 systemd[1]: Created slice kubepods-burstable-podc15a84fb_ac68_4acd_b385_a152a2911116.slice - libcontainer container kubepods-burstable-podc15a84fb_ac68_4acd_b385_a152a2911116.slice.
Jan 14 13:38:33.582109 kubelet[3442]: I0114 13:38:31.985796    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkpdj\" (UniqueName: \"kubernetes.io/projected/35d51e43-1c53-427a-a8d2-bd422d727c5b-kube-api-access-qkpdj\") pod \"calico-apiserver-7478669d4-zcsrg\" (UID: \"35d51e43-1c53-427a-a8d2-bd422d727c5b\") " pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:33.582109 kubelet[3442]: I0114 13:38:31.986055    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8tqf\" (UniqueName: \"kubernetes.io/projected/f475c534-3126-4484-b2ac-7f780fe28e12-kube-api-access-x8tqf\") pod \"calico-kube-controllers-8555cc7446-bv6h2\" (UID: \"f475c534-3126-4484-b2ac-7f780fe28e12\") " pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:33.582109 kubelet[3442]: I0114 13:38:31.986091    3442 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1-calico-apiserver-certs\") pod \"calico-apiserver-7478669d4-9xx9c\" (UID: \"f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1\") " pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:31.967185 systemd[1]: Created slice kubepods-besteffort-podf475c534_3126_4484_b2ac_7f780fe28e12.slice - libcontainer container kubepods-besteffort-podf475c534_3126_4484_b2ac_7f780fe28e12.slice.
Jan 14 13:38:31.980716 systemd[1]: Created slice kubepods-besteffort-pod35d51e43_1c53_427a_a8d2_bd422d727c5b.slice - libcontainer container kubepods-besteffort-pod35d51e43_1c53_427a_a8d2_bd422d727c5b.slice.
Jan 14 13:38:31.994845 systemd[1]: Created slice kubepods-besteffort-podf5f02a24_03f2_49f5_ae2a_c9d89b6de9c1.slice - libcontainer container kubepods-besteffort-podf5f02a24_03f2_49f5_ae2a_c9d89b6de9c1.slice.
Jan 14 13:38:33.611901 containerd[1759]: time="2025-01-14T13:38:33.611831821Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-9xx9c,Uid:f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1,Namespace:calico-apiserver,Attempt:0,}"
Jan 14 13:38:33.750180 systemd[1]: Created slice kubepods-besteffort-podaf70d9ef_1b20_4f9f_93e1_f55f680c58b4.slice - libcontainer container kubepods-besteffort-podaf70d9ef_1b20_4f9f_93e1_f55f680c58b4.slice.
Jan 14 13:38:33.752271 containerd[1759]: time="2025-01-14T13:38:33.752163592Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7bk8l,Uid:af70d9ef-1b20-4f9f-93e1-f55f680c58b4,Namespace:calico-system,Attempt:0,}"
Jan 14 13:38:33.881982 containerd[1759]: time="2025-01-14T13:38:33.881922516Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2fhlb,Uid:af2992f6-a261-4e1c-9cfc-1ef759086d8d,Namespace:kube-system,Attempt:0,}"
Jan 14 13:38:33.884696 containerd[1759]: time="2025-01-14T13:38:33.884494718Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-zcsrg,Uid:35d51e43-1c53-427a-a8d2-bd422d727c5b,Namespace:calico-apiserver,Attempt:0,}"
Jan 14 13:38:33.894646 containerd[1759]: time="2025-01-14T13:38:33.894341124Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8555cc7446-bv6h2,Uid:f475c534-3126-4484-b2ac-7f780fe28e12,Namespace:calico-system,Attempt:0,}"
Jan 14 13:38:33.894754 containerd[1759]: time="2025-01-14T13:38:33.894723645Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-msgc9,Uid:c15a84fb-ac68-4acd-b385-a152a2911116,Namespace:kube-system,Attempt:0,}"
Jan 14 13:38:37.484695 containerd[1759]: time="2025-01-14T13:38:37.484631334Z" level=info msg="shim disconnected" id=36961f03308957093be39a31ddd69edf632f054cb9d6d857188198d52dd55163 namespace=k8s.io
Jan 14 13:38:37.485719 containerd[1759]: time="2025-01-14T13:38:37.485058134Z" level=warning msg="cleaning up after shim disconnected" id=36961f03308957093be39a31ddd69edf632f054cb9d6d857188198d52dd55163 namespace=k8s.io
Jan 14 13:38:37.485719 containerd[1759]: time="2025-01-14T13:38:37.485076614Z" level=info msg="cleaning up dead shim" namespace=k8s.io
Jan 14 13:38:37.916364 containerd[1759]: time="2025-01-14T13:38:37.916274414Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\""
Jan 14 13:38:39.440176 containerd[1759]: time="2025-01-14T13:38:39.440056923Z" level=error msg="Failed to destroy network for sandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.440506 containerd[1759]: time="2025-01-14T13:38:39.440364283Z" level=error msg="encountered an error cleaning up failed sandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.440506 containerd[1759]: time="2025-01-14T13:38:39.440424043Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-msgc9,Uid:c15a84fb-ac68-4acd-b385-a152a2911116,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.440694 kubelet[3442]: E0114 13:38:39.440666    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.440918 kubelet[3442]: E0114 13:38:39.440727    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-msgc9"
Jan 14 13:38:39.440918 kubelet[3442]: E0114 13:38:39.440748    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-msgc9"
Jan 14 13:38:39.440918 kubelet[3442]: E0114 13:38:39.440802    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-msgc9_kube-system(c15a84fb-ac68-4acd-b385-a152a2911116)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-msgc9_kube-system(c15a84fb-ac68-4acd-b385-a152a2911116)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-msgc9" podUID="c15a84fb-ac68-4acd-b385-a152a2911116"
Jan 14 13:38:39.625018 containerd[1759]: time="2025-01-14T13:38:39.624693203Z" level=error msg="Failed to destroy network for sandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.625142 containerd[1759]: time="2025-01-14T13:38:39.625020523Z" level=error msg="encountered an error cleaning up failed sandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.625142 containerd[1759]: time="2025-01-14T13:38:39.625095763Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8555cc7446-bv6h2,Uid:f475c534-3126-4484-b2ac-7f780fe28e12,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.626033 kubelet[3442]: E0114 13:38:39.625349    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.626033 kubelet[3442]: E0114 13:38:39.625400    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:39.626033 kubelet[3442]: E0114 13:38:39.625419    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:39.626184 kubelet[3442]: E0114 13:38:39.625472    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8555cc7446-bv6h2_calico-system(f475c534-3126-4484-b2ac-7f780fe28e12)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8555cc7446-bv6h2_calico-system(f475c534-3126-4484-b2ac-7f780fe28e12)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2" podUID="f475c534-3126-4484-b2ac-7f780fe28e12"
Jan 14 13:38:39.670791 containerd[1759]: time="2025-01-14T13:38:39.670741553Z" level=error msg="Failed to destroy network for sandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.671096 containerd[1759]: time="2025-01-14T13:38:39.671069553Z" level=error msg="encountered an error cleaning up failed sandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.671165 containerd[1759]: time="2025-01-14T13:38:39.671135993Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-9xx9c,Uid:f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.671507 kubelet[3442]: E0114 13:38:39.671370    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.671507 kubelet[3442]: E0114 13:38:39.671418    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:39.671887 kubelet[3442]: E0114 13:38:39.671608    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:39.671887 kubelet[3442]: E0114 13:38:39.671670    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7478669d4-9xx9c_calico-apiserver(f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7478669d4-9xx9c_calico-apiserver(f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c" podUID="f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1"
Jan 14 13:38:39.722985 containerd[1759]: time="2025-01-14T13:38:39.722834546Z" level=error msg="Failed to destroy network for sandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.723657 containerd[1759]: time="2025-01-14T13:38:39.723200547Z" level=error msg="encountered an error cleaning up failed sandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.723657 containerd[1759]: time="2025-01-14T13:38:39.723260547Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-zcsrg,Uid:35d51e43-1c53-427a-a8d2-bd422d727c5b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.723784 kubelet[3442]: E0114 13:38:39.723509    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.723784 kubelet[3442]: E0114 13:38:39.723564    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:39.723784 kubelet[3442]: E0114 13:38:39.723585    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:39.723910 kubelet[3442]: E0114 13:38:39.723639    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7478669d4-zcsrg_calico-apiserver(35d51e43-1c53-427a-a8d2-bd422d727c5b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7478669d4-zcsrg_calico-apiserver(35d51e43-1c53-427a-a8d2-bd422d727c5b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg" podUID="35d51e43-1c53-427a-a8d2-bd422d727c5b"
Jan 14 13:38:39.767339 containerd[1759]: time="2025-01-14T13:38:39.767297175Z" level=error msg="Failed to destroy network for sandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.767864 containerd[1759]: time="2025-01-14T13:38:39.767714016Z" level=error msg="encountered an error cleaning up failed sandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.767864 containerd[1759]: time="2025-01-14T13:38:39.767773536Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2fhlb,Uid:af2992f6-a261-4e1c-9cfc-1ef759086d8d,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.768071 kubelet[3442]: E0114 13:38:39.768039    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.768126 kubelet[3442]: E0114 13:38:39.768091    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2fhlb"
Jan 14 13:38:39.768126 kubelet[3442]: E0114 13:38:39.768117    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2fhlb"
Jan 14 13:38:39.768199 kubelet[3442]: E0114 13:38:39.768173    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-2fhlb_kube-system(af2992f6-a261-4e1c-9cfc-1ef759086d8d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-2fhlb_kube-system(af2992f6-a261-4e1c-9cfc-1ef759086d8d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-2fhlb" podUID="af2992f6-a261-4e1c-9cfc-1ef759086d8d"
Jan 14 13:38:39.859213 containerd[1759]: time="2025-01-14T13:38:39.859168075Z" level=error msg="Failed to destroy network for sandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.859479 containerd[1759]: time="2025-01-14T13:38:39.859451755Z" level=error msg="encountered an error cleaning up failed sandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.859536 containerd[1759]: time="2025-01-14T13:38:39.859511195Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7bk8l,Uid:af70d9ef-1b20-4f9f-93e1-f55f680c58b4,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.859755 kubelet[3442]: E0114 13:38:39.859735    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:39.859982 kubelet[3442]: E0114 13:38:39.859864    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:38:39.859982 kubelet[3442]: E0114 13:38:39.859926    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:38:39.860369 kubelet[3442]: E0114 13:38:39.860215    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7bk8l_calico-system(af70d9ef-1b20-4f9f-93e1-f55f680c58b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7bk8l_calico-system(af70d9ef-1b20-4f9f-93e1-f55f680c58b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:39.920039 kubelet[3442]: I0114 13:38:39.919864    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27"
Jan 14 13:38:39.922472 kubelet[3442]: I0114 13:38:39.921147    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e"
Jan 14 13:38:39.922557 containerd[1759]: time="2025-01-14T13:38:39.921460115Z" level=info msg="StopPodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\""
Jan 14 13:38:39.922557 containerd[1759]: time="2025-01-14T13:38:39.921618595Z" level=info msg="Ensure that sandbox a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27 in task-service has been cleanup successfully"
Jan 14 13:38:39.922557 containerd[1759]: time="2025-01-14T13:38:39.922151556Z" level=info msg="TearDown network for sandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" successfully"
Jan 14 13:38:39.922557 containerd[1759]: time="2025-01-14T13:38:39.922169036Z" level=info msg="StopPodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" returns successfully"
Jan 14 13:38:39.922931 containerd[1759]: time="2025-01-14T13:38:39.922699996Z" level=info msg="StopPodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\""
Jan 14 13:38:39.922931 containerd[1759]: time="2025-01-14T13:38:39.922814076Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2fhlb,Uid:af2992f6-a261-4e1c-9cfc-1ef759086d8d,Namespace:kube-system,Attempt:1,}"
Jan 14 13:38:39.922931 containerd[1759]: time="2025-01-14T13:38:39.922881756Z" level=info msg="Ensure that sandbox 3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e in task-service has been cleanup successfully"
Jan 14 13:38:39.923716 containerd[1759]: time="2025-01-14T13:38:39.923298757Z" level=info msg="TearDown network for sandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" successfully"
Jan 14 13:38:39.923768 kubelet[3442]: I0114 13:38:39.923446    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519"
Jan 14 13:38:39.924162 containerd[1759]: time="2025-01-14T13:38:39.923858517Z" level=info msg="StopPodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" returns successfully"
Jan 14 13:38:39.924162 containerd[1759]: time="2025-01-14T13:38:39.923801957Z" level=info msg="StopPodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\""
Jan 14 13:38:39.924162 containerd[1759]: time="2025-01-14T13:38:39.924057917Z" level=info msg="Ensure that sandbox 5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519 in task-service has been cleanup successfully"
Jan 14 13:38:39.924379 containerd[1759]: time="2025-01-14T13:38:39.924342837Z" level=info msg="TearDown network for sandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" successfully"
Jan 14 13:38:39.924532 containerd[1759]: time="2025-01-14T13:38:39.924464637Z" level=info msg="StopPodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" returns successfully"
Jan 14 13:38:39.924575 containerd[1759]: time="2025-01-14T13:38:39.924528237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8555cc7446-bv6h2,Uid:f475c534-3126-4484-b2ac-7f780fe28e12,Namespace:calico-system,Attempt:1,}"
Jan 14 13:38:39.925106 kubelet[3442]: I0114 13:38:39.925081    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9"
Jan 14 13:38:39.925854 containerd[1759]: time="2025-01-14T13:38:39.925450678Z" level=info msg="StopPodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\""
Jan 14 13:38:39.925854 containerd[1759]: time="2025-01-14T13:38:39.925563358Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-msgc9,Uid:c15a84fb-ac68-4acd-b385-a152a2911116,Namespace:kube-system,Attempt:1,}"
Jan 14 13:38:39.926213 containerd[1759]: time="2025-01-14T13:38:39.926157878Z" level=info msg="Ensure that sandbox 7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9 in task-service has been cleanup successfully"
Jan 14 13:38:39.926538 containerd[1759]: time="2025-01-14T13:38:39.926493759Z" level=info msg="TearDown network for sandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" successfully"
Jan 14 13:38:39.926979 containerd[1759]: time="2025-01-14T13:38:39.926859319Z" level=info msg="StopPodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" returns successfully"
Jan 14 13:38:39.927793 kubelet[3442]: I0114 13:38:39.926701    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b"
Jan 14 13:38:39.928281 containerd[1759]: time="2025-01-14T13:38:39.927868360Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7bk8l,Uid:af70d9ef-1b20-4f9f-93e1-f55f680c58b4,Namespace:calico-system,Attempt:1,}"
Jan 14 13:38:39.928281 containerd[1759]: time="2025-01-14T13:38:39.927983640Z" level=info msg="StopPodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\""
Jan 14 13:38:39.928281 containerd[1759]: time="2025-01-14T13:38:39.928156120Z" level=info msg="Ensure that sandbox 8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b in task-service has been cleanup successfully"
Jan 14 13:38:39.928454 containerd[1759]: time="2025-01-14T13:38:39.928425320Z" level=info msg="TearDown network for sandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" successfully"
Jan 14 13:38:39.928656 containerd[1759]: time="2025-01-14T13:38:39.928631080Z" level=info msg="StopPodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" returns successfully"
Jan 14 13:38:39.929306 containerd[1759]: time="2025-01-14T13:38:39.929274800Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-9xx9c,Uid:f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1,Namespace:calico-apiserver,Attempt:1,}"
Jan 14 13:38:39.929600 kubelet[3442]: I0114 13:38:39.929583    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f"
Jan 14 13:38:39.930287 containerd[1759]: time="2025-01-14T13:38:39.930068961Z" level=info msg="StopPodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\""
Jan 14 13:38:39.930287 containerd[1759]: time="2025-01-14T13:38:39.930192401Z" level=info msg="Ensure that sandbox 2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f in task-service has been cleanup successfully"
Jan 14 13:38:39.930396 containerd[1759]: time="2025-01-14T13:38:39.930331121Z" level=info msg="TearDown network for sandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" successfully"
Jan 14 13:38:39.930396 containerd[1759]: time="2025-01-14T13:38:39.930346801Z" level=info msg="StopPodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" returns successfully"
Jan 14 13:38:39.931056 containerd[1759]: time="2025-01-14T13:38:39.930881401Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-zcsrg,Uid:35d51e43-1c53-427a-a8d2-bd422d727c5b,Namespace:calico-apiserver,Attempt:1,}"
Jan 14 13:38:39.942669 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b-shm.mount: Deactivated successfully.
Jan 14 13:38:39.943020 systemd[1]: run-netns-cni\x2da6e2ca25\x2d8405\x2d0c61\x2d5928\x2db4ca564d71dc.mount: Deactivated successfully.
Jan 14 13:38:39.943166 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e-shm.mount: Deactivated successfully.
Jan 14 13:38:39.943312 systemd[1]: run-netns-cni\x2d76d4113f\x2df2fa\x2dea61\x2dd423\x2db03bc8d82685.mount: Deactivated successfully.
Jan 14 13:38:39.943427 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519-shm.mount: Deactivated successfully.
Jan 14 13:38:40.774571 containerd[1759]: time="2025-01-14T13:38:40.774439734Z" level=error msg="Failed to destroy network for sandbox \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.775437 containerd[1759]: time="2025-01-14T13:38:40.775116534Z" level=error msg="encountered an error cleaning up failed sandbox \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.775437 containerd[1759]: time="2025-01-14T13:38:40.775185974Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8555cc7446-bv6h2,Uid:f475c534-3126-4484-b2ac-7f780fe28e12,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.775551 kubelet[3442]: E0114 13:38:40.775404    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.775551 kubelet[3442]: E0114 13:38:40.775454    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:40.775551 kubelet[3442]: E0114 13:38:40.775475    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:40.777411 kubelet[3442]: E0114 13:38:40.775530    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8555cc7446-bv6h2_calico-system(f475c534-3126-4484-b2ac-7f780fe28e12)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8555cc7446-bv6h2_calico-system(f475c534-3126-4484-b2ac-7f780fe28e12)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2" podUID="f475c534-3126-4484-b2ac-7f780fe28e12"
Jan 14 13:38:40.844079 containerd[1759]: time="2025-01-14T13:38:40.844023369Z" level=error msg="Failed to destroy network for sandbox \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.844380 containerd[1759]: time="2025-01-14T13:38:40.844343650Z" level=error msg="encountered an error cleaning up failed sandbox \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.844426 containerd[1759]: time="2025-01-14T13:38:40.844407090Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2fhlb,Uid:af2992f6-a261-4e1c-9cfc-1ef759086d8d,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.844629 kubelet[3442]: E0114 13:38:40.844595    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.844686 kubelet[3442]: E0114 13:38:40.844650    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2fhlb"
Jan 14 13:38:40.844686 kubelet[3442]: E0114 13:38:40.844674    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2fhlb"
Jan 14 13:38:40.844736 kubelet[3442]: E0114 13:38:40.844723    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-2fhlb_kube-system(af2992f6-a261-4e1c-9cfc-1ef759086d8d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-2fhlb_kube-system(af2992f6-a261-4e1c-9cfc-1ef759086d8d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-2fhlb" podUID="af2992f6-a261-4e1c-9cfc-1ef759086d8d"
Jan 14 13:38:40.866184 containerd[1759]: time="2025-01-14T13:38:40.866140301Z" level=error msg="Failed to destroy network for sandbox \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.867498 containerd[1759]: time="2025-01-14T13:38:40.867344581Z" level=error msg="encountered an error cleaning up failed sandbox \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.867693 containerd[1759]: time="2025-01-14T13:38:40.867669021Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-msgc9,Uid:c15a84fb-ac68-4acd-b385-a152a2911116,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.868164 kubelet[3442]: E0114 13:38:40.868138    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.868271 kubelet[3442]: E0114 13:38:40.868190    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-msgc9"
Jan 14 13:38:40.868271 kubelet[3442]: E0114 13:38:40.868212    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-msgc9"
Jan 14 13:38:40.868427 kubelet[3442]: E0114 13:38:40.868280    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-msgc9_kube-system(c15a84fb-ac68-4acd-b385-a152a2911116)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-msgc9_kube-system(c15a84fb-ac68-4acd-b385-a152a2911116)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-msgc9" podUID="c15a84fb-ac68-4acd-b385-a152a2911116"
Jan 14 13:38:40.870741 containerd[1759]: time="2025-01-14T13:38:40.870675143Z" level=error msg="Failed to destroy network for sandbox \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.871059 containerd[1759]: time="2025-01-14T13:38:40.870937423Z" level=error msg="encountered an error cleaning up failed sandbox \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.871059 containerd[1759]: time="2025-01-14T13:38:40.870987423Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-9xx9c,Uid:f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.871267 kubelet[3442]: E0114 13:38:40.871213    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.871267 kubelet[3442]: E0114 13:38:40.871249    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:40.871267 kubelet[3442]: E0114 13:38:40.871269    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:40.871354 kubelet[3442]: E0114 13:38:40.871330    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7478669d4-9xx9c_calico-apiserver(f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7478669d4-9xx9c_calico-apiserver(f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c" podUID="f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1"
Jan 14 13:38:40.879719 containerd[1759]: time="2025-01-14T13:38:40.879513868Z" level=error msg="Failed to destroy network for sandbox \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.879968 containerd[1759]: time="2025-01-14T13:38:40.879936188Z" level=error msg="encountered an error cleaning up failed sandbox \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.880139 containerd[1759]: time="2025-01-14T13:38:40.880052668Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-zcsrg,Uid:35d51e43-1c53-427a-a8d2-bd422d727c5b,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.880748 kubelet[3442]: E0114 13:38:40.880345    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.880748 kubelet[3442]: E0114 13:38:40.880384    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:40.880748 kubelet[3442]: E0114 13:38:40.880415    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:40.880879 kubelet[3442]: E0114 13:38:40.880464    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7478669d4-zcsrg_calico-apiserver(35d51e43-1c53-427a-a8d2-bd422d727c5b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7478669d4-zcsrg_calico-apiserver(35d51e43-1c53-427a-a8d2-bd422d727c5b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg" podUID="35d51e43-1c53-427a-a8d2-bd422d727c5b"
Jan 14 13:38:40.882318 containerd[1759]: time="2025-01-14T13:38:40.882136029Z" level=error msg="Failed to destroy network for sandbox \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.883100 containerd[1759]: time="2025-01-14T13:38:40.882979389Z" level=error msg="encountered an error cleaning up failed sandbox \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.883100 containerd[1759]: time="2025-01-14T13:38:40.883050669Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7bk8l,Uid:af70d9ef-1b20-4f9f-93e1-f55f680c58b4,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.883239 kubelet[3442]: E0114 13:38:40.883217    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:40.883280 kubelet[3442]: E0114 13:38:40.883273    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:38:40.883316 kubelet[3442]: E0114 13:38:40.883291    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:38:40.883346 kubelet[3442]: E0114 13:38:40.883340    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7bk8l_calico-system(af70d9ef-1b20-4f9f-93e1-f55f680c58b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7bk8l_calico-system(af70d9ef-1b20-4f9f-93e1-f55f680c58b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:40.933564 kubelet[3442]: I0114 13:38:40.933530    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c"
Jan 14 13:38:40.934309 containerd[1759]: time="2025-01-14T13:38:40.934097375Z" level=info msg="StopPodSandbox for \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\""
Jan 14 13:38:40.934309 containerd[1759]: time="2025-01-14T13:38:40.934275616Z" level=info msg="Ensure that sandbox 9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c in task-service has been cleanup successfully"
Jan 14 13:38:40.935252 containerd[1759]: time="2025-01-14T13:38:40.934775536Z" level=info msg="TearDown network for sandbox \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\" successfully"
Jan 14 13:38:40.935252 containerd[1759]: time="2025-01-14T13:38:40.934858456Z" level=info msg="StopPodSandbox for \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\" returns successfully"
Jan 14 13:38:40.935682 containerd[1759]: time="2025-01-14T13:38:40.935551856Z" level=info msg="StopPodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\""
Jan 14 13:38:40.935682 containerd[1759]: time="2025-01-14T13:38:40.935627176Z" level=info msg="TearDown network for sandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" successfully"
Jan 14 13:38:40.935682 containerd[1759]: time="2025-01-14T13:38:40.935637376Z" level=info msg="StopPodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" returns successfully"
Jan 14 13:38:40.936359 containerd[1759]: time="2025-01-14T13:38:40.936274457Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-zcsrg,Uid:35d51e43-1c53-427a-a8d2-bd422d727c5b,Namespace:calico-apiserver,Attempt:2,}"
Jan 14 13:38:40.937721 kubelet[3442]: I0114 13:38:40.937687    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2"
Jan 14 13:38:40.938546 containerd[1759]: time="2025-01-14T13:38:40.938178778Z" level=info msg="StopPodSandbox for \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\""
Jan 14 13:38:40.938877 containerd[1759]: time="2025-01-14T13:38:40.938747258Z" level=info msg="Ensure that sandbox 7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2 in task-service has been cleanup successfully"
Jan 14 13:38:40.939305 containerd[1759]: time="2025-01-14T13:38:40.939032218Z" level=info msg="TearDown network for sandbox \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\" successfully"
Jan 14 13:38:40.939305 containerd[1759]: time="2025-01-14T13:38:40.939050458Z" level=info msg="StopPodSandbox for \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\" returns successfully"
Jan 14 13:38:40.939626 containerd[1759]: time="2025-01-14T13:38:40.939557338Z" level=info msg="StopPodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\""
Jan 14 13:38:40.939683 containerd[1759]: time="2025-01-14T13:38:40.939649298Z" level=info msg="TearDown network for sandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" successfully"
Jan 14 13:38:40.939683 containerd[1759]: time="2025-01-14T13:38:40.939660498Z" level=info msg="StopPodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" returns successfully"
Jan 14 13:38:40.940470 kubelet[3442]: I0114 13:38:40.940181    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b"
Jan 14 13:38:40.940550 containerd[1759]: time="2025-01-14T13:38:40.940270139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2fhlb,Uid:af2992f6-a261-4e1c-9cfc-1ef759086d8d,Namespace:kube-system,Attempt:2,}"
Jan 14 13:38:40.940812 containerd[1759]: time="2025-01-14T13:38:40.940702739Z" level=info msg="StopPodSandbox for \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\""
Jan 14 13:38:40.942934 containerd[1759]: time="2025-01-14T13:38:40.942877620Z" level=info msg="Ensure that sandbox e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b in task-service has been cleanup successfully"
Jan 14 13:38:40.943866 containerd[1759]: time="2025-01-14T13:38:40.943529260Z" level=info msg="TearDown network for sandbox \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\" successfully"
Jan 14 13:38:40.943866 containerd[1759]: time="2025-01-14T13:38:40.943558460Z" level=info msg="StopPodSandbox for \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\" returns successfully"
Jan 14 13:38:40.944144 containerd[1759]: time="2025-01-14T13:38:40.944083741Z" level=info msg="StopPodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\""
Jan 14 13:38:40.944195 containerd[1759]: time="2025-01-14T13:38:40.944186341Z" level=info msg="TearDown network for sandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" successfully"
Jan 14 13:38:40.944195 containerd[1759]: time="2025-01-14T13:38:40.944198181Z" level=info msg="StopPodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" returns successfully"
Jan 14 13:38:40.945224 systemd[1]: run-netns-cni\x2dcf3f4713\x2d2e73\x2dc30b\x2d62dc\x2d9180cb77b65c.mount: Deactivated successfully.
Jan 14 13:38:40.946501 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2-shm.mount: Deactivated successfully.
Jan 14 13:38:40.950315 containerd[1759]: time="2025-01-14T13:38:40.948456383Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8555cc7446-bv6h2,Uid:f475c534-3126-4484-b2ac-7f780fe28e12,Namespace:calico-system,Attempt:2,}"
Jan 14 13:38:40.946564 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b-shm.mount: Deactivated successfully.
Jan 14 13:38:40.952132 kubelet[3442]: I0114 13:38:40.952113    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b"
Jan 14 13:38:40.952366 systemd[1]: run-netns-cni\x2dd24088ab\x2de583\x2db332\x2dd4cc\x2d25a1966d47aa.mount: Deactivated successfully.
Jan 14 13:38:40.954630 containerd[1759]: time="2025-01-14T13:38:40.954568146Z" level=info msg="StopPodSandbox for \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\""
Jan 14 13:38:40.955402 containerd[1759]: time="2025-01-14T13:38:40.955278626Z" level=info msg="Ensure that sandbox 91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b in task-service has been cleanup successfully"
Jan 14 13:38:40.955744 kubelet[3442]: I0114 13:38:40.955481    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c"
Jan 14 13:38:40.957271 containerd[1759]: time="2025-01-14T13:38:40.956141787Z" level=info msg="StopPodSandbox for \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\""
Jan 14 13:38:40.958812 containerd[1759]: time="2025-01-14T13:38:40.957711428Z" level=info msg="TearDown network for sandbox \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\" successfully"
Jan 14 13:38:40.958812 containerd[1759]: time="2025-01-14T13:38:40.957753508Z" level=info msg="StopPodSandbox for \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\" returns successfully"
Jan 14 13:38:40.958156 systemd[1]: run-netns-cni\x2d47232525\x2d97a5\x2db7c5\x2d1284\x2d19510cb02bb3.mount: Deactivated successfully.
Jan 14 13:38:40.959907 containerd[1759]: time="2025-01-14T13:38:40.959216428Z" level=info msg="Ensure that sandbox 30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c in task-service has been cleanup successfully"
Jan 14 13:38:40.960143 containerd[1759]: time="2025-01-14T13:38:40.960123549Z" level=info msg="TearDown network for sandbox \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\" successfully"
Jan 14 13:38:40.960254 containerd[1759]: time="2025-01-14T13:38:40.960238429Z" level=info msg="StopPodSandbox for \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\" returns successfully"
Jan 14 13:38:40.963807 containerd[1759]: time="2025-01-14T13:38:40.962791550Z" level=info msg="StopPodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\""
Jan 14 13:38:40.963807 containerd[1759]: time="2025-01-14T13:38:40.962867190Z" level=info msg="TearDown network for sandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" successfully"
Jan 14 13:38:40.963807 containerd[1759]: time="2025-01-14T13:38:40.962876110Z" level=info msg="StopPodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" returns successfully"
Jan 14 13:38:40.963807 containerd[1759]: time="2025-01-14T13:38:40.963005870Z" level=info msg="StopPodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\""
Jan 14 13:38:40.963807 containerd[1759]: time="2025-01-14T13:38:40.963059990Z" level=info msg="TearDown network for sandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" successfully"
Jan 14 13:38:40.963807 containerd[1759]: time="2025-01-14T13:38:40.963069510Z" level=info msg="StopPodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" returns successfully"
Jan 14 13:38:40.964406 systemd[1]: run-netns-cni\x2d40ade2fb\x2d2754\x2d1071\x2d75d3\x2da499418b7fb7.mount: Deactivated successfully.
Jan 14 13:38:40.965680 containerd[1759]: time="2025-01-14T13:38:40.965337551Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-msgc9,Uid:c15a84fb-ac68-4acd-b385-a152a2911116,Namespace:kube-system,Attempt:2,}"
Jan 14 13:38:40.966026 containerd[1759]: time="2025-01-14T13:38:40.966005912Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-9xx9c,Uid:f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1,Namespace:calico-apiserver,Attempt:2,}"
Jan 14 13:38:40.966884 kubelet[3442]: I0114 13:38:40.966857    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824"
Jan 14 13:38:40.972439 containerd[1759]: time="2025-01-14T13:38:40.972398755Z" level=info msg="StopPodSandbox for \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\""
Jan 14 13:38:40.972638 containerd[1759]: time="2025-01-14T13:38:40.972610155Z" level=info msg="Ensure that sandbox 1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824 in task-service has been cleanup successfully"
Jan 14 13:38:40.974800 systemd[1]: run-netns-cni\x2d9ece5faa\x2dae04\x2d8a83\x2d8fe3\x2de72dd0c3156a.mount: Deactivated successfully.
Jan 14 13:38:40.975338 containerd[1759]: time="2025-01-14T13:38:40.975164596Z" level=info msg="TearDown network for sandbox \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\" successfully"
Jan 14 13:38:40.975338 containerd[1759]: time="2025-01-14T13:38:40.975192877Z" level=info msg="StopPodSandbox for \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\" returns successfully"
Jan 14 13:38:40.975942 containerd[1759]: time="2025-01-14T13:38:40.975575317Z" level=info msg="StopPodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\""
Jan 14 13:38:40.975942 containerd[1759]: time="2025-01-14T13:38:40.975652357Z" level=info msg="TearDown network for sandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" successfully"
Jan 14 13:38:40.975942 containerd[1759]: time="2025-01-14T13:38:40.975661437Z" level=info msg="StopPodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" returns successfully"
Jan 14 13:38:40.978704 containerd[1759]: time="2025-01-14T13:38:40.978439798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7bk8l,Uid:af70d9ef-1b20-4f9f-93e1-f55f680c58b4,Namespace:calico-system,Attempt:2,}"
Jan 14 13:38:41.281252 containerd[1759]: time="2025-01-14T13:38:41.281188833Z" level=error msg="Failed to destroy network for sandbox \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.281945 containerd[1759]: time="2025-01-14T13:38:41.281673953Z" level=error msg="encountered an error cleaning up failed sandbox \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.281945 containerd[1759]: time="2025-01-14T13:38:41.281732353Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-zcsrg,Uid:35d51e43-1c53-427a-a8d2-bd422d727c5b,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.291873 kubelet[3442]: E0114 13:38:41.291541    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.291873 kubelet[3442]: E0114 13:38:41.291595    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:41.291873 kubelet[3442]: E0114 13:38:41.291616    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:41.292078 kubelet[3442]: E0114 13:38:41.291671    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7478669d4-zcsrg_calico-apiserver(35d51e43-1c53-427a-a8d2-bd422d727c5b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7478669d4-zcsrg_calico-apiserver(35d51e43-1c53-427a-a8d2-bd422d727c5b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg" podUID="35d51e43-1c53-427a-a8d2-bd422d727c5b"
Jan 14 13:38:41.293474 containerd[1759]: time="2025-01-14T13:38:41.293244999Z" level=error msg="Failed to destroy network for sandbox \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.299194 containerd[1759]: time="2025-01-14T13:38:41.299159522Z" level=error msg="encountered an error cleaning up failed sandbox \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.299520 containerd[1759]: time="2025-01-14T13:38:41.299366562Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2fhlb,Uid:af2992f6-a261-4e1c-9cfc-1ef759086d8d,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.299640 kubelet[3442]: E0114 13:38:41.299607    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.299844 kubelet[3442]: E0114 13:38:41.299667    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2fhlb"
Jan 14 13:38:41.299844 kubelet[3442]: E0114 13:38:41.299688    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2fhlb"
Jan 14 13:38:41.299844 kubelet[3442]: E0114 13:38:41.299743    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-2fhlb_kube-system(af2992f6-a261-4e1c-9cfc-1ef759086d8d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-2fhlb_kube-system(af2992f6-a261-4e1c-9cfc-1ef759086d8d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-2fhlb" podUID="af2992f6-a261-4e1c-9cfc-1ef759086d8d"
Jan 14 13:38:41.346351 containerd[1759]: time="2025-01-14T13:38:41.346231586Z" level=error msg="Failed to destroy network for sandbox \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.348412 containerd[1759]: time="2025-01-14T13:38:41.348257267Z" level=error msg="encountered an error cleaning up failed sandbox \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.348412 containerd[1759]: time="2025-01-14T13:38:41.348320467Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8555cc7446-bv6h2,Uid:f475c534-3126-4484-b2ac-7f780fe28e12,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.348948 kubelet[3442]: E0114 13:38:41.348773    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.348948 kubelet[3442]: E0114 13:38:41.348891    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:41.348948 kubelet[3442]: E0114 13:38:41.348912    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:41.352591 kubelet[3442]: E0114 13:38:41.352547    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8555cc7446-bv6h2_calico-system(f475c534-3126-4484-b2ac-7f780fe28e12)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8555cc7446-bv6h2_calico-system(f475c534-3126-4484-b2ac-7f780fe28e12)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2" podUID="f475c534-3126-4484-b2ac-7f780fe28e12"
Jan 14 13:38:41.365398 containerd[1759]: time="2025-01-14T13:38:41.365358716Z" level=error msg="Failed to destroy network for sandbox \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.365865 containerd[1759]: time="2025-01-14T13:38:41.365841156Z" level=error msg="encountered an error cleaning up failed sandbox \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.366168 containerd[1759]: time="2025-01-14T13:38:41.366065437Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-9xx9c,Uid:f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.366700 kubelet[3442]: E0114 13:38:41.366675    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.367341 kubelet[3442]: E0114 13:38:41.366839    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:41.367341 kubelet[3442]: E0114 13:38:41.366876    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:41.367341 kubelet[3442]: E0114 13:38:41.366936    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7478669d4-9xx9c_calico-apiserver(f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7478669d4-9xx9c_calico-apiserver(f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c" podUID="f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1"
Jan 14 13:38:41.373065 containerd[1759]: time="2025-01-14T13:38:41.373031880Z" level=error msg="Failed to destroy network for sandbox \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.373468 containerd[1759]: time="2025-01-14T13:38:41.373442200Z" level=error msg="encountered an error cleaning up failed sandbox \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.374065 containerd[1759]: time="2025-01-14T13:38:41.374040121Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-msgc9,Uid:c15a84fb-ac68-4acd-b385-a152a2911116,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.374334 kubelet[3442]: E0114 13:38:41.374320    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.374450 kubelet[3442]: E0114 13:38:41.374440    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-msgc9"
Jan 14 13:38:41.374620 kubelet[3442]: E0114 13:38:41.374522    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-msgc9"
Jan 14 13:38:41.374620 kubelet[3442]: E0114 13:38:41.374595    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-msgc9_kube-system(c15a84fb-ac68-4acd-b385-a152a2911116)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-msgc9_kube-system(c15a84fb-ac68-4acd-b385-a152a2911116)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-msgc9" podUID="c15a84fb-ac68-4acd-b385-a152a2911116"
Jan 14 13:38:41.375881 containerd[1759]: time="2025-01-14T13:38:41.375846522Z" level=error msg="Failed to destroy network for sandbox \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.377353 containerd[1759]: time="2025-01-14T13:38:41.377326082Z" level=error msg="encountered an error cleaning up failed sandbox \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.377470 containerd[1759]: time="2025-01-14T13:38:41.377452082Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7bk8l,Uid:af70d9ef-1b20-4f9f-93e1-f55f680c58b4,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.377854 kubelet[3442]: E0114 13:38:41.377722    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:41.377854 kubelet[3442]: E0114 13:38:41.377758    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:38:41.377854 kubelet[3442]: E0114 13:38:41.377782    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:38:41.377978 kubelet[3442]: E0114 13:38:41.377826    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7bk8l_calico-system(af70d9ef-1b20-4f9f-93e1-f55f680c58b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7bk8l_calico-system(af70d9ef-1b20-4f9f-93e1-f55f680c58b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:41.970105 kubelet[3442]: I0114 13:38:41.969876    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7"
Jan 14 13:38:41.970699 containerd[1759]: time="2025-01-14T13:38:41.970491266Z" level=info msg="StopPodSandbox for \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\""
Jan 14 13:38:41.970699 containerd[1759]: time="2025-01-14T13:38:41.970650106Z" level=info msg="Ensure that sandbox 65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7 in task-service has been cleanup successfully"
Jan 14 13:38:41.972885 containerd[1759]: time="2025-01-14T13:38:41.971766666Z" level=info msg="TearDown network for sandbox \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\" successfully"
Jan 14 13:38:41.973131 systemd[1]: run-netns-cni\x2dc0ee5e7b\x2d7fba\x2d24ae\x2db118\x2d7f156136ee82.mount: Deactivated successfully.
Jan 14 13:38:41.974486 containerd[1759]: time="2025-01-14T13:38:41.971787546Z" level=info msg="StopPodSandbox for \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\" returns successfully"
Jan 14 13:38:41.975594 containerd[1759]: time="2025-01-14T13:38:41.974700148Z" level=info msg="StopPodSandbox for \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\""
Jan 14 13:38:41.975594 containerd[1759]: time="2025-01-14T13:38:41.974792228Z" level=info msg="TearDown network for sandbox \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\" successfully"
Jan 14 13:38:41.975594 containerd[1759]: time="2025-01-14T13:38:41.974802788Z" level=info msg="StopPodSandbox for \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\" returns successfully"
Jan 14 13:38:41.976205 containerd[1759]: time="2025-01-14T13:38:41.975971069Z" level=info msg="StopPodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\""
Jan 14 13:38:41.976521 containerd[1759]: time="2025-01-14T13:38:41.976464749Z" level=info msg="TearDown network for sandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" successfully"
Jan 14 13:38:41.976521 containerd[1759]: time="2025-01-14T13:38:41.976480909Z" level=info msg="StopPodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" returns successfully"
Jan 14 13:38:41.976636 kubelet[3442]: I0114 13:38:41.976592    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a"
Jan 14 13:38:41.977035 containerd[1759]: time="2025-01-14T13:38:41.976965389Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-9xx9c,Uid:f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1,Namespace:calico-apiserver,Attempt:3,}"
Jan 14 13:38:41.977713 containerd[1759]: time="2025-01-14T13:38:41.977442629Z" level=info msg="StopPodSandbox for \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\""
Jan 14 13:38:41.977713 containerd[1759]: time="2025-01-14T13:38:41.977603749Z" level=info msg="Ensure that sandbox f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a in task-service has been cleanup successfully"
Jan 14 13:38:41.979465 systemd[1]: run-netns-cni\x2d16fe9102\x2daaba\x2db03f\x2d9104\x2d0bb5c7c691aa.mount: Deactivated successfully.
Jan 14 13:38:41.980462 containerd[1759]: time="2025-01-14T13:38:41.979712111Z" level=info msg="TearDown network for sandbox \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\" successfully"
Jan 14 13:38:41.980462 containerd[1759]: time="2025-01-14T13:38:41.979736831Z" level=info msg="StopPodSandbox for \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\" returns successfully"
Jan 14 13:38:41.981784 containerd[1759]: time="2025-01-14T13:38:41.981510391Z" level=info msg="StopPodSandbox for \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\""
Jan 14 13:38:41.981784 containerd[1759]: time="2025-01-14T13:38:41.981604152Z" level=info msg="TearDown network for sandbox \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\" successfully"
Jan 14 13:38:41.982498 containerd[1759]: time="2025-01-14T13:38:41.981615712Z" level=info msg="StopPodSandbox for \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\" returns successfully"
Jan 14 13:38:41.983161 kubelet[3442]: I0114 13:38:41.982820    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600"
Jan 14 13:38:41.983738 containerd[1759]: time="2025-01-14T13:38:41.983710753Z" level=info msg="StopPodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\""
Jan 14 13:38:41.983830 containerd[1759]: time="2025-01-14T13:38:41.983792033Z" level=info msg="TearDown network for sandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" successfully"
Jan 14 13:38:41.983830 containerd[1759]: time="2025-01-14T13:38:41.983802353Z" level=info msg="StopPodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" returns successfully"
Jan 14 13:38:41.984079 containerd[1759]: time="2025-01-14T13:38:41.983921353Z" level=info msg="StopPodSandbox for \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\""
Jan 14 13:38:41.984575 containerd[1759]: time="2025-01-14T13:38:41.984405833Z" level=info msg="Ensure that sandbox dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600 in task-service has been cleanup successfully"
Jan 14 13:38:41.984872 containerd[1759]: time="2025-01-14T13:38:41.984848673Z" level=info msg="TearDown network for sandbox \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\" successfully"
Jan 14 13:38:41.986032 containerd[1759]: time="2025-01-14T13:38:41.985083193Z" level=info msg="StopPodSandbox for \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\" returns successfully"
Jan 14 13:38:41.986609 containerd[1759]: time="2025-01-14T13:38:41.986387314Z" level=info msg="StopPodSandbox for \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\""
Jan 14 13:38:41.986609 containerd[1759]: time="2025-01-14T13:38:41.986468194Z" level=info msg="TearDown network for sandbox \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\" successfully"
Jan 14 13:38:41.986609 containerd[1759]: time="2025-01-14T13:38:41.986477474Z" level=info msg="StopPodSandbox for \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\" returns successfully"
Jan 14 13:38:41.987491 containerd[1759]: time="2025-01-14T13:38:41.986876874Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-zcsrg,Uid:35d51e43-1c53-427a-a8d2-bd422d727c5b,Namespace:calico-apiserver,Attempt:3,}"
Jan 14 13:38:41.987645 systemd[1]: run-netns-cni\x2d25367287\x2d8252\x2d9918\x2dfb0c\x2d65e8a17fa9f3.mount: Deactivated successfully.
Jan 14 13:38:41.987941 containerd[1759]: time="2025-01-14T13:38:41.987830995Z" level=info msg="StopPodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\""
Jan 14 13:38:41.988296 containerd[1759]: time="2025-01-14T13:38:41.988197995Z" level=info msg="TearDown network for sandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" successfully"
Jan 14 13:38:41.988296 containerd[1759]: time="2025-01-14T13:38:41.988213955Z" level=info msg="StopPodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" returns successfully"
Jan 14 13:38:41.989037 kubelet[3442]: I0114 13:38:41.988553    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245"
Jan 14 13:38:41.989153 containerd[1759]: time="2025-01-14T13:38:41.989124675Z" level=info msg="StopPodSandbox for \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\""
Jan 14 13:38:41.989911 containerd[1759]: time="2025-01-14T13:38:41.989876036Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2fhlb,Uid:af2992f6-a261-4e1c-9cfc-1ef759086d8d,Namespace:kube-system,Attempt:3,}"
Jan 14 13:38:41.990103 containerd[1759]: time="2025-01-14T13:38:41.990075756Z" level=info msg="Ensure that sandbox e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245 in task-service has been cleanup successfully"
Jan 14 13:38:41.990481 containerd[1759]: time="2025-01-14T13:38:41.990444756Z" level=info msg="TearDown network for sandbox \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\" successfully"
Jan 14 13:38:41.990481 containerd[1759]: time="2025-01-14T13:38:41.990476636Z" level=info msg="StopPodSandbox for \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\" returns successfully"
Jan 14 13:38:41.991406 containerd[1759]: time="2025-01-14T13:38:41.991375917Z" level=info msg="StopPodSandbox for \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\""
Jan 14 13:38:41.991465 containerd[1759]: time="2025-01-14T13:38:41.991448797Z" level=info msg="TearDown network for sandbox \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\" successfully"
Jan 14 13:38:41.991465 containerd[1759]: time="2025-01-14T13:38:41.991458637Z" level=info msg="StopPodSandbox for \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\" returns successfully"
Jan 14 13:38:41.992548 systemd[1]: run-netns-cni\x2df8495c12\x2d2fb1\x2dec2a\x2de61e\x2da2d3497b979a.mount: Deactivated successfully.
Jan 14 13:38:41.993360 containerd[1759]: time="2025-01-14T13:38:41.993331838Z" level=info msg="StopPodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\""
Jan 14 13:38:41.994650 containerd[1759]: time="2025-01-14T13:38:41.994590798Z" level=info msg="TearDown network for sandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" successfully"
Jan 14 13:38:41.994650 containerd[1759]: time="2025-01-14T13:38:41.994621198Z" level=info msg="StopPodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" returns successfully"
Jan 14 13:38:41.995286 kubelet[3442]: I0114 13:38:41.995170    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6"
Jan 14 13:38:41.995961 containerd[1759]: time="2025-01-14T13:38:41.995921719Z" level=info msg="StopPodSandbox for \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\""
Jan 14 13:38:41.996192 containerd[1759]: time="2025-01-14T13:38:41.996164319Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8555cc7446-bv6h2,Uid:f475c534-3126-4484-b2ac-7f780fe28e12,Namespace:calico-system,Attempt:3,}"
Jan 14 13:38:41.996429 containerd[1759]: time="2025-01-14T13:38:41.996307719Z" level=info msg="Ensure that sandbox 343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6 in task-service has been cleanup successfully"
Jan 14 13:38:41.997040 containerd[1759]: time="2025-01-14T13:38:41.996835919Z" level=info msg="TearDown network for sandbox \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\" successfully"
Jan 14 13:38:41.997040 containerd[1759]: time="2025-01-14T13:38:41.996969799Z" level=info msg="StopPodSandbox for \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\" returns successfully"
Jan 14 13:38:41.998600 containerd[1759]: time="2025-01-14T13:38:41.998383560Z" level=info msg="StopPodSandbox for \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\""
Jan 14 13:38:41.998600 containerd[1759]: time="2025-01-14T13:38:41.998470880Z" level=info msg="TearDown network for sandbox \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\" successfully"
Jan 14 13:38:41.998600 containerd[1759]: time="2025-01-14T13:38:41.998481920Z" level=info msg="StopPodSandbox for \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\" returns successfully"
Jan 14 13:38:41.999636 containerd[1759]: time="2025-01-14T13:38:41.999542921Z" level=info msg="StopPodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\""
Jan 14 13:38:41.999895 containerd[1759]: time="2025-01-14T13:38:41.999808921Z" level=info msg="TearDown network for sandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" successfully"
Jan 14 13:38:41.999895 containerd[1759]: time="2025-01-14T13:38:41.999832281Z" level=info msg="StopPodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" returns successfully"
Jan 14 13:38:42.000495 kubelet[3442]: I0114 13:38:42.000437    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d"
Jan 14 13:38:42.001591 containerd[1759]: time="2025-01-14T13:38:42.001568322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-msgc9,Uid:c15a84fb-ac68-4acd-b385-a152a2911116,Namespace:kube-system,Attempt:3,}"
Jan 14 13:38:42.002061 containerd[1759]: time="2025-01-14T13:38:42.002033642Z" level=info msg="StopPodSandbox for \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\""
Jan 14 13:38:42.002339 containerd[1759]: time="2025-01-14T13:38:42.002314402Z" level=info msg="Ensure that sandbox b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d in task-service has been cleanup successfully"
Jan 14 13:38:42.002461 containerd[1759]: time="2025-01-14T13:38:42.002439122Z" level=info msg="TearDown network for sandbox \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\" successfully"
Jan 14 13:38:42.002461 containerd[1759]: time="2025-01-14T13:38:42.002459562Z" level=info msg="StopPodSandbox for \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\" returns successfully"
Jan 14 13:38:42.002855 containerd[1759]: time="2025-01-14T13:38:42.002713562Z" level=info msg="StopPodSandbox for \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\""
Jan 14 13:38:42.002855 containerd[1759]: time="2025-01-14T13:38:42.002797882Z" level=info msg="TearDown network for sandbox \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\" successfully"
Jan 14 13:38:42.002855 containerd[1759]: time="2025-01-14T13:38:42.002808082Z" level=info msg="StopPodSandbox for \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\" returns successfully"
Jan 14 13:38:42.003135 containerd[1759]: time="2025-01-14T13:38:42.003110083Z" level=info msg="StopPodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\""
Jan 14 13:38:42.003209 containerd[1759]: time="2025-01-14T13:38:42.003188003Z" level=info msg="TearDown network for sandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" successfully"
Jan 14 13:38:42.003209 containerd[1759]: time="2025-01-14T13:38:42.003204283Z" level=info msg="StopPodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" returns successfully"
Jan 14 13:38:42.003804 containerd[1759]: time="2025-01-14T13:38:42.003768723Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7bk8l,Uid:af70d9ef-1b20-4f9f-93e1-f55f680c58b4,Namespace:calico-system,Attempt:3,}"
Jan 14 13:38:42.741681 containerd[1759]: time="2025-01-14T13:38:42.741535700Z" level=error msg="Failed to destroy network for sandbox \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.746612 containerd[1759]: time="2025-01-14T13:38:42.746484583Z" level=error msg="encountered an error cleaning up failed sandbox \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.746612 containerd[1759]: time="2025-01-14T13:38:42.746566143Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-zcsrg,Uid:35d51e43-1c53-427a-a8d2-bd422d727c5b,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.747258 kubelet[3442]: E0114 13:38:42.747067    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.747258 kubelet[3442]: E0114 13:38:42.747151    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:42.747258 kubelet[3442]: E0114 13:38:42.747176    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:42.748716 kubelet[3442]: E0114 13:38:42.747256    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7478669d4-zcsrg_calico-apiserver(35d51e43-1c53-427a-a8d2-bd422d727c5b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7478669d4-zcsrg_calico-apiserver(35d51e43-1c53-427a-a8d2-bd422d727c5b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg" podUID="35d51e43-1c53-427a-a8d2-bd422d727c5b"
Jan 14 13:38:42.760790 containerd[1759]: time="2025-01-14T13:38:42.760682590Z" level=error msg="Failed to destroy network for sandbox \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.762851 containerd[1759]: time="2025-01-14T13:38:42.762808551Z" level=error msg="encountered an error cleaning up failed sandbox \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.762932 containerd[1759]: time="2025-01-14T13:38:42.762883271Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-9xx9c,Uid:f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.763474 kubelet[3442]: E0114 13:38:42.763436    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.763547 kubelet[3442]: E0114 13:38:42.763495    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:42.763547 kubelet[3442]: E0114 13:38:42.763517    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:42.764394 kubelet[3442]: E0114 13:38:42.763715    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7478669d4-9xx9c_calico-apiserver(f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7478669d4-9xx9c_calico-apiserver(f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c" podUID="f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1"
Jan 14 13:38:42.771082 containerd[1759]: time="2025-01-14T13:38:42.771040595Z" level=error msg="Failed to destroy network for sandbox \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.771418 containerd[1759]: time="2025-01-14T13:38:42.771382116Z" level=error msg="encountered an error cleaning up failed sandbox \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.771565 containerd[1759]: time="2025-01-14T13:38:42.771466076Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2fhlb,Uid:af2992f6-a261-4e1c-9cfc-1ef759086d8d,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.772065 kubelet[3442]: E0114 13:38:42.772043    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.772358 kubelet[3442]: E0114 13:38:42.772261    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2fhlb"
Jan 14 13:38:42.772358 kubelet[3442]: E0114 13:38:42.772299    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2fhlb"
Jan 14 13:38:42.772680 kubelet[3442]: E0114 13:38:42.772643    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-2fhlb_kube-system(af2992f6-a261-4e1c-9cfc-1ef759086d8d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-2fhlb_kube-system(af2992f6-a261-4e1c-9cfc-1ef759086d8d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-2fhlb" podUID="af2992f6-a261-4e1c-9cfc-1ef759086d8d"
Jan 14 13:38:42.781820 containerd[1759]: time="2025-01-14T13:38:42.781662961Z" level=error msg="Failed to destroy network for sandbox \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.782583 containerd[1759]: time="2025-01-14T13:38:42.782543521Z" level=error msg="encountered an error cleaning up failed sandbox \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.782650 containerd[1759]: time="2025-01-14T13:38:42.782617801Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7bk8l,Uid:af70d9ef-1b20-4f9f-93e1-f55f680c58b4,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.782936 kubelet[3442]: E0114 13:38:42.782899    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.783023 kubelet[3442]: E0114 13:38:42.782953    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:38:42.783023 kubelet[3442]: E0114 13:38:42.782973    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:38:42.783095 kubelet[3442]: E0114 13:38:42.783041    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7bk8l_calico-system(af70d9ef-1b20-4f9f-93e1-f55f680c58b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7bk8l_calico-system(af70d9ef-1b20-4f9f-93e1-f55f680c58b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:42.784538 containerd[1759]: time="2025-01-14T13:38:42.784417322Z" level=error msg="Failed to destroy network for sandbox \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.786223 containerd[1759]: time="2025-01-14T13:38:42.785569443Z" level=error msg="encountered an error cleaning up failed sandbox \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.786400 containerd[1759]: time="2025-01-14T13:38:42.786256443Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8555cc7446-bv6h2,Uid:f475c534-3126-4484-b2ac-7f780fe28e12,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.786478 kubelet[3442]: E0114 13:38:42.786410    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.786478 kubelet[3442]: E0114 13:38:42.786456    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:42.786478 kubelet[3442]: E0114 13:38:42.786475    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:42.786703 kubelet[3442]: E0114 13:38:42.786520    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8555cc7446-bv6h2_calico-system(f475c534-3126-4484-b2ac-7f780fe28e12)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8555cc7446-bv6h2_calico-system(f475c534-3126-4484-b2ac-7f780fe28e12)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2" podUID="f475c534-3126-4484-b2ac-7f780fe28e12"
Jan 14 13:38:42.793637 containerd[1759]: time="2025-01-14T13:38:42.793581767Z" level=error msg="Failed to destroy network for sandbox \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.794352 containerd[1759]: time="2025-01-14T13:38:42.794321047Z" level=error msg="encountered an error cleaning up failed sandbox \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.794419 containerd[1759]: time="2025-01-14T13:38:42.794380047Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-msgc9,Uid:c15a84fb-ac68-4acd-b385-a152a2911116,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.794665 kubelet[3442]: E0114 13:38:42.794642    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:42.794871 kubelet[3442]: E0114 13:38:42.794751    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-msgc9"
Jan 14 13:38:42.794938 kubelet[3442]: E0114 13:38:42.794881    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-msgc9"
Jan 14 13:38:42.794967 kubelet[3442]: E0114 13:38:42.794950    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-msgc9_kube-system(c15a84fb-ac68-4acd-b385-a152a2911116)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-msgc9_kube-system(c15a84fb-ac68-4acd-b385-a152a2911116)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-msgc9" podUID="c15a84fb-ac68-4acd-b385-a152a2911116"
Jan 14 13:38:42.946504 systemd[1]: run-netns-cni\x2d9f728366\x2dc23a\x2d6b9e\x2d6374\x2d214913a2e045.mount: Deactivated successfully.
Jan 14 13:38:42.946596 systemd[1]: run-netns-cni\x2ddeefc2ce\x2dbe24\x2d4af4\x2d7289\x2d636a1a39643f.mount: Deactivated successfully.
Jan 14 13:38:43.006328 kubelet[3442]: I0114 13:38:43.006020    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92"
Jan 14 13:38:43.008345 containerd[1759]: time="2025-01-14T13:38:43.008306797Z" level=info msg="StopPodSandbox for \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\""
Jan 14 13:38:43.009039 containerd[1759]: time="2025-01-14T13:38:43.008477637Z" level=info msg="Ensure that sandbox 60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92 in task-service has been cleanup successfully"
Jan 14 13:38:43.013924 containerd[1759]: time="2025-01-14T13:38:43.013866680Z" level=info msg="TearDown network for sandbox \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\" successfully"
Jan 14 13:38:43.013924 containerd[1759]: time="2025-01-14T13:38:43.013899000Z" level=info msg="StopPodSandbox for \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\" returns successfully"
Jan 14 13:38:43.014384 systemd[1]: run-netns-cni\x2dce4b3d7c\x2d2a97\x2d7c99\x2de37c\x2d7bf770c95032.mount: Deactivated successfully.
Jan 14 13:38:43.016282 containerd[1759]: time="2025-01-14T13:38:43.016243241Z" level=info msg="StopPodSandbox for \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\""
Jan 14 13:38:43.016375 containerd[1759]: time="2025-01-14T13:38:43.016354561Z" level=info msg="TearDown network for sandbox \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\" successfully"
Jan 14 13:38:43.016375 containerd[1759]: time="2025-01-14T13:38:43.016364801Z" level=info msg="StopPodSandbox for \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\" returns successfully"
Jan 14 13:38:43.016857 kubelet[3442]: I0114 13:38:43.016737    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931"
Jan 14 13:38:43.017638 containerd[1759]: time="2025-01-14T13:38:43.017500802Z" level=info msg="StopPodSandbox for \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\""
Jan 14 13:38:43.019129 containerd[1759]: time="2025-01-14T13:38:43.018959162Z" level=info msg="StopPodSandbox for \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\""
Jan 14 13:38:43.019129 containerd[1759]: time="2025-01-14T13:38:43.019056882Z" level=info msg="TearDown network for sandbox \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\" successfully"
Jan 14 13:38:43.019129 containerd[1759]: time="2025-01-14T13:38:43.019067922Z" level=info msg="StopPodSandbox for \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\" returns successfully"
Jan 14 13:38:43.019752 containerd[1759]: time="2025-01-14T13:38:43.019631403Z" level=info msg="Ensure that sandbox 841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931 in task-service has been cleanup successfully"
Jan 14 13:38:43.021594 containerd[1759]: time="2025-01-14T13:38:43.020502083Z" level=info msg="StopPodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\""
Jan 14 13:38:43.021594 containerd[1759]: time="2025-01-14T13:38:43.021588244Z" level=info msg="TearDown network for sandbox \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\" successfully"
Jan 14 13:38:43.021751 containerd[1759]: time="2025-01-14T13:38:43.021607044Z" level=info msg="StopPodSandbox for \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\" returns successfully"
Jan 14 13:38:43.022455 containerd[1759]: time="2025-01-14T13:38:43.022298084Z" level=info msg="TearDown network for sandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" successfully"
Jan 14 13:38:43.022455 containerd[1759]: time="2025-01-14T13:38:43.022318924Z" level=info msg="StopPodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" returns successfully"
Jan 14 13:38:43.022839 containerd[1759]: time="2025-01-14T13:38:43.022577044Z" level=info msg="StopPodSandbox for \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\""
Jan 14 13:38:43.022839 containerd[1759]: time="2025-01-14T13:38:43.022661644Z" level=info msg="TearDown network for sandbox \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\" successfully"
Jan 14 13:38:43.022839 containerd[1759]: time="2025-01-14T13:38:43.022672724Z" level=info msg="StopPodSandbox for \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\" returns successfully"
Jan 14 13:38:43.023210 systemd[1]: run-netns-cni\x2d9e0267c7\x2d4745\x2da86b\x2d6300\x2dc4edac7040d3.mount: Deactivated successfully.
Jan 14 13:38:43.025680 containerd[1759]: time="2025-01-14T13:38:43.024948845Z" level=info msg="StopPodSandbox for \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\""
Jan 14 13:38:43.025680 containerd[1759]: time="2025-01-14T13:38:43.025069165Z" level=info msg="TearDown network for sandbox \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\" successfully"
Jan 14 13:38:43.025680 containerd[1759]: time="2025-01-14T13:38:43.025080445Z" level=info msg="StopPodSandbox for \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\" returns successfully"
Jan 14 13:38:43.025798 kubelet[3442]: I0114 13:38:43.025288    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b"
Jan 14 13:38:43.026554 containerd[1759]: time="2025-01-14T13:38:43.026441966Z" level=info msg="StopPodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\""
Jan 14 13:38:43.026554 containerd[1759]: time="2025-01-14T13:38:43.026522446Z" level=info msg="TearDown network for sandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" successfully"
Jan 14 13:38:43.026554 containerd[1759]: time="2025-01-14T13:38:43.026534566Z" level=info msg="StopPodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" returns successfully"
Jan 14 13:38:43.026905 containerd[1759]: time="2025-01-14T13:38:43.026744086Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8555cc7446-bv6h2,Uid:f475c534-3126-4484-b2ac-7f780fe28e12,Namespace:calico-system,Attempt:4,}"
Jan 14 13:38:43.027729 containerd[1759]: time="2025-01-14T13:38:43.027553447Z" level=info msg="StopPodSandbox for \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\""
Jan 14 13:38:43.027985 containerd[1759]: time="2025-01-14T13:38:43.027783007Z" level=info msg="Ensure that sandbox 4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b in task-service has been cleanup successfully"
Jan 14 13:38:43.028216 containerd[1759]: time="2025-01-14T13:38:43.028122887Z" level=info msg="TearDown network for sandbox \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\" successfully"
Jan 14 13:38:43.028216 containerd[1759]: time="2025-01-14T13:38:43.028144047Z" level=info msg="StopPodSandbox for \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\" returns successfully"
Jan 14 13:38:43.030735 containerd[1759]: time="2025-01-14T13:38:43.028661287Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-9xx9c,Uid:f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1,Namespace:calico-apiserver,Attempt:4,}"
Jan 14 13:38:43.030872 systemd[1]: run-netns-cni\x2de9508ae2\x2db653\x2d1bf1\x2dffae\x2dd01d566056c7.mount: Deactivated successfully.
Jan 14 13:38:43.032188 containerd[1759]: time="2025-01-14T13:38:43.032141009Z" level=info msg="StopPodSandbox for \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\""
Jan 14 13:38:43.032257 containerd[1759]: time="2025-01-14T13:38:43.032216769Z" level=info msg="TearDown network for sandbox \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\" successfully"
Jan 14 13:38:43.032257 containerd[1759]: time="2025-01-14T13:38:43.032227529Z" level=info msg="StopPodSandbox for \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\" returns successfully"
Jan 14 13:38:43.035497 containerd[1759]: time="2025-01-14T13:38:43.035367731Z" level=info msg="StopPodSandbox for \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\""
Jan 14 13:38:43.035497 containerd[1759]: time="2025-01-14T13:38:43.035444531Z" level=info msg="TearDown network for sandbox \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\" successfully"
Jan 14 13:38:43.035497 containerd[1759]: time="2025-01-14T13:38:43.035455291Z" level=info msg="StopPodSandbox for \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\" returns successfully"
Jan 14 13:38:43.036118 containerd[1759]: time="2025-01-14T13:38:43.035839931Z" level=info msg="StopPodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\""
Jan 14 13:38:43.036118 containerd[1759]: time="2025-01-14T13:38:43.035911011Z" level=info msg="TearDown network for sandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" successfully"
Jan 14 13:38:43.036118 containerd[1759]: time="2025-01-14T13:38:43.035921011Z" level=info msg="StopPodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" returns successfully"
Jan 14 13:38:43.037222 kubelet[3442]: I0114 13:38:43.037144    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d"
Jan 14 13:38:43.037567 containerd[1759]: time="2025-01-14T13:38:43.037544412Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-msgc9,Uid:c15a84fb-ac68-4acd-b385-a152a2911116,Namespace:kube-system,Attempt:4,}"
Jan 14 13:38:43.040904 containerd[1759]: time="2025-01-14T13:38:43.040826894Z" level=info msg="StopPodSandbox for \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\""
Jan 14 13:38:43.041645 containerd[1759]: time="2025-01-14T13:38:43.041619734Z" level=info msg="Ensure that sandbox a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d in task-service has been cleanup successfully"
Jan 14 13:38:43.043936 systemd[1]: run-netns-cni\x2de494a3ed\x2d4397\x2d9292\x2d5254\x2d4520f8cb117a.mount: Deactivated successfully.
Jan 14 13:38:43.044650 containerd[1759]: time="2025-01-14T13:38:43.044523935Z" level=info msg="TearDown network for sandbox \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\" successfully"
Jan 14 13:38:43.045111 containerd[1759]: time="2025-01-14T13:38:43.044546615Z" level=info msg="StopPodSandbox for \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\" returns successfully"
Jan 14 13:38:43.047238 containerd[1759]: time="2025-01-14T13:38:43.047152737Z" level=info msg="StopPodSandbox for \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\""
Jan 14 13:38:43.047238 containerd[1759]: time="2025-01-14T13:38:43.047233697Z" level=info msg="TearDown network for sandbox \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\" successfully"
Jan 14 13:38:43.047476 containerd[1759]: time="2025-01-14T13:38:43.047243457Z" level=info msg="StopPodSandbox for \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\" returns successfully"
Jan 14 13:38:43.048424 containerd[1759]: time="2025-01-14T13:38:43.048284177Z" level=info msg="StopPodSandbox for \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\""
Jan 14 13:38:43.048424 containerd[1759]: time="2025-01-14T13:38:43.048370337Z" level=info msg="TearDown network for sandbox \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\" successfully"
Jan 14 13:38:43.048424 containerd[1759]: time="2025-01-14T13:38:43.048380017Z" level=info msg="StopPodSandbox for \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\" returns successfully"
Jan 14 13:38:43.048799 kubelet[3442]: I0114 13:38:43.048617    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5"
Jan 14 13:38:43.049425 containerd[1759]: time="2025-01-14T13:38:43.049242538Z" level=info msg="StopPodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\""
Jan 14 13:38:43.049425 containerd[1759]: time="2025-01-14T13:38:43.049322538Z" level=info msg="TearDown network for sandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" successfully"
Jan 14 13:38:43.049425 containerd[1759]: time="2025-01-14T13:38:43.049332058Z" level=info msg="StopPodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" returns successfully"
Jan 14 13:38:43.050496 containerd[1759]: time="2025-01-14T13:38:43.050458578Z" level=info msg="StopPodSandbox for \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\""
Jan 14 13:38:43.050637 containerd[1759]: time="2025-01-14T13:38:43.050613139Z" level=info msg="Ensure that sandbox 1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5 in task-service has been cleanup successfully"
Jan 14 13:38:43.051982 containerd[1759]: time="2025-01-14T13:38:43.051901899Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7bk8l,Uid:af70d9ef-1b20-4f9f-93e1-f55f680c58b4,Namespace:calico-system,Attempt:4,}"
Jan 14 13:38:43.053192 containerd[1759]: time="2025-01-14T13:38:43.053111340Z" level=info msg="TearDown network for sandbox \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\" successfully"
Jan 14 13:38:43.053192 containerd[1759]: time="2025-01-14T13:38:43.053135860Z" level=info msg="StopPodSandbox for \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\" returns successfully"
Jan 14 13:38:43.055819 containerd[1759]: time="2025-01-14T13:38:43.055732461Z" level=info msg="StopPodSandbox for \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\""
Jan 14 13:38:43.055819 containerd[1759]: time="2025-01-14T13:38:43.055812581Z" level=info msg="TearDown network for sandbox \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\" successfully"
Jan 14 13:38:43.055819 containerd[1759]: time="2025-01-14T13:38:43.055822701Z" level=info msg="StopPodSandbox for \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\" returns successfully"
Jan 14 13:38:43.058064 containerd[1759]: time="2025-01-14T13:38:43.058033542Z" level=info msg="StopPodSandbox for \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\""
Jan 14 13:38:43.058226 containerd[1759]: time="2025-01-14T13:38:43.058113142Z" level=info msg="TearDown network for sandbox \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\" successfully"
Jan 14 13:38:43.058226 containerd[1759]: time="2025-01-14T13:38:43.058123422Z" level=info msg="StopPodSandbox for \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\" returns successfully"
Jan 14 13:38:43.058647 kubelet[3442]: I0114 13:38:43.058480    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf"
Jan 14 13:38:43.058815 containerd[1759]: time="2025-01-14T13:38:43.058735023Z" level=info msg="StopPodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\""
Jan 14 13:38:43.059031 containerd[1759]: time="2025-01-14T13:38:43.058924503Z" level=info msg="TearDown network for sandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" successfully"
Jan 14 13:38:43.059031 containerd[1759]: time="2025-01-14T13:38:43.058939463Z" level=info msg="StopPodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" returns successfully"
Jan 14 13:38:43.059158 containerd[1759]: time="2025-01-14T13:38:43.059106263Z" level=info msg="StopPodSandbox for \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\""
Jan 14 13:38:43.060577 containerd[1759]: time="2025-01-14T13:38:43.060288223Z" level=info msg="Ensure that sandbox e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf in task-service has been cleanup successfully"
Jan 14 13:38:43.061841 containerd[1759]: time="2025-01-14T13:38:43.060987544Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-zcsrg,Uid:35d51e43-1c53-427a-a8d2-bd422d727c5b,Namespace:calico-apiserver,Attempt:4,}"
Jan 14 13:38:43.062544 containerd[1759]: time="2025-01-14T13:38:43.062518665Z" level=info msg="TearDown network for sandbox \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\" successfully"
Jan 14 13:38:43.062732 containerd[1759]: time="2025-01-14T13:38:43.062715585Z" level=info msg="StopPodSandbox for \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\" returns successfully"
Jan 14 13:38:43.064579 containerd[1759]: time="2025-01-14T13:38:43.064552146Z" level=info msg="StopPodSandbox for \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\""
Jan 14 13:38:43.064653 containerd[1759]: time="2025-01-14T13:38:43.064634746Z" level=info msg="TearDown network for sandbox \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\" successfully"
Jan 14 13:38:43.064653 containerd[1759]: time="2025-01-14T13:38:43.064644346Z" level=info msg="StopPodSandbox for \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\" returns successfully"
Jan 14 13:38:43.065146 containerd[1759]: time="2025-01-14T13:38:43.065123306Z" level=info msg="StopPodSandbox for \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\""
Jan 14 13:38:43.065303 containerd[1759]: time="2025-01-14T13:38:43.065286546Z" level=info msg="TearDown network for sandbox \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\" successfully"
Jan 14 13:38:43.065370 containerd[1759]: time="2025-01-14T13:38:43.065355826Z" level=info msg="StopPodSandbox for \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\" returns successfully"
Jan 14 13:38:43.066158 containerd[1759]: time="2025-01-14T13:38:43.066092026Z" level=info msg="StopPodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\""
Jan 14 13:38:43.066238 containerd[1759]: time="2025-01-14T13:38:43.066183227Z" level=info msg="TearDown network for sandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" successfully"
Jan 14 13:38:43.066238 containerd[1759]: time="2025-01-14T13:38:43.066194187Z" level=info msg="StopPodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" returns successfully"
Jan 14 13:38:43.067273 containerd[1759]: time="2025-01-14T13:38:43.067098867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2fhlb,Uid:af2992f6-a261-4e1c-9cfc-1ef759086d8d,Namespace:kube-system,Attempt:4,}"
Jan 14 13:38:43.343907 containerd[1759]: time="2025-01-14T13:38:43.342955568Z" level=error msg="Failed to destroy network for sandbox \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.345024 containerd[1759]: time="2025-01-14T13:38:43.344452369Z" level=error msg="encountered an error cleaning up failed sandbox \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.345024 containerd[1759]: time="2025-01-14T13:38:43.344728089Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-9xx9c,Uid:f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.346715 kubelet[3442]: E0114 13:38:43.345324    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.346715 kubelet[3442]: E0114 13:38:43.345375    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:43.346715 kubelet[3442]: E0114 13:38:43.345394    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:43.346838 kubelet[3442]: E0114 13:38:43.345442    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7478669d4-9xx9c_calico-apiserver(f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7478669d4-9xx9c_calico-apiserver(f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c" podUID="f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1"
Jan 14 13:38:43.444751 containerd[1759]: time="2025-01-14T13:38:43.444542180Z" level=error msg="Failed to destroy network for sandbox \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.449491 containerd[1759]: time="2025-01-14T13:38:43.448714222Z" level=error msg="encountered an error cleaning up failed sandbox \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.449491 containerd[1759]: time="2025-01-14T13:38:43.449385063Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-msgc9,Uid:c15a84fb-ac68-4acd-b385-a152a2911116,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.450049 kubelet[3442]: E0114 13:38:43.449810    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.450049 kubelet[3442]: E0114 13:38:43.449865    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-msgc9"
Jan 14 13:38:43.450049 kubelet[3442]: E0114 13:38:43.449885    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-msgc9"
Jan 14 13:38:43.450152 kubelet[3442]: E0114 13:38:43.449934    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-msgc9_kube-system(c15a84fb-ac68-4acd-b385-a152a2911116)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-msgc9_kube-system(c15a84fb-ac68-4acd-b385-a152a2911116)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-msgc9" podUID="c15a84fb-ac68-4acd-b385-a152a2911116"
Jan 14 13:38:43.453826 containerd[1759]: time="2025-01-14T13:38:43.453736425Z" level=error msg="Failed to destroy network for sandbox \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.454537 containerd[1759]: time="2025-01-14T13:38:43.454176585Z" level=error msg="encountered an error cleaning up failed sandbox \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.454537 containerd[1759]: time="2025-01-14T13:38:43.454233065Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7bk8l,Uid:af70d9ef-1b20-4f9f-93e1-f55f680c58b4,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.454652 kubelet[3442]: E0114 13:38:43.454390    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.454652 kubelet[3442]: E0114 13:38:43.454432    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:38:43.454652 kubelet[3442]: E0114 13:38:43.454453    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:38:43.455380 kubelet[3442]: E0114 13:38:43.454496    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7bk8l_calico-system(af70d9ef-1b20-4f9f-93e1-f55f680c58b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7bk8l_calico-system(af70d9ef-1b20-4f9f-93e1-f55f680c58b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:43.463529 containerd[1759]: time="2025-01-14T13:38:43.463490190Z" level=error msg="Failed to destroy network for sandbox \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.463947 containerd[1759]: time="2025-01-14T13:38:43.463911990Z" level=error msg="encountered an error cleaning up failed sandbox \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.464171 containerd[1759]: time="2025-01-14T13:38:43.464099110Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8555cc7446-bv6h2,Uid:f475c534-3126-4484-b2ac-7f780fe28e12,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.464802 kubelet[3442]: E0114 13:38:43.464467    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.464802 kubelet[3442]: E0114 13:38:43.464515    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:43.464802 kubelet[3442]: E0114 13:38:43.464536    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:43.464933 kubelet[3442]: E0114 13:38:43.464581    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8555cc7446-bv6h2_calico-system(f475c534-3126-4484-b2ac-7f780fe28e12)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8555cc7446-bv6h2_calico-system(f475c534-3126-4484-b2ac-7f780fe28e12)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2" podUID="f475c534-3126-4484-b2ac-7f780fe28e12"
Jan 14 13:38:43.466420 containerd[1759]: time="2025-01-14T13:38:43.466353911Z" level=error msg="Failed to destroy network for sandbox \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.467435 containerd[1759]: time="2025-01-14T13:38:43.467314312Z" level=error msg="encountered an error cleaning up failed sandbox \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.467435 containerd[1759]: time="2025-01-14T13:38:43.467369752Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-zcsrg,Uid:35d51e43-1c53-427a-a8d2-bd422d727c5b,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.467979 kubelet[3442]: E0114 13:38:43.467873    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.468165 kubelet[3442]: E0114 13:38:43.468073    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:43.468165 kubelet[3442]: E0114 13:38:43.468096    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:43.468576 kubelet[3442]: E0114 13:38:43.468351    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7478669d4-zcsrg_calico-apiserver(35d51e43-1c53-427a-a8d2-bd422d727c5b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7478669d4-zcsrg_calico-apiserver(35d51e43-1c53-427a-a8d2-bd422d727c5b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg" podUID="35d51e43-1c53-427a-a8d2-bd422d727c5b"
Jan 14 13:38:43.483735 containerd[1759]: time="2025-01-14T13:38:43.483615400Z" level=error msg="Failed to destroy network for sandbox \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.484611 containerd[1759]: time="2025-01-14T13:38:43.484502241Z" level=error msg="encountered an error cleaning up failed sandbox \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.484611 containerd[1759]: time="2025-01-14T13:38:43.484597241Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2fhlb,Uid:af2992f6-a261-4e1c-9cfc-1ef759086d8d,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.485238 kubelet[3442]: E0114 13:38:43.484858    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:43.485238 kubelet[3442]: E0114 13:38:43.484907    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2fhlb"
Jan 14 13:38:43.485238 kubelet[3442]: E0114 13:38:43.484927    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2fhlb"
Jan 14 13:38:43.485326 kubelet[3442]: E0114 13:38:43.484982    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-2fhlb_kube-system(af2992f6-a261-4e1c-9cfc-1ef759086d8d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-2fhlb_kube-system(af2992f6-a261-4e1c-9cfc-1ef759086d8d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-2fhlb" podUID="af2992f6-a261-4e1c-9cfc-1ef759086d8d"
Jan 14 13:38:43.945331 systemd[1]: run-netns-cni\x2daf88834e\x2de33f\x2d123f\x2d5aa5\x2db5c39d348ddc.mount: Deactivated successfully.
Jan 14 13:38:43.945566 systemd[1]: run-netns-cni\x2d6954abe1\x2d5b4e\x2d7488\x2de923\x2d5e2af6e7afaf.mount: Deactivated successfully.
Jan 14 13:38:44.067269 kubelet[3442]: I0114 13:38:44.067055    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da"
Jan 14 13:38:44.070792 containerd[1759]: time="2025-01-14T13:38:44.069671700Z" level=info msg="StopPodSandbox for \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\""
Jan 14 13:38:44.070792 containerd[1759]: time="2025-01-14T13:38:44.069849300Z" level=info msg="Ensure that sandbox 6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da in task-service has been cleanup successfully"
Jan 14 13:38:44.075846 containerd[1759]: time="2025-01-14T13:38:44.073131422Z" level=info msg="TearDown network for sandbox \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\" successfully"
Jan 14 13:38:44.075846 containerd[1759]: time="2025-01-14T13:38:44.073161422Z" level=info msg="StopPodSandbox for \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\" returns successfully"
Jan 14 13:38:44.075876 systemd[1]: run-netns-cni\x2d4ddcd71f\x2de12d\x2d53e8\x2d337c\x2d40b79aa720d0.mount: Deactivated successfully.
Jan 14 13:38:44.076977 containerd[1759]: time="2025-01-14T13:38:44.076537504Z" level=info msg="StopPodSandbox for \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\""
Jan 14 13:38:44.076977 containerd[1759]: time="2025-01-14T13:38:44.076620624Z" level=info msg="TearDown network for sandbox \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\" successfully"
Jan 14 13:38:44.076977 containerd[1759]: time="2025-01-14T13:38:44.076632024Z" level=info msg="StopPodSandbox for \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\" returns successfully"
Jan 14 13:38:44.079654 containerd[1759]: time="2025-01-14T13:38:44.079552545Z" level=info msg="StopPodSandbox for \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\""
Jan 14 13:38:44.079654 containerd[1759]: time="2025-01-14T13:38:44.079655985Z" level=info msg="TearDown network for sandbox \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\" successfully"
Jan 14 13:38:44.079874 containerd[1759]: time="2025-01-14T13:38:44.079666905Z" level=info msg="StopPodSandbox for \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\" returns successfully"
Jan 14 13:38:44.080023 containerd[1759]: time="2025-01-14T13:38:44.079963225Z" level=info msg="StopPodSandbox for \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\""
Jan 14 13:38:44.080125 containerd[1759]: time="2025-01-14T13:38:44.080068145Z" level=info msg="TearDown network for sandbox \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\" successfully"
Jan 14 13:38:44.080125 containerd[1759]: time="2025-01-14T13:38:44.080084825Z" level=info msg="StopPodSandbox for \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\" returns successfully"
Jan 14 13:38:44.080821 containerd[1759]: time="2025-01-14T13:38:44.080784106Z" level=info msg="StopPodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\""
Jan 14 13:38:44.080892 containerd[1759]: time="2025-01-14T13:38:44.080860226Z" level=info msg="TearDown network for sandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" successfully"
Jan 14 13:38:44.080892 containerd[1759]: time="2025-01-14T13:38:44.080871706Z" level=info msg="StopPodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" returns successfully"
Jan 14 13:38:44.082319 containerd[1759]: time="2025-01-14T13:38:44.082279226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-zcsrg,Uid:35d51e43-1c53-427a-a8d2-bd422d727c5b,Namespace:calico-apiserver,Attempt:5,}"
Jan 14 13:38:44.083482 kubelet[3442]: I0114 13:38:44.083306    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe"
Jan 14 13:38:44.084806 containerd[1759]: time="2025-01-14T13:38:44.084682508Z" level=info msg="StopPodSandbox for \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\""
Jan 14 13:38:44.084881 containerd[1759]: time="2025-01-14T13:38:44.084833788Z" level=info msg="Ensure that sandbox 52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe in task-service has been cleanup successfully"
Jan 14 13:38:44.087650 containerd[1759]: time="2025-01-14T13:38:44.087453149Z" level=info msg="TearDown network for sandbox \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\" successfully"
Jan 14 13:38:44.087650 containerd[1759]: time="2025-01-14T13:38:44.087479149Z" level=info msg="StopPodSandbox for \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\" returns successfully"
Jan 14 13:38:44.089429 systemd[1]: run-netns-cni\x2d3df65af7\x2dfb57\x2d4b65\x2d9513\x2dc6b8eca8198e.mount: Deactivated successfully.
Jan 14 13:38:44.090514 containerd[1759]: time="2025-01-14T13:38:44.090439551Z" level=info msg="StopPodSandbox for \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\""
Jan 14 13:38:44.092294 containerd[1759]: time="2025-01-14T13:38:44.092232712Z" level=info msg="TearDown network for sandbox \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\" successfully"
Jan 14 13:38:44.092294 containerd[1759]: time="2025-01-14T13:38:44.092258072Z" level=info msg="StopPodSandbox for \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\" returns successfully"
Jan 14 13:38:44.093457 kubelet[3442]: I0114 13:38:44.093350    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083"
Jan 14 13:38:44.097207 containerd[1759]: time="2025-01-14T13:38:44.094571393Z" level=info msg="StopPodSandbox for \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\""
Jan 14 13:38:44.097207 containerd[1759]: time="2025-01-14T13:38:44.094723313Z" level=info msg="Ensure that sandbox 5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083 in task-service has been cleanup successfully"
Jan 14 13:38:44.097207 containerd[1759]: time="2025-01-14T13:38:44.094853113Z" level=info msg="StopPodSandbox for \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\""
Jan 14 13:38:44.097207 containerd[1759]: time="2025-01-14T13:38:44.094911033Z" level=info msg="TearDown network for sandbox \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\" successfully"
Jan 14 13:38:44.097207 containerd[1759]: time="2025-01-14T13:38:44.094919753Z" level=info msg="StopPodSandbox for \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\" returns successfully"
Jan 14 13:38:44.097207 containerd[1759]: time="2025-01-14T13:38:44.095303073Z" level=info msg="StopPodSandbox for \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\""
Jan 14 13:38:44.097207 containerd[1759]: time="2025-01-14T13:38:44.095370473Z" level=info msg="TearDown network for sandbox \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\" successfully"
Jan 14 13:38:44.097207 containerd[1759]: time="2025-01-14T13:38:44.095379113Z" level=info msg="StopPodSandbox for \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\" returns successfully"
Jan 14 13:38:44.097207 containerd[1759]: time="2025-01-14T13:38:44.095582873Z" level=info msg="StopPodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\""
Jan 14 13:38:44.097207 containerd[1759]: time="2025-01-14T13:38:44.095679633Z" level=info msg="TearDown network for sandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" successfully"
Jan 14 13:38:44.097207 containerd[1759]: time="2025-01-14T13:38:44.095688793Z" level=info msg="StopPodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" returns successfully"
Jan 14 13:38:44.097685 containerd[1759]: time="2025-01-14T13:38:44.097325314Z" level=info msg="TearDown network for sandbox \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\" successfully"
Jan 14 13:38:44.097685 containerd[1759]: time="2025-01-14T13:38:44.097342994Z" level=info msg="StopPodSandbox for \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\" returns successfully"
Jan 14 13:38:44.098077 containerd[1759]: time="2025-01-14T13:38:44.097875154Z" level=info msg="StopPodSandbox for \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\""
Jan 14 13:38:44.098077 containerd[1759]: time="2025-01-14T13:38:44.097951314Z" level=info msg="TearDown network for sandbox \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\" successfully"
Jan 14 13:38:44.098077 containerd[1759]: time="2025-01-14T13:38:44.097960114Z" level=info msg="StopPodSandbox for \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\" returns successfully"
Jan 14 13:38:44.098886 systemd[1]: run-netns-cni\x2df8c7dfde\x2d4263\x2d0172\x2d79d3\x2d9126a505b359.mount: Deactivated successfully.
Jan 14 13:38:44.100352 containerd[1759]: time="2025-01-14T13:38:44.100312116Z" level=info msg="StopPodSandbox for \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\""
Jan 14 13:38:44.101099 containerd[1759]: time="2025-01-14T13:38:44.100396796Z" level=info msg="TearDown network for sandbox \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\" successfully"
Jan 14 13:38:44.101099 containerd[1759]: time="2025-01-14T13:38:44.100407876Z" level=info msg="StopPodSandbox for \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\" returns successfully"
Jan 14 13:38:44.101099 containerd[1759]: time="2025-01-14T13:38:44.100780516Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2fhlb,Uid:af2992f6-a261-4e1c-9cfc-1ef759086d8d,Namespace:kube-system,Attempt:5,}"
Jan 14 13:38:44.102316 containerd[1759]: time="2025-01-14T13:38:44.102291997Z" level=info msg="StopPodSandbox for \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\""
Jan 14 13:38:44.107133 containerd[1759]: time="2025-01-14T13:38:44.102508277Z" level=info msg="TearDown network for sandbox \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\" successfully"
Jan 14 13:38:44.107133 containerd[1759]: time="2025-01-14T13:38:44.102522317Z" level=info msg="StopPodSandbox for \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\" returns successfully"
Jan 14 13:38:44.107133 containerd[1759]: time="2025-01-14T13:38:44.102764597Z" level=info msg="StopPodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\""
Jan 14 13:38:44.107133 containerd[1759]: time="2025-01-14T13:38:44.102983077Z" level=info msg="TearDown network for sandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" successfully"
Jan 14 13:38:44.107133 containerd[1759]: time="2025-01-14T13:38:44.103115597Z" level=info msg="StopPodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" returns successfully"
Jan 14 13:38:44.107133 containerd[1759]: time="2025-01-14T13:38:44.104026598Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8555cc7446-bv6h2,Uid:f475c534-3126-4484-b2ac-7f780fe28e12,Namespace:calico-system,Attempt:5,}"
Jan 14 13:38:44.107133 containerd[1759]: time="2025-01-14T13:38:44.106656999Z" level=info msg="StopPodSandbox for \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\""
Jan 14 13:38:44.107320 kubelet[3442]: I0114 13:38:44.105686    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101"
Jan 14 13:38:44.108596 containerd[1759]: time="2025-01-14T13:38:44.108433600Z" level=info msg="Ensure that sandbox b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101 in task-service has been cleanup successfully"
Jan 14 13:38:44.109344 containerd[1759]: time="2025-01-14T13:38:44.108771640Z" level=info msg="TearDown network for sandbox \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\" successfully"
Jan 14 13:38:44.109344 containerd[1759]: time="2025-01-14T13:38:44.108792560Z" level=info msg="StopPodSandbox for \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\" returns successfully"
Jan 14 13:38:44.111515 systemd[1]: run-netns-cni\x2dc338e5e3\x2d9084\x2d20f3\x2d1211\x2d104a3d80a00c.mount: Deactivated successfully.
Jan 14 13:38:44.113819 containerd[1759]: time="2025-01-14T13:38:44.113789163Z" level=info msg="StopPodSandbox for \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\""
Jan 14 13:38:44.114446 containerd[1759]: time="2025-01-14T13:38:44.114170883Z" level=info msg="TearDown network for sandbox \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\" successfully"
Jan 14 13:38:44.114446 containerd[1759]: time="2025-01-14T13:38:44.114189803Z" level=info msg="StopPodSandbox for \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\" returns successfully"
Jan 14 13:38:44.115081 containerd[1759]: time="2025-01-14T13:38:44.114815803Z" level=info msg="StopPodSandbox for \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\""
Jan 14 13:38:44.115081 containerd[1759]: time="2025-01-14T13:38:44.114903243Z" level=info msg="TearDown network for sandbox \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\" successfully"
Jan 14 13:38:44.115081 containerd[1759]: time="2025-01-14T13:38:44.114913563Z" level=info msg="StopPodSandbox for \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\" returns successfully"
Jan 14 13:38:44.117075 containerd[1759]: time="2025-01-14T13:38:44.116957844Z" level=info msg="StopPodSandbox for \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\""
Jan 14 13:38:44.117075 containerd[1759]: time="2025-01-14T13:38:44.117050284Z" level=info msg="TearDown network for sandbox \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\" successfully"
Jan 14 13:38:44.117486 containerd[1759]: time="2025-01-14T13:38:44.117059964Z" level=info msg="StopPodSandbox for \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\" returns successfully"
Jan 14 13:38:44.118292 containerd[1759]: time="2025-01-14T13:38:44.118244525Z" level=info msg="StopPodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\""
Jan 14 13:38:44.118821 containerd[1759]: time="2025-01-14T13:38:44.118317605Z" level=info msg="TearDown network for sandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" successfully"
Jan 14 13:38:44.118821 containerd[1759]: time="2025-01-14T13:38:44.118327445Z" level=info msg="StopPodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" returns successfully"
Jan 14 13:38:44.118821 containerd[1759]: time="2025-01-14T13:38:44.118715365Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-msgc9,Uid:c15a84fb-ac68-4acd-b385-a152a2911116,Namespace:kube-system,Attempt:5,}"
Jan 14 13:38:44.119447 kubelet[3442]: I0114 13:38:44.118678    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77"
Jan 14 13:38:44.120680 containerd[1759]: time="2025-01-14T13:38:44.120444806Z" level=info msg="StopPodSandbox for \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\""
Jan 14 13:38:44.121094 containerd[1759]: time="2025-01-14T13:38:44.120790166Z" level=info msg="Ensure that sandbox ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77 in task-service has been cleanup successfully"
Jan 14 13:38:44.122006 containerd[1759]: time="2025-01-14T13:38:44.121968007Z" level=info msg="TearDown network for sandbox \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\" successfully"
Jan 14 13:38:44.122006 containerd[1759]: time="2025-01-14T13:38:44.122005647Z" level=info msg="StopPodSandbox for \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\" returns successfully"
Jan 14 13:38:44.122442 containerd[1759]: time="2025-01-14T13:38:44.122398887Z" level=info msg="StopPodSandbox for \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\""
Jan 14 13:38:44.122489 containerd[1759]: time="2025-01-14T13:38:44.122467567Z" level=info msg="TearDown network for sandbox \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\" successfully"
Jan 14 13:38:44.122489 containerd[1759]: time="2025-01-14T13:38:44.122477807Z" level=info msg="StopPodSandbox for \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\" returns successfully"
Jan 14 13:38:44.122795 containerd[1759]: time="2025-01-14T13:38:44.122766127Z" level=info msg="StopPodSandbox for \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\""
Jan 14 13:38:44.122961 containerd[1759]: time="2025-01-14T13:38:44.122830527Z" level=info msg="TearDown network for sandbox \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\" successfully"
Jan 14 13:38:44.122961 containerd[1759]: time="2025-01-14T13:38:44.122840007Z" level=info msg="StopPodSandbox for \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\" returns successfully"
Jan 14 13:38:44.123422 containerd[1759]: time="2025-01-14T13:38:44.123312807Z" level=info msg="StopPodSandbox for \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\""
Jan 14 13:38:44.123422 containerd[1759]: time="2025-01-14T13:38:44.123389928Z" level=info msg="TearDown network for sandbox \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\" successfully"
Jan 14 13:38:44.123422 containerd[1759]: time="2025-01-14T13:38:44.123399928Z" level=info msg="StopPodSandbox for \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\" returns successfully"
Jan 14 13:38:44.123857 kubelet[3442]: I0114 13:38:44.123681    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4"
Jan 14 13:38:44.125960 containerd[1759]: time="2025-01-14T13:38:44.125911009Z" level=info msg="StopPodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\""
Jan 14 13:38:44.126216 containerd[1759]: time="2025-01-14T13:38:44.126042769Z" level=info msg="TearDown network for sandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" successfully"
Jan 14 13:38:44.126216 containerd[1759]: time="2025-01-14T13:38:44.126054129Z" level=info msg="StopPodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" returns successfully"
Jan 14 13:38:44.128523 containerd[1759]: time="2025-01-14T13:38:44.128475130Z" level=info msg="StopPodSandbox for \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\""
Jan 14 13:38:44.129162 containerd[1759]: time="2025-01-14T13:38:44.129130290Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7bk8l,Uid:af70d9ef-1b20-4f9f-93e1-f55f680c58b4,Namespace:calico-system,Attempt:5,}"
Jan 14 13:38:44.131087 containerd[1759]: time="2025-01-14T13:38:44.131046411Z" level=info msg="Ensure that sandbox 5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4 in task-service has been cleanup successfully"
Jan 14 13:38:44.131271 containerd[1759]: time="2025-01-14T13:38:44.131220412Z" level=info msg="TearDown network for sandbox \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\" successfully"
Jan 14 13:38:44.131271 containerd[1759]: time="2025-01-14T13:38:44.131237572Z" level=info msg="StopPodSandbox for \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\" returns successfully"
Jan 14 13:38:44.132451 containerd[1759]: time="2025-01-14T13:38:44.132286812Z" level=info msg="StopPodSandbox for \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\""
Jan 14 13:38:44.133370 containerd[1759]: time="2025-01-14T13:38:44.133281533Z" level=info msg="TearDown network for sandbox \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\" successfully"
Jan 14 13:38:44.133370 containerd[1759]: time="2025-01-14T13:38:44.133303213Z" level=info msg="StopPodSandbox for \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\" returns successfully"
Jan 14 13:38:44.137170 containerd[1759]: time="2025-01-14T13:38:44.137086255Z" level=info msg="StopPodSandbox for \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\""
Jan 14 13:38:44.137274 containerd[1759]: time="2025-01-14T13:38:44.137176535Z" level=info msg="TearDown network for sandbox \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\" successfully"
Jan 14 13:38:44.137274 containerd[1759]: time="2025-01-14T13:38:44.137188215Z" level=info msg="StopPodSandbox for \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\" returns successfully"
Jan 14 13:38:44.140972 containerd[1759]: time="2025-01-14T13:38:44.140939856Z" level=info msg="StopPodSandbox for \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\""
Jan 14 13:38:44.143080 containerd[1759]: time="2025-01-14T13:38:44.143045338Z" level=info msg="TearDown network for sandbox \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\" successfully"
Jan 14 13:38:44.143080 containerd[1759]: time="2025-01-14T13:38:44.143071578Z" level=info msg="StopPodSandbox for \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\" returns successfully"
Jan 14 13:38:44.144173 containerd[1759]: time="2025-01-14T13:38:44.144046498Z" level=info msg="StopPodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\""
Jan 14 13:38:44.144437 containerd[1759]: time="2025-01-14T13:38:44.144394938Z" level=info msg="TearDown network for sandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" successfully"
Jan 14 13:38:44.144437 containerd[1759]: time="2025-01-14T13:38:44.144418938Z" level=info msg="StopPodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" returns successfully"
Jan 14 13:38:44.145279 containerd[1759]: time="2025-01-14T13:38:44.145241459Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-9xx9c,Uid:f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1,Namespace:calico-apiserver,Attempt:5,}"
Jan 14 13:38:44.249525 containerd[1759]: time="2025-01-14T13:38:44.249344872Z" level=error msg="Failed to destroy network for sandbox \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.251841 containerd[1759]: time="2025-01-14T13:38:44.251803833Z" level=error msg="encountered an error cleaning up failed sandbox \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.251929 containerd[1759]: time="2025-01-14T13:38:44.251874593Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-zcsrg,Uid:35d51e43-1c53-427a-a8d2-bd422d727c5b,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.252740 kubelet[3442]: E0114 13:38:44.252706    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.252740 kubelet[3442]: E0114 13:38:44.252768    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:44.252911 kubelet[3442]: E0114 13:38:44.252888    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:44.253161 kubelet[3442]: E0114 13:38:44.252955    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7478669d4-zcsrg_calico-apiserver(35d51e43-1c53-427a-a8d2-bd422d727c5b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7478669d4-zcsrg_calico-apiserver(35d51e43-1c53-427a-a8d2-bd422d727c5b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg" podUID="35d51e43-1c53-427a-a8d2-bd422d727c5b"
Jan 14 13:38:44.388172 containerd[1759]: time="2025-01-14T13:38:44.388121543Z" level=error msg="Failed to destroy network for sandbox \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.388454 containerd[1759]: time="2025-01-14T13:38:44.388424423Z" level=error msg="encountered an error cleaning up failed sandbox \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.388614 containerd[1759]: time="2025-01-14T13:38:44.388482863Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2fhlb,Uid:af2992f6-a261-4e1c-9cfc-1ef759086d8d,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.388768 kubelet[3442]: E0114 13:38:44.388742    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.388926 kubelet[3442]: E0114 13:38:44.388805    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2fhlb"
Jan 14 13:38:44.388926 kubelet[3442]: E0114 13:38:44.388826    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2fhlb"
Jan 14 13:38:44.388926 kubelet[3442]: E0114 13:38:44.388890    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-2fhlb_kube-system(af2992f6-a261-4e1c-9cfc-1ef759086d8d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-2fhlb_kube-system(af2992f6-a261-4e1c-9cfc-1ef759086d8d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-2fhlb" podUID="af2992f6-a261-4e1c-9cfc-1ef759086d8d"
Jan 14 13:38:44.445096 containerd[1759]: time="2025-01-14T13:38:44.444964332Z" level=error msg="Failed to destroy network for sandbox \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.447188 containerd[1759]: time="2025-01-14T13:38:44.447032133Z" level=error msg="encountered an error cleaning up failed sandbox \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.447880 containerd[1759]: time="2025-01-14T13:38:44.447851454Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-msgc9,Uid:c15a84fb-ac68-4acd-b385-a152a2911116,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.449162 kubelet[3442]: E0114 13:38:44.449037    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.449162 kubelet[3442]: E0114 13:38:44.449108    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-msgc9"
Jan 14 13:38:44.449162 kubelet[3442]: E0114 13:38:44.449133    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-msgc9"
Jan 14 13:38:44.449639 kubelet[3442]: E0114 13:38:44.449573    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-msgc9_kube-system(c15a84fb-ac68-4acd-b385-a152a2911116)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-msgc9_kube-system(c15a84fb-ac68-4acd-b385-a152a2911116)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-msgc9" podUID="c15a84fb-ac68-4acd-b385-a152a2911116"
Jan 14 13:38:44.458300 containerd[1759]: time="2025-01-14T13:38:44.458204939Z" level=error msg="Failed to destroy network for sandbox \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.459227 containerd[1759]: time="2025-01-14T13:38:44.459102939Z" level=error msg="encountered an error cleaning up failed sandbox \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.459227 containerd[1759]: time="2025-01-14T13:38:44.459202779Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8555cc7446-bv6h2,Uid:f475c534-3126-4484-b2ac-7f780fe28e12,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.459923 kubelet[3442]: E0114 13:38:44.459393    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.459923 kubelet[3442]: E0114 13:38:44.459433    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:44.459923 kubelet[3442]: E0114 13:38:44.459458    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:44.460169 kubelet[3442]: E0114 13:38:44.459511    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8555cc7446-bv6h2_calico-system(f475c534-3126-4484-b2ac-7f780fe28e12)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8555cc7446-bv6h2_calico-system(f475c534-3126-4484-b2ac-7f780fe28e12)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2" podUID="f475c534-3126-4484-b2ac-7f780fe28e12"
Jan 14 13:38:44.463196 containerd[1759]: time="2025-01-14T13:38:44.463066861Z" level=error msg="Failed to destroy network for sandbox \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.463444 containerd[1759]: time="2025-01-14T13:38:44.463380541Z" level=error msg="encountered an error cleaning up failed sandbox \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.463555 containerd[1759]: time="2025-01-14T13:38:44.463439942Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7bk8l,Uid:af70d9ef-1b20-4f9f-93e1-f55f680c58b4,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.463691 kubelet[3442]: E0114 13:38:44.463601    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.463691 kubelet[3442]: E0114 13:38:44.463637    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:38:44.463691 kubelet[3442]: E0114 13:38:44.463657    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:38:44.463807 kubelet[3442]: E0114 13:38:44.463698    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7bk8l_calico-system(af70d9ef-1b20-4f9f-93e1-f55f680c58b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7bk8l_calico-system(af70d9ef-1b20-4f9f-93e1-f55f680c58b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:44.487017 containerd[1759]: time="2025-01-14T13:38:44.485834313Z" level=error msg="Failed to destroy network for sandbox \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.487017 containerd[1759]: time="2025-01-14T13:38:44.486152913Z" level=error msg="encountered an error cleaning up failed sandbox \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.487017 containerd[1759]: time="2025-01-14T13:38:44.486210233Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-9xx9c,Uid:f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.487215 kubelet[3442]: E0114 13:38:44.486530    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:44.487215 kubelet[3442]: E0114 13:38:44.486584    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:44.487215 kubelet[3442]: E0114 13:38:44.486604    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:44.487295 kubelet[3442]: E0114 13:38:44.486656    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7478669d4-9xx9c_calico-apiserver(f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7478669d4-9xx9c_calico-apiserver(f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c" podUID="f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1"
Jan 14 13:38:44.947825 systemd[1]: run-netns-cni\x2dac453c08\x2da02f\x2d6d8e\x2dcb99\x2dcb633be047c7.mount: Deactivated successfully.
Jan 14 13:38:44.948176 systemd[1]: run-netns-cni\x2d228edab0\x2da582\x2dfe3c\x2d2d9a\x2dfd27e56c4415.mount: Deactivated successfully.
Jan 14 13:38:45.127566 kubelet[3442]: I0114 13:38:45.127532    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5"
Jan 14 13:38:45.129279 containerd[1759]: time="2025-01-14T13:38:45.129138962Z" level=info msg="StopPodSandbox for \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\""
Jan 14 13:38:45.129696 containerd[1759]: time="2025-01-14T13:38:45.129314002Z" level=info msg="Ensure that sandbox b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5 in task-service has been cleanup successfully"
Jan 14 13:38:45.132696 systemd[1]: run-netns-cni\x2d850afd33\x2de258\x2d80ef\x2d3b12\x2de74f87a1c4ff.mount: Deactivated successfully.
Jan 14 13:38:45.133005 containerd[1759]: time="2025-01-14T13:38:45.132961764Z" level=info msg="TearDown network for sandbox \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\" successfully"
Jan 14 13:38:45.133060 containerd[1759]: time="2025-01-14T13:38:45.133007684Z" level=info msg="StopPodSandbox for \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\" returns successfully"
Jan 14 13:38:45.133713 containerd[1759]: time="2025-01-14T13:38:45.133683724Z" level=info msg="StopPodSandbox for \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\""
Jan 14 13:38:45.133789 containerd[1759]: time="2025-01-14T13:38:45.133762125Z" level=info msg="TearDown network for sandbox \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\" successfully"
Jan 14 13:38:45.133789 containerd[1759]: time="2025-01-14T13:38:45.133771125Z" level=info msg="StopPodSandbox for \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\" returns successfully"
Jan 14 13:38:45.135273 containerd[1759]: time="2025-01-14T13:38:45.135138165Z" level=info msg="StopPodSandbox for \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\""
Jan 14 13:38:45.135273 containerd[1759]: time="2025-01-14T13:38:45.135213005Z" level=info msg="TearDown network for sandbox \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\" successfully"
Jan 14 13:38:45.135273 containerd[1759]: time="2025-01-14T13:38:45.135222365Z" level=info msg="StopPodSandbox for \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\" returns successfully"
Jan 14 13:38:45.135893 containerd[1759]: time="2025-01-14T13:38:45.135865846Z" level=info msg="StopPodSandbox for \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\""
Jan 14 13:38:45.135963 containerd[1759]: time="2025-01-14T13:38:45.135943286Z" level=info msg="TearDown network for sandbox \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\" successfully"
Jan 14 13:38:45.135963 containerd[1759]: time="2025-01-14T13:38:45.135958646Z" level=info msg="StopPodSandbox for \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\" returns successfully"
Jan 14 13:38:45.136410 containerd[1759]: time="2025-01-14T13:38:45.136386246Z" level=info msg="StopPodSandbox for \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\""
Jan 14 13:38:45.136465 containerd[1759]: time="2025-01-14T13:38:45.136450246Z" level=info msg="TearDown network for sandbox \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\" successfully"
Jan 14 13:38:45.136465 containerd[1759]: time="2025-01-14T13:38:45.136459446Z" level=info msg="StopPodSandbox for \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\" returns successfully"
Jan 14 13:38:45.136858 containerd[1759]: time="2025-01-14T13:38:45.136833646Z" level=info msg="StopPodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\""
Jan 14 13:38:45.136929 containerd[1759]: time="2025-01-14T13:38:45.136908366Z" level=info msg="TearDown network for sandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" successfully"
Jan 14 13:38:45.136929 containerd[1759]: time="2025-01-14T13:38:45.136922646Z" level=info msg="StopPodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" returns successfully"
Jan 14 13:38:45.138098 containerd[1759]: time="2025-01-14T13:38:45.137763967Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-msgc9,Uid:c15a84fb-ac68-4acd-b385-a152a2911116,Namespace:kube-system,Attempt:6,}"
Jan 14 13:38:45.141886 kubelet[3442]: I0114 13:38:45.141865    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117"
Jan 14 13:38:45.142923 containerd[1759]: time="2025-01-14T13:38:45.142888729Z" level=info msg="StopPodSandbox for \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\""
Jan 14 13:38:45.143192 containerd[1759]: time="2025-01-14T13:38:45.143166809Z" level=info msg="Ensure that sandbox d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117 in task-service has been cleanup successfully"
Jan 14 13:38:45.143423 containerd[1759]: time="2025-01-14T13:38:45.143395089Z" level=info msg="TearDown network for sandbox \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\" successfully"
Jan 14 13:38:45.143487 containerd[1759]: time="2025-01-14T13:38:45.143427369Z" level=info msg="StopPodSandbox for \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\" returns successfully"
Jan 14 13:38:45.145414 systemd[1]: run-netns-cni\x2d983bb9eb\x2d09ab\x2df098\x2d1951\x2d1953fa167b89.mount: Deactivated successfully.
Jan 14 13:38:45.146048 containerd[1759]: time="2025-01-14T13:38:45.146011851Z" level=info msg="StopPodSandbox for \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\""
Jan 14 13:38:45.146122 containerd[1759]: time="2025-01-14T13:38:45.146098851Z" level=info msg="TearDown network for sandbox \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\" successfully"
Jan 14 13:38:45.146122 containerd[1759]: time="2025-01-14T13:38:45.146109411Z" level=info msg="StopPodSandbox for \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\" returns successfully"
Jan 14 13:38:45.147325 containerd[1759]: time="2025-01-14T13:38:45.146955011Z" level=info msg="StopPodSandbox for \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\""
Jan 14 13:38:45.147325 containerd[1759]: time="2025-01-14T13:38:45.147084931Z" level=info msg="TearDown network for sandbox \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\" successfully"
Jan 14 13:38:45.147325 containerd[1759]: time="2025-01-14T13:38:45.147097491Z" level=info msg="StopPodSandbox for \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\" returns successfully"
Jan 14 13:38:45.148193 containerd[1759]: time="2025-01-14T13:38:45.148155732Z" level=info msg="StopPodSandbox for \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\""
Jan 14 13:38:45.148255 containerd[1759]: time="2025-01-14T13:38:45.148233372Z" level=info msg="TearDown network for sandbox \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\" successfully"
Jan 14 13:38:45.148255 containerd[1759]: time="2025-01-14T13:38:45.148244212Z" level=info msg="StopPodSandbox for \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\" returns successfully"
Jan 14 13:38:45.149589 containerd[1759]: time="2025-01-14T13:38:45.149563493Z" level=info msg="StopPodSandbox for \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\""
Jan 14 13:38:45.150011 containerd[1759]: time="2025-01-14T13:38:45.149725373Z" level=info msg="TearDown network for sandbox \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\" successfully"
Jan 14 13:38:45.150011 containerd[1759]: time="2025-01-14T13:38:45.149764293Z" level=info msg="StopPodSandbox for \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\" returns successfully"
Jan 14 13:38:45.150437 containerd[1759]: time="2025-01-14T13:38:45.150317893Z" level=info msg="StopPodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\""
Jan 14 13:38:45.150437 containerd[1759]: time="2025-01-14T13:38:45.150406933Z" level=info msg="TearDown network for sandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" successfully"
Jan 14 13:38:45.150612 containerd[1759]: time="2025-01-14T13:38:45.150417053Z" level=info msg="StopPodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" returns successfully"
Jan 14 13:38:45.151106 kubelet[3442]: I0114 13:38:45.150944    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288"
Jan 14 13:38:45.152536 containerd[1759]: time="2025-01-14T13:38:45.151369814Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7bk8l,Uid:af70d9ef-1b20-4f9f-93e1-f55f680c58b4,Namespace:calico-system,Attempt:6,}"
Jan 14 13:38:45.154063 containerd[1759]: time="2025-01-14T13:38:45.154037655Z" level=info msg="StopPodSandbox for \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\""
Jan 14 13:38:45.154599 containerd[1759]: time="2025-01-14T13:38:45.154548095Z" level=info msg="Ensure that sandbox b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288 in task-service has been cleanup successfully"
Jan 14 13:38:45.158091 systemd[1]: run-netns-cni\x2dc81ad9d8\x2d4c9a\x2d51c1\x2d6776\x2d6a08af8c2240.mount: Deactivated successfully.
Jan 14 13:38:45.159948 containerd[1759]: time="2025-01-14T13:38:45.159530778Z" level=info msg="TearDown network for sandbox \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\" successfully"
Jan 14 13:38:45.159948 containerd[1759]: time="2025-01-14T13:38:45.159553258Z" level=info msg="StopPodSandbox for \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\" returns successfully"
Jan 14 13:38:45.160362 containerd[1759]: time="2025-01-14T13:38:45.160273418Z" level=info msg="StopPodSandbox for \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\""
Jan 14 13:38:45.160362 containerd[1759]: time="2025-01-14T13:38:45.160350098Z" level=info msg="TearDown network for sandbox \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\" successfully"
Jan 14 13:38:45.160362 containerd[1759]: time="2025-01-14T13:38:45.160359298Z" level=info msg="StopPodSandbox for \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\" returns successfully"
Jan 14 13:38:45.160940 kubelet[3442]: I0114 13:38:45.160913    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2"
Jan 14 13:38:45.161375 containerd[1759]: time="2025-01-14T13:38:45.161352699Z" level=info msg="StopPodSandbox for \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\""
Jan 14 13:38:45.161722 containerd[1759]: time="2025-01-14T13:38:45.161550179Z" level=info msg="TearDown network for sandbox \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\" successfully"
Jan 14 13:38:45.161722 containerd[1759]: time="2025-01-14T13:38:45.161566819Z" level=info msg="StopPodSandbox for \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\" returns successfully"
Jan 14 13:38:45.163114 containerd[1759]: time="2025-01-14T13:38:45.163090580Z" level=info msg="StopPodSandbox for \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\""
Jan 14 13:38:45.163114 containerd[1759]: time="2025-01-14T13:38:45.163270140Z" level=info msg="Ensure that sandbox fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2 in task-service has been cleanup successfully"
Jan 14 13:38:45.165690 systemd[1]: run-netns-cni\x2dc845f34b\x2d2604\x2d9c19\x2d3523\x2de2785f8a9f35.mount: Deactivated successfully.
Jan 14 13:38:45.170151 containerd[1759]: time="2025-01-14T13:38:45.167941502Z" level=info msg="TearDown network for sandbox \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\" successfully"
Jan 14 13:38:45.170151 containerd[1759]: time="2025-01-14T13:38:45.167962662Z" level=info msg="StopPodSandbox for \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\" returns successfully"
Jan 14 13:38:45.170151 containerd[1759]: time="2025-01-14T13:38:45.168214462Z" level=info msg="StopPodSandbox for \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\""
Jan 14 13:38:45.170151 containerd[1759]: time="2025-01-14T13:38:45.168275942Z" level=info msg="TearDown network for sandbox \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\" successfully"
Jan 14 13:38:45.170151 containerd[1759]: time="2025-01-14T13:38:45.168285182Z" level=info msg="StopPodSandbox for \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\" returns successfully"
Jan 14 13:38:45.172147 containerd[1759]: time="2025-01-14T13:38:45.172115144Z" level=info msg="StopPodSandbox for \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\""
Jan 14 13:38:45.172283 containerd[1759]: time="2025-01-14T13:38:45.172239824Z" level=info msg="TearDown network for sandbox \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\" successfully"
Jan 14 13:38:45.172283 containerd[1759]: time="2025-01-14T13:38:45.172278024Z" level=info msg="StopPodSandbox for \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\" returns successfully"
Jan 14 13:38:45.173218 containerd[1759]: time="2025-01-14T13:38:45.173171425Z" level=info msg="StopPodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\""
Jan 14 13:38:45.177171 containerd[1759]: time="2025-01-14T13:38:45.176939987Z" level=info msg="StopPodSandbox for \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\""
Jan 14 13:38:45.177372 containerd[1759]: time="2025-01-14T13:38:45.177355227Z" level=info msg="TearDown network for sandbox \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\" successfully"
Jan 14 13:38:45.177804 containerd[1759]: time="2025-01-14T13:38:45.177784987Z" level=info msg="StopPodSandbox for \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\" returns successfully"
Jan 14 13:38:45.179939 containerd[1759]: time="2025-01-14T13:38:45.179910988Z" level=info msg="StopPodSandbox for \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\""
Jan 14 13:38:45.180090 containerd[1759]: time="2025-01-14T13:38:45.180034028Z" level=info msg="TearDown network for sandbox \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\" successfully"
Jan 14 13:38:45.180090 containerd[1759]: time="2025-01-14T13:38:45.180050428Z" level=info msg="StopPodSandbox for \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\" returns successfully"
Jan 14 13:38:45.182633 containerd[1759]: time="2025-01-14T13:38:45.182608390Z" level=info msg="TearDown network for sandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" successfully"
Jan 14 13:38:45.183685 containerd[1759]: time="2025-01-14T13:38:45.183487470Z" level=info msg="StopPodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" returns successfully"
Jan 14 13:38:45.183685 containerd[1759]: time="2025-01-14T13:38:45.183175950Z" level=info msg="StopPodSandbox for \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\""
Jan 14 13:38:45.183685 containerd[1759]: time="2025-01-14T13:38:45.183596750Z" level=info msg="TearDown network for sandbox \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\" successfully"
Jan 14 13:38:45.183685 containerd[1759]: time="2025-01-14T13:38:45.183617270Z" level=info msg="StopPodSandbox for \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\" returns successfully"
Jan 14 13:38:45.184173 containerd[1759]: time="2025-01-14T13:38:45.184114230Z" level=info msg="StopPodSandbox for \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\""
Jan 14 13:38:45.184401 containerd[1759]: time="2025-01-14T13:38:45.184221710Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-9xx9c,Uid:f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1,Namespace:calico-apiserver,Attempt:6,}"
Jan 14 13:38:45.184401 containerd[1759]: time="2025-01-14T13:38:45.184303590Z" level=info msg="TearDown network for sandbox \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\" successfully"
Jan 14 13:38:45.184401 containerd[1759]: time="2025-01-14T13:38:45.184317950Z" level=info msg="StopPodSandbox for \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\" returns successfully"
Jan 14 13:38:45.186006 containerd[1759]: time="2025-01-14T13:38:45.185223751Z" level=info msg="StopPodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\""
Jan 14 13:38:45.186006 containerd[1759]: time="2025-01-14T13:38:45.185335431Z" level=info msg="TearDown network for sandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" successfully"
Jan 14 13:38:45.186006 containerd[1759]: time="2025-01-14T13:38:45.185346431Z" level=info msg="StopPodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" returns successfully"
Jan 14 13:38:45.186360 kubelet[3442]: I0114 13:38:45.186307    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7"
Jan 14 13:38:45.186897 containerd[1759]: time="2025-01-14T13:38:45.186856592Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-zcsrg,Uid:35d51e43-1c53-427a-a8d2-bd422d727c5b,Namespace:calico-apiserver,Attempt:6,}"
Jan 14 13:38:45.187757 containerd[1759]: time="2025-01-14T13:38:45.187290432Z" level=info msg="StopPodSandbox for \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\""
Jan 14 13:38:45.188081 containerd[1759]: time="2025-01-14T13:38:45.187945552Z" level=info msg="Ensure that sandbox 67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7 in task-service has been cleanup successfully"
Jan 14 13:38:45.188555 containerd[1759]: time="2025-01-14T13:38:45.188398712Z" level=info msg="TearDown network for sandbox \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\" successfully"
Jan 14 13:38:45.188555 containerd[1759]: time="2025-01-14T13:38:45.188470873Z" level=info msg="StopPodSandbox for \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\" returns successfully"
Jan 14 13:38:45.190950 containerd[1759]: time="2025-01-14T13:38:45.190926674Z" level=info msg="StopPodSandbox for \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\""
Jan 14 13:38:45.192431 containerd[1759]: time="2025-01-14T13:38:45.192400115Z" level=info msg="TearDown network for sandbox \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\" successfully"
Jan 14 13:38:45.192431 containerd[1759]: time="2025-01-14T13:38:45.192423115Z" level=info msg="StopPodSandbox for \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\" returns successfully"
Jan 14 13:38:45.194013 containerd[1759]: time="2025-01-14T13:38:45.193786875Z" level=info msg="StopPodSandbox for \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\""
Jan 14 13:38:45.194013 containerd[1759]: time="2025-01-14T13:38:45.193865275Z" level=info msg="TearDown network for sandbox \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\" successfully"
Jan 14 13:38:45.194013 containerd[1759]: time="2025-01-14T13:38:45.193874275Z" level=info msg="StopPodSandbox for \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\" returns successfully"
Jan 14 13:38:45.195344 containerd[1759]: time="2025-01-14T13:38:45.195258916Z" level=info msg="StopPodSandbox for \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\""
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.195348436Z" level=info msg="TearDown network for sandbox \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\" successfully"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.195358316Z" level=info msg="StopPodSandbox for \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\" returns successfully"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.195904676Z" level=info msg="StopPodSandbox for \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\""
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.196188236Z" level=info msg="TearDown network for sandbox \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\" successfully"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.196202516Z" level=info msg="StopPodSandbox for \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\" returns successfully"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.196679477Z" level=info msg="StopPodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\""
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.196771397Z" level=info msg="TearDown network for sandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" successfully"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.196782757Z" level=info msg="StopPodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" returns successfully"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.197323757Z" level=info msg="StopPodSandbox for \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\""
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.197679077Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2fhlb,Uid:af2992f6-a261-4e1c-9cfc-1ef759086d8d,Namespace:kube-system,Attempt:6,}"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.197791117Z" level=info msg="Ensure that sandbox 1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1 in task-service has been cleanup successfully"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.198230078Z" level=info msg="TearDown network for sandbox \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\" successfully"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.198283998Z" level=info msg="StopPodSandbox for \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\" returns successfully"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.198987718Z" level=info msg="StopPodSandbox for \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\""
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.199130238Z" level=info msg="TearDown network for sandbox \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\" successfully"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.199143118Z" level=info msg="StopPodSandbox for \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\" returns successfully"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.199540598Z" level=info msg="StopPodSandbox for \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\""
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.199641238Z" level=info msg="TearDown network for sandbox \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\" successfully"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.199653278Z" level=info msg="StopPodSandbox for \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\" returns successfully"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.200380799Z" level=info msg="StopPodSandbox for \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\""
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.200480439Z" level=info msg="TearDown network for sandbox \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\" successfully"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.200491279Z" level=info msg="StopPodSandbox for \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\" returns successfully"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.200751239Z" level=info msg="StopPodSandbox for \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\""
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.200811759Z" level=info msg="TearDown network for sandbox \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\" successfully"
Jan 14 13:38:45.201065 containerd[1759]: time="2025-01-14T13:38:45.200820679Z" level=info msg="StopPodSandbox for \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\" returns successfully"
Jan 14 13:38:45.201595 kubelet[3442]: I0114 13:38:45.196704    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1"
Jan 14 13:38:45.204310 containerd[1759]: time="2025-01-14T13:38:45.204184401Z" level=info msg="StopPodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\""
Jan 14 13:38:45.204594 containerd[1759]: time="2025-01-14T13:38:45.204558641Z" level=info msg="TearDown network for sandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" successfully"
Jan 14 13:38:45.204594 containerd[1759]: time="2025-01-14T13:38:45.204587481Z" level=info msg="StopPodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" returns successfully"
Jan 14 13:38:45.208369 containerd[1759]: time="2025-01-14T13:38:45.207934282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8555cc7446-bv6h2,Uid:f475c534-3126-4484-b2ac-7f780fe28e12,Namespace:calico-system,Attempt:6,}"
Jan 14 13:38:45.299475 containerd[1759]: time="2025-01-14T13:38:45.299413609Z" level=error msg="Failed to destroy network for sandbox \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.299760 containerd[1759]: time="2025-01-14T13:38:45.299718249Z" level=error msg="encountered an error cleaning up failed sandbox \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.299830 containerd[1759]: time="2025-01-14T13:38:45.299781409Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-msgc9,Uid:c15a84fb-ac68-4acd-b385-a152a2911116,Namespace:kube-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.300103 kubelet[3442]: E0114 13:38:45.300074    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.300480 kubelet[3442]: E0114 13:38:45.300138    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-msgc9"
Jan 14 13:38:45.300480 kubelet[3442]: E0114 13:38:45.300158    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-msgc9"
Jan 14 13:38:45.300480 kubelet[3442]: E0114 13:38:45.300227    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-msgc9_kube-system(c15a84fb-ac68-4acd-b385-a152a2911116)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-msgc9_kube-system(c15a84fb-ac68-4acd-b385-a152a2911116)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-msgc9" podUID="c15a84fb-ac68-4acd-b385-a152a2911116"
Jan 14 13:38:45.371887 containerd[1759]: time="2025-01-14T13:38:45.371759166Z" level=error msg="Failed to destroy network for sandbox \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.372748 containerd[1759]: time="2025-01-14T13:38:45.372625887Z" level=error msg="encountered an error cleaning up failed sandbox \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.372748 containerd[1759]: time="2025-01-14T13:38:45.372703167Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7bk8l,Uid:af70d9ef-1b20-4f9f-93e1-f55f680c58b4,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.375284 kubelet[3442]: E0114 13:38:45.375183    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.375284 kubelet[3442]: E0114 13:38:45.375237    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:38:45.375284 kubelet[3442]: E0114 13:38:45.375258    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7bk8l"
Jan 14 13:38:45.375510 kubelet[3442]: E0114 13:38:45.375316    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7bk8l_calico-system(af70d9ef-1b20-4f9f-93e1-f55f680c58b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7bk8l_calico-system(af70d9ef-1b20-4f9f-93e1-f55f680c58b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7bk8l" podUID="af70d9ef-1b20-4f9f-93e1-f55f680c58b4"
Jan 14 13:38:45.832807 containerd[1759]: time="2025-01-14T13:38:45.832268322Z" level=error msg="Failed to destroy network for sandbox \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.832807 containerd[1759]: time="2025-01-14T13:38:45.832599522Z" level=error msg="encountered an error cleaning up failed sandbox \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.832807 containerd[1759]: time="2025-01-14T13:38:45.832684402Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-9xx9c,Uid:f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1,Namespace:calico-apiserver,Attempt:6,} failed, error" error="failed to setup network for sandbox \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.833042 kubelet[3442]: E0114 13:38:45.832950    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.833042 kubelet[3442]: E0114 13:38:45.833024    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:45.833120 kubelet[3442]: E0114 13:38:45.833051    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c"
Jan 14 13:38:45.833120 kubelet[3442]: E0114 13:38:45.833113    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7478669d4-9xx9c_calico-apiserver(f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7478669d4-9xx9c_calico-apiserver(f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c" podUID="f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1"
Jan 14 13:38:45.839895 containerd[1759]: time="2025-01-14T13:38:45.839856286Z" level=error msg="Failed to destroy network for sandbox \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.850544 containerd[1759]: time="2025-01-14T13:38:45.850474971Z" level=error msg="encountered an error cleaning up failed sandbox \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.850679 containerd[1759]: time="2025-01-14T13:38:45.850573051Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-zcsrg,Uid:35d51e43-1c53-427a-a8d2-bd422d727c5b,Namespace:calico-apiserver,Attempt:6,} failed, error" error="failed to setup network for sandbox \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.851300 kubelet[3442]: E0114 13:38:45.850917    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.851300 kubelet[3442]: E0114 13:38:45.850976    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:45.851300 kubelet[3442]: E0114 13:38:45.851021    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg"
Jan 14 13:38:45.851434 kubelet[3442]: E0114 13:38:45.851082    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7478669d4-zcsrg_calico-apiserver(35d51e43-1c53-427a-a8d2-bd422d727c5b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7478669d4-zcsrg_calico-apiserver(35d51e43-1c53-427a-a8d2-bd422d727c5b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg" podUID="35d51e43-1c53-427a-a8d2-bd422d727c5b"
Jan 14 13:38:45.852311 containerd[1759]: time="2025-01-14T13:38:45.852278292Z" level=error msg="Failed to destroy network for sandbox \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.852712 containerd[1759]: time="2025-01-14T13:38:45.852684172Z" level=error msg="encountered an error cleaning up failed sandbox \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.852916 containerd[1759]: time="2025-01-14T13:38:45.852891213Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2fhlb,Uid:af2992f6-a261-4e1c-9cfc-1ef759086d8d,Namespace:kube-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.853218 kubelet[3442]: E0114 13:38:45.853201    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.853515 kubelet[3442]: E0114 13:38:45.853388    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2fhlb"
Jan 14 13:38:45.853515 kubelet[3442]: E0114 13:38:45.853414    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-2fhlb"
Jan 14 13:38:45.853515 kubelet[3442]: E0114 13:38:45.853483    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-2fhlb_kube-system(af2992f6-a261-4e1c-9cfc-1ef759086d8d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-2fhlb_kube-system(af2992f6-a261-4e1c-9cfc-1ef759086d8d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-2fhlb" podUID="af2992f6-a261-4e1c-9cfc-1ef759086d8d"
Jan 14 13:38:45.873479 containerd[1759]: time="2025-01-14T13:38:45.873361543Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:38:45.878213 containerd[1759]: time="2025-01-14T13:38:45.878154425Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=137671762"
Jan 14 13:38:45.882489 containerd[1759]: time="2025-01-14T13:38:45.881611147Z" level=info msg="ImageCreate event name:\"sha256:680b8c280812d12c035ca9f0deedea7c761afe0f1cc65109ea2f96bf63801758\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:38:45.886109 containerd[1759]: time="2025-01-14T13:38:45.886075909Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:38:45.886389 containerd[1759]: time="2025-01-14T13:38:45.886363510Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:680b8c280812d12c035ca9f0deedea7c761afe0f1cc65109ea2f96bf63801758\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"137671624\" in 7.970052536s"
Jan 14 13:38:45.886422 containerd[1759]: time="2025-01-14T13:38:45.886390790Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:680b8c280812d12c035ca9f0deedea7c761afe0f1cc65109ea2f96bf63801758\""
Jan 14 13:38:45.897925 containerd[1759]: time="2025-01-14T13:38:45.897788195Z" level=info msg="CreateContainer within sandbox \"2bfbdef55f246b1c65be1142c8ccc79669bcb65247943c9f2ada4e457c3083de\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}"
Jan 14 13:38:45.916168 containerd[1759]: time="2025-01-14T13:38:45.916051205Z" level=error msg="Failed to destroy network for sandbox \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.916413 containerd[1759]: time="2025-01-14T13:38:45.916380325Z" level=error msg="encountered an error cleaning up failed sandbox \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.916468 containerd[1759]: time="2025-01-14T13:38:45.916442725Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8555cc7446-bv6h2,Uid:f475c534-3126-4484-b2ac-7f780fe28e12,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.916713 kubelet[3442]: E0114 13:38:45.916691    3442 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/"
Jan 14 13:38:45.917780 kubelet[3442]: E0114 13:38:45.916843    3442 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:45.917780 kubelet[3442]: E0114 13:38:45.916882    3442 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2"
Jan 14 13:38:45.917780 kubelet[3442]: E0114 13:38:45.916937    3442 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8555cc7446-bv6h2_calico-system(f475c534-3126-4484-b2ac-7f780fe28e12)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8555cc7446-bv6h2_calico-system(f475c534-3126-4484-b2ac-7f780fe28e12)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2" podUID="f475c534-3126-4484-b2ac-7f780fe28e12"
Jan 14 13:38:45.947453 systemd[1]: run-netns-cni\x2d35a7540d\x2dd61c\x2d994b\x2d373f\x2d74d931cc0a36.mount: Deactivated successfully.
Jan 14 13:38:45.947672 systemd[1]: run-netns-cni\x2d0fca006c\x2ddf41\x2d1be7\x2d9b8c\x2da775a933d527.mount: Deactivated successfully.
Jan 14 13:38:45.947835 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount141930938.mount: Deactivated successfully.
Jan 14 13:38:45.961673 containerd[1759]: time="2025-01-14T13:38:45.961627908Z" level=info msg="CreateContainer within sandbox \"2bfbdef55f246b1c65be1142c8ccc79669bcb65247943c9f2ada4e457c3083de\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"fe32ac6a3afdc97a519736de0bb2610a669eb4b204fc42136816ac7e3ea13b4d\""
Jan 14 13:38:45.962236 containerd[1759]: time="2025-01-14T13:38:45.962201028Z" level=info msg="StartContainer for \"fe32ac6a3afdc97a519736de0bb2610a669eb4b204fc42136816ac7e3ea13b4d\""
Jan 14 13:38:45.994142 systemd[1]: Started cri-containerd-fe32ac6a3afdc97a519736de0bb2610a669eb4b204fc42136816ac7e3ea13b4d.scope - libcontainer container fe32ac6a3afdc97a519736de0bb2610a669eb4b204fc42136816ac7e3ea13b4d.
Jan 14 13:38:46.024454 containerd[1759]: time="2025-01-14T13:38:46.024328860Z" level=info msg="StartContainer for \"fe32ac6a3afdc97a519736de0bb2610a669eb4b204fc42136816ac7e3ea13b4d\" returns successfully"
Jan 14 13:38:46.209402 kubelet[3442]: I0114 13:38:46.205163    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89"
Jan 14 13:38:46.209743 containerd[1759]: time="2025-01-14T13:38:46.206019713Z" level=info msg="StopPodSandbox for \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\""
Jan 14 13:38:46.209743 containerd[1759]: time="2025-01-14T13:38:46.206183033Z" level=info msg="Ensure that sandbox efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89 in task-service has been cleanup successfully"
Jan 14 13:38:46.209743 containerd[1759]: time="2025-01-14T13:38:46.209464355Z" level=info msg="TearDown network for sandbox \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\" successfully"
Jan 14 13:38:46.218737 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information.
Jan 14 13:38:46.218854 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld <Jason@zx2c4.com>. All Rights Reserved.
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.209484995Z" level=info msg="StopPodSandbox for \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\" returns successfully"
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.210901796Z" level=info msg="StopPodSandbox for \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\""
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.211250356Z" level=info msg="TearDown network for sandbox \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\" successfully"
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.211299996Z" level=info msg="StopPodSandbox for \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\" returns successfully"
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.212971197Z" level=info msg="StopPodSandbox for \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\""
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.213065597Z" level=info msg="TearDown network for sandbox \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\" successfully"
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.213076117Z" level=info msg="StopPodSandbox for \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\" returns successfully"
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.213710517Z" level=info msg="StopPodSandbox for \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\""
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.213786637Z" level=info msg="TearDown network for sandbox \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\" successfully"
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.213795277Z" level=info msg="StopPodSandbox for \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\" returns successfully"
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.214186957Z" level=info msg="StopPodSandbox for \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\""
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.214260997Z" level=info msg="TearDown network for sandbox \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\" successfully"
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.214271077Z" level=info msg="StopPodSandbox for \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\" returns successfully"
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.214681398Z" level=info msg="StopPodSandbox for \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\""
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.214816198Z" level=info msg="TearDown network for sandbox \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\" successfully"
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.214835958Z" level=info msg="StopPodSandbox for \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\" returns successfully"
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.215285518Z" level=info msg="StopPodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\""
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.215380158Z" level=info msg="TearDown network for sandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" successfully"
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.215393038Z" level=info msg="StopPodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" returns successfully"
Jan 14 13:38:46.218873 containerd[1759]: time="2025-01-14T13:38:46.215922518Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-9xx9c,Uid:f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1,Namespace:calico-apiserver,Attempt:7,}"
Jan 14 13:38:46.221783 kubelet[3442]: I0114 13:38:46.221758    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6"
Jan 14 13:38:46.224460 containerd[1759]: time="2025-01-14T13:38:46.224429563Z" level=info msg="StopPodSandbox for \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\""
Jan 14 13:38:46.224745 containerd[1759]: time="2025-01-14T13:38:46.224573963Z" level=info msg="Ensure that sandbox 00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6 in task-service has been cleanup successfully"
Jan 14 13:38:46.226100 containerd[1759]: time="2025-01-14T13:38:46.225846083Z" level=info msg="TearDown network for sandbox \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\" successfully"
Jan 14 13:38:46.226100 containerd[1759]: time="2025-01-14T13:38:46.225871083Z" level=info msg="StopPodSandbox for \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\" returns successfully"
Jan 14 13:38:46.228597 containerd[1759]: time="2025-01-14T13:38:46.227368244Z" level=info msg="StopPodSandbox for \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\""
Jan 14 13:38:46.228597 containerd[1759]: time="2025-01-14T13:38:46.227475844Z" level=info msg="TearDown network for sandbox \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\" successfully"
Jan 14 13:38:46.228597 containerd[1759]: time="2025-01-14T13:38:46.227489684Z" level=info msg="StopPodSandbox for \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\" returns successfully"
Jan 14 13:38:46.229741 containerd[1759]: time="2025-01-14T13:38:46.229469285Z" level=info msg="StopPodSandbox for \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\""
Jan 14 13:38:46.232664 containerd[1759]: time="2025-01-14T13:38:46.232437087Z" level=info msg="TearDown network for sandbox \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\" successfully"
Jan 14 13:38:46.232664 containerd[1759]: time="2025-01-14T13:38:46.232461567Z" level=info msg="StopPodSandbox for \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\" returns successfully"
Jan 14 13:38:46.234657 containerd[1759]: time="2025-01-14T13:38:46.234386288Z" level=info msg="StopPodSandbox for \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\""
Jan 14 13:38:46.234657 containerd[1759]: time="2025-01-14T13:38:46.234473648Z" level=info msg="TearDown network for sandbox \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\" successfully"
Jan 14 13:38:46.234657 containerd[1759]: time="2025-01-14T13:38:46.234483768Z" level=info msg="StopPodSandbox for \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\" returns successfully"
Jan 14 13:38:46.235077 containerd[1759]: time="2025-01-14T13:38:46.235049568Z" level=info msg="StopPodSandbox for \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\""
Jan 14 13:38:46.237097 containerd[1759]: time="2025-01-14T13:38:46.235239248Z" level=info msg="TearDown network for sandbox \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\" successfully"
Jan 14 13:38:46.237097 containerd[1759]: time="2025-01-14T13:38:46.235360368Z" level=info msg="StopPodSandbox for \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\" returns successfully"
Jan 14 13:38:46.237220 kubelet[3442]: I0114 13:38:46.235453    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598"
Jan 14 13:38:46.238617 containerd[1759]: time="2025-01-14T13:38:46.237861450Z" level=info msg="StopPodSandbox for \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\""
Jan 14 13:38:46.239574 containerd[1759]: time="2025-01-14T13:38:46.238627130Z" level=info msg="StopPodSandbox for \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\""
Jan 14 13:38:46.239574 containerd[1759]: time="2025-01-14T13:38:46.239149010Z" level=info msg="Ensure that sandbox 3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598 in task-service has been cleanup successfully"
Jan 14 13:38:46.239574 containerd[1759]: time="2025-01-14T13:38:46.238982970Z" level=info msg="TearDown network for sandbox \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\" successfully"
Jan 14 13:38:46.239574 containerd[1759]: time="2025-01-14T13:38:46.239405610Z" level=info msg="StopPodSandbox for \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\" returns successfully"
Jan 14 13:38:46.240321 containerd[1759]: time="2025-01-14T13:38:46.240273611Z" level=info msg="TearDown network for sandbox \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\" successfully"
Jan 14 13:38:46.240683 containerd[1759]: time="2025-01-14T13:38:46.240662091Z" level=info msg="StopPodSandbox for \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\" returns successfully"
Jan 14 13:38:46.241435 containerd[1759]: time="2025-01-14T13:38:46.241389211Z" level=info msg="StopPodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\""
Jan 14 13:38:46.241799 containerd[1759]: time="2025-01-14T13:38:46.241492971Z" level=info msg="TearDown network for sandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" successfully"
Jan 14 13:38:46.241799 containerd[1759]: time="2025-01-14T13:38:46.241507531Z" level=info msg="StopPodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" returns successfully"
Jan 14 13:38:46.243158 containerd[1759]: time="2025-01-14T13:38:46.243136932Z" level=info msg="StopPodSandbox for \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\""
Jan 14 13:38:46.244680 containerd[1759]: time="2025-01-14T13:38:46.243649732Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-zcsrg,Uid:35d51e43-1c53-427a-a8d2-bd422d727c5b,Namespace:calico-apiserver,Attempt:7,}"
Jan 14 13:38:46.245465 containerd[1759]: time="2025-01-14T13:38:46.245228173Z" level=info msg="TearDown network for sandbox \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\" successfully"
Jan 14 13:38:46.245465 containerd[1759]: time="2025-01-14T13:38:46.245250773Z" level=info msg="StopPodSandbox for \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\" returns successfully"
Jan 14 13:38:46.246092 containerd[1759]: time="2025-01-14T13:38:46.246072374Z" level=info msg="StopPodSandbox for \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\""
Jan 14 13:38:46.246290 containerd[1759]: time="2025-01-14T13:38:46.246240614Z" level=info msg="TearDown network for sandbox \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\" successfully"
Jan 14 13:38:46.246290 containerd[1759]: time="2025-01-14T13:38:46.246255614Z" level=info msg="StopPodSandbox for \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\" returns successfully"
Jan 14 13:38:46.247262 containerd[1759]: time="2025-01-14T13:38:46.246838694Z" level=info msg="StopPodSandbox for \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\""
Jan 14 13:38:46.247262 containerd[1759]: time="2025-01-14T13:38:46.247188054Z" level=info msg="TearDown network for sandbox \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\" successfully"
Jan 14 13:38:46.247262 containerd[1759]: time="2025-01-14T13:38:46.247204214Z" level=info msg="StopPodSandbox for \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\" returns successfully"
Jan 14 13:38:46.248399 kubelet[3442]: I0114 13:38:46.248045    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a"
Jan 14 13:38:46.248671 containerd[1759]: time="2025-01-14T13:38:46.248639455Z" level=info msg="StopPodSandbox for \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\""
Jan 14 13:38:46.249135 containerd[1759]: time="2025-01-14T13:38:46.249105335Z" level=info msg="TearDown network for sandbox \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\" successfully"
Jan 14 13:38:46.249135 containerd[1759]: time="2025-01-14T13:38:46.249127135Z" level=info msg="StopPodSandbox for \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\" returns successfully"
Jan 14 13:38:46.249258 containerd[1759]: time="2025-01-14T13:38:46.248937695Z" level=info msg="StopPodSandbox for \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\""
Jan 14 13:38:46.250514 containerd[1759]: time="2025-01-14T13:38:46.250360776Z" level=info msg="Ensure that sandbox 9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a in task-service has been cleanup successfully"
Jan 14 13:38:46.250514 containerd[1759]: time="2025-01-14T13:38:46.250403296Z" level=info msg="StopPodSandbox for \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\""
Jan 14 13:38:46.250514 containerd[1759]: time="2025-01-14T13:38:46.250481256Z" level=info msg="TearDown network for sandbox \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\" successfully"
Jan 14 13:38:46.250514 containerd[1759]: time="2025-01-14T13:38:46.250494816Z" level=info msg="StopPodSandbox for \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\" returns successfully"
Jan 14 13:38:46.250943 containerd[1759]: time="2025-01-14T13:38:46.250918976Z" level=info msg="TearDown network for sandbox \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\" successfully"
Jan 14 13:38:46.251040 containerd[1759]: time="2025-01-14T13:38:46.251027216Z" level=info msg="StopPodSandbox for \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\" returns successfully"
Jan 14 13:38:46.251342 containerd[1759]: time="2025-01-14T13:38:46.251307736Z" level=info msg="StopPodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\""
Jan 14 13:38:46.251397 containerd[1759]: time="2025-01-14T13:38:46.251386776Z" level=info msg="TearDown network for sandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" successfully"
Jan 14 13:38:46.251418 containerd[1759]: time="2025-01-14T13:38:46.251396576Z" level=info msg="StopPodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" returns successfully"
Jan 14 13:38:46.252288 containerd[1759]: time="2025-01-14T13:38:46.252250297Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2fhlb,Uid:af2992f6-a261-4e1c-9cfc-1ef759086d8d,Namespace:kube-system,Attempt:7,}"
Jan 14 13:38:46.254901 containerd[1759]: time="2025-01-14T13:38:46.254840978Z" level=info msg="StopPodSandbox for \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\""
Jan 14 13:38:46.254959 containerd[1759]: time="2025-01-14T13:38:46.254936818Z" level=info msg="TearDown network for sandbox \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\" successfully"
Jan 14 13:38:46.254959 containerd[1759]: time="2025-01-14T13:38:46.254947058Z" level=info msg="StopPodSandbox for \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\" returns successfully"
Jan 14 13:38:46.257149 containerd[1759]: time="2025-01-14T13:38:46.255736339Z" level=info msg="StopPodSandbox for \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\""
Jan 14 13:38:46.257149 containerd[1759]: time="2025-01-14T13:38:46.255835539Z" level=info msg="TearDown network for sandbox \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\" successfully"
Jan 14 13:38:46.257149 containerd[1759]: time="2025-01-14T13:38:46.255861259Z" level=info msg="StopPodSandbox for \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\" returns successfully"
Jan 14 13:38:46.257862 kubelet[3442]: I0114 13:38:46.257823    3442 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-node-mkcgd" podStartSLOduration=1.450148128 podStartE2EDuration="47.25771778s" podCreationTimestamp="2025-01-14 13:37:59 +0000 UTC" firstStartedPulling="2025-01-14 13:38:00.079293818 +0000 UTC m=+22.433238632" lastFinishedPulling="2025-01-14 13:38:45.88686347 +0000 UTC m=+68.240808284" observedRunningTime="2025-01-14 13:38:46.253749658 +0000 UTC m=+68.607694432" watchObservedRunningTime="2025-01-14 13:38:46.25771778 +0000 UTC m=+68.611662634"
Jan 14 13:38:46.259703 containerd[1759]: time="2025-01-14T13:38:46.259671221Z" level=info msg="StopPodSandbox for \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\""
Jan 14 13:38:46.261209 containerd[1759]: time="2025-01-14T13:38:46.261174301Z" level=info msg="TearDown network for sandbox \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\" successfully"
Jan 14 13:38:46.261951 kubelet[3442]: I0114 13:38:46.261783    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2"
Jan 14 13:38:46.262495 containerd[1759]: time="2025-01-14T13:38:46.261699542Z" level=info msg="StopPodSandbox for \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\" returns successfully"
Jan 14 13:38:46.263915 containerd[1759]: time="2025-01-14T13:38:46.263791383Z" level=info msg="StopPodSandbox for \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\""
Jan 14 13:38:46.263915 containerd[1759]: time="2025-01-14T13:38:46.263880703Z" level=info msg="TearDown network for sandbox \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\" successfully"
Jan 14 13:38:46.263915 containerd[1759]: time="2025-01-14T13:38:46.263891383Z" level=info msg="StopPodSandbox for \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\" returns successfully"
Jan 14 13:38:46.264280 containerd[1759]: time="2025-01-14T13:38:46.263939263Z" level=info msg="StopPodSandbox for \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\""
Jan 14 13:38:46.264701 containerd[1759]: time="2025-01-14T13:38:46.264671103Z" level=info msg="StopPodSandbox for \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\""
Jan 14 13:38:46.265595 containerd[1759]: time="2025-01-14T13:38:46.264760703Z" level=info msg="TearDown network for sandbox \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\" successfully"
Jan 14 13:38:46.265595 containerd[1759]: time="2025-01-14T13:38:46.264775703Z" level=info msg="StopPodSandbox for \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\" returns successfully"
Jan 14 13:38:46.266903 containerd[1759]: time="2025-01-14T13:38:46.266869744Z" level=info msg="Ensure that sandbox b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2 in task-service has been cleanup successfully"
Jan 14 13:38:46.267444 containerd[1759]: time="2025-01-14T13:38:46.267391585Z" level=info msg="StopPodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\""
Jan 14 13:38:46.268026 containerd[1759]: time="2025-01-14T13:38:46.267469065Z" level=info msg="TearDown network for sandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" successfully"
Jan 14 13:38:46.268026 containerd[1759]: time="2025-01-14T13:38:46.267478705Z" level=info msg="StopPodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" returns successfully"
Jan 14 13:38:46.268415 containerd[1759]: time="2025-01-14T13:38:46.268247865Z" level=info msg="TearDown network for sandbox \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\" successfully"
Jan 14 13:38:46.268415 containerd[1759]: time="2025-01-14T13:38:46.268274585Z" level=info msg="StopPodSandbox for \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\" returns successfully"
Jan 14 13:38:46.268489 containerd[1759]: time="2025-01-14T13:38:46.268441665Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8555cc7446-bv6h2,Uid:f475c534-3126-4484-b2ac-7f780fe28e12,Namespace:calico-system,Attempt:7,}"
Jan 14 13:38:46.271856 containerd[1759]: time="2025-01-14T13:38:46.271735307Z" level=info msg="StopPodSandbox for \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\""
Jan 14 13:38:46.272791 containerd[1759]: time="2025-01-14T13:38:46.272272227Z" level=info msg="TearDown network for sandbox \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\" successfully"
Jan 14 13:38:46.272791 containerd[1759]: time="2025-01-14T13:38:46.272288787Z" level=info msg="StopPodSandbox for \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\" returns successfully"
Jan 14 13:38:46.273164 containerd[1759]: time="2025-01-14T13:38:46.273129628Z" level=info msg="StopPodSandbox for \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\""
Jan 14 13:38:46.273323 containerd[1759]: time="2025-01-14T13:38:46.273211788Z" level=info msg="TearDown network for sandbox \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\" successfully"
Jan 14 13:38:46.273323 containerd[1759]: time="2025-01-14T13:38:46.273225908Z" level=info msg="StopPodSandbox for \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\" returns successfully"
Jan 14 13:38:46.274322 containerd[1759]: time="2025-01-14T13:38:46.274178948Z" level=info msg="StopPodSandbox for \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\""
Jan 14 13:38:46.275931 containerd[1759]: time="2025-01-14T13:38:46.275726029Z" level=info msg="TearDown network for sandbox \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\" successfully"
Jan 14 13:38:46.275931 containerd[1759]: time="2025-01-14T13:38:46.275747229Z" level=info msg="StopPodSandbox for \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\" returns successfully"
Jan 14 13:38:46.282381 kubelet[3442]: I0114 13:38:46.282347    3442 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995"
Jan 14 13:38:46.282693 containerd[1759]: time="2025-01-14T13:38:46.282354712Z" level=info msg="StopPodSandbox for \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\""
Jan 14 13:38:46.284258 containerd[1759]: time="2025-01-14T13:38:46.284186353Z" level=info msg="TearDown network for sandbox \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\" successfully"
Jan 14 13:38:46.284258 containerd[1759]: time="2025-01-14T13:38:46.284211113Z" level=info msg="StopPodSandbox for \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\" returns successfully"
Jan 14 13:38:46.286565 containerd[1759]: time="2025-01-14T13:38:46.286259834Z" level=info msg="StopPodSandbox for \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\""
Jan 14 13:38:46.288259 containerd[1759]: time="2025-01-14T13:38:46.287808155Z" level=info msg="Ensure that sandbox 69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995 in task-service has been cleanup successfully"
Jan 14 13:38:46.288471 containerd[1759]: time="2025-01-14T13:38:46.288417555Z" level=info msg="StopPodSandbox for \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\""
Jan 14 13:38:46.288846 containerd[1759]: time="2025-01-14T13:38:46.288596595Z" level=info msg="TearDown network for sandbox \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\" successfully"
Jan 14 13:38:46.288846 containerd[1759]: time="2025-01-14T13:38:46.288610515Z" level=info msg="StopPodSandbox for \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\" returns successfully"
Jan 14 13:38:46.289563 containerd[1759]: time="2025-01-14T13:38:46.289440476Z" level=info msg="StopPodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\""
Jan 14 13:38:46.289563 containerd[1759]: time="2025-01-14T13:38:46.289528316Z" level=info msg="TearDown network for sandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" successfully"
Jan 14 13:38:46.289563 containerd[1759]: time="2025-01-14T13:38:46.289537956Z" level=info msg="StopPodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" returns successfully"
Jan 14 13:38:46.291229 containerd[1759]: time="2025-01-14T13:38:46.291030957Z" level=info msg="TearDown network for sandbox \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\" successfully"
Jan 14 13:38:46.291229 containerd[1759]: time="2025-01-14T13:38:46.291162597Z" level=info msg="StopPodSandbox for \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\" returns successfully"
Jan 14 13:38:46.297383 containerd[1759]: time="2025-01-14T13:38:46.297254440Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-msgc9,Uid:c15a84fb-ac68-4acd-b385-a152a2911116,Namespace:kube-system,Attempt:7,}"
Jan 14 13:38:46.297907 containerd[1759]: time="2025-01-14T13:38:46.297541880Z" level=info msg="StopPodSandbox for \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\""
Jan 14 13:38:46.297907 containerd[1759]: time="2025-01-14T13:38:46.297627400Z" level=info msg="TearDown network for sandbox \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\" successfully"
Jan 14 13:38:46.297907 containerd[1759]: time="2025-01-14T13:38:46.297637000Z" level=info msg="StopPodSandbox for \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\" returns successfully"
Jan 14 13:38:46.302429 containerd[1759]: time="2025-01-14T13:38:46.302392843Z" level=info msg="StopPodSandbox for \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\""
Jan 14 13:38:46.302602 containerd[1759]: time="2025-01-14T13:38:46.302489243Z" level=info msg="TearDown network for sandbox \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\" successfully"
Jan 14 13:38:46.302602 containerd[1759]: time="2025-01-14T13:38:46.302499363Z" level=info msg="StopPodSandbox for \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\" returns successfully"
Jan 14 13:38:46.304397 containerd[1759]: time="2025-01-14T13:38:46.303961043Z" level=info msg="StopPodSandbox for \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\""
Jan 14 13:38:46.304519 containerd[1759]: time="2025-01-14T13:38:46.304287204Z" level=info msg="TearDown network for sandbox \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\" successfully"
Jan 14 13:38:46.304614 containerd[1759]: time="2025-01-14T13:38:46.304599764Z" level=info msg="StopPodSandbox for \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\" returns successfully"
Jan 14 13:38:46.306365 containerd[1759]: time="2025-01-14T13:38:46.305972964Z" level=info msg="StopPodSandbox for \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\""
Jan 14 13:38:46.307209 containerd[1759]: time="2025-01-14T13:38:46.307092005Z" level=info msg="TearDown network for sandbox \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\" successfully"
Jan 14 13:38:46.307209 containerd[1759]: time="2025-01-14T13:38:46.307136805Z" level=info msg="StopPodSandbox for \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\" returns successfully"
Jan 14 13:38:46.308682 containerd[1759]: time="2025-01-14T13:38:46.308649046Z" level=info msg="StopPodSandbox for \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\""
Jan 14 13:38:46.308783 containerd[1759]: time="2025-01-14T13:38:46.308739926Z" level=info msg="TearDown network for sandbox \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\" successfully"
Jan 14 13:38:46.308783 containerd[1759]: time="2025-01-14T13:38:46.308750606Z" level=info msg="StopPodSandbox for \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\" returns successfully"
Jan 14 13:38:46.309235 containerd[1759]: time="2025-01-14T13:38:46.309213526Z" level=info msg="StopPodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\""
Jan 14 13:38:46.310066 containerd[1759]: time="2025-01-14T13:38:46.309449766Z" level=info msg="TearDown network for sandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" successfully"
Jan 14 13:38:46.310178 containerd[1759]: time="2025-01-14T13:38:46.309466446Z" level=info msg="StopPodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" returns successfully"
Jan 14 13:38:46.311757 containerd[1759]: time="2025-01-14T13:38:46.311643127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7bk8l,Uid:af70d9ef-1b20-4f9f-93e1-f55f680c58b4,Namespace:calico-system,Attempt:7,}"
Jan 14 13:38:46.716202 systemd-networkd[1331]: cali98d6c895af8: Link UP
Jan 14 13:38:46.716434 systemd-networkd[1331]: cali98d6c895af8: Gained carrier
Jan 14 13:38:46.718599 systemd-networkd[1331]: cali6b2f1bd0bdd: Link UP
Jan 14 13:38:46.719602 systemd-networkd[1331]: cali6b2f1bd0bdd: Gained carrier
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.433 [INFO][5595] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.453 [INFO][5595] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--zcsrg-eth0 calico-apiserver-7478669d4- calico-apiserver  35d51e43-1c53-427a-a8d2-bd422d727c5b 775 0 2025-01-14 13:37:59 +0000 UTC <nil> <nil> map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7478669d4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s  ci-4186.1.0-a-e83668d6e0  calico-apiserver-7478669d4-zcsrg eth0 calico-apiserver [] []   [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali6b2f1bd0bdd  [] []}} ContainerID="95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" Namespace="calico-apiserver" Pod="calico-apiserver-7478669d4-zcsrg" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--zcsrg-"
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.454 [INFO][5595] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" Namespace="calico-apiserver" Pod="calico-apiserver-7478669d4-zcsrg" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--zcsrg-eth0"
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.574 [INFO][5616] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" HandleID="k8s-pod-network.95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" Workload="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--zcsrg-eth0"
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.598 [INFO][5616] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" HandleID="k8s-pod-network.95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" Workload="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--zcsrg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400030deb0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4186.1.0-a-e83668d6e0", "pod":"calico-apiserver-7478669d4-zcsrg", "timestamp":"2025-01-14 13:38:46.574246902 +0000 UTC"}, Hostname:"ci-4186.1.0-a-e83668d6e0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"}
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.598 [INFO][5616] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock.
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.622 [INFO][5616] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock.
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.622 [INFO][5616] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186.1.0-a-e83668d6e0'
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.627 [INFO][5616] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.638 [INFO][5616] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.646 [INFO][5616] ipam/ipam.go 489: Trying affinity for 192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.649 [INFO][5616] ipam/ipam.go 155: Attempting to load block cidr=192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.653 [INFO][5616] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.654 [INFO][5616] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.85.0/26 handle="k8s-pod-network.95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.661 [INFO][5616] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.671 [INFO][5616] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.85.0/26 handle="k8s-pod-network.95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.687 [INFO][5616] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.85.1/26] block=192.168.85.0/26 handle="k8s-pod-network.95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.688 [INFO][5616] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.85.1/26] handle="k8s-pod-network.95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.688 [INFO][5616] ipam/ipam_plugin.go 374: Released host-wide IPAM lock.
Jan 14 13:38:46.773661 containerd[1759]: 2025-01-14 13:38:46.688 [INFO][5616] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.85.1/26] IPv6=[] ContainerID="95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" HandleID="k8s-pod-network.95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" Workload="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--zcsrg-eth0"
Jan 14 13:38:46.775887 containerd[1759]: 2025-01-14 13:38:46.691 [INFO][5595] cni-plugin/k8s.go 386: Populated endpoint ContainerID="95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" Namespace="calico-apiserver" Pod="calico-apiserver-7478669d4-zcsrg" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--zcsrg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--zcsrg-eth0", GenerateName:"calico-apiserver-7478669d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"35d51e43-1c53-427a-a8d2-bd422d727c5b", ResourceVersion:"775", Generation:0, CreationTimestamp:time.Date(2025, time.January, 14, 13, 37, 59, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7478669d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186.1.0-a-e83668d6e0", ContainerID:"", Pod:"calico-apiserver-7478669d4-zcsrg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.85.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6b2f1bd0bdd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}}
Jan 14 13:38:46.775887 containerd[1759]: 2025-01-14 13:38:46.691 [INFO][5595] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.85.1/32] ContainerID="95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" Namespace="calico-apiserver" Pod="calico-apiserver-7478669d4-zcsrg" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--zcsrg-eth0"
Jan 14 13:38:46.775887 containerd[1759]: 2025-01-14 13:38:46.691 [INFO][5595] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6b2f1bd0bdd ContainerID="95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" Namespace="calico-apiserver" Pod="calico-apiserver-7478669d4-zcsrg" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--zcsrg-eth0"
Jan 14 13:38:46.775887 containerd[1759]: 2025-01-14 13:38:46.721 [INFO][5595] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" Namespace="calico-apiserver" Pod="calico-apiserver-7478669d4-zcsrg" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--zcsrg-eth0"
Jan 14 13:38:46.775887 containerd[1759]: 2025-01-14 13:38:46.727 [INFO][5595] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" Namespace="calico-apiserver" Pod="calico-apiserver-7478669d4-zcsrg" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--zcsrg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--zcsrg-eth0", GenerateName:"calico-apiserver-7478669d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"35d51e43-1c53-427a-a8d2-bd422d727c5b", ResourceVersion:"775", Generation:0, CreationTimestamp:time.Date(2025, time.January, 14, 13, 37, 59, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7478669d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186.1.0-a-e83668d6e0", ContainerID:"95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5", Pod:"calico-apiserver-7478669d4-zcsrg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.85.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6b2f1bd0bdd", MAC:"be:51:a2:e5:32:d0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}}
Jan 14 13:38:46.775887 containerd[1759]: 2025-01-14 13:38:46.769 [INFO][5595] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5" Namespace="calico-apiserver" Pod="calico-apiserver-7478669d4-zcsrg" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--zcsrg-eth0"
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.398 [INFO][5578] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.424 [INFO][5578] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--9xx9c-eth0 calico-apiserver-7478669d4- calico-apiserver  f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1 772 0 2025-01-14 13:37:58 +0000 UTC <nil> <nil> map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7478669d4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s  ci-4186.1.0-a-e83668d6e0  calico-apiserver-7478669d4-9xx9c eth0 calico-apiserver [] []   [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali98d6c895af8  [] []}} ContainerID="bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" Namespace="calico-apiserver" Pod="calico-apiserver-7478669d4-9xx9c" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--9xx9c-"
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.424 [INFO][5578] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" Namespace="calico-apiserver" Pod="calico-apiserver-7478669d4-9xx9c" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--9xx9c-eth0"
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.486 [INFO][5607] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" HandleID="k8s-pod-network.bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" Workload="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--9xx9c-eth0"
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.508 [INFO][5607] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" HandleID="k8s-pod-network.bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" Workload="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--9xx9c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d640), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4186.1.0-a-e83668d6e0", "pod":"calico-apiserver-7478669d4-9xx9c", "timestamp":"2025-01-14 13:38:46.486445457 +0000 UTC"}, Hostname:"ci-4186.1.0-a-e83668d6e0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"}
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.508 [INFO][5607] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock.
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.508 [INFO][5607] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock.
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.508 [INFO][5607] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186.1.0-a-e83668d6e0'
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.511 [INFO][5607] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.518 [INFO][5607] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.531 [INFO][5607] ipam/ipam.go 521: Ran out of existing affine blocks for host host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.537 [INFO][5607] ipam/ipam.go 538: Tried all affine blocks. Looking for an affine block with space, or a new unclaimed block host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.540 [INFO][5607] ipam/ipam_block_reader_writer.go 154: Found free block: 192.168.85.0/26
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.540 [INFO][5607] ipam/ipam.go 550: Found unclaimed block host="ci-4186.1.0-a-e83668d6e0" subnet=192.168.85.0/26
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.540 [INFO][5607] ipam/ipam_block_reader_writer.go 171: Trying to create affinity in pending state host="ci-4186.1.0-a-e83668d6e0" subnet=192.168.85.0/26
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.550 [INFO][5607] ipam/ipam_block_reader_writer.go 201: Successfully created pending affinity for block host="ci-4186.1.0-a-e83668d6e0" subnet=192.168.85.0/26
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.551 [INFO][5607] ipam/ipam.go 155: Attempting to load block cidr=192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.554 [INFO][5607] ipam/ipam.go 160: The referenced block doesn't exist, trying to create it cidr=192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.558 [INFO][5607] ipam/ipam.go 167: Wrote affinity as pending cidr=192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.562 [INFO][5607] ipam/ipam.go 176: Attempting to claim the block cidr=192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.562 [INFO][5607] ipam/ipam_block_reader_writer.go 223: Attempting to create a new block host="ci-4186.1.0-a-e83668d6e0" subnet=192.168.85.0/26
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.575 [INFO][5607] ipam/ipam_block_reader_writer.go 264: Successfully created block
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.576 [INFO][5607] ipam/ipam_block_reader_writer.go 275: Confirming affinity host="ci-4186.1.0-a-e83668d6e0" subnet=192.168.85.0/26
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.592 [INFO][5607] ipam/ipam_block_reader_writer.go 290: Successfully confirmed affinity host="ci-4186.1.0-a-e83668d6e0" subnet=192.168.85.0/26
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.592 [INFO][5607] ipam/ipam.go 585: Block '192.168.85.0/26' has 64 free ips which is more than 1 ips required. host="ci-4186.1.0-a-e83668d6e0" subnet=192.168.85.0/26
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.592 [INFO][5607] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.85.0/26 handle="k8s-pod-network.bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.788950 containerd[1759]: 2025-01-14 13:38:46.598 [INFO][5607] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f
Jan 14 13:38:46.790036 containerd[1759]: 2025-01-14 13:38:46.612 [INFO][5607] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.85.0/26 handle="k8s-pod-network.bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.790036 containerd[1759]: 2025-01-14 13:38:46.622 [INFO][5607] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.85.0/26] block=192.168.85.0/26 handle="k8s-pod-network.bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.790036 containerd[1759]: 2025-01-14 13:38:46.622 [INFO][5607] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.85.0/26] handle="k8s-pod-network.bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.790036 containerd[1759]: 2025-01-14 13:38:46.622 [INFO][5607] ipam/ipam_plugin.go 374: Released host-wide IPAM lock.
Jan 14 13:38:46.790036 containerd[1759]: 2025-01-14 13:38:46.622 [INFO][5607] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.85.0/26] IPv6=[] ContainerID="bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" HandleID="k8s-pod-network.bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" Workload="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--9xx9c-eth0"
Jan 14 13:38:46.790036 containerd[1759]: 2025-01-14 13:38:46.639 [INFO][5578] cni-plugin/k8s.go 386: Populated endpoint ContainerID="bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" Namespace="calico-apiserver" Pod="calico-apiserver-7478669d4-9xx9c" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--9xx9c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--9xx9c-eth0", GenerateName:"calico-apiserver-7478669d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1", ResourceVersion:"772", Generation:0, CreationTimestamp:time.Date(2025, time.January, 14, 13, 37, 58, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7478669d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186.1.0-a-e83668d6e0", ContainerID:"", Pod:"calico-apiserver-7478669d4-9xx9c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.85.0/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali98d6c895af8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}}
Jan 14 13:38:46.790036 containerd[1759]: 2025-01-14 13:38:46.641 [INFO][5578] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.85.0/32] ContainerID="bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" Namespace="calico-apiserver" Pod="calico-apiserver-7478669d4-9xx9c" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--9xx9c-eth0"
Jan 14 13:38:46.790036 containerd[1759]: 2025-01-14 13:38:46.641 [INFO][5578] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali98d6c895af8 ContainerID="bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" Namespace="calico-apiserver" Pod="calico-apiserver-7478669d4-9xx9c" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--9xx9c-eth0"
Jan 14 13:38:46.790036 containerd[1759]: 2025-01-14 13:38:46.717 [INFO][5578] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" Namespace="calico-apiserver" Pod="calico-apiserver-7478669d4-9xx9c" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--9xx9c-eth0"
Jan 14 13:38:46.790036 containerd[1759]: 2025-01-14 13:38:46.721 [INFO][5578] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" Namespace="calico-apiserver" Pod="calico-apiserver-7478669d4-9xx9c" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--9xx9c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--9xx9c-eth0", GenerateName:"calico-apiserver-7478669d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1", ResourceVersion:"772", Generation:0, CreationTimestamp:time.Date(2025, time.January, 14, 13, 37, 58, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7478669d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186.1.0-a-e83668d6e0", ContainerID:"bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f", Pod:"calico-apiserver-7478669d4-9xx9c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.85.0/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali98d6c895af8", MAC:"32:31:98:96:3d:12", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}}
Jan 14 13:38:46.790297 containerd[1759]: 2025-01-14 13:38:46.766 [INFO][5578] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f" Namespace="calico-apiserver" Pod="calico-apiserver-7478669d4-9xx9c" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--apiserver--7478669d4--9xx9c-eth0"
Jan 14 13:38:46.836673 systemd-networkd[1331]: calidc70f3b220d: Link UP
Jan 14 13:38:46.837699 systemd-networkd[1331]: calidc70f3b220d: Gained carrier
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.523 [INFO][5617] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.562 [INFO][5617] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186.1.0--a--e83668d6e0-k8s-csi--node--driver--7bk8l-eth0 csi-node-driver- calico-system  af70d9ef-1b20-4f9f-93e1-f55f680c58b4 632 0 2025-01-14 13:37:59 +0000 UTC <nil> <nil> map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:55b695c467 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s  ci-4186.1.0-a-e83668d6e0  csi-node-driver-7bk8l eth0 csi-node-driver [] []   [kns.calico-system ksa.calico-system.csi-node-driver] calidc70f3b220d  [] []}} ContainerID="4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" Namespace="calico-system" Pod="csi-node-driver-7bk8l" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-csi--node--driver--7bk8l-"
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.562 [INFO][5617] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" Namespace="calico-system" Pod="csi-node-driver-7bk8l" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-csi--node--driver--7bk8l-eth0"
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.634 [INFO][5649] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" HandleID="k8s-pod-network.4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" Workload="ci--4186.1.0--a--e83668d6e0-k8s-csi--node--driver--7bk8l-eth0"
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.681 [INFO][5649] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" HandleID="k8s-pod-network.4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" Workload="ci--4186.1.0--a--e83668d6e0-k8s-csi--node--driver--7bk8l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003175d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4186.1.0-a-e83668d6e0", "pod":"csi-node-driver-7bk8l", "timestamp":"2025-01-14 13:38:46.634409652 +0000 UTC"}, Hostname:"ci-4186.1.0-a-e83668d6e0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"}
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.681 [INFO][5649] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock.
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.689 [INFO][5649] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock.
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.689 [INFO][5649] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186.1.0-a-e83668d6e0'
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.696 [INFO][5649] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.707 [INFO][5649] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.730 [INFO][5649] ipam/ipam.go 489: Trying affinity for 192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.753 [INFO][5649] ipam/ipam.go 155: Attempting to load block cidr=192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.770 [INFO][5649] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.770 [INFO][5649] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.85.0/26 handle="k8s-pod-network.4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.782 [INFO][5649] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.794 [INFO][5649] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.85.0/26 handle="k8s-pod-network.4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.812 [INFO][5649] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.85.2/26] block=192.168.85.0/26 handle="k8s-pod-network.4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.812 [INFO][5649] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.85.2/26] handle="k8s-pod-network.4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.812 [INFO][5649] ipam/ipam_plugin.go 374: Released host-wide IPAM lock.
Jan 14 13:38:46.877854 containerd[1759]: 2025-01-14 13:38:46.813 [INFO][5649] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.85.2/26] IPv6=[] ContainerID="4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" HandleID="k8s-pod-network.4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" Workload="ci--4186.1.0--a--e83668d6e0-k8s-csi--node--driver--7bk8l-eth0"
Jan 14 13:38:46.878983 containerd[1759]: 2025-01-14 13:38:46.825 [INFO][5617] cni-plugin/k8s.go 386: Populated endpoint ContainerID="4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" Namespace="calico-system" Pod="csi-node-driver-7bk8l" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-csi--node--driver--7bk8l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186.1.0--a--e83668d6e0-k8s-csi--node--driver--7bk8l-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"af70d9ef-1b20-4f9f-93e1-f55f680c58b4", ResourceVersion:"632", Generation:0, CreationTimestamp:time.Date(2025, time.January, 14, 13, 37, 59, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b695c467", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186.1.0-a-e83668d6e0", ContainerID:"", Pod:"csi-node-driver-7bk8l", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.85.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidc70f3b220d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}}
Jan 14 13:38:46.878983 containerd[1759]: 2025-01-14 13:38:46.825 [INFO][5617] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.85.2/32] ContainerID="4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" Namespace="calico-system" Pod="csi-node-driver-7bk8l" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-csi--node--driver--7bk8l-eth0"
Jan 14 13:38:46.878983 containerd[1759]: 2025-01-14 13:38:46.826 [INFO][5617] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidc70f3b220d ContainerID="4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" Namespace="calico-system" Pod="csi-node-driver-7bk8l" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-csi--node--driver--7bk8l-eth0"
Jan 14 13:38:46.878983 containerd[1759]: 2025-01-14 13:38:46.839 [INFO][5617] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" Namespace="calico-system" Pod="csi-node-driver-7bk8l" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-csi--node--driver--7bk8l-eth0"
Jan 14 13:38:46.878983 containerd[1759]: 2025-01-14 13:38:46.840 [INFO][5617] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" Namespace="calico-system" Pod="csi-node-driver-7bk8l" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-csi--node--driver--7bk8l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186.1.0--a--e83668d6e0-k8s-csi--node--driver--7bk8l-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"af70d9ef-1b20-4f9f-93e1-f55f680c58b4", ResourceVersion:"632", Generation:0, CreationTimestamp:time.Date(2025, time.January, 14, 13, 37, 59, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b695c467", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186.1.0-a-e83668d6e0", ContainerID:"4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7", Pod:"csi-node-driver-7bk8l", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.85.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidc70f3b220d", MAC:"8e:76:61:9b:28:0c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}}
Jan 14 13:38:46.878983 containerd[1759]: 2025-01-14 13:38:46.872 [INFO][5617] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7" Namespace="calico-system" Pod="csi-node-driver-7bk8l" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-csi--node--driver--7bk8l-eth0"
Jan 14 13:38:46.901525 containerd[1759]: time="2025-01-14T13:38:46.901434029Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
Jan 14 13:38:46.902439 containerd[1759]: time="2025-01-14T13:38:46.902397510Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
Jan 14 13:38:46.902951 containerd[1759]: time="2025-01-14T13:38:46.902909510Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:38:46.904693 containerd[1759]: time="2025-01-14T13:38:46.904644991Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:38:46.909009 containerd[1759]: time="2025-01-14T13:38:46.906769392Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
Jan 14 13:38:46.909009 containerd[1759]: time="2025-01-14T13:38:46.907038672Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
Jan 14 13:38:46.909009 containerd[1759]: time="2025-01-14T13:38:46.907052992Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:38:46.909009 containerd[1759]: time="2025-01-14T13:38:46.907196992Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:38:46.940879 systemd-networkd[1331]: cali046d88438d6: Link UP
Jan 14 13:38:46.951247 systemd-networkd[1331]: cali046d88438d6: Gained carrier
Jan 14 13:38:46.959716 systemd[1]: run-netns-cni\x2db7a5b5b3\x2d67d2\x2db1af\x2dad6c\x2dc0983b5251a3.mount: Deactivated successfully.
Jan 14 13:38:46.960338 systemd[1]: run-netns-cni\x2d0861c260\x2de115\x2daa8b\x2d3a4c\x2d92918c570ca4.mount: Deactivated successfully.
Jan 14 13:38:46.960393 systemd[1]: run-netns-cni\x2dec492530\x2d18af\x2d44ce\x2dbb9f\x2deb451d5a3b22.mount: Deactivated successfully.
Jan 14 13:38:46.960441 systemd[1]: run-netns-cni\x2d079b2afd\x2d7dbe\x2deeb6\x2d8e9d\x2d81ef562f8814.mount: Deactivated successfully.
Jan 14 13:38:46.960486 systemd[1]: run-netns-cni\x2df9e12b50\x2de352\x2da090\x2d2532\x2da95bb4f5f3e3.mount: Deactivated successfully.
Jan 14 13:38:46.960528 systemd[1]: run-netns-cni\x2d2b4c3364\x2dd26c\x2d4792\x2d66ad\x2d0a5573563944.mount: Deactivated successfully.
Jan 14 13:38:46.977595 systemd[1]: Started cri-containerd-bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f.scope - libcontainer container bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f.
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.570 [INFO][5632] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.613 [INFO][5632] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--msgc9-eth0 coredns-76f75df574- kube-system  c15a84fb-ac68-4acd-b385-a152a2911116 771 0 2025-01-14 13:37:53 +0000 UTC <nil> <nil> map[k8s-app:kube-dns pod-template-hash:76f75df574 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s  ci-4186.1.0-a-e83668d6e0  coredns-76f75df574-msgc9 eth0 coredns [] []   [kns.kube-system ksa.kube-system.coredns] cali046d88438d6  [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" Namespace="kube-system" Pod="coredns-76f75df574-msgc9" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--msgc9-"
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.613 [INFO][5632] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" Namespace="kube-system" Pod="coredns-76f75df574-msgc9" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--msgc9-eth0"
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.727 [INFO][5658] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" HandleID="k8s-pod-network.d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" Workload="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--msgc9-eth0"
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.777 [INFO][5658] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" HandleID="k8s-pod-network.d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" Workload="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--msgc9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003f4b40), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4186.1.0-a-e83668d6e0", "pod":"coredns-76f75df574-msgc9", "timestamp":"2025-01-14 13:38:46.72728494 +0000 UTC"}, Hostname:"ci-4186.1.0-a-e83668d6e0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"}
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.780 [INFO][5658] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock.
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.813 [INFO][5658] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock.
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.814 [INFO][5658] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186.1.0-a-e83668d6e0'
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.818 [INFO][5658] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.833 [INFO][5658] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.856 [INFO][5658] ipam/ipam.go 489: Trying affinity for 192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.863 [INFO][5658] ipam/ipam.go 155: Attempting to load block cidr=192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.876 [INFO][5658] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.881 [INFO][5658] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.85.0/26 handle="k8s-pod-network.d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.891 [INFO][5658] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.899 [INFO][5658] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.85.0/26 handle="k8s-pod-network.d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.913 [INFO][5658] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.85.4/26] block=192.168.85.0/26 handle="k8s-pod-network.d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.913 [INFO][5658] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.85.4/26] handle="k8s-pod-network.d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.913 [INFO][5658] ipam/ipam_plugin.go 374: Released host-wide IPAM lock.
Jan 14 13:38:47.004462 containerd[1759]: 2025-01-14 13:38:46.913 [INFO][5658] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.85.4/26] IPv6=[] ContainerID="d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" HandleID="k8s-pod-network.d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" Workload="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--msgc9-eth0"
Jan 14 13:38:47.005026 containerd[1759]: 2025-01-14 13:38:46.919 [INFO][5632] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" Namespace="kube-system" Pod="coredns-76f75df574-msgc9" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--msgc9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--msgc9-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"c15a84fb-ac68-4acd-b385-a152a2911116", ResourceVersion:"771", Generation:0, CreationTimestamp:time.Date(2025, time.January, 14, 13, 37, 53, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186.1.0-a-e83668d6e0", ContainerID:"", Pod:"coredns-76f75df574-msgc9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.85.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali046d88438d6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}}
Jan 14 13:38:47.005026 containerd[1759]: 2025-01-14 13:38:46.920 [INFO][5632] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.85.4/32] ContainerID="d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" Namespace="kube-system" Pod="coredns-76f75df574-msgc9" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--msgc9-eth0"
Jan 14 13:38:47.005026 containerd[1759]: 2025-01-14 13:38:46.921 [INFO][5632] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali046d88438d6 ContainerID="d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" Namespace="kube-system" Pod="coredns-76f75df574-msgc9" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--msgc9-eth0"
Jan 14 13:38:47.005026 containerd[1759]: 2025-01-14 13:38:46.967 [INFO][5632] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" Namespace="kube-system" Pod="coredns-76f75df574-msgc9" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--msgc9-eth0"
Jan 14 13:38:47.005026 containerd[1759]: 2025-01-14 13:38:46.971 [INFO][5632] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" Namespace="kube-system" Pod="coredns-76f75df574-msgc9" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--msgc9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--msgc9-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"c15a84fb-ac68-4acd-b385-a152a2911116", ResourceVersion:"771", Generation:0, CreationTimestamp:time.Date(2025, time.January, 14, 13, 37, 53, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186.1.0-a-e83668d6e0", ContainerID:"d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8", Pod:"coredns-76f75df574-msgc9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.85.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali046d88438d6", MAC:"c2:64:d5:79:c6:67", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}}
Jan 14 13:38:47.005026 containerd[1759]: 2025-01-14 13:38:46.993 [INFO][5632] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8" Namespace="kube-system" Pod="coredns-76f75df574-msgc9" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--msgc9-eth0"
Jan 14 13:38:47.013555 systemd[1]: Started cri-containerd-95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5.scope - libcontainer container 95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5.
Jan 14 13:38:47.014080 containerd[1759]: time="2025-01-14T13:38:47.012840606Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
Jan 14 13:38:47.014080 containerd[1759]: time="2025-01-14T13:38:47.013215926Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
Jan 14 13:38:47.014080 containerd[1759]: time="2025-01-14T13:38:47.013235286Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:38:47.014080 containerd[1759]: time="2025-01-14T13:38:47.013480726Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:38:47.078185 systemd[1]: Started cri-containerd-4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7.scope - libcontainer container 4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7.
Jan 14 13:38:47.102835 systemd-networkd[1331]: calid9f260195a8: Link UP
Jan 14 13:38:47.106518 systemd-networkd[1331]: calid9f260195a8: Gained carrier
Jan 14 13:38:47.118384 containerd[1759]: time="2025-01-14T13:38:47.118253420Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
Jan 14 13:38:47.118384 containerd[1759]: time="2025-01-14T13:38:47.118316940Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
Jan 14 13:38:47.118384 containerd[1759]: time="2025-01-14T13:38:47.118329340Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:38:47.119193 containerd[1759]: time="2025-01-14T13:38:47.119082340Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:46.752 [INFO][5674] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:46.782 [INFO][5674] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--2fhlb-eth0 coredns-76f75df574- kube-system  af2992f6-a261-4e1c-9cfc-1ef759086d8d 766 0 2025-01-14 13:37:53 +0000 UTC <nil> <nil> map[k8s-app:kube-dns pod-template-hash:76f75df574 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s  ci-4186.1.0-a-e83668d6e0  coredns-76f75df574-2fhlb eth0 coredns [] []   [kns.kube-system ksa.kube-system.coredns] calid9f260195a8  [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" Namespace="kube-system" Pod="coredns-76f75df574-2fhlb" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--2fhlb-"
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:46.782 [INFO][5674] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" Namespace="kube-system" Pod="coredns-76f75df574-2fhlb" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--2fhlb-eth0"
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:46.876 [INFO][5708] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" HandleID="k8s-pod-network.ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" Workload="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--2fhlb-eth0"
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:46.919 [INFO][5708] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" HandleID="k8s-pod-network.ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" Workload="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--2fhlb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003ba560), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4186.1.0-a-e83668d6e0", "pod":"coredns-76f75df574-2fhlb", "timestamp":"2025-01-14 13:38:46.876448256 +0000 UTC"}, Hostname:"ci-4186.1.0-a-e83668d6e0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"}
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:46.919 [INFO][5708] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock.
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:46.919 [INFO][5708] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock.
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:46.920 [INFO][5708] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186.1.0-a-e83668d6e0'
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:46.933 [INFO][5708] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:46.964 [INFO][5708] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:46.985 [INFO][5708] ipam/ipam.go 489: Trying affinity for 192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:46.992 [INFO][5708] ipam/ipam.go 155: Attempting to load block cidr=192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:47.005 [INFO][5708] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:47.005 [INFO][5708] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.85.0/26 handle="k8s-pod-network.ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:47.014 [INFO][5708] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:47.039 [INFO][5708] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.85.0/26 handle="k8s-pod-network.ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:47.081 [INFO][5708] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.85.5/26] block=192.168.85.0/26 handle="k8s-pod-network.ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:47.081 [INFO][5708] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.85.5/26] handle="k8s-pod-network.ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:47.081 [INFO][5708] ipam/ipam_plugin.go 374: Released host-wide IPAM lock.
Jan 14 13:38:47.141236 containerd[1759]: 2025-01-14 13:38:47.081 [INFO][5708] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.85.5/26] IPv6=[] ContainerID="ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" HandleID="k8s-pod-network.ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" Workload="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--2fhlb-eth0"
Jan 14 13:38:47.142611 containerd[1759]: 2025-01-14 13:38:47.086 [INFO][5674] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" Namespace="kube-system" Pod="coredns-76f75df574-2fhlb" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--2fhlb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--2fhlb-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"af2992f6-a261-4e1c-9cfc-1ef759086d8d", ResourceVersion:"766", Generation:0, CreationTimestamp:time.Date(2025, time.January, 14, 13, 37, 53, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186.1.0-a-e83668d6e0", ContainerID:"", Pod:"coredns-76f75df574-2fhlb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.85.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid9f260195a8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}}
Jan 14 13:38:47.142611 containerd[1759]: 2025-01-14 13:38:47.088 [INFO][5674] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.85.5/32] ContainerID="ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" Namespace="kube-system" Pod="coredns-76f75df574-2fhlb" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--2fhlb-eth0"
Jan 14 13:38:47.142611 containerd[1759]: 2025-01-14 13:38:47.088 [INFO][5674] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid9f260195a8 ContainerID="ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" Namespace="kube-system" Pod="coredns-76f75df574-2fhlb" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--2fhlb-eth0"
Jan 14 13:38:47.142611 containerd[1759]: 2025-01-14 13:38:47.106 [INFO][5674] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" Namespace="kube-system" Pod="coredns-76f75df574-2fhlb" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--2fhlb-eth0"
Jan 14 13:38:47.142611 containerd[1759]: 2025-01-14 13:38:47.107 [INFO][5674] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" Namespace="kube-system" Pod="coredns-76f75df574-2fhlb" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--2fhlb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--2fhlb-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"af2992f6-a261-4e1c-9cfc-1ef759086d8d", ResourceVersion:"766", Generation:0, CreationTimestamp:time.Date(2025, time.January, 14, 13, 37, 53, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186.1.0-a-e83668d6e0", ContainerID:"ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04", Pod:"coredns-76f75df574-2fhlb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.85.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid9f260195a8", MAC:"2a:3d:0f:3b:f7:f9", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}}
Jan 14 13:38:47.142611 containerd[1759]: 2025-01-14 13:38:47.132 [INFO][5674] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04" Namespace="kube-system" Pod="coredns-76f75df574-2fhlb" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-coredns--76f75df574--2fhlb-eth0"
Jan 14 13:38:47.155329 systemd[1]: Started cri-containerd-d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8.scope - libcontainer container d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8.
Jan 14 13:38:47.180245 containerd[1759]: time="2025-01-14T13:38:47.180083652Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
Jan 14 13:38:47.180245 containerd[1759]: time="2025-01-14T13:38:47.180200492Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
Jan 14 13:38:47.180599 containerd[1759]: time="2025-01-14T13:38:47.180351252Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:38:47.181110 containerd[1759]: time="2025-01-14T13:38:47.181051572Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:38:47.208207 systemd[1]: Started cri-containerd-ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04.scope - libcontainer container ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04.
Jan 14 13:38:47.222730 systemd-networkd[1331]: cali69ee0d2d57f: Link UP
Jan 14 13:38:47.222987 systemd-networkd[1331]: cali69ee0d2d57f: Gained carrier
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:46.699 [INFO][5659] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:46.789 [INFO][5659] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186.1.0--a--e83668d6e0-k8s-calico--kube--controllers--8555cc7446--bv6h2-eth0 calico-kube-controllers-8555cc7446- calico-system  f475c534-3126-4484-b2ac-7f780fe28e12 773 0 2025-01-14 13:37:59 +0000 UTC <nil> <nil> map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:8555cc7446 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s  ci-4186.1.0-a-e83668d6e0  calico-kube-controllers-8555cc7446-bv6h2 eth0 calico-kube-controllers [] []   [kns.calico-system ksa.calico-system.calico-kube-controllers] cali69ee0d2d57f  [] []}} ContainerID="ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" Namespace="calico-system" Pod="calico-kube-controllers-8555cc7446-bv6h2" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--kube--controllers--8555cc7446--bv6h2-"
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:46.789 [INFO][5659] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" Namespace="calico-system" Pod="calico-kube-controllers-8555cc7446-bv6h2" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--kube--controllers--8555cc7446--bv6h2-eth0"
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.002 [INFO][5709] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" HandleID="k8s-pod-network.ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" Workload="ci--4186.1.0--a--e83668d6e0-k8s-calico--kube--controllers--8555cc7446--bv6h2-eth0"
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.071 [INFO][5709] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" HandleID="k8s-pod-network.ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" Workload="ci--4186.1.0--a--e83668d6e0-k8s-calico--kube--controllers--8555cc7446--bv6h2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004ceb0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4186.1.0-a-e83668d6e0", "pod":"calico-kube-controllers-8555cc7446-bv6h2", "timestamp":"2025-01-14 13:38:47.002870001 +0000 UTC"}, Hostname:"ci-4186.1.0-a-e83668d6e0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"}
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.071 [INFO][5709] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock.
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.081 [INFO][5709] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock.
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.081 [INFO][5709] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186.1.0-a-e83668d6e0'
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.085 [INFO][5709] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.102 [INFO][5709] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.125 [INFO][5709] ipam/ipam.go 489: Trying affinity for 192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.133 [INFO][5709] ipam/ipam.go 155: Attempting to load block cidr=192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.141 [INFO][5709] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.85.0/26 host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.142 [INFO][5709] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.85.0/26 handle="k8s-pod-network.ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.160 [INFO][5709] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.171 [INFO][5709] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.85.0/26 handle="k8s-pod-network.ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.203 [INFO][5709] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.85.6/26] block=192.168.85.0/26 handle="k8s-pod-network.ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.204 [INFO][5709] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.85.6/26] handle="k8s-pod-network.ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" host="ci-4186.1.0-a-e83668d6e0"
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.204 [INFO][5709] ipam/ipam_plugin.go 374: Released host-wide IPAM lock.
Jan 14 13:38:47.254257 containerd[1759]: 2025-01-14 13:38:47.205 [INFO][5709] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.85.6/26] IPv6=[] ContainerID="ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" HandleID="k8s-pod-network.ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" Workload="ci--4186.1.0--a--e83668d6e0-k8s-calico--kube--controllers--8555cc7446--bv6h2-eth0"
Jan 14 13:38:47.256583 containerd[1759]: 2025-01-14 13:38:47.212 [INFO][5659] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" Namespace="calico-system" Pod="calico-kube-controllers-8555cc7446-bv6h2" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--kube--controllers--8555cc7446--bv6h2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186.1.0--a--e83668d6e0-k8s-calico--kube--controllers--8555cc7446--bv6h2-eth0", GenerateName:"calico-kube-controllers-8555cc7446-", Namespace:"calico-system", SelfLink:"", UID:"f475c534-3126-4484-b2ac-7f780fe28e12", ResourceVersion:"773", Generation:0, CreationTimestamp:time.Date(2025, time.January, 14, 13, 37, 59, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8555cc7446", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186.1.0-a-e83668d6e0", ContainerID:"", Pod:"calico-kube-controllers-8555cc7446-bv6h2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.85.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali69ee0d2d57f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}}
Jan 14 13:38:47.256583 containerd[1759]: 2025-01-14 13:38:47.213 [INFO][5659] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.85.6/32] ContainerID="ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" Namespace="calico-system" Pod="calico-kube-controllers-8555cc7446-bv6h2" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--kube--controllers--8555cc7446--bv6h2-eth0"
Jan 14 13:38:47.256583 containerd[1759]: 2025-01-14 13:38:47.213 [INFO][5659] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali69ee0d2d57f ContainerID="ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" Namespace="calico-system" Pod="calico-kube-controllers-8555cc7446-bv6h2" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--kube--controllers--8555cc7446--bv6h2-eth0"
Jan 14 13:38:47.256583 containerd[1759]: 2025-01-14 13:38:47.222 [INFO][5659] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" Namespace="calico-system" Pod="calico-kube-controllers-8555cc7446-bv6h2" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--kube--controllers--8555cc7446--bv6h2-eth0"
Jan 14 13:38:47.256583 containerd[1759]: 2025-01-14 13:38:47.224 [INFO][5659] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" Namespace="calico-system" Pod="calico-kube-controllers-8555cc7446-bv6h2" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--kube--controllers--8555cc7446--bv6h2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186.1.0--a--e83668d6e0-k8s-calico--kube--controllers--8555cc7446--bv6h2-eth0", GenerateName:"calico-kube-controllers-8555cc7446-", Namespace:"calico-system", SelfLink:"", UID:"f475c534-3126-4484-b2ac-7f780fe28e12", ResourceVersion:"773", Generation:0, CreationTimestamp:time.Date(2025, time.January, 14, 13, 37, 59, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8555cc7446", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186.1.0-a-e83668d6e0", ContainerID:"ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2", Pod:"calico-kube-controllers-8555cc7446-bv6h2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.85.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali69ee0d2d57f", MAC:"2e:27:16:49:ef:27", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}}
Jan 14 13:38:47.256583 containerd[1759]: 2025-01-14 13:38:47.243 [INFO][5659] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2" Namespace="calico-system" Pod="calico-kube-controllers-8555cc7446-bv6h2" WorkloadEndpoint="ci--4186.1.0--a--e83668d6e0-k8s-calico--kube--controllers--8555cc7446--bv6h2-eth0"
Jan 14 13:38:47.280960 containerd[1759]: time="2025-01-14T13:38:47.280315543Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-msgc9,Uid:c15a84fb-ac68-4acd-b385-a152a2911116,Namespace:kube-system,Attempt:7,} returns sandbox id \"d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8\""
Jan 14 13:38:47.307338 containerd[1759]: time="2025-01-14T13:38:47.307293317Z" level=info msg="CreateContainer within sandbox \"d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8\" for container &ContainerMetadata{Name:coredns,Attempt:0,}"
Jan 14 13:38:47.320215 containerd[1759]: time="2025-01-14T13:38:47.319871763Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-2fhlb,Uid:af2992f6-a261-4e1c-9cfc-1ef759086d8d,Namespace:kube-system,Attempt:7,} returns sandbox id \"ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04\""
Jan 14 13:38:47.330506 containerd[1759]: time="2025-01-14T13:38:47.330430209Z" level=info msg="CreateContainer within sandbox \"ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04\" for container &ContainerMetadata{Name:coredns,Attempt:0,}"
Jan 14 13:38:47.341247 containerd[1759]: time="2025-01-14T13:38:47.341188334Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-zcsrg,Uid:35d51e43-1c53-427a-a8d2-bd422d727c5b,Namespace:calico-apiserver,Attempt:7,} returns sandbox id \"95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5\""
Jan 14 13:38:47.343808 containerd[1759]: time="2025-01-14T13:38:47.343778335Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7bk8l,Uid:af70d9ef-1b20-4f9f-93e1-f55f680c58b4,Namespace:calico-system,Attempt:7,} returns sandbox id \"4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7\""
Jan 14 13:38:47.349355 containerd[1759]: time="2025-01-14T13:38:47.348706138Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\""
Jan 14 13:38:47.360330 containerd[1759]: time="2025-01-14T13:38:47.360284384Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7478669d4-9xx9c,Uid:f5f02a24-03f2-49f5-ae2a-c9d89b6de9c1,Namespace:calico-apiserver,Attempt:7,} returns sandbox id \"bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f\""
Jan 14 13:38:47.360652 containerd[1759]: time="2025-01-14T13:38:47.360207544Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
Jan 14 13:38:47.360652 containerd[1759]: time="2025-01-14T13:38:47.360272544Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
Jan 14 13:38:47.360652 containerd[1759]: time="2025-01-14T13:38:47.360284744Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:38:47.360652 containerd[1759]: time="2025-01-14T13:38:47.360359104Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
Jan 14 13:38:47.378350 containerd[1759]: time="2025-01-14T13:38:47.377483313Z" level=info msg="CreateContainer within sandbox \"d88fc5032b5fda138c2b2e84ece169a783ef4c3740ed0a5f2d598b1fec93b5c8\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a84c76386ac27d3f7c716199d83dcfea56bd5bee691504869009176a035b7cce\""
Jan 14 13:38:47.380524 containerd[1759]: time="2025-01-14T13:38:47.379252114Z" level=info msg="StartContainer for \"a84c76386ac27d3f7c716199d83dcfea56bd5bee691504869009176a035b7cce\""
Jan 14 13:38:47.382637 systemd[1]: Started cri-containerd-ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2.scope - libcontainer container ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2.
Jan 14 13:38:47.413630 containerd[1759]: time="2025-01-14T13:38:47.413583211Z" level=info msg="CreateContainer within sandbox \"ee004b77219c9492a1cb15a2be63988a89f7bfeef8f4d54ef7b57c5df0d40f04\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a8220876206102467e97ffa5612479d6ab23c65721e9235a27591d44153f89d5\""
Jan 14 13:38:47.414861 systemd[1]: Started cri-containerd-a84c76386ac27d3f7c716199d83dcfea56bd5bee691504869009176a035b7cce.scope - libcontainer container a84c76386ac27d3f7c716199d83dcfea56bd5bee691504869009176a035b7cce.
Jan 14 13:38:47.426958 containerd[1759]: time="2025-01-14T13:38:47.426912778Z" level=info msg="StartContainer for \"a8220876206102467e97ffa5612479d6ab23c65721e9235a27591d44153f89d5\""
Jan 14 13:38:47.460419 containerd[1759]: time="2025-01-14T13:38:47.460368435Z" level=info msg="StartContainer for \"a84c76386ac27d3f7c716199d83dcfea56bd5bee691504869009176a035b7cce\" returns successfully"
Jan 14 13:38:47.471327 systemd[1]: Started cri-containerd-a8220876206102467e97ffa5612479d6ab23c65721e9235a27591d44153f89d5.scope - libcontainer container a8220876206102467e97ffa5612479d6ab23c65721e9235a27591d44153f89d5.
Jan 14 13:38:47.502385 containerd[1759]: time="2025-01-14T13:38:47.501837536Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8555cc7446-bv6h2,Uid:f475c534-3126-4484-b2ac-7f780fe28e12,Namespace:calico-system,Attempt:7,} returns sandbox id \"ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2\""
Jan 14 13:38:47.532053 containerd[1759]: time="2025-01-14T13:38:47.531499031Z" level=info msg="StartContainer for \"a8220876206102467e97ffa5612479d6ab23c65721e9235a27591d44153f89d5\" returns successfully"
Jan 14 13:38:48.018263 systemd-networkd[1331]: cali98d6c895af8: Gained IPv6LL
Jan 14 13:38:48.424828 kubelet[3442]: I0114 13:38:48.423863    3442 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-76f75df574-msgc9" podStartSLOduration=55.42382289 podStartE2EDuration="55.42382289s" podCreationTimestamp="2025-01-14 13:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-14 13:38:48.402731758 +0000 UTC m=+70.756676572" watchObservedRunningTime="2025-01-14 13:38:48.42382289 +0000 UTC m=+70.777767704"
Jan 14 13:38:48.594111 systemd-networkd[1331]: cali6b2f1bd0bdd: Gained IPv6LL
Jan 14 13:38:48.638017 kernel: bpftool[6235]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set
Jan 14 13:38:48.658192 systemd-networkd[1331]: calidc70f3b220d: Gained IPv6LL
Jan 14 13:38:48.786164 systemd-networkd[1331]: cali046d88438d6: Gained IPv6LL
Jan 14 13:38:48.817400 systemd-networkd[1331]: vxlan.calico: Link UP
Jan 14 13:38:48.817408 systemd-networkd[1331]: vxlan.calico: Gained carrier
Jan 14 13:38:49.107193 systemd-networkd[1331]: calid9f260195a8: Gained IPv6LL
Jan 14 13:38:49.108150 systemd-networkd[1331]: cali69ee0d2d57f: Gained IPv6LL
Jan 14 13:38:50.642188 systemd-networkd[1331]: vxlan.calico: Gained IPv6LL
Jan 14 13:38:55.375663 update_engine[1740]: I20250114 13:38:55.375607  1740 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs
Jan 14 13:38:55.375663 update_engine[1740]: I20250114 13:38:55.375658  1740 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs
Jan 14 13:38:55.376087 update_engine[1740]: I20250114 13:38:55.375927  1740 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs
Jan 14 13:38:55.376697 update_engine[1740]: I20250114 13:38:55.376665  1740 omaha_request_params.cc:62] Current group set to beta
Jan 14 13:38:55.376876 update_engine[1740]: I20250114 13:38:55.376754  1740 update_attempter.cc:499] Already updated boot flags. Skipping.
Jan 14 13:38:55.376876 update_engine[1740]: I20250114 13:38:55.376768  1740 update_attempter.cc:643] Scheduling an action processor start.
Jan 14 13:38:55.376876 update_engine[1740]: I20250114 13:38:55.376785  1740 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction
Jan 14 13:38:55.376876 update_engine[1740]: I20250114 13:38:55.376816  1740 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs
Jan 14 13:38:55.376876 update_engine[1740]: I20250114 13:38:55.376865  1740 omaha_request_action.cc:271] Posting an Omaha request to disabled
Jan 14 13:38:55.376876 update_engine[1740]: I20250114 13:38:55.376873  1740 omaha_request_action.cc:272] Request: <?xml version="1.0" encoding="UTF-8"?>
Jan 14 13:38:55.376876 update_engine[1740]: <request protocol="3.0" version="update_engine-0.4.10" updaterversion="update_engine-0.4.10" installsource="scheduler" ismachine="1">
Jan 14 13:38:55.376876 update_engine[1740]:     <os version="Chateau" platform="CoreOS" sp="4186.1.0_aarch64"></os>
Jan 14 13:38:55.376876 update_engine[1740]:     <app appid="{e96281a6-d1af-4bde-9a0a-97b76e56dc57}" version="4186.1.0" track="beta" bootid="{134d0750-e4c1-426c-8a00-974c69c8bae7}" oem="azure" oemversion="2.9.1.1-r3" alephversion="4186.1.0" machineid="085da10940fc43fdb77d0f10bdb20665" machinealias="" lang="en-US" board="arm64-usr" hardware_class="" delta_okay="false" >
Jan 14 13:38:55.376876 update_engine[1740]:         <ping active="1"></ping>
Jan 14 13:38:55.376876 update_engine[1740]:         <updatecheck></updatecheck>
Jan 14 13:38:55.376876 update_engine[1740]:         <event eventtype="3" eventresult="2" previousversion="0.0.0.0"></event>
Jan 14 13:38:55.376876 update_engine[1740]:     </app>
Jan 14 13:38:55.376876 update_engine[1740]: </request>
Jan 14 13:38:55.376876 update_engine[1740]: I20250114 13:38:55.376878  1740 libcurl_http_fetcher.cc:47] Starting/Resuming transfer
Jan 14 13:38:55.378129 locksmithd[1779]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0
Jan 14 13:38:55.380597 update_engine[1740]: I20250114 13:38:55.380291  1740 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP
Jan 14 13:38:55.380665 update_engine[1740]: I20250114 13:38:55.380585  1740 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds.
Jan 14 13:38:55.479478 update_engine[1740]: E20250114 13:38:55.479421  1740 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled
Jan 14 13:38:55.479604 update_engine[1740]: I20250114 13:38:55.479521  1740 libcurl_http_fetcher.cc:283] No HTTP response, retry 1
Jan 14 13:39:04.186299 containerd[1759]: time="2025-01-14T13:39:04.186245331Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:39:04.189276 containerd[1759]: time="2025-01-14T13:39:04.189237093Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=39298409"
Jan 14 13:39:04.234489 containerd[1759]: time="2025-01-14T13:39:04.234430604Z" level=info msg="ImageCreate event name:\"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:39:04.284257 containerd[1759]: time="2025-01-14T13:39:04.284179957Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:39:04.285406 containerd[1759]: time="2025-01-14T13:39:04.284927958Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"40668079\" in 16.9357857s"
Jan 14 13:39:04.285406 containerd[1759]: time="2025-01-14T13:39:04.284961278Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\""
Jan 14 13:39:04.285985 containerd[1759]: time="2025-01-14T13:39:04.285945118Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\""
Jan 14 13:39:04.286934 containerd[1759]: time="2025-01-14T13:39:04.286796479Z" level=info msg="CreateContainer within sandbox \"95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}"
Jan 14 13:39:04.585981 containerd[1759]: time="2025-01-14T13:39:04.585867729Z" level=info msg="CreateContainer within sandbox \"95853ddbdc095492878251f449a5be1dfbbeac971b0e19ffd50fb9e4b952d1e5\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e56808dc6c89159121bef4fff174924a40191965d0397e5c8e3bf1ec0fd8e371\""
Jan 14 13:39:04.586870 containerd[1759]: time="2025-01-14T13:39:04.586512246Z" level=info msg="StartContainer for \"e56808dc6c89159121bef4fff174924a40191965d0397e5c8e3bf1ec0fd8e371\""
Jan 14 13:39:04.620146 systemd[1]: Started cri-containerd-e56808dc6c89159121bef4fff174924a40191965d0397e5c8e3bf1ec0fd8e371.scope - libcontainer container e56808dc6c89159121bef4fff174924a40191965d0397e5c8e3bf1ec0fd8e371.
Jan 14 13:39:04.658307 containerd[1759]: time="2025-01-14T13:39:04.658255127Z" level=info msg="StartContainer for \"e56808dc6c89159121bef4fff174924a40191965d0397e5c8e3bf1ec0fd8e371\" returns successfully"
Jan 14 13:39:05.371983 update_engine[1740]: I20250114 13:39:05.371917  1740 libcurl_http_fetcher.cc:47] Starting/Resuming transfer
Jan 14 13:39:05.372338 update_engine[1740]: I20250114 13:39:05.372150  1740 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP
Jan 14 13:39:05.372436 update_engine[1740]: I20250114 13:39:05.372382  1740 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds.
Jan 14 13:39:05.472017 kubelet[3442]: I0114 13:39:05.471952    3442 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-76f75df574-2fhlb" podStartSLOduration=72.471914541 podStartE2EDuration="1m12.471914541s" podCreationTimestamp="2025-01-14 13:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-14 13:38:48.443145341 +0000 UTC m=+70.797090155" watchObservedRunningTime="2025-01-14 13:39:05.471914541 +0000 UTC m=+87.825859355"
Jan 14 13:39:05.475241 update_engine[1740]: E20250114 13:39:05.475181  1740 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled
Jan 14 13:39:05.475318 update_engine[1740]: I20250114 13:39:05.475260  1740 libcurl_http_fetcher.cc:283] No HTTP response, retry 2
Jan 14 13:39:05.866420 containerd[1759]: time="2025-01-14T13:39:05.865743946Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:39:05.870421 containerd[1759]: time="2025-01-14T13:39:05.870383165Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7464730"
Jan 14 13:39:05.881100 containerd[1759]: time="2025-01-14T13:39:05.881076517Z" level=info msg="ImageCreate event name:\"sha256:3c11734f3001b7070e7e2b5e64938f89891cf8c44f8997e86aa23c5d5bf70163\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:39:05.892157 containerd[1759]: time="2025-01-14T13:39:05.891930749Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:39:05.893248 containerd[1759]: time="2025-01-14T13:39:05.893150904Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:3c11734f3001b7070e7e2b5e64938f89891cf8c44f8997e86aa23c5d5bf70163\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"8834384\" in 1.607167506s"
Jan 14 13:39:05.893248 containerd[1759]: time="2025-01-14T13:39:05.893178024Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:3c11734f3001b7070e7e2b5e64938f89891cf8c44f8997e86aa23c5d5bf70163\""
Jan 14 13:39:05.894553 containerd[1759]: time="2025-01-14T13:39:05.894323738Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\""
Jan 14 13:39:05.896694 containerd[1759]: time="2025-01-14T13:39:05.896672448Z" level=info msg="CreateContainer within sandbox \"4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}"
Jan 14 13:39:05.946197 containerd[1759]: time="2025-01-14T13:39:05.946165427Z" level=info msg="CreateContainer within sandbox \"4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"04fe137d90804e209d4976dca8e4b97016734f514d3a0338db635648b2ff208f\""
Jan 14 13:39:05.947831 containerd[1759]: time="2025-01-14T13:39:05.946855504Z" level=info msg="StartContainer for \"04fe137d90804e209d4976dca8e4b97016734f514d3a0338db635648b2ff208f\""
Jan 14 13:39:05.980128 systemd[1]: Started cri-containerd-04fe137d90804e209d4976dca8e4b97016734f514d3a0338db635648b2ff208f.scope - libcontainer container 04fe137d90804e209d4976dca8e4b97016734f514d3a0338db635648b2ff208f.
Jan 14 13:39:06.012059 containerd[1759]: time="2025-01-14T13:39:06.012026254Z" level=info msg="StartContainer for \"04fe137d90804e209d4976dca8e4b97016734f514d3a0338db635648b2ff208f\" returns successfully"
Jan 14 13:39:06.259883 containerd[1759]: time="2025-01-14T13:39:06.259779811Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:39:06.263834 containerd[1759]: time="2025-01-14T13:39:06.263364333Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77"
Jan 14 13:39:06.267865 containerd[1759]: time="2025-01-14T13:39:06.267827176Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"40668079\" in 373.477198ms"
Jan 14 13:39:06.270033 containerd[1759]: time="2025-01-14T13:39:06.270017177Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\""
Jan 14 13:39:06.271363 containerd[1759]: time="2025-01-14T13:39:06.271176738Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\""
Jan 14 13:39:06.272568 containerd[1759]: time="2025-01-14T13:39:06.272546339Z" level=info msg="CreateContainer within sandbox \"bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}"
Jan 14 13:39:06.325355 containerd[1759]: time="2025-01-14T13:39:06.325261294Z" level=info msg="CreateContainer within sandbox \"bdc029994400fed901d99dbf8bea2a850f1dc8c7fd7f138e702ec71b9321c24f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"6daacdabbc4a61e9f086ee83a53045adacb25fac23ef6c2068ad6422a433d119\""
Jan 14 13:39:06.326595 containerd[1759]: time="2025-01-14T13:39:06.326562014Z" level=info msg="StartContainer for \"6daacdabbc4a61e9f086ee83a53045adacb25fac23ef6c2068ad6422a433d119\""
Jan 14 13:39:06.356167 systemd[1]: Started cri-containerd-6daacdabbc4a61e9f086ee83a53045adacb25fac23ef6c2068ad6422a433d119.scope - libcontainer container 6daacdabbc4a61e9f086ee83a53045adacb25fac23ef6c2068ad6422a433d119.
Jan 14 13:39:06.404729 containerd[1759]: time="2025-01-14T13:39:06.404135345Z" level=info msg="StartContainer for \"6daacdabbc4a61e9f086ee83a53045adacb25fac23ef6c2068ad6422a433d119\" returns successfully"
Jan 14 13:39:06.475729 kubelet[3442]: I0114 13:39:06.475694    3442 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness"
Jan 14 13:39:06.523623 kubelet[3442]: I0114 13:39:06.523144    3442 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7478669d4-zcsrg" podStartSLOduration=50.584503042 podStartE2EDuration="1m7.523100863s" podCreationTimestamp="2025-01-14 13:37:59 +0000 UTC" firstStartedPulling="2025-01-14 13:38:47.346646457 +0000 UTC m=+69.700591271" lastFinishedPulling="2025-01-14 13:39:04.285244278 +0000 UTC m=+86.639189092" observedRunningTime="2025-01-14 13:39:05.473163015 +0000 UTC m=+87.827107829" watchObservedRunningTime="2025-01-14 13:39:06.523100863 +0000 UTC m=+88.877045677"
Jan 14 13:39:07.478342 kubelet[3442]: I0114 13:39:07.478294    3442 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness"
Jan 14 13:39:09.386856 containerd[1759]: time="2025-01-14T13:39:09.386796863Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:39:09.392307 containerd[1759]: time="2025-01-14T13:39:09.389761945Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=31953828"
Jan 14 13:39:09.397745 containerd[1759]: time="2025-01-14T13:39:09.397149830Z" level=info msg="ImageCreate event name:\"sha256:32c335fdb9d757e7ba6a76a9cfa8d292a5a229101ae7ea37b42f53c28adf2db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:39:09.404485 containerd[1759]: time="2025-01-14T13:39:09.404435595Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:39:09.405432 containerd[1759]: time="2025-01-14T13:39:09.405404195Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:32c335fdb9d757e7ba6a76a9cfa8d292a5a229101ae7ea37b42f53c28adf2db1\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"33323450\" in 3.134195257s"
Jan 14 13:39:09.405539 containerd[1759]: time="2025-01-14T13:39:09.405523715Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:32c335fdb9d757e7ba6a76a9cfa8d292a5a229101ae7ea37b42f53c28adf2db1\""
Jan 14 13:39:09.407667 containerd[1759]: time="2025-01-14T13:39:09.406145876Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\""
Jan 14 13:39:09.423841 containerd[1759]: time="2025-01-14T13:39:09.423397767Z" level=info msg="CreateContainer within sandbox \"ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}"
Jan 14 13:39:09.464051 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1442204810.mount: Deactivated successfully.
Jan 14 13:39:09.474612 kubelet[3442]: I0114 13:39:09.474576    3442 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7478669d4-9xx9c" podStartSLOduration=52.566278128 podStartE2EDuration="1m11.474522841s" podCreationTimestamp="2025-01-14 13:37:58 +0000 UTC" firstStartedPulling="2025-01-14 13:38:47.362137345 +0000 UTC m=+69.716082159" lastFinishedPulling="2025-01-14 13:39:06.270382058 +0000 UTC m=+88.624326872" observedRunningTime="2025-01-14 13:39:06.525154345 +0000 UTC m=+88.879099159" watchObservedRunningTime="2025-01-14 13:39:09.474522841 +0000 UTC m=+91.828467655"
Jan 14 13:39:09.475708 containerd[1759]: time="2025-01-14T13:39:09.475204521Z" level=info msg="CreateContainer within sandbox \"ba19fd317e9eb782512644afa8e6e30206edf39733ebd883e97260b2a756eaa2\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"354c2dab58c75eb832be4ddb417f03b648da284c7c07951f60ee20295200c07a\""
Jan 14 13:39:09.476649 containerd[1759]: time="2025-01-14T13:39:09.476477722Z" level=info msg="StartContainer for \"354c2dab58c75eb832be4ddb417f03b648da284c7c07951f60ee20295200c07a\""
Jan 14 13:39:09.511205 systemd[1]: Started cri-containerd-354c2dab58c75eb832be4ddb417f03b648da284c7c07951f60ee20295200c07a.scope - libcontainer container 354c2dab58c75eb832be4ddb417f03b648da284c7c07951f60ee20295200c07a.
Jan 14 13:39:09.546428 containerd[1759]: time="2025-01-14T13:39:09.546389088Z" level=info msg="StartContainer for \"354c2dab58c75eb832be4ddb417f03b648da284c7c07951f60ee20295200c07a\" returns successfully"
Jan 14 13:39:10.515172 kubelet[3442]: I0114 13:39:10.514199    3442 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-8555cc7446-bv6h2" podStartSLOduration=49.614904946 podStartE2EDuration="1m11.514161243s" podCreationTimestamp="2025-01-14 13:37:59 +0000 UTC" firstStartedPulling="2025-01-14 13:38:47.506656499 +0000 UTC m=+69.860601273" lastFinishedPulling="2025-01-14 13:39:09.405912756 +0000 UTC m=+91.759857570" observedRunningTime="2025-01-14 13:39:10.510833881 +0000 UTC m=+92.864778695" watchObservedRunningTime="2025-01-14 13:39:10.514161243 +0000 UTC m=+92.868106057"
Jan 14 13:39:11.109494 containerd[1759]: time="2025-01-14T13:39:11.109438994Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:39:11.118203 containerd[1759]: time="2025-01-14T13:39:11.111646475Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=9883368"
Jan 14 13:39:11.118283 containerd[1759]: time="2025-01-14T13:39:11.116545118Z" level=info msg="ImageCreate event name:\"sha256:3eb557f7694f230afd24a75a691bcda4c0a7bfe87a981386dcd4ecf2b0701349\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:39:11.122395 containerd[1759]: time="2025-01-14T13:39:11.122320242Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
Jan 14 13:39:11.123214 containerd[1759]: time="2025-01-14T13:39:11.122903922Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:3eb557f7694f230afd24a75a691bcda4c0a7bfe87a981386dcd4ecf2b0701349\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11252974\" in 1.716725086s"
Jan 14 13:39:11.123214 containerd[1759]: time="2025-01-14T13:39:11.122937083Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:3eb557f7694f230afd24a75a691bcda4c0a7bfe87a981386dcd4ecf2b0701349\""
Jan 14 13:39:11.126288 containerd[1759]: time="2025-01-14T13:39:11.126241565Z" level=info msg="CreateContainer within sandbox \"4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}"
Jan 14 13:39:11.178025 containerd[1759]: time="2025-01-14T13:39:11.177965999Z" level=info msg="CreateContainer within sandbox \"4c337d882b222e2839b90ee842cff49d14384a941b62c5002c562a8c62ab81c7\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"def9953990b5864c29ebc88d95d6c6158e6c2881abf07b0f319a819639075f1e\""
Jan 14 13:39:11.178707 containerd[1759]: time="2025-01-14T13:39:11.178506679Z" level=info msg="StartContainer for \"def9953990b5864c29ebc88d95d6c6158e6c2881abf07b0f319a819639075f1e\""
Jan 14 13:39:11.209165 systemd[1]: Started cri-containerd-def9953990b5864c29ebc88d95d6c6158e6c2881abf07b0f319a819639075f1e.scope - libcontainer container def9953990b5864c29ebc88d95d6c6158e6c2881abf07b0f319a819639075f1e.
Jan 14 13:39:11.241362 containerd[1759]: time="2025-01-14T13:39:11.241312160Z" level=info msg="StartContainer for \"def9953990b5864c29ebc88d95d6c6158e6c2881abf07b0f319a819639075f1e\" returns successfully"
Jan 14 13:39:11.521919 kubelet[3442]: I0114 13:39:11.521487    3442 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/csi-node-driver-7bk8l" podStartSLOduration=48.747328879 podStartE2EDuration="1m12.521423664s" podCreationTimestamp="2025-01-14 13:37:59 +0000 UTC" firstStartedPulling="2025-01-14 13:38:47.349116498 +0000 UTC m=+69.703061312" lastFinishedPulling="2025-01-14 13:39:11.123211283 +0000 UTC m=+93.477156097" observedRunningTime="2025-01-14 13:39:11.520956064 +0000 UTC m=+93.874900878" watchObservedRunningTime="2025-01-14 13:39:11.521423664 +0000 UTC m=+93.875368478"
Jan 14 13:39:11.884160 kubelet[3442]: I0114 13:39:11.884124    3442 csi_plugin.go:99] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0
Jan 14 13:39:11.884160 kubelet[3442]: I0114 13:39:11.884162    3442 csi_plugin.go:112] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock
Jan 14 13:39:15.371910 update_engine[1740]: I20250114 13:39:15.371843  1740 libcurl_http_fetcher.cc:47] Starting/Resuming transfer
Jan 14 13:39:15.372282 update_engine[1740]: I20250114 13:39:15.372088  1740 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP
Jan 14 13:39:15.372317 update_engine[1740]: I20250114 13:39:15.372303  1740 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds.
Jan 14 13:39:15.377682 update_engine[1740]: E20250114 13:39:15.377630  1740 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled
Jan 14 13:39:15.377896 update_engine[1740]: I20250114 13:39:15.377848  1740 libcurl_http_fetcher.cc:283] No HTTP response, retry 3
Jan 14 13:39:19.654158 kubelet[3442]: I0114 13:39:19.653930    3442 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness"
Jan 14 13:39:25.381511 update_engine[1740]: I20250114 13:39:25.381238  1740 libcurl_http_fetcher.cc:47] Starting/Resuming transfer
Jan 14 13:39:25.381858 update_engine[1740]: I20250114 13:39:25.381554  1740 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP
Jan 14 13:39:25.381858 update_engine[1740]: I20250114 13:39:25.381784  1740 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds.
Jan 14 13:39:25.423999 update_engine[1740]: E20250114 13:39:25.423942  1740 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled
Jan 14 13:39:25.424122 update_engine[1740]: I20250114 13:39:25.424046  1740 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded
Jan 14 13:39:25.424122 update_engine[1740]: I20250114 13:39:25.424055  1740 omaha_request_action.cc:617] Omaha request response:
Jan 14 13:39:25.424169 update_engine[1740]: E20250114 13:39:25.424132  1740 omaha_request_action.cc:636] Omaha request network transfer failed.
Jan 14 13:39:25.424169 update_engine[1740]: I20250114 13:39:25.424150  1740 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing.
Jan 14 13:39:25.424169 update_engine[1740]: I20250114 13:39:25.424155  1740 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction
Jan 14 13:39:25.424169 update_engine[1740]: I20250114 13:39:25.424160  1740 update_attempter.cc:306] Processing Done.
Jan 14 13:39:25.424243 update_engine[1740]: E20250114 13:39:25.424173  1740 update_attempter.cc:619] Update failed.
Jan 14 13:39:25.424243 update_engine[1740]: I20250114 13:39:25.424178  1740 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse
Jan 14 13:39:25.424243 update_engine[1740]: I20250114 13:39:25.424182  1740 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse)
Jan 14 13:39:25.424243 update_engine[1740]: I20250114 13:39:25.424187  1740 payload_state.cc:103] Ignoring failures until we get a valid Omaha response.
Jan 14 13:39:25.424337 update_engine[1740]: I20250114 13:39:25.424252  1740 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction
Jan 14 13:39:25.424337 update_engine[1740]: I20250114 13:39:25.424271  1740 omaha_request_action.cc:271] Posting an Omaha request to disabled
Jan 14 13:39:25.424337 update_engine[1740]: I20250114 13:39:25.424276  1740 omaha_request_action.cc:272] Request: <?xml version="1.0" encoding="UTF-8"?>
Jan 14 13:39:25.424337 update_engine[1740]: <request protocol="3.0" version="update_engine-0.4.10" updaterversion="update_engine-0.4.10" installsource="scheduler" ismachine="1">
Jan 14 13:39:25.424337 update_engine[1740]:     <os version="Chateau" platform="CoreOS" sp="4186.1.0_aarch64"></os>
Jan 14 13:39:25.424337 update_engine[1740]:     <app appid="{e96281a6-d1af-4bde-9a0a-97b76e56dc57}" version="4186.1.0" track="beta" bootid="{134d0750-e4c1-426c-8a00-974c69c8bae7}" oem="azure" oemversion="2.9.1.1-r3" alephversion="4186.1.0" machineid="085da10940fc43fdb77d0f10bdb20665" machinealias="" lang="en-US" board="arm64-usr" hardware_class="" delta_okay="false" >
Jan 14 13:39:25.424337 update_engine[1740]:         <event eventtype="3" eventresult="0" errorcode="268437456"></event>
Jan 14 13:39:25.424337 update_engine[1740]:     </app>
Jan 14 13:39:25.424337 update_engine[1740]: </request>
Jan 14 13:39:25.424337 update_engine[1740]: I20250114 13:39:25.424283  1740 libcurl_http_fetcher.cc:47] Starting/Resuming transfer
Jan 14 13:39:25.424506 update_engine[1740]: I20250114 13:39:25.424418  1740 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP
Jan 14 13:39:25.424771 update_engine[1740]: I20250114 13:39:25.424633  1740 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds.
Jan 14 13:39:25.424824 locksmithd[1779]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0
Jan 14 13:39:25.432513 update_engine[1740]: E20250114 13:39:25.432474  1740 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled
Jan 14 13:39:25.432573 update_engine[1740]: I20250114 13:39:25.432538  1740 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded
Jan 14 13:39:25.432573 update_engine[1740]: I20250114 13:39:25.432545  1740 omaha_request_action.cc:617] Omaha request response:
Jan 14 13:39:25.432573 update_engine[1740]: I20250114 13:39:25.432551  1740 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction
Jan 14 13:39:25.432573 update_engine[1740]: I20250114 13:39:25.432556  1740 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction
Jan 14 13:39:25.432573 update_engine[1740]: I20250114 13:39:25.432560  1740 update_attempter.cc:306] Processing Done.
Jan 14 13:39:25.432573 update_engine[1740]: I20250114 13:39:25.432566  1740 update_attempter.cc:310] Error event sent.
Jan 14 13:39:25.432936 update_engine[1740]: I20250114 13:39:25.432574  1740 update_check_scheduler.cc:74] Next update check in 47m51s
Jan 14 13:39:25.432961 locksmithd[1779]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0
Jan 14 13:39:27.653786 kubelet[3442]: I0114 13:39:27.653541    3442 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness"
Jan 14 13:39:37.749525 containerd[1759]: time="2025-01-14T13:39:37.749342099Z" level=info msg="StopPodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\""
Jan 14 13:39:37.749525 containerd[1759]: time="2025-01-14T13:39:37.749448139Z" level=info msg="TearDown network for sandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" successfully"
Jan 14 13:39:37.749525 containerd[1759]: time="2025-01-14T13:39:37.749460139Z" level=info msg="StopPodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" returns successfully"
Jan 14 13:39:37.750352 containerd[1759]: time="2025-01-14T13:39:37.750078260Z" level=info msg="RemovePodSandbox for \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\""
Jan 14 13:39:37.750352 containerd[1759]: time="2025-01-14T13:39:37.750105660Z" level=info msg="Forcibly stopping sandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\""
Jan 14 13:39:37.750352 containerd[1759]: time="2025-01-14T13:39:37.750169100Z" level=info msg="TearDown network for sandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" successfully"
Jan 14 13:39:38.274239 containerd[1759]: time="2025-01-14T13:39:38.274174385Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.274374 containerd[1759]: time="2025-01-14T13:39:38.274261265Z" level=info msg="RemovePodSandbox \"3b48cb5ad11a0ea9b48bdf590d17efbae330d525642eff01297f7a30fca1eb0e\" returns successfully"
Jan 14 13:39:38.275264 containerd[1759]: time="2025-01-14T13:39:38.275184425Z" level=info msg="StopPodSandbox for \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\""
Jan 14 13:39:38.275538 containerd[1759]: time="2025-01-14T13:39:38.275513025Z" level=info msg="TearDown network for sandbox \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\" successfully"
Jan 14 13:39:38.275538 containerd[1759]: time="2025-01-14T13:39:38.275535785Z" level=info msg="StopPodSandbox for \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\" returns successfully"
Jan 14 13:39:38.276272 containerd[1759]: time="2025-01-14T13:39:38.276101386Z" level=info msg="RemovePodSandbox for \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\""
Jan 14 13:39:38.276272 containerd[1759]: time="2025-01-14T13:39:38.276153146Z" level=info msg="Forcibly stopping sandbox \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\""
Jan 14 13:39:38.276272 containerd[1759]: time="2025-01-14T13:39:38.276226346Z" level=info msg="TearDown network for sandbox \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\" successfully"
Jan 14 13:39:38.291867 containerd[1759]: time="2025-01-14T13:39:38.291747032Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.291867 containerd[1759]: time="2025-01-14T13:39:38.291810832Z" level=info msg="RemovePodSandbox \"e6726e17c5586b3756a91d7d739855baa63d9056dd68f2c725c1822c63dbdd3b\" returns successfully"
Jan 14 13:39:38.292241 containerd[1759]: time="2025-01-14T13:39:38.292227192Z" level=info msg="StopPodSandbox for \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\""
Jan 14 13:39:38.292344 containerd[1759]: time="2025-01-14T13:39:38.292318472Z" level=info msg="TearDown network for sandbox \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\" successfully"
Jan 14 13:39:38.292344 containerd[1759]: time="2025-01-14T13:39:38.292335952Z" level=info msg="StopPodSandbox for \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\" returns successfully"
Jan 14 13:39:38.292922 containerd[1759]: time="2025-01-14T13:39:38.292593072Z" level=info msg="RemovePodSandbox for \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\""
Jan 14 13:39:38.292922 containerd[1759]: time="2025-01-14T13:39:38.292618072Z" level=info msg="Forcibly stopping sandbox \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\""
Jan 14 13:39:38.292922 containerd[1759]: time="2025-01-14T13:39:38.292679072Z" level=info msg="TearDown network for sandbox \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\" successfully"
Jan 14 13:39:38.300966 containerd[1759]: time="2025-01-14T13:39:38.300923155Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.301273 containerd[1759]: time="2025-01-14T13:39:38.301009515Z" level=info msg="RemovePodSandbox \"e60362369f533a28e9b22016d264b6e2880d2c6f2127122dafd0732fa911c245\" returns successfully"
Jan 14 13:39:38.301780 containerd[1759]: time="2025-01-14T13:39:38.301597916Z" level=info msg="StopPodSandbox for \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\""
Jan 14 13:39:38.301780 containerd[1759]: time="2025-01-14T13:39:38.301701196Z" level=info msg="TearDown network for sandbox \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\" successfully"
Jan 14 13:39:38.301780 containerd[1759]: time="2025-01-14T13:39:38.301712876Z" level=info msg="StopPodSandbox for \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\" returns successfully"
Jan 14 13:39:38.302697 containerd[1759]: time="2025-01-14T13:39:38.302030796Z" level=info msg="RemovePodSandbox for \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\""
Jan 14 13:39:38.302697 containerd[1759]: time="2025-01-14T13:39:38.302073756Z" level=info msg="Forcibly stopping sandbox \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\""
Jan 14 13:39:38.302697 containerd[1759]: time="2025-01-14T13:39:38.302141796Z" level=info msg="TearDown network for sandbox \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\" successfully"
Jan 14 13:39:38.307955 containerd[1759]: time="2025-01-14T13:39:38.307908278Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.308038 containerd[1759]: time="2025-01-14T13:39:38.308012798Z" level=info msg="RemovePodSandbox \"60ee7281c28f2bafe3d69ebb470f10c50bdf72d8ef2c898ac0e100ba49d88e92\" returns successfully"
Jan 14 13:39:38.308565 containerd[1759]: time="2025-01-14T13:39:38.308505798Z" level=info msg="StopPodSandbox for \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\""
Jan 14 13:39:38.308860 containerd[1759]: time="2025-01-14T13:39:38.308749758Z" level=info msg="TearDown network for sandbox \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\" successfully"
Jan 14 13:39:38.308860 containerd[1759]: time="2025-01-14T13:39:38.308767679Z" level=info msg="StopPodSandbox for \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\" returns successfully"
Jan 14 13:39:38.309213 containerd[1759]: time="2025-01-14T13:39:38.309186719Z" level=info msg="RemovePodSandbox for \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\""
Jan 14 13:39:38.309282 containerd[1759]: time="2025-01-14T13:39:38.309221519Z" level=info msg="Forcibly stopping sandbox \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\""
Jan 14 13:39:38.309310 containerd[1759]: time="2025-01-14T13:39:38.309282279Z" level=info msg="TearDown network for sandbox \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\" successfully"
Jan 14 13:39:38.316547 containerd[1759]: time="2025-01-14T13:39:38.316409601Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.316547 containerd[1759]: time="2025-01-14T13:39:38.316466882Z" level=info msg="RemovePodSandbox \"5624843d13cc48752b4f9f6b4233747dc115df765e427c6b3220b7e716612083\" returns successfully"
Jan 14 13:39:38.317012 containerd[1759]: time="2025-01-14T13:39:38.316803562Z" level=info msg="StopPodSandbox for \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\""
Jan 14 13:39:38.317012 containerd[1759]: time="2025-01-14T13:39:38.316934562Z" level=info msg="TearDown network for sandbox \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\" successfully"
Jan 14 13:39:38.317012 containerd[1759]: time="2025-01-14T13:39:38.316945282Z" level=info msg="StopPodSandbox for \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\" returns successfully"
Jan 14 13:39:38.317193 containerd[1759]: time="2025-01-14T13:39:38.317161762Z" level=info msg="RemovePodSandbox for \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\""
Jan 14 13:39:38.317193 containerd[1759]: time="2025-01-14T13:39:38.317190802Z" level=info msg="Forcibly stopping sandbox \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\""
Jan 14 13:39:38.317268 containerd[1759]: time="2025-01-14T13:39:38.317249242Z" level=info msg="TearDown network for sandbox \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\" successfully"
Jan 14 13:39:38.324741 containerd[1759]: time="2025-01-14T13:39:38.324692245Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.324870 containerd[1759]: time="2025-01-14T13:39:38.324753325Z" level=info msg="RemovePodSandbox \"1d31ee1f06b5ba8e5167d75dc1363c0587f7ded1ed0460416729f6a6cf73d2d1\" returns successfully"
Jan 14 13:39:38.325224 containerd[1759]: time="2025-01-14T13:39:38.325070365Z" level=info msg="StopPodSandbox for \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\""
Jan 14 13:39:38.325224 containerd[1759]: time="2025-01-14T13:39:38.325151565Z" level=info msg="TearDown network for sandbox \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\" successfully"
Jan 14 13:39:38.325224 containerd[1759]: time="2025-01-14T13:39:38.325161045Z" level=info msg="StopPodSandbox for \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\" returns successfully"
Jan 14 13:39:38.326018 containerd[1759]: time="2025-01-14T13:39:38.325514725Z" level=info msg="RemovePodSandbox for \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\""
Jan 14 13:39:38.326018 containerd[1759]: time="2025-01-14T13:39:38.325544405Z" level=info msg="Forcibly stopping sandbox \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\""
Jan 14 13:39:38.326018 containerd[1759]: time="2025-01-14T13:39:38.325606805Z" level=info msg="TearDown network for sandbox \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\" successfully"
Jan 14 13:39:38.333259 containerd[1759]: time="2025-01-14T13:39:38.333206928Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.333324 containerd[1759]: time="2025-01-14T13:39:38.333305808Z" level=info msg="RemovePodSandbox \"9073c3e5084e944117cf74bb7005b63d11f0eb4631b6b564b21227b06a20970a\" returns successfully"
Jan 14 13:39:38.333901 containerd[1759]: time="2025-01-14T13:39:38.333633648Z" level=info msg="StopPodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\""
Jan 14 13:39:38.333901 containerd[1759]: time="2025-01-14T13:39:38.333714768Z" level=info msg="TearDown network for sandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" successfully"
Jan 14 13:39:38.333901 containerd[1759]: time="2025-01-14T13:39:38.333723808Z" level=info msg="StopPodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" returns successfully"
Jan 14 13:39:38.334052 containerd[1759]: time="2025-01-14T13:39:38.333962568Z" level=info msg="RemovePodSandbox for \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\""
Jan 14 13:39:38.334052 containerd[1759]: time="2025-01-14T13:39:38.333985168Z" level=info msg="Forcibly stopping sandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\""
Jan 14 13:39:38.334098 containerd[1759]: time="2025-01-14T13:39:38.334067848Z" level=info msg="TearDown network for sandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" successfully"
Jan 14 13:39:38.345946 containerd[1759]: time="2025-01-14T13:39:38.345896453Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.346045 containerd[1759]: time="2025-01-14T13:39:38.345963613Z" level=info msg="RemovePodSandbox \"7e7608342da6dd9403bb85605ac8842544e71d9e1d741ba7bbc870e3d4e236f9\" returns successfully"
Jan 14 13:39:38.346651 containerd[1759]: time="2025-01-14T13:39:38.346417333Z" level=info msg="StopPodSandbox for \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\""
Jan 14 13:39:38.346651 containerd[1759]: time="2025-01-14T13:39:38.346495773Z" level=info msg="TearDown network for sandbox \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\" successfully"
Jan 14 13:39:38.346651 containerd[1759]: time="2025-01-14T13:39:38.346504693Z" level=info msg="StopPodSandbox for \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\" returns successfully"
Jan 14 13:39:38.346920 containerd[1759]: time="2025-01-14T13:39:38.346894413Z" level=info msg="RemovePodSandbox for \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\""
Jan 14 13:39:38.346955 containerd[1759]: time="2025-01-14T13:39:38.346924773Z" level=info msg="Forcibly stopping sandbox \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\""
Jan 14 13:39:38.347023 containerd[1759]: time="2025-01-14T13:39:38.346985253Z" level=info msg="TearDown network for sandbox \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\" successfully"
Jan 14 13:39:38.356828 containerd[1759]: time="2025-01-14T13:39:38.356783537Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.356911 containerd[1759]: time="2025-01-14T13:39:38.356849097Z" level=info msg="RemovePodSandbox \"1b0d823e025a79885d654307b1364345649a31d4b9e4b0bf0f2470037b962824\" returns successfully"
Jan 14 13:39:38.357449 containerd[1759]: time="2025-01-14T13:39:38.357195617Z" level=info msg="StopPodSandbox for \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\""
Jan 14 13:39:38.357449 containerd[1759]: time="2025-01-14T13:39:38.357277938Z" level=info msg="TearDown network for sandbox \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\" successfully"
Jan 14 13:39:38.357449 containerd[1759]: time="2025-01-14T13:39:38.357287458Z" level=info msg="StopPodSandbox for \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\" returns successfully"
Jan 14 13:39:38.357596 containerd[1759]: time="2025-01-14T13:39:38.357518018Z" level=info msg="RemovePodSandbox for \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\""
Jan 14 13:39:38.357596 containerd[1759]: time="2025-01-14T13:39:38.357543338Z" level=info msg="Forcibly stopping sandbox \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\""
Jan 14 13:39:38.357653 containerd[1759]: time="2025-01-14T13:39:38.357600018Z" level=info msg="TearDown network for sandbox \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\" successfully"
Jan 14 13:39:38.368983 containerd[1759]: time="2025-01-14T13:39:38.368938742Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.369218 containerd[1759]: time="2025-01-14T13:39:38.369020702Z" level=info msg="RemovePodSandbox \"b2088de595417c6322eabeda96d10374253ed65b0a6fe14394b237707e3a935d\" returns successfully"
Jan 14 13:39:38.369603 containerd[1759]: time="2025-01-14T13:39:38.369418662Z" level=info msg="StopPodSandbox for \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\""
Jan 14 13:39:38.369603 containerd[1759]: time="2025-01-14T13:39:38.369503022Z" level=info msg="TearDown network for sandbox \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\" successfully"
Jan 14 13:39:38.369603 containerd[1759]: time="2025-01-14T13:39:38.369513262Z" level=info msg="StopPodSandbox for \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\" returns successfully"
Jan 14 13:39:38.369841 containerd[1759]: time="2025-01-14T13:39:38.369817142Z" level=info msg="RemovePodSandbox for \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\""
Jan 14 13:39:38.369955 containerd[1759]: time="2025-01-14T13:39:38.369940462Z" level=info msg="Forcibly stopping sandbox \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\""
Jan 14 13:39:38.370362 containerd[1759]: time="2025-01-14T13:39:38.370069183Z" level=info msg="TearDown network for sandbox \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\" successfully"
Jan 14 13:39:38.378500 containerd[1759]: time="2025-01-14T13:39:38.378467226Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.378643 containerd[1759]: time="2025-01-14T13:39:38.378628666Z" level=info msg="RemovePodSandbox \"a75eddb392264b219e530b544d47b67b74e00edd2a8f8b7106abdc8fbbbcfe2d\" returns successfully"
Jan 14 13:39:38.379148 containerd[1759]: time="2025-01-14T13:39:38.379121826Z" level=info msg="StopPodSandbox for \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\""
Jan 14 13:39:38.379267 containerd[1759]: time="2025-01-14T13:39:38.379243946Z" level=info msg="TearDown network for sandbox \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\" successfully"
Jan 14 13:39:38.379267 containerd[1759]: time="2025-01-14T13:39:38.379261546Z" level=info msg="StopPodSandbox for \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\" returns successfully"
Jan 14 13:39:38.379619 containerd[1759]: time="2025-01-14T13:39:38.379546586Z" level=info msg="RemovePodSandbox for \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\""
Jan 14 13:39:38.379619 containerd[1759]: time="2025-01-14T13:39:38.379572986Z" level=info msg="Forcibly stopping sandbox \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\""
Jan 14 13:39:38.379732 containerd[1759]: time="2025-01-14T13:39:38.379635266Z" level=info msg="TearDown network for sandbox \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\" successfully"
Jan 14 13:39:38.389667 containerd[1759]: time="2025-01-14T13:39:38.389593710Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.389843 containerd[1759]: time="2025-01-14T13:39:38.389709230Z" level=info msg="RemovePodSandbox \"ea32c5d5841c18b721b720c17a89e0f12374a58e62e420be87dbcd53c441bb77\" returns successfully"
Jan 14 13:39:38.390524 containerd[1759]: time="2025-01-14T13:39:38.390283150Z" level=info msg="StopPodSandbox for \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\""
Jan 14 13:39:38.390524 containerd[1759]: time="2025-01-14T13:39:38.390370830Z" level=info msg="TearDown network for sandbox \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\" successfully"
Jan 14 13:39:38.390524 containerd[1759]: time="2025-01-14T13:39:38.390381510Z" level=info msg="StopPodSandbox for \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\" returns successfully"
Jan 14 13:39:38.391354 containerd[1759]: time="2025-01-14T13:39:38.390874871Z" level=info msg="RemovePodSandbox for \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\""
Jan 14 13:39:38.391354 containerd[1759]: time="2025-01-14T13:39:38.390899071Z" level=info msg="Forcibly stopping sandbox \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\""
Jan 14 13:39:38.391354 containerd[1759]: time="2025-01-14T13:39:38.390951351Z" level=info msg="TearDown network for sandbox \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\" successfully"
Jan 14 13:39:38.401144 containerd[1759]: time="2025-01-14T13:39:38.401106355Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.401335 containerd[1759]: time="2025-01-14T13:39:38.401319475Z" level=info msg="RemovePodSandbox \"d12ffc587626672804360aa4ee4df91d71f582754398565ceec309d37e3dd117\" returns successfully"
Jan 14 13:39:38.401876 containerd[1759]: time="2025-01-14T13:39:38.401844355Z" level=info msg="StopPodSandbox for \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\""
Jan 14 13:39:38.401960 containerd[1759]: time="2025-01-14T13:39:38.401939195Z" level=info msg="TearDown network for sandbox \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\" successfully"
Jan 14 13:39:38.401960 containerd[1759]: time="2025-01-14T13:39:38.401954755Z" level=info msg="StopPodSandbox for \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\" returns successfully"
Jan 14 13:39:38.402703 containerd[1759]: time="2025-01-14T13:39:38.402284555Z" level=info msg="RemovePodSandbox for \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\""
Jan 14 13:39:38.402703 containerd[1759]: time="2025-01-14T13:39:38.402305875Z" level=info msg="Forcibly stopping sandbox \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\""
Jan 14 13:39:38.402703 containerd[1759]: time="2025-01-14T13:39:38.402367635Z" level=info msg="TearDown network for sandbox \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\" successfully"
Jan 14 13:39:38.413276 containerd[1759]: time="2025-01-14T13:39:38.413241879Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.413432 containerd[1759]: time="2025-01-14T13:39:38.413417319Z" level=info msg="RemovePodSandbox \"69eb62339ee12efc5793c9acfda373ad4383ea47c3f7a16201f9b83fc4864995\" returns successfully"
Jan 14 13:39:38.413893 containerd[1759]: time="2025-01-14T13:39:38.413858360Z" level=info msg="StopPodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\""
Jan 14 13:39:38.413977 containerd[1759]: time="2025-01-14T13:39:38.413957080Z" level=info msg="TearDown network for sandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" successfully"
Jan 14 13:39:38.413977 containerd[1759]: time="2025-01-14T13:39:38.413973640Z" level=info msg="StopPodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" returns successfully"
Jan 14 13:39:38.414415 containerd[1759]: time="2025-01-14T13:39:38.414320200Z" level=info msg="RemovePodSandbox for \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\""
Jan 14 13:39:38.414461 containerd[1759]: time="2025-01-14T13:39:38.414419640Z" level=info msg="Forcibly stopping sandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\""
Jan 14 13:39:38.414523 containerd[1759]: time="2025-01-14T13:39:38.414499960Z" level=info msg="TearDown network for sandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" successfully"
Jan 14 13:39:38.423567 containerd[1759]: time="2025-01-14T13:39:38.423523203Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.423635 containerd[1759]: time="2025-01-14T13:39:38.423587163Z" level=info msg="RemovePodSandbox \"8d350eae9b839df9d1c78f2cb855ce14ea3e95d3cc2b61696def18e77b872b3b\" returns successfully"
Jan 14 13:39:38.424023 containerd[1759]: time="2025-01-14T13:39:38.423978404Z" level=info msg="StopPodSandbox for \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\""
Jan 14 13:39:38.424101 containerd[1759]: time="2025-01-14T13:39:38.424081724Z" level=info msg="TearDown network for sandbox \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\" successfully"
Jan 14 13:39:38.424101 containerd[1759]: time="2025-01-14T13:39:38.424097004Z" level=info msg="StopPodSandbox for \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\" returns successfully"
Jan 14 13:39:38.425011 containerd[1759]: time="2025-01-14T13:39:38.424453804Z" level=info msg="RemovePodSandbox for \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\""
Jan 14 13:39:38.425011 containerd[1759]: time="2025-01-14T13:39:38.424494364Z" level=info msg="Forcibly stopping sandbox \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\""
Jan 14 13:39:38.425011 containerd[1759]: time="2025-01-14T13:39:38.424555564Z" level=info msg="TearDown network for sandbox \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\" successfully"
Jan 14 13:39:38.433406 containerd[1759]: time="2025-01-14T13:39:38.433357607Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.433472 containerd[1759]: time="2025-01-14T13:39:38.433448767Z" level=info msg="RemovePodSandbox \"91a15225daffe4cb62c77358b195a94616d8fb5ad2ee0c4f7277f2156f1f8c3b\" returns successfully"
Jan 14 13:39:38.434037 containerd[1759]: time="2025-01-14T13:39:38.433855327Z" level=info msg="StopPodSandbox for \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\""
Jan 14 13:39:38.434037 containerd[1759]: time="2025-01-14T13:39:38.433948928Z" level=info msg="TearDown network for sandbox \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\" successfully"
Jan 14 13:39:38.434037 containerd[1759]: time="2025-01-14T13:39:38.433958968Z" level=info msg="StopPodSandbox for \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\" returns successfully"
Jan 14 13:39:38.434288 containerd[1759]: time="2025-01-14T13:39:38.434266728Z" level=info msg="RemovePodSandbox for \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\""
Jan 14 13:39:38.434964 containerd[1759]: time="2025-01-14T13:39:38.434354968Z" level=info msg="Forcibly stopping sandbox \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\""
Jan 14 13:39:38.434964 containerd[1759]: time="2025-01-14T13:39:38.434420768Z" level=info msg="TearDown network for sandbox \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\" successfully"
Jan 14 13:39:38.445232 containerd[1759]: time="2025-01-14T13:39:38.445190252Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.445327 containerd[1759]: time="2025-01-14T13:39:38.445253572Z" level=info msg="RemovePodSandbox \"65aef4b63a38233051f32ba17531a94b9a49b62964c3c06eda6aa069a0e152a7\" returns successfully"
Jan 14 13:39:38.445982 containerd[1759]: time="2025-01-14T13:39:38.445815932Z" level=info msg="StopPodSandbox for \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\""
Jan 14 13:39:38.445982 containerd[1759]: time="2025-01-14T13:39:38.445904652Z" level=info msg="TearDown network for sandbox \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\" successfully"
Jan 14 13:39:38.445982 containerd[1759]: time="2025-01-14T13:39:38.445914292Z" level=info msg="StopPodSandbox for \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\" returns successfully"
Jan 14 13:39:38.446894 containerd[1759]: time="2025-01-14T13:39:38.446249252Z" level=info msg="RemovePodSandbox for \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\""
Jan 14 13:39:38.446894 containerd[1759]: time="2025-01-14T13:39:38.446302012Z" level=info msg="Forcibly stopping sandbox \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\""
Jan 14 13:39:38.446894 containerd[1759]: time="2025-01-14T13:39:38.446383172Z" level=info msg="TearDown network for sandbox \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\" successfully"
Jan 14 13:39:38.454834 containerd[1759]: time="2025-01-14T13:39:38.454799936Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.454885 containerd[1759]: time="2025-01-14T13:39:38.454854456Z" level=info msg="RemovePodSandbox \"841e56e949fb66a97b88db0e3fe0006d6d42372730579c2399124a9d7f239931\" returns successfully"
Jan 14 13:39:38.455224 containerd[1759]: time="2025-01-14T13:39:38.455170656Z" level=info msg="StopPodSandbox for \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\""
Jan 14 13:39:38.455448 containerd[1759]: time="2025-01-14T13:39:38.455351536Z" level=info msg="TearDown network for sandbox \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\" successfully"
Jan 14 13:39:38.455448 containerd[1759]: time="2025-01-14T13:39:38.455365056Z" level=info msg="StopPodSandbox for \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\" returns successfully"
Jan 14 13:39:38.455689 containerd[1759]: time="2025-01-14T13:39:38.455625136Z" level=info msg="RemovePodSandbox for \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\""
Jan 14 13:39:38.455689 containerd[1759]: time="2025-01-14T13:39:38.455669176Z" level=info msg="Forcibly stopping sandbox \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\""
Jan 14 13:39:38.455785 containerd[1759]: time="2025-01-14T13:39:38.455727376Z" level=info msg="TearDown network for sandbox \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\" successfully"
Jan 14 13:39:38.464445 containerd[1759]: time="2025-01-14T13:39:38.464403779Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.464517 containerd[1759]: time="2025-01-14T13:39:38.464463139Z" level=info msg="RemovePodSandbox \"5f47f137b039997796869fe562c299eeb7c123bbc85094e6a6cb54bdfe48aaf4\" returns successfully"
Jan 14 13:39:38.464882 containerd[1759]: time="2025-01-14T13:39:38.464791300Z" level=info msg="StopPodSandbox for \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\""
Jan 14 13:39:38.465145 containerd[1759]: time="2025-01-14T13:39:38.464981820Z" level=info msg="TearDown network for sandbox \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\" successfully"
Jan 14 13:39:38.465145 containerd[1759]: time="2025-01-14T13:39:38.465029140Z" level=info msg="StopPodSandbox for \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\" returns successfully"
Jan 14 13:39:38.465930 containerd[1759]: time="2025-01-14T13:39:38.465383900Z" level=info msg="RemovePodSandbox for \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\""
Jan 14 13:39:38.465930 containerd[1759]: time="2025-01-14T13:39:38.465416980Z" level=info msg="Forcibly stopping sandbox \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\""
Jan 14 13:39:38.465930 containerd[1759]: time="2025-01-14T13:39:38.465472020Z" level=info msg="TearDown network for sandbox \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\" successfully"
Jan 14 13:39:38.475097 containerd[1759]: time="2025-01-14T13:39:38.475054064Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.475185 containerd[1759]: time="2025-01-14T13:39:38.475111584Z" level=info msg="RemovePodSandbox \"b2e7c993e731d79ffdbc9e501948b9ef4683bd5d915ec929dd75e1598e563288\" returns successfully"
Jan 14 13:39:38.475636 containerd[1759]: time="2025-01-14T13:39:38.475530504Z" level=info msg="StopPodSandbox for \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\""
Jan 14 13:39:38.475636 containerd[1759]: time="2025-01-14T13:39:38.475608424Z" level=info msg="TearDown network for sandbox \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\" successfully"
Jan 14 13:39:38.475636 containerd[1759]: time="2025-01-14T13:39:38.475617104Z" level=info msg="StopPodSandbox for \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\" returns successfully"
Jan 14 13:39:38.476146 containerd[1759]: time="2025-01-14T13:39:38.476116344Z" level=info msg="RemovePodSandbox for \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\""
Jan 14 13:39:38.476203 containerd[1759]: time="2025-01-14T13:39:38.476146984Z" level=info msg="Forcibly stopping sandbox \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\""
Jan 14 13:39:38.476275 containerd[1759]: time="2025-01-14T13:39:38.476253704Z" level=info msg="TearDown network for sandbox \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\" successfully"
Jan 14 13:39:38.490062 containerd[1759]: time="2025-01-14T13:39:38.489886309Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.490062 containerd[1759]: time="2025-01-14T13:39:38.489954509Z" level=info msg="RemovePodSandbox \"efff37b38705ce54a05c4b314fb44e896f62187f93a6bf65db8521c6acc0da89\" returns successfully"
Jan 14 13:39:38.490397 containerd[1759]: time="2025-01-14T13:39:38.490371190Z" level=info msg="StopPodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\""
Jan 14 13:39:38.490483 containerd[1759]: time="2025-01-14T13:39:38.490463750Z" level=info msg="TearDown network for sandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" successfully"
Jan 14 13:39:38.490483 containerd[1759]: time="2025-01-14T13:39:38.490478550Z" level=info msg="StopPodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" returns successfully"
Jan 14 13:39:38.490848 containerd[1759]: time="2025-01-14T13:39:38.490822990Z" level=info msg="RemovePodSandbox for \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\""
Jan 14 13:39:38.490908 containerd[1759]: time="2025-01-14T13:39:38.490852310Z" level=info msg="Forcibly stopping sandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\""
Jan 14 13:39:38.490931 containerd[1759]: time="2025-01-14T13:39:38.490908070Z" level=info msg="TearDown network for sandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" successfully"
Jan 14 13:39:38.505191 containerd[1759]: time="2025-01-14T13:39:38.505149195Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.505296 containerd[1759]: time="2025-01-14T13:39:38.505221995Z" level=info msg="RemovePodSandbox \"2dd60e42c4f09fa00d5d338ff0186738d5f44226bd86f91746fc1a70a2d7f60f\" returns successfully"
Jan 14 13:39:38.505812 containerd[1759]: time="2025-01-14T13:39:38.505648196Z" level=info msg="StopPodSandbox for \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\""
Jan 14 13:39:38.505812 containerd[1759]: time="2025-01-14T13:39:38.505740996Z" level=info msg="TearDown network for sandbox \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\" successfully"
Jan 14 13:39:38.505812 containerd[1759]: time="2025-01-14T13:39:38.505750556Z" level=info msg="StopPodSandbox for \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\" returns successfully"
Jan 14 13:39:38.506058 containerd[1759]: time="2025-01-14T13:39:38.506029636Z" level=info msg="RemovePodSandbox for \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\""
Jan 14 13:39:38.506120 containerd[1759]: time="2025-01-14T13:39:38.506059196Z" level=info msg="Forcibly stopping sandbox \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\""
Jan 14 13:39:38.506147 containerd[1759]: time="2025-01-14T13:39:38.506123756Z" level=info msg="TearDown network for sandbox \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\" successfully"
Jan 14 13:39:38.513649 containerd[1759]: time="2025-01-14T13:39:38.513593719Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.513754 containerd[1759]: time="2025-01-14T13:39:38.513690479Z" level=info msg="RemovePodSandbox \"9d9edccca252d0200ddc1d1a06feea21632c6aba47e5373abe892ed2e67d558c\" returns successfully"
Jan 14 13:39:38.514384 containerd[1759]: time="2025-01-14T13:39:38.514099199Z" level=info msg="StopPodSandbox for \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\""
Jan 14 13:39:38.514384 containerd[1759]: time="2025-01-14T13:39:38.514185599Z" level=info msg="TearDown network for sandbox \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\" successfully"
Jan 14 13:39:38.514384 containerd[1759]: time="2025-01-14T13:39:38.514196319Z" level=info msg="StopPodSandbox for \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\" returns successfully"
Jan 14 13:39:38.514902 containerd[1759]: time="2025-01-14T13:39:38.514759959Z" level=info msg="RemovePodSandbox for \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\""
Jan 14 13:39:38.514902 containerd[1759]: time="2025-01-14T13:39:38.514784199Z" level=info msg="Forcibly stopping sandbox \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\""
Jan 14 13:39:38.514902 containerd[1759]: time="2025-01-14T13:39:38.514852679Z" level=info msg="TearDown network for sandbox \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\" successfully"
Jan 14 13:39:38.525665 containerd[1759]: time="2025-01-14T13:39:38.525545683Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.525665 containerd[1759]: time="2025-01-14T13:39:38.525611963Z" level=info msg="RemovePodSandbox \"f059e8bf2c04ef4a690b8971d959651a553cae0a328f4635db3d79e2f11e954a\" returns successfully"
Jan 14 13:39:38.526738 containerd[1759]: time="2025-01-14T13:39:38.526383444Z" level=info msg="StopPodSandbox for \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\""
Jan 14 13:39:38.526738 containerd[1759]: time="2025-01-14T13:39:38.526493884Z" level=info msg="TearDown network for sandbox \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\" successfully"
Jan 14 13:39:38.526738 containerd[1759]: time="2025-01-14T13:39:38.526503084Z" level=info msg="StopPodSandbox for \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\" returns successfully"
Jan 14 13:39:38.527286 containerd[1759]: time="2025-01-14T13:39:38.527194644Z" level=info msg="RemovePodSandbox for \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\""
Jan 14 13:39:38.527286 containerd[1759]: time="2025-01-14T13:39:38.527220684Z" level=info msg="Forcibly stopping sandbox \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\""
Jan 14 13:39:38.527601 containerd[1759]: time="2025-01-14T13:39:38.527513604Z" level=info msg="TearDown network for sandbox \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\" successfully"
Jan 14 13:39:38.541541 containerd[1759]: time="2025-01-14T13:39:38.541489010Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.541701 containerd[1759]: time="2025-01-14T13:39:38.541565530Z" level=info msg="RemovePodSandbox \"1264c616a5981a80fd295a4789519fedebf29faa9f62ef616b2c236bb83f3eb5\" returns successfully"
Jan 14 13:39:38.542183 containerd[1759]: time="2025-01-14T13:39:38.542155570Z" level=info msg="StopPodSandbox for \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\""
Jan 14 13:39:38.542347 containerd[1759]: time="2025-01-14T13:39:38.542324330Z" level=info msg="TearDown network for sandbox \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\" successfully"
Jan 14 13:39:38.542347 containerd[1759]: time="2025-01-14T13:39:38.542343050Z" level=info msg="StopPodSandbox for \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\" returns successfully"
Jan 14 13:39:38.542669 containerd[1759]: time="2025-01-14T13:39:38.542643250Z" level=info msg="RemovePodSandbox for \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\""
Jan 14 13:39:38.542719 containerd[1759]: time="2025-01-14T13:39:38.542670890Z" level=info msg="Forcibly stopping sandbox \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\""
Jan 14 13:39:38.542777 containerd[1759]: time="2025-01-14T13:39:38.542742490Z" level=info msg="TearDown network for sandbox \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\" successfully"
Jan 14 13:39:38.553490 containerd[1759]: time="2025-01-14T13:39:38.553398814Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.554025 containerd[1759]: time="2025-01-14T13:39:38.553549374Z" level=info msg="RemovePodSandbox \"6db7c174b4a2e2a91c30fbadca47ed59a9e1253d3c7673ccdcdd2e0f1e9663da\" returns successfully"
Jan 14 13:39:38.554634 containerd[1759]: time="2025-01-14T13:39:38.554558335Z" level=info msg="StopPodSandbox for \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\""
Jan 14 13:39:38.554836 containerd[1759]: time="2025-01-14T13:39:38.554646855Z" level=info msg="TearDown network for sandbox \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\" successfully"
Jan 14 13:39:38.554836 containerd[1759]: time="2025-01-14T13:39:38.554656775Z" level=info msg="StopPodSandbox for \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\" returns successfully"
Jan 14 13:39:38.556226 containerd[1759]: time="2025-01-14T13:39:38.555036775Z" level=info msg="RemovePodSandbox for \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\""
Jan 14 13:39:38.556226 containerd[1759]: time="2025-01-14T13:39:38.555063695Z" level=info msg="Forcibly stopping sandbox \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\""
Jan 14 13:39:38.556226 containerd[1759]: time="2025-01-14T13:39:38.555122535Z" level=info msg="TearDown network for sandbox \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\" successfully"
Jan 14 13:39:38.565886 containerd[1759]: time="2025-01-14T13:39:38.565836579Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.565982 containerd[1759]: time="2025-01-14T13:39:38.565897339Z" level=info msg="RemovePodSandbox \"fa943c87d584052299723783e0bfa7c22ccf7dd8b2d12c134357dbb5cf1d49a2\" returns successfully"
Jan 14 13:39:38.566538 containerd[1759]: time="2025-01-14T13:39:38.566382019Z" level=info msg="StopPodSandbox for \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\""
Jan 14 13:39:38.566538 containerd[1759]: time="2025-01-14T13:39:38.566470259Z" level=info msg="TearDown network for sandbox \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\" successfully"
Jan 14 13:39:38.566538 containerd[1759]: time="2025-01-14T13:39:38.566479699Z" level=info msg="StopPodSandbox for \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\" returns successfully"
Jan 14 13:39:38.567448 containerd[1759]: time="2025-01-14T13:39:38.566778260Z" level=info msg="RemovePodSandbox for \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\""
Jan 14 13:39:38.567448 containerd[1759]: time="2025-01-14T13:39:38.566801540Z" level=info msg="Forcibly stopping sandbox \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\""
Jan 14 13:39:38.567448 containerd[1759]: time="2025-01-14T13:39:38.566885980Z" level=info msg="TearDown network for sandbox \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\" successfully"
Jan 14 13:39:38.577017 containerd[1759]: time="2025-01-14T13:39:38.576895144Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.577017 containerd[1759]: time="2025-01-14T13:39:38.576953944Z" level=info msg="RemovePodSandbox \"00c3c6ec83186ef1c0a5ae07af8cae4c396a43d61e23752c36865a6b634b59f6\" returns successfully"
Jan 14 13:39:38.577511 containerd[1759]: time="2025-01-14T13:39:38.577484504Z" level=info msg="StopPodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\""
Jan 14 13:39:38.577699 containerd[1759]: time="2025-01-14T13:39:38.577673544Z" level=info msg="TearDown network for sandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" successfully"
Jan 14 13:39:38.577699 containerd[1759]: time="2025-01-14T13:39:38.577694104Z" level=info msg="StopPodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" returns successfully"
Jan 14 13:39:38.578181 containerd[1759]: time="2025-01-14T13:39:38.578116384Z" level=info msg="RemovePodSandbox for \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\""
Jan 14 13:39:38.578181 containerd[1759]: time="2025-01-14T13:39:38.578139584Z" level=info msg="Forcibly stopping sandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\""
Jan 14 13:39:38.578269 containerd[1759]: time="2025-01-14T13:39:38.578197264Z" level=info msg="TearDown network for sandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" successfully"
Jan 14 13:39:38.588915 containerd[1759]: time="2025-01-14T13:39:38.588878508Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.589010 containerd[1759]: time="2025-01-14T13:39:38.588940868Z" level=info msg="RemovePodSandbox \"5277ec73b0c51303428aa9e4f4ce88e03b90fc242960628c34066db03b0c1519\" returns successfully"
Jan 14 13:39:38.589313 containerd[1759]: time="2025-01-14T13:39:38.589281588Z" level=info msg="StopPodSandbox for \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\""
Jan 14 13:39:38.589402 containerd[1759]: time="2025-01-14T13:39:38.589379308Z" level=info msg="TearDown network for sandbox \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\" successfully"
Jan 14 13:39:38.589402 containerd[1759]: time="2025-01-14T13:39:38.589397308Z" level=info msg="StopPodSandbox for \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\" returns successfully"
Jan 14 13:39:38.589713 containerd[1759]: time="2025-01-14T13:39:38.589687989Z" level=info msg="RemovePodSandbox for \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\""
Jan 14 13:39:38.589751 containerd[1759]: time="2025-01-14T13:39:38.589717229Z" level=info msg="Forcibly stopping sandbox \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\""
Jan 14 13:39:38.589797 containerd[1759]: time="2025-01-14T13:39:38.589775469Z" level=info msg="TearDown network for sandbox \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\" successfully"
Jan 14 13:39:38.604810 containerd[1759]: time="2025-01-14T13:39:38.604765394Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.604961 containerd[1759]: time="2025-01-14T13:39:38.604839954Z" level=info msg="RemovePodSandbox \"30093b19ba20ec4e518777e0b208d0ee8304583360a153031ab1730c1cc7b57c\" returns successfully"
Jan 14 13:39:38.605294 containerd[1759]: time="2025-01-14T13:39:38.605266235Z" level=info msg="StopPodSandbox for \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\""
Jan 14 13:39:38.605391 containerd[1759]: time="2025-01-14T13:39:38.605369355Z" level=info msg="TearDown network for sandbox \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\" successfully"
Jan 14 13:39:38.605391 containerd[1759]: time="2025-01-14T13:39:38.605386435Z" level=info msg="StopPodSandbox for \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\" returns successfully"
Jan 14 13:39:38.606024 containerd[1759]: time="2025-01-14T13:39:38.605729675Z" level=info msg="RemovePodSandbox for \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\""
Jan 14 13:39:38.606024 containerd[1759]: time="2025-01-14T13:39:38.605755075Z" level=info msg="Forcibly stopping sandbox \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\""
Jan 14 13:39:38.606024 containerd[1759]: time="2025-01-14T13:39:38.605829795Z" level=info msg="TearDown network for sandbox \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\" successfully"
Jan 14 13:39:38.614817 containerd[1759]: time="2025-01-14T13:39:38.614777838Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.614889 containerd[1759]: time="2025-01-14T13:39:38.614837278Z" level=info msg="RemovePodSandbox \"343ce01851b09b781e7c37aecf6233297f530f4e36927dfa004a3ceca4827cc6\" returns successfully"
Jan 14 13:39:38.615388 containerd[1759]: time="2025-01-14T13:39:38.615194799Z" level=info msg="StopPodSandbox for \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\""
Jan 14 13:39:38.615388 containerd[1759]: time="2025-01-14T13:39:38.615285039Z" level=info msg="TearDown network for sandbox \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\" successfully"
Jan 14 13:39:38.615388 containerd[1759]: time="2025-01-14T13:39:38.615294439Z" level=info msg="StopPodSandbox for \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\" returns successfully"
Jan 14 13:39:38.616022 containerd[1759]: time="2025-01-14T13:39:38.615739279Z" level=info msg="RemovePodSandbox for \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\""
Jan 14 13:39:38.616022 containerd[1759]: time="2025-01-14T13:39:38.615768039Z" level=info msg="Forcibly stopping sandbox \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\""
Jan 14 13:39:38.616022 containerd[1759]: time="2025-01-14T13:39:38.615861719Z" level=info msg="TearDown network for sandbox \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\" successfully"
Jan 14 13:39:38.627336 containerd[1759]: time="2025-01-14T13:39:38.627237683Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.627336 containerd[1759]: time="2025-01-14T13:39:38.627320563Z" level=info msg="RemovePodSandbox \"4324b2b7b7abf7abc3ce5f6add8c590366f32b45d3142690b38119333f9a0e2b\" returns successfully"
Jan 14 13:39:38.628066 containerd[1759]: time="2025-01-14T13:39:38.627820723Z" level=info msg="StopPodSandbox for \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\""
Jan 14 13:39:38.628066 containerd[1759]: time="2025-01-14T13:39:38.627925364Z" level=info msg="TearDown network for sandbox \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\" successfully"
Jan 14 13:39:38.628066 containerd[1759]: time="2025-01-14T13:39:38.627937404Z" level=info msg="StopPodSandbox for \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\" returns successfully"
Jan 14 13:39:38.629046 containerd[1759]: time="2025-01-14T13:39:38.628381524Z" level=info msg="RemovePodSandbox for \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\""
Jan 14 13:39:38.629046 containerd[1759]: time="2025-01-14T13:39:38.628407804Z" level=info msg="Forcibly stopping sandbox \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\""
Jan 14 13:39:38.629046 containerd[1759]: time="2025-01-14T13:39:38.628463964Z" level=info msg="TearDown network for sandbox \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\" successfully"
Jan 14 13:39:38.636302 containerd[1759]: time="2025-01-14T13:39:38.636266207Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.636352 containerd[1759]: time="2025-01-14T13:39:38.636327927Z" level=info msg="RemovePodSandbox \"b25e2b01bbc2a4e604157740144b7276644b40fc3d95844bb4b4dfb88f0b6101\" returns successfully"
Jan 14 13:39:38.636870 containerd[1759]: time="2025-01-14T13:39:38.636698807Z" level=info msg="StopPodSandbox for \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\""
Jan 14 13:39:38.636870 containerd[1759]: time="2025-01-14T13:39:38.636787287Z" level=info msg="TearDown network for sandbox \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\" successfully"
Jan 14 13:39:38.636870 containerd[1759]: time="2025-01-14T13:39:38.636797127Z" level=info msg="StopPodSandbox for \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\" returns successfully"
Jan 14 13:39:38.637161 containerd[1759]: time="2025-01-14T13:39:38.637140647Z" level=info msg="RemovePodSandbox for \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\""
Jan 14 13:39:38.637161 containerd[1759]: time="2025-01-14T13:39:38.637195207Z" level=info msg="Forcibly stopping sandbox \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\""
Jan 14 13:39:38.637161 containerd[1759]: time="2025-01-14T13:39:38.637265527Z" level=info msg="TearDown network for sandbox \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\" successfully"
Jan 14 13:39:38.649557 containerd[1759]: time="2025-01-14T13:39:38.649489052Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.649707 containerd[1759]: time="2025-01-14T13:39:38.649577652Z" level=info msg="RemovePodSandbox \"b2e96b6775ecfba35b59af46249769cdebbbb6bb7c875319cd566a8827b720b5\" returns successfully"
Jan 14 13:39:38.650199 containerd[1759]: time="2025-01-14T13:39:38.650032932Z" level=info msg="StopPodSandbox for \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\""
Jan 14 13:39:38.650199 containerd[1759]: time="2025-01-14T13:39:38.650121412Z" level=info msg="TearDown network for sandbox \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\" successfully"
Jan 14 13:39:38.650199 containerd[1759]: time="2025-01-14T13:39:38.650129932Z" level=info msg="StopPodSandbox for \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\" returns successfully"
Jan 14 13:39:38.650503 containerd[1759]: time="2025-01-14T13:39:38.650485052Z" level=info msg="RemovePodSandbox for \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\""
Jan 14 13:39:38.650938 containerd[1759]: time="2025-01-14T13:39:38.650574852Z" level=info msg="Forcibly stopping sandbox \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\""
Jan 14 13:39:38.650938 containerd[1759]: time="2025-01-14T13:39:38.650636932Z" level=info msg="TearDown network for sandbox \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\" successfully"
Jan 14 13:39:38.662534 containerd[1759]: time="2025-01-14T13:39:38.662501617Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.662707 containerd[1759]: time="2025-01-14T13:39:38.662690537Z" level=info msg="RemovePodSandbox \"b670bc37149820437c0e0eebd0c955b7bf278e320b34d3bef403389074a178b2\" returns successfully"
Jan 14 13:39:38.663225 containerd[1759]: time="2025-01-14T13:39:38.663195417Z" level=info msg="StopPodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\""
Jan 14 13:39:38.663320 containerd[1759]: time="2025-01-14T13:39:38.663298537Z" level=info msg="TearDown network for sandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" successfully"
Jan 14 13:39:38.663320 containerd[1759]: time="2025-01-14T13:39:38.663314537Z" level=info msg="StopPodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" returns successfully"
Jan 14 13:39:38.663662 containerd[1759]: time="2025-01-14T13:39:38.663637097Z" level=info msg="RemovePodSandbox for \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\""
Jan 14 13:39:38.663714 containerd[1759]: time="2025-01-14T13:39:38.663672138Z" level=info msg="Forcibly stopping sandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\""
Jan 14 13:39:38.663748 containerd[1759]: time="2025-01-14T13:39:38.663737178Z" level=info msg="TearDown network for sandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" successfully"
Jan 14 13:39:38.676158 containerd[1759]: time="2025-01-14T13:39:38.676020742Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.676158 containerd[1759]: time="2025-01-14T13:39:38.676108422Z" level=info msg="RemovePodSandbox \"a113b3af1c8407460935e35552808b277b7d12450176e51b611747d3c6242f27\" returns successfully"
Jan 14 13:39:38.676537 containerd[1759]: time="2025-01-14T13:39:38.676510583Z" level=info msg="StopPodSandbox for \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\""
Jan 14 13:39:38.676685 containerd[1759]: time="2025-01-14T13:39:38.676658383Z" level=info msg="TearDown network for sandbox \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\" successfully"
Jan 14 13:39:38.676685 containerd[1759]: time="2025-01-14T13:39:38.676677143Z" level=info msg="StopPodSandbox for \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\" returns successfully"
Jan 14 13:39:38.677049 containerd[1759]: time="2025-01-14T13:39:38.677022783Z" level=info msg="RemovePodSandbox for \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\""
Jan 14 13:39:38.677107 containerd[1759]: time="2025-01-14T13:39:38.677051143Z" level=info msg="Forcibly stopping sandbox \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\""
Jan 14 13:39:38.677132 containerd[1759]: time="2025-01-14T13:39:38.677114743Z" level=info msg="TearDown network for sandbox \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\" successfully"
Jan 14 13:39:38.689139 containerd[1759]: time="2025-01-14T13:39:38.689089867Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.689265 containerd[1759]: time="2025-01-14T13:39:38.689165187Z" level=info msg="RemovePodSandbox \"7c9976981517f24168c73d3c336d4a6c14006e956d02e28fa5420b4185aedcf2\" returns successfully"
Jan 14 13:39:38.689807 containerd[1759]: time="2025-01-14T13:39:38.689618348Z" level=info msg="StopPodSandbox for \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\""
Jan 14 13:39:38.689807 containerd[1759]: time="2025-01-14T13:39:38.689714988Z" level=info msg="TearDown network for sandbox \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\" successfully"
Jan 14 13:39:38.689807 containerd[1759]: time="2025-01-14T13:39:38.689725348Z" level=info msg="StopPodSandbox for \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\" returns successfully"
Jan 14 13:39:38.690150 containerd[1759]: time="2025-01-14T13:39:38.690117868Z" level=info msg="RemovePodSandbox for \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\""
Jan 14 13:39:38.690236 containerd[1759]: time="2025-01-14T13:39:38.690223388Z" level=info msg="Forcibly stopping sandbox \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\""
Jan 14 13:39:38.690886 containerd[1759]: time="2025-01-14T13:39:38.690326428Z" level=info msg="TearDown network for sandbox \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\" successfully"
Jan 14 13:39:38.702055 containerd[1759]: time="2025-01-14T13:39:38.702005633Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.702304 containerd[1759]: time="2025-01-14T13:39:38.702062833Z" level=info msg="RemovePodSandbox \"dabfa3512c99376d5a086efaca566f587696b3221947e482d388918d13d87600\" returns successfully"
Jan 14 13:39:38.702610 containerd[1759]: time="2025-01-14T13:39:38.702462313Z" level=info msg="StopPodSandbox for \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\""
Jan 14 13:39:38.702610 containerd[1759]: time="2025-01-14T13:39:38.702540353Z" level=info msg="TearDown network for sandbox \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\" successfully"
Jan 14 13:39:38.702610 containerd[1759]: time="2025-01-14T13:39:38.702549593Z" level=info msg="StopPodSandbox for \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\" returns successfully"
Jan 14 13:39:38.702839 containerd[1759]: time="2025-01-14T13:39:38.702821153Z" level=info msg="RemovePodSandbox for \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\""
Jan 14 13:39:38.703570 containerd[1759]: time="2025-01-14T13:39:38.702879313Z" level=info msg="Forcibly stopping sandbox \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\""
Jan 14 13:39:38.703570 containerd[1759]: time="2025-01-14T13:39:38.702953153Z" level=info msg="TearDown network for sandbox \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\" successfully"
Jan 14 13:39:38.737404 containerd[1759]: time="2025-01-14T13:39:38.737356686Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.737785 containerd[1759]: time="2025-01-14T13:39:38.737430686Z" level=info msg="RemovePodSandbox \"e7375a07ba60b18b5778a2498441fe0d72d5081c743ec5c617b265edbe37b6cf\" returns successfully"
Jan 14 13:39:38.738178 containerd[1759]: time="2025-01-14T13:39:38.738042847Z" level=info msg="StopPodSandbox for \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\""
Jan 14 13:39:38.738178 containerd[1759]: time="2025-01-14T13:39:38.738152967Z" level=info msg="TearDown network for sandbox \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\" successfully"
Jan 14 13:39:38.738334 containerd[1759]: time="2025-01-14T13:39:38.738163367Z" level=info msg="StopPodSandbox for \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\" returns successfully"
Jan 14 13:39:38.739023 containerd[1759]: time="2025-01-14T13:39:38.738550087Z" level=info msg="RemovePodSandbox for \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\""
Jan 14 13:39:38.739023 containerd[1759]: time="2025-01-14T13:39:38.738573327Z" level=info msg="Forcibly stopping sandbox \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\""
Jan 14 13:39:38.739023 containerd[1759]: time="2025-01-14T13:39:38.738634287Z" level=info msg="TearDown network for sandbox \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\" successfully"
Jan 14 13:39:38.747372 containerd[1759]: time="2025-01-14T13:39:38.747336250Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.747542 containerd[1759]: time="2025-01-14T13:39:38.747489370Z" level=info msg="RemovePodSandbox \"52c1e2cb647327fef74966b5ff40fa665c7dfab5ab87df1cfe2b9f72c1c82bfe\" returns successfully"
Jan 14 13:39:38.747981 containerd[1759]: time="2025-01-14T13:39:38.747955251Z" level=info msg="StopPodSandbox for \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\""
Jan 14 13:39:38.748077 containerd[1759]: time="2025-01-14T13:39:38.748055091Z" level=info msg="TearDown network for sandbox \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\" successfully"
Jan 14 13:39:38.748077 containerd[1759]: time="2025-01-14T13:39:38.748073171Z" level=info msg="StopPodSandbox for \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\" returns successfully"
Jan 14 13:39:38.748370 containerd[1759]: time="2025-01-14T13:39:38.748347611Z" level=info msg="RemovePodSandbox for \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\""
Jan 14 13:39:38.748423 containerd[1759]: time="2025-01-14T13:39:38.748371211Z" level=info msg="Forcibly stopping sandbox \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\""
Jan 14 13:39:38.748448 containerd[1759]: time="2025-01-14T13:39:38.748428691Z" level=info msg="TearDown network for sandbox \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\" successfully"
Jan 14 13:39:38.761771 containerd[1759]: time="2025-01-14T13:39:38.761723496Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.762367 containerd[1759]: time="2025-01-14T13:39:38.761804856Z" level=info msg="RemovePodSandbox \"67b9a4dcac185723bb47072319b142a2f5a946493f9410b499f7a76e22329fb7\" returns successfully"
Jan 14 13:39:38.762702 containerd[1759]: time="2025-01-14T13:39:38.762539976Z" level=info msg="StopPodSandbox for \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\""
Jan 14 13:39:38.762702 containerd[1759]: time="2025-01-14T13:39:38.762623976Z" level=info msg="TearDown network for sandbox \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\" successfully"
Jan 14 13:39:38.762702 containerd[1759]: time="2025-01-14T13:39:38.762633776Z" level=info msg="StopPodSandbox for \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\" returns successfully"
Jan 14 13:39:38.763347 containerd[1759]: time="2025-01-14T13:39:38.762871896Z" level=info msg="RemovePodSandbox for \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\""
Jan 14 13:39:38.763347 containerd[1759]: time="2025-01-14T13:39:38.762897376Z" level=info msg="Forcibly stopping sandbox \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\""
Jan 14 13:39:38.763347 containerd[1759]: time="2025-01-14T13:39:38.762951856Z" level=info msg="TearDown network for sandbox \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\" successfully"
Jan 14 13:39:38.774071 containerd[1759]: time="2025-01-14T13:39:38.774039021Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 14 13:39:38.774210 containerd[1759]: time="2025-01-14T13:39:38.774195821Z" level=info msg="RemovePodSandbox \"3b24aa1db19b744819395afc45a3494a664b194c26e93f32d75b2a8190e66598\" returns successfully"
Jan 14 13:40:03.942535 systemd[1]: Started sshd@7-10.200.20.15:22-10.200.16.10:52378.service - OpenSSH per-connection server daemon (10.200.16.10:52378).
Jan 14 13:40:04.436949 sshd[6688]: Accepted publickey for core from 10.200.16.10 port 52378 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:40:04.438742 sshd-session[6688]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:40:04.442882 systemd-logind[1736]: New session 10 of user core.
Jan 14 13:40:04.447143 systemd[1]: Started session-10.scope - Session 10 of User core.
Jan 14 13:40:04.868493 sshd[6690]: Connection closed by 10.200.16.10 port 52378
Jan 14 13:40:04.869090 sshd-session[6688]: pam_unix(sshd:session): session closed for user core
Jan 14 13:40:04.874812 systemd-logind[1736]: Session 10 logged out. Waiting for processes to exit.
Jan 14 13:40:04.875503 systemd[1]: sshd@7-10.200.20.15:22-10.200.16.10:52378.service: Deactivated successfully.
Jan 14 13:40:04.880194 systemd[1]: session-10.scope: Deactivated successfully.
Jan 14 13:40:04.881254 systemd-logind[1736]: Removed session 10.
Jan 14 13:40:08.037908 systemd[1]: run-containerd-runc-k8s.io-354c2dab58c75eb832be4ddb417f03b648da284c7c07951f60ee20295200c07a-runc.Twzykg.mount: Deactivated successfully.
Jan 14 13:40:09.962240 systemd[1]: Started sshd@8-10.200.20.15:22-10.200.16.10:33602.service - OpenSSH per-connection server daemon (10.200.16.10:33602).
Jan 14 13:40:10.441600 sshd[6743]: Accepted publickey for core from 10.200.16.10 port 33602 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:40:10.443089 sshd-session[6743]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:40:10.447119 systemd-logind[1736]: New session 11 of user core.
Jan 14 13:40:10.453136 systemd[1]: Started session-11.scope - Session 11 of User core.
Jan 14 13:40:10.865027 sshd[6745]: Connection closed by 10.200.16.10 port 33602
Jan 14 13:40:10.865524 sshd-session[6743]: pam_unix(sshd:session): session closed for user core
Jan 14 13:40:10.868688 systemd[1]: sshd@8-10.200.20.15:22-10.200.16.10:33602.service: Deactivated successfully.
Jan 14 13:40:10.868736 systemd-logind[1736]: Session 11 logged out. Waiting for processes to exit.
Jan 14 13:40:10.870401 systemd[1]: session-11.scope: Deactivated successfully.
Jan 14 13:40:10.871554 systemd-logind[1736]: Removed session 11.
Jan 14 13:40:15.958229 systemd[1]: Started sshd@9-10.200.20.15:22-10.200.16.10:45680.service - OpenSSH per-connection server daemon (10.200.16.10:45680).
Jan 14 13:40:16.438374 sshd[6765]: Accepted publickey for core from 10.200.16.10 port 45680 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:40:16.439704 sshd-session[6765]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:40:16.444200 systemd-logind[1736]: New session 12 of user core.
Jan 14 13:40:16.452138 systemd[1]: Started session-12.scope - Session 12 of User core.
Jan 14 13:40:16.851830 sshd[6767]: Connection closed by 10.200.16.10 port 45680
Jan 14 13:40:16.852831 sshd-session[6765]: pam_unix(sshd:session): session closed for user core
Jan 14 13:40:16.856265 systemd-logind[1736]: Session 12 logged out. Waiting for processes to exit.
Jan 14 13:40:16.856887 systemd[1]: sshd@9-10.200.20.15:22-10.200.16.10:45680.service: Deactivated successfully.
Jan 14 13:40:16.858793 systemd[1]: session-12.scope: Deactivated successfully.
Jan 14 13:40:16.860199 systemd-logind[1736]: Removed session 12.
Jan 14 13:40:21.950035 systemd[1]: Started sshd@10-10.200.20.15:22-10.200.16.10:45686.service - OpenSSH per-connection server daemon (10.200.16.10:45686).
Jan 14 13:40:22.408574 sshd[6780]: Accepted publickey for core from 10.200.16.10 port 45686 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:40:22.409839 sshd-session[6780]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:40:22.414246 systemd-logind[1736]: New session 13 of user core.
Jan 14 13:40:22.422151 systemd[1]: Started session-13.scope - Session 13 of User core.
Jan 14 13:40:22.811395 sshd[6782]: Connection closed by 10.200.16.10 port 45686
Jan 14 13:40:22.812229 sshd-session[6780]: pam_unix(sshd:session): session closed for user core
Jan 14 13:40:22.815949 systemd-logind[1736]: Session 13 logged out. Waiting for processes to exit.
Jan 14 13:40:22.816615 systemd[1]: sshd@10-10.200.20.15:22-10.200.16.10:45686.service: Deactivated successfully.
Jan 14 13:40:22.818922 systemd[1]: session-13.scope: Deactivated successfully.
Jan 14 13:40:22.820391 systemd-logind[1736]: Removed session 13.
Jan 14 13:40:22.899394 systemd[1]: Started sshd@11-10.200.20.15:22-10.200.16.10:45702.service - OpenSSH per-connection server daemon (10.200.16.10:45702).
Jan 14 13:40:23.348422 sshd[6794]: Accepted publickey for core from 10.200.16.10 port 45702 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:40:23.349702 sshd-session[6794]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:40:23.353908 systemd-logind[1736]: New session 14 of user core.
Jan 14 13:40:23.357212 systemd[1]: Started session-14.scope - Session 14 of User core.
Jan 14 13:40:23.784179 sshd[6808]: Connection closed by 10.200.16.10 port 45702
Jan 14 13:40:23.784783 sshd-session[6794]: pam_unix(sshd:session): session closed for user core
Jan 14 13:40:23.788550 systemd[1]: sshd@11-10.200.20.15:22-10.200.16.10:45702.service: Deactivated successfully.
Jan 14 13:40:23.790790 systemd[1]: session-14.scope: Deactivated successfully.
Jan 14 13:40:23.792129 systemd-logind[1736]: Session 14 logged out. Waiting for processes to exit.
Jan 14 13:40:23.792940 systemd-logind[1736]: Removed session 14.
Jan 14 13:40:23.870242 systemd[1]: Started sshd@12-10.200.20.15:22-10.200.16.10:45716.service - OpenSSH per-connection server daemon (10.200.16.10:45716).
Jan 14 13:40:24.370054 sshd[6824]: Accepted publickey for core from 10.200.16.10 port 45716 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:40:24.371402 sshd-session[6824]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:40:24.375107 systemd-logind[1736]: New session 15 of user core.
Jan 14 13:40:24.384115 systemd[1]: Started session-15.scope - Session 15 of User core.
Jan 14 13:40:24.791684 sshd[6826]: Connection closed by 10.200.16.10 port 45716
Jan 14 13:40:24.792394 sshd-session[6824]: pam_unix(sshd:session): session closed for user core
Jan 14 13:40:24.795397 systemd-logind[1736]: Session 15 logged out. Waiting for processes to exit.
Jan 14 13:40:24.796233 systemd[1]: sshd@12-10.200.20.15:22-10.200.16.10:45716.service: Deactivated successfully.
Jan 14 13:40:24.798445 systemd[1]: session-15.scope: Deactivated successfully.
Jan 14 13:40:24.799358 systemd-logind[1736]: Removed session 15.
Jan 14 13:40:29.873693 systemd[1]: Started sshd@13-10.200.20.15:22-10.200.16.10:47606.service - OpenSSH per-connection server daemon (10.200.16.10:47606).
Jan 14 13:40:30.326956 sshd[6842]: Accepted publickey for core from 10.200.16.10 port 47606 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:40:30.328185 sshd-session[6842]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:40:30.331805 systemd-logind[1736]: New session 16 of user core.
Jan 14 13:40:30.336120 systemd[1]: Started session-16.scope - Session 16 of User core.
Jan 14 13:40:30.724752 sshd[6844]: Connection closed by 10.200.16.10 port 47606
Jan 14 13:40:30.725342 sshd-session[6842]: pam_unix(sshd:session): session closed for user core
Jan 14 13:40:30.728409 systemd[1]: sshd@13-10.200.20.15:22-10.200.16.10:47606.service: Deactivated successfully.
Jan 14 13:40:30.729903 systemd[1]: session-16.scope: Deactivated successfully.
Jan 14 13:40:30.731359 systemd-logind[1736]: Session 16 logged out. Waiting for processes to exit.
Jan 14 13:40:30.732320 systemd-logind[1736]: Removed session 16.
Jan 14 13:40:35.817131 systemd[1]: Started sshd@14-10.200.20.15:22-10.200.16.10:47620.service - OpenSSH per-connection server daemon (10.200.16.10:47620).
Jan 14 13:40:36.275166 sshd[6875]: Accepted publickey for core from 10.200.16.10 port 47620 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:40:36.276901 sshd-session[6875]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:40:36.281367 systemd-logind[1736]: New session 17 of user core.
Jan 14 13:40:36.284126 systemd[1]: Started session-17.scope - Session 17 of User core.
Jan 14 13:40:36.675618 sshd[6877]: Connection closed by 10.200.16.10 port 47620
Jan 14 13:40:36.676190 sshd-session[6875]: pam_unix(sshd:session): session closed for user core
Jan 14 13:40:36.678907 systemd[1]: sshd@14-10.200.20.15:22-10.200.16.10:47620.service: Deactivated successfully.
Jan 14 13:40:36.681126 systemd[1]: session-17.scope: Deactivated successfully.
Jan 14 13:40:36.682743 systemd-logind[1736]: Session 17 logged out. Waiting for processes to exit.
Jan 14 13:40:36.683849 systemd-logind[1736]: Removed session 17.
Jan 14 13:40:36.765656 systemd[1]: Started sshd@15-10.200.20.15:22-10.200.16.10:49348.service - OpenSSH per-connection server daemon (10.200.16.10:49348).
Jan 14 13:40:37.249551 sshd[6887]: Accepted publickey for core from 10.200.16.10 port 49348 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:40:37.250919 sshd-session[6887]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:40:37.255541 systemd-logind[1736]: New session 18 of user core.
Jan 14 13:40:37.259137 systemd[1]: Started session-18.scope - Session 18 of User core.
Jan 14 13:40:37.766843 sshd[6889]: Connection closed by 10.200.16.10 port 49348
Jan 14 13:40:37.767343 sshd-session[6887]: pam_unix(sshd:session): session closed for user core
Jan 14 13:40:37.770014 systemd-logind[1736]: Session 18 logged out. Waiting for processes to exit.
Jan 14 13:40:37.771793 systemd[1]: sshd@15-10.200.20.15:22-10.200.16.10:49348.service: Deactivated successfully.
Jan 14 13:40:37.773890 systemd[1]: session-18.scope: Deactivated successfully.
Jan 14 13:40:37.774972 systemd-logind[1736]: Removed session 18.
Jan 14 13:40:37.852885 systemd[1]: Started sshd@16-10.200.20.15:22-10.200.16.10:49362.service - OpenSSH per-connection server daemon (10.200.16.10:49362).
Jan 14 13:40:38.316893 sshd[6900]: Accepted publickey for core from 10.200.16.10 port 49362 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:40:38.318680 sshd-session[6900]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:40:38.323092 systemd-logind[1736]: New session 19 of user core.
Jan 14 13:40:38.326153 systemd[1]: Started session-19.scope - Session 19 of User core.
Jan 14 13:40:40.312565 sshd[6902]: Connection closed by 10.200.16.10 port 49362
Jan 14 13:40:40.313249 sshd-session[6900]: pam_unix(sshd:session): session closed for user core
Jan 14 13:40:40.316109 systemd[1]: sshd@16-10.200.20.15:22-10.200.16.10:49362.service: Deactivated successfully.
Jan 14 13:40:40.317979 systemd[1]: session-19.scope: Deactivated successfully.
Jan 14 13:40:40.320400 systemd-logind[1736]: Session 19 logged out. Waiting for processes to exit.
Jan 14 13:40:40.321591 systemd-logind[1736]: Removed session 19.
Jan 14 13:40:40.406079 systemd[1]: Started sshd@17-10.200.20.15:22-10.200.16.10:49378.service - OpenSSH per-connection server daemon (10.200.16.10:49378).
Jan 14 13:40:40.894445 sshd[6938]: Accepted publickey for core from 10.200.16.10 port 49378 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:40:40.895725 sshd-session[6938]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:40:40.899895 systemd-logind[1736]: New session 20 of user core.
Jan 14 13:40:40.904122 systemd[1]: Started session-20.scope - Session 20 of User core.
Jan 14 13:40:41.420112 sshd[6940]: Connection closed by 10.200.16.10 port 49378
Jan 14 13:40:41.420481 sshd-session[6938]: pam_unix(sshd:session): session closed for user core
Jan 14 13:40:41.424639 systemd-logind[1736]: Session 20 logged out. Waiting for processes to exit.
Jan 14 13:40:41.425203 systemd[1]: sshd@17-10.200.20.15:22-10.200.16.10:49378.service: Deactivated successfully.
Jan 14 13:40:41.428650 systemd[1]: session-20.scope: Deactivated successfully.
Jan 14 13:40:41.430227 systemd-logind[1736]: Removed session 20.
Jan 14 13:40:41.513246 systemd[1]: Started sshd@18-10.200.20.15:22-10.200.16.10:49394.service - OpenSSH per-connection server daemon (10.200.16.10:49394).
Jan 14 13:40:41.993391 sshd[6949]: Accepted publickey for core from 10.200.16.10 port 49394 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:40:41.994671 sshd-session[6949]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:40:41.998693 systemd-logind[1736]: New session 21 of user core.
Jan 14 13:40:42.005123 systemd[1]: Started session-21.scope - Session 21 of User core.
Jan 14 13:40:42.408183 sshd[6951]: Connection closed by 10.200.16.10 port 49394
Jan 14 13:40:42.408767 sshd-session[6949]: pam_unix(sshd:session): session closed for user core
Jan 14 13:40:42.412041 systemd[1]: sshd@18-10.200.20.15:22-10.200.16.10:49394.service: Deactivated successfully.
Jan 14 13:40:42.414220 systemd[1]: session-21.scope: Deactivated successfully.
Jan 14 13:40:42.415260 systemd-logind[1736]: Session 21 logged out. Waiting for processes to exit.
Jan 14 13:40:42.416308 systemd-logind[1736]: Removed session 21.
Jan 14 13:40:47.499264 systemd[1]: Started sshd@19-10.200.20.15:22-10.200.16.10:58418.service - OpenSSH per-connection server daemon (10.200.16.10:58418).
Jan 14 13:40:47.981862 sshd[6967]: Accepted publickey for core from 10.200.16.10 port 58418 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:40:47.983449 sshd-session[6967]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:40:47.992667 systemd-logind[1736]: New session 22 of user core.
Jan 14 13:40:47.996164 systemd[1]: Started session-22.scope - Session 22 of User core.
Jan 14 13:40:48.408036 sshd[6972]: Connection closed by 10.200.16.10 port 58418
Jan 14 13:40:48.408592 sshd-session[6967]: pam_unix(sshd:session): session closed for user core
Jan 14 13:40:48.411815 systemd[1]: sshd@19-10.200.20.15:22-10.200.16.10:58418.service: Deactivated successfully.
Jan 14 13:40:48.413610 systemd[1]: session-22.scope: Deactivated successfully.
Jan 14 13:40:48.414389 systemd-logind[1736]: Session 22 logged out. Waiting for processes to exit.
Jan 14 13:40:48.415611 systemd-logind[1736]: Removed session 22.
Jan 14 13:40:53.494763 systemd[1]: Started sshd@20-10.200.20.15:22-10.200.16.10:58432.service - OpenSSH per-connection server daemon (10.200.16.10:58432).
Jan 14 13:40:53.948981 sshd[6983]: Accepted publickey for core from 10.200.16.10 port 58432 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:40:53.950300 sshd-session[6983]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:40:53.955225 systemd-logind[1736]: New session 23 of user core.
Jan 14 13:40:53.960152 systemd[1]: Started session-23.scope - Session 23 of User core.
Jan 14 13:40:54.345100 sshd[6987]: Connection closed by 10.200.16.10 port 58432
Jan 14 13:40:54.345601 sshd-session[6983]: pam_unix(sshd:session): session closed for user core
Jan 14 13:40:54.348874 systemd[1]: sshd@20-10.200.20.15:22-10.200.16.10:58432.service: Deactivated successfully.
Jan 14 13:40:54.351682 systemd[1]: session-23.scope: Deactivated successfully.
Jan 14 13:40:54.353162 systemd-logind[1736]: Session 23 logged out. Waiting for processes to exit.
Jan 14 13:40:54.354001 systemd-logind[1736]: Removed session 23.
Jan 14 13:40:59.438809 systemd[1]: Started sshd@21-10.200.20.15:22-10.200.16.10:36134.service - OpenSSH per-connection server daemon (10.200.16.10:36134).
Jan 14 13:40:59.923515 sshd[7002]: Accepted publickey for core from 10.200.16.10 port 36134 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:40:59.924981 sshd-session[7002]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:40:59.929031 systemd-logind[1736]: New session 24 of user core.
Jan 14 13:40:59.934150 systemd[1]: Started session-24.scope - Session 24 of User core.
Jan 14 13:41:00.343549 sshd[7005]: Connection closed by 10.200.16.10 port 36134
Jan 14 13:41:00.344223 sshd-session[7002]: pam_unix(sshd:session): session closed for user core
Jan 14 13:41:00.347723 systemd[1]: sshd@21-10.200.20.15:22-10.200.16.10:36134.service: Deactivated successfully.
Jan 14 13:41:00.350115 systemd[1]: session-24.scope: Deactivated successfully.
Jan 14 13:41:00.351099 systemd-logind[1736]: Session 24 logged out. Waiting for processes to exit.
Jan 14 13:41:00.351972 systemd-logind[1736]: Removed session 24.
Jan 14 13:41:05.428862 systemd[1]: Started sshd@22-10.200.20.15:22-10.200.16.10:36146.service - OpenSSH per-connection server daemon (10.200.16.10:36146).
Jan 14 13:41:05.882714 sshd[7035]: Accepted publickey for core from 10.200.16.10 port 36146 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:41:05.883980 sshd-session[7035]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:41:05.887760 systemd-logind[1736]: New session 25 of user core.
Jan 14 13:41:05.892143 systemd[1]: Started session-25.scope - Session 25 of User core.
Jan 14 13:41:06.282932 sshd[7037]: Connection closed by 10.200.16.10 port 36146
Jan 14 13:41:06.282419 sshd-session[7035]: pam_unix(sshd:session): session closed for user core
Jan 14 13:41:06.285233 systemd[1]: sshd@22-10.200.20.15:22-10.200.16.10:36146.service: Deactivated successfully.
Jan 14 13:41:06.287126 systemd[1]: session-25.scope: Deactivated successfully.
Jan 14 13:41:06.288748 systemd-logind[1736]: Session 25 logged out. Waiting for processes to exit.
Jan 14 13:41:06.289725 systemd-logind[1736]: Removed session 25.
Jan 14 13:41:11.379667 systemd[1]: Started sshd@23-10.200.20.15:22-10.200.16.10:47450.service - OpenSSH per-connection server daemon (10.200.16.10:47450).
Jan 14 13:41:11.836247 sshd[7091]: Accepted publickey for core from 10.200.16.10 port 47450 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:41:11.837143 sshd-session[7091]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:41:11.841010 systemd-logind[1736]: New session 26 of user core.
Jan 14 13:41:11.846181 systemd[1]: Started session-26.scope - Session 26 of User core.
Jan 14 13:41:12.233069 sshd[7093]: Connection closed by 10.200.16.10 port 47450
Jan 14 13:41:12.233643 sshd-session[7091]: pam_unix(sshd:session): session closed for user core
Jan 14 13:41:12.236762 systemd[1]: sshd@23-10.200.20.15:22-10.200.16.10:47450.service: Deactivated successfully.
Jan 14 13:41:12.238957 systemd[1]: session-26.scope: Deactivated successfully.
Jan 14 13:41:12.240534 systemd-logind[1736]: Session 26 logged out. Waiting for processes to exit.
Jan 14 13:41:12.241563 systemd-logind[1736]: Removed session 26.
Jan 14 13:41:17.322693 systemd[1]: Started sshd@24-10.200.20.15:22-10.200.16.10:53144.service - OpenSSH per-connection server daemon (10.200.16.10:53144).
Jan 14 13:41:17.806535 sshd[7106]: Accepted publickey for core from 10.200.16.10 port 53144 ssh2: RSA SHA256:AMUBWb04LkINjl6iymCQ58zI8KSkiZGdP88JbHPzCuU
Jan 14 13:41:17.807817 sshd-session[7106]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0)
Jan 14 13:41:17.811646 systemd-logind[1736]: New session 27 of user core.
Jan 14 13:41:17.818134 systemd[1]: Started session-27.scope - Session 27 of User core.
Jan 14 13:41:18.219059 sshd[7108]: Connection closed by 10.200.16.10 port 53144
Jan 14 13:41:18.219907 sshd-session[7106]: pam_unix(sshd:session): session closed for user core
Jan 14 13:41:18.224445 systemd[1]: sshd@24-10.200.20.15:22-10.200.16.10:53144.service: Deactivated successfully.
Jan 14 13:41:18.226370 systemd[1]: session-27.scope: Deactivated successfully.
Jan 14 13:41:18.227226 systemd-logind[1736]: Session 27 logged out. Waiting for processes to exit.
Jan 14 13:41:18.228242 systemd-logind[1736]: Removed session 27.