//! help debug fibheap code in particular because additional checks are performed under `cfg(test)`
//! that will `abort` if the heap breaks (ie because a duplicate node was added, a node was removed
//! Asymptotic time complexity is not the whole story. Fibonacci heaps and pairing heaps are linked data
//! are not consistent and so generally pointers are used, which can introduce overhead in several
//! its parent, its first child, its next sibling, and its previous sibling. For the subheap roots,
//! Each node also stores its degree (number of chilren) and whether or not it has had a child removed.
//! The heap itself stores a pointer to the minimal root, and the number of elements in the heap.
//! subheap whose root has k children is at least F_(k+2), where F_k is the kth Fibonacci number,
//! Each such node goes into the index in the array corresponding to its degree. If there is already a subheap there,
//! we instead remove that subheap, merge it with the current one, and keep going. When we merge the subheaps,
//! the one with the larger root is added as a child of the one with the smaller root, so the degree of the latter goes
//! up and then we try to put it in the next corresponding index in the array, repeating until we reach a degree that
//! We re-link these into a double circularly linked list, and update the minimal root pointer to point to the minimal one.
//! If all subheaps obey the `size >= F_(degree + 2)` invariant before `extract-min`, they will also obey it after.
//! If there are two or fewer nodes in the heap before we extract-min, we can optimize by not merging subheaps and just
//! When decrease-key is called, if the key is still greater than or equal to the parent's key, or the node is
//! a root node and the key is still greater than or equal to the minimal root, or the root node is the minimal root,
//! there is nothing to be done. If the node is a root node but not the minimal root and the key becomes
//! Finally, if the key is now less than the parent's key, remove the node from its sibling list and as a child of its parent.
//! Add it as a new subeap. If the parent was not marked as already having a child removed, mark it as such
//! and then remove it as well, repeating for any remaining parents. However, root nodes need not be marked as such.
//! Insert and meld just require adding a new subheap with one element, and stitching together the subheap root linked lists,
//! A very popular priority queue crate. This crate implements priority queues using a hash table to allow key-indexed node
//! access, and builds the priority queue on top of its internal hash table. It has a good api and is a good choice most of
//! the time. Crater has some advantages over this crate: our fib heap allows building priority queues with any backing
//! store, eg KD trees, vectors, just putting each node in `Box`, etc; and references to nodes have generous lifetimes,
//! so storing references instead of keys is possible and makes lookup as fast as possible. However, unless you need these
/// Get a reference to the embedded raw node. This is used internally to traverse and bookkeep the heap.
Self{prev: None.into(), next: None.into(), first_child: None.into(), parent: None.into(), has_split: false.into(), degree: 0.into()}
_ => unsafe { // the roots both have degree greater than one, so res has a distinct existing first and last child
for child in (*unsafe { node.get_raw().first_child.get().as_ref().unwrap() }).into_iter().flat_map(iter_siblings) {
} else if !unsafe { *child_links.prev.get() }.is_some_and(|p|ptr::eq(p, prev.unwrap())) {
// we don't need to check "prev".next because that's the termination condition of `iter_siblings`
/// - The number of elements extracted from the heap is proportional to the number inserted and/or large, not fixed and small
/// - The keys/priorities of elements don't need to be increased frequently (also the comparison function doesn't need to be changed)
/// Create a new fibonacci heap whose backing storage will live at least as long as `_container`.
/// Conceptually, only the lifespan of `_container` matters: by passing it in here and holding a `PhantomData`
/// reference with the same lifespan, we force the shared reference to the container to outlive the
/// Internally, [`RawNode`] uses [`UnsafeCell`] so that nodes can be mutated through shared reference.
/// Otherwise, correctly mutating nodes is excessively complicated (requires using raw pointers throughout,
/// and requires the backing storage to have some way to get a mutable subreference that doesn't overlap with
/// any node so we can assume it is effectively pinned, most of the time this would be a mutable reference to
/// if doing so could cause it to no longer be minimal (the key probably lives behind an [`UnsafeCell`])
// Calculate the ceiling of the base 2 log of self.count, then multiply by the reciprocal of the base 2 log of the golden ratio
let max_degree = ((((self.count - 1).ilog2() + 1) as f64)*1.4404200904125567).ceil() as usize;
// iterate over all roots besides min_root (other_roots.flat_map(RawNode::unlinking_siblings)),
// unlinking_siblings/children are "destructive" and will automatically remove all sibling links.
loop { // repeatedly try to insert the root into the array of roots, merging it with the root with the same degree until it has unique degree
// we don't need to check "prev".next because that's the termination condition of `iter_siblings`
let node = Box::new(GenNode{multiple: (n*n).into(), prime: n.into(), _node: Default::default()});
unsafe { *self.min_dist.get() + self.h }.cmp(&unsafe { *other.min_dist.get() + other.h })
grid.extend(l.split(',').map(|t|GridNode{cost: t.parse().unwrap(), h: 0, min_dist: 0.into(), visited: false.into(), _node: Default::default()}));