Signal Processing

  • Recollections from Bernard Widrow’s Classes


    I took Bernard Widrow’s machine learning & neural networks classes in the early 2000s. Some recollections.Bernard Widrow's approach to teaching neural networks and signal processing at Stanford in the early 2000s was remarkably ahead of its time, presenting neural networks as practical engineering systems rather than speculative concepts. His classes covered topics such as learning rules, stability, and hardware constraints, and he often demonstrated how concepts like reinforcement learning and adaptive filtering were already being implemented long before they became trendy. Widrow emphasized the importance of real-world applications, sharing anecdotes like the neural network hardware prototype he carried, highlighting the necessity of treating learning systems as tangible entities. His professional courtesy and engineering-oriented mindset left a lasting impression, showcasing how many ideas considered new today were already being explored and treated as practical challenges decades ago. This matters because it underscores the foundational work in neural networks that continues to influence modern advancements in the field.

    Read Full Article: Recollections from Bernard Widrow’s Classes

  • Innovative Solutions to GPS Jamming Vulnerabilities


    GPS is vulnerable to jamming—here’s how we might fix itGPS systems are increasingly vulnerable to jamming, prompting companies like TrustPoint and Xona Space Systems to innovate with new satellite technologies. TrustPoint aims to deploy small satellites closer to Earth that transmit higher-frequency, encrypted signals, reducing the effectiveness of jamming devices. Xona Space Systems plans to offer signals 100 times stronger than current GPS, providing two-centimeter precision and incorporating a watermark for added security. These advancements could significantly enhance GPS reliability and security, crucial for both military and civilian applications.

    Read Full Article: Innovative Solutions to GPS Jamming Vulnerabilities

  • Distributed FFT in TensorFlow v2


    Distributed Fast Fourier Transform in TensorFlowThe recent integration of Distributed Fast Fourier Transform (FFT) in TensorFlow v2, through the DTensor API, allows for efficient computation of Fourier Transforms on large datasets that exceed the memory capacity of a single device. This advancement is particularly beneficial for image-like datasets, enabling synchronous distributed computing and enhancing performance by utilizing multiple devices. The implementation retains the original FFT API interface, requiring only a sharded tensor as input, and demonstrates significant data processing capabilities, albeit with some tradeoffs in speed due to communication overhead. Future improvements are anticipated, including algorithm optimization and communication tweaks, to further enhance performance. This matters because it enables more efficient processing of large-scale data in machine learning applications, expanding the capabilities of TensorFlow.

    Read Full Article: Distributed FFT in TensorFlow v2