top of page

Search Results

27 results found with an empty search

  • Movie | H Peter Alesso

    Henry Gallant Movie Movie: Midshipman Henry Gallant in Space Movie with subtitles: Midshipman Henry Gallant in Space Podcast: Midshipman Henry Gallant in Space (1)

  • Home | H. Peter Alesso science fiction author

    H. Peter Alesso Portfolio Past, Present, and Future. " Oh, why is love so complicated?" asked Henry. ​ Alaina said, "It's not so complicated. You just have to love the other person more than yourself." Not everyone who fights is a warrior. ​ A warrior knows what's worth fighting for.

  • New | H Peter Alesso

    New Release Sometimes, the right man in the wrong uniform can make all the difference. Ethan, a lowly recruit with an oil-stained uniform and a spirit worn down by disappointment, finds his life forever changed by a twist of fate. Squinting at his reflection, he sees the sleeves of his borrowed jacket bore captain’s stripes. A grotesque emblem is embossed over the jacket's breast pocket—a roaring lion's head surrounded by a cluster of jagged broken bones—the symbol of the Special Operations Service. There is no way out. The ship is taking off because they think an elite SOS captain is on board to take command—him. His choices were brutally simple . . . act like the officer everyone thought he was or be found out as a fraud. One was survival, the other . . . The consequences sent a wave of panic through him. He was a mouse in a lion's skin. He had to become that lion until he found a way out of his cage. Ethan's path intersects with Kate Haliday, the leader of the dark matter project in the Cygni star system. A subtle dance of glances and half-spoken truths begins. But the threads of connection are fragile as they are tangled with the ambitions of Commander Varek, a skeptical officer. The emergence of an unknown alien race casts a long shadow that shifts the cosmic chessboard of a space fleet and a galactic empire. Their interest in dark matter and Earth's colonies weaves a layer of mystery and suspense. In this hard science fiction dance, Ethan must navigate the intricacies of love, rivalry, and alien invasion. The possibility of being unveiled darkens his every step. With each move, the line between the man he is and the officer he pretended to be . . . blurs. Once a misfit dropout, Mike now controls the fate of man versus machine In a world where the boundaries between man and machine blur, your thoughts, emotions, and yearnings are no longer private. The confluence of biotech and infotech has given birth to the Algorithm—a force that predicts your every move and has the power to shape your deepest desire. But when the Algorithm starts undermining human worth, many find themselves obsolete. Grappling with their waning relevance, they find solace in a new realm. They master the skills of a surreal virtual world that requires neither gravity nor light. As technology's grip tightens, a haunting question emerges: Does anyone hold the reins of the omnipotent Algorithm? Enter an unlikely hero—an aimless dropout who unwittingly finds himself at the nexus of power. Tall and lean, Mike has deep-set blue eyes that often reflect his internal conflicts and dilemmas. His past is riddled with disappointment and insecurity. Assuming another student’s ID in a crowded exam room, Mike's journey takes an unexpected turn when a stern figure declares, "I am Jacob Winters. Welcome to the AI career placement test. Today, we will discover which of you represents the pinnacle of human genius." Delve into Keeper of the Algorithm to discover a future where destiny is written in code and domination is the ultimate prize. For serious AI enthusiasts only!

  • About | H Peter Alesso

    My Story I love words, but that wasn't always the case. ​ I grew up with a talent for numbers, leading me to follow a different path. I went to Annapolis and MIT and became a nuclear physicist at Lawrence Livermore National Laboratory. Only after retiring was my desire to tell stories reawakened. ​ In recent years, I have immersed myself in the world of words, drawing on my scientific knowledge and personal experience to shape my writing. As a scientist, I explored physics and technology, which enabled me to create informative and insightful books, sharing my knowledge with readers who sought to expand their understanding in these areas—contributing to their intellectual growth while satisfying my own passion. But it was my time as a naval officer, that genuinely ignited my imagination and propelled me into science fiction. After graduating from the United States Naval Academy and serving on nuclear submarines during both hot and cold wars, I witnessed firsthand the complexities and challenges of military operations that seamen face daily. This allowed me a unique perspective, which I channeled into creating Henry Gallant and a 22nd-century world where a space officer fought against invading aliens. Through this narrative, I explored the depths of human resilience, the mysteries of space, and the intricacies of military conflict. ​ My stories let me share the highlights of my journey with you. ​ I hope you enjoy the ride. 1/9 Contact First name* Last name Email* Write a message Submit

  • Youngblood | H Peter Alesso

    Youngblood AMAZON Death’s Dream Kingdom* ​ “I can’t breathe!” ​ Youngblood’s lungs strained to inhale the last of the thin air but drew in only an empty breath. His heart pounded against his hollow lungs. His fingers stretched wide and then clenched. ​ The last faint echoes of whining machinery died away as the fading glow of emergency lights sputtered out. ​ He opened his eyes wide trying to make sense of the dark confined space. ​ Where am I? ​ Lying on his back in pitch black, he reached up and touched a smooth encapsulating surface without seams or latches. ​ A coffin? Banging against the case, a raspy cry escaped his mouth, “Helppp.” ​ He scratched with fingers until they bled; he smashed with fists until they bruised. ​ “AUUGGH!” ​ His face was a bulging purple mask with a protruding red tongue. Gory hands wiped away oozing goo dripping from his nose. Each passing second was a countdown toward imploding lungs. Coma and death were fast approaching—causing a spasm of raw cold fear—a deep primal terror like the first great scare a child experienced when his nightmare turned ‘real’ and the claws of a hideous monster squeezed his throat. ​ “Air! I’ve got to have air,” he begged in a whisper. ​ He balled his fist and punched him. He kicked. Again, and again. ​ CRACK! ​ A small fissure created a loud hiss as air trickled into the confined space. ​ Finally able to inhale, Youngblood’s chest rose and fell with each precious breath. ​ The visceral threat of suffocation lessened, but the injuries continued to throb. ​ He pulled out some of the needles that feathered his body. ​ It’s a hibernation chamber. ​ They were supposed to revive him when they found a cure. ​ Did they find a cure? ​ For a moment, he opened his mouth and raised his eyebrows, then . . . ​ Stupid! Stupid! ​ This wasn’t a normal recovery. There were countless things wrong with this. Was this a random system failure? Someone should have been monitoring. ​ His brain screamed. ​ Why isn’t there an alarm and an attendant? ​ They might all be dead. ​ He tried to break out of the case. The straining structure moaned as he pressed against it, but his debilitated joints and impaired muscles lacked the strength to free him. ​ As he continued to welcome the incessant wheezing of cool air filtering into the cocoon, he waited, but no one came. ​ The only sounds of activity were the sparks of electrical wires far off in the distance. With stiff fingers, he massaged his sore arms and legs, but all his efforts to break out of the coffin-like container failed. ​ He dug his fingernails into his palms to escape the deadness that gnawed inside him. Dark destructive thoughts flooded in. The future that was supposed to augur health had turned into a nightmare. His mind stretched back to something very painful, a filament-thin memory, I was eighteen when the debilitating effects of the illness began. Father said hibernation was the only solution. I trusted him. Lies! All his words were lies . . . he just wanted to be rid of me. Lingering on the ghost-like memory as it waxed and waned, it was the end of hope. At that moment, everything came crashing down inside of him. ​ The chamber grew claustrophobic. ​ I’m done. ​ He closed his eyes. ​ It’s over. ​ He let the minutes pass, hoping the pain would end, wishing he would end . . . but neither did. Finally, he opened his eyes and took a deep breath. Focusing his thoughts, he pushed back against the black despair. ​ No! I won’t give up. I’m going to survive . . . somehow. ​ But it was undeniable, he needed immediate medical aid, or this chamber would become a coffin. He screamed when he yanked out the rest of the needles. He cast away the trailing tubes that had sustained his life for . . . ​ How long? ​ No way of telling. ​ You can do this. You must. ​ He pressed against the case once more. It creaked and groaned like a living thing and it took many more tries before, at last, it broke open enough to allow him to squeeze out. Leaning over the edge, he shifted his weight to let it carry him over the side. Hitting the floor with a thump, he began to crawl. It took an hour to reach the wall a mere twenty yards away. There were other chambers along the way, but none appeared operational. ​ “Is anyone there?” he called out—repeatedly. ​ There was never a reply. ​ In a moment of raw honesty, he understood: ​ No one else escaped. ​ Moving along the wall until he reached a door, he wobbled to his feet and managed to stand and press a button. It slid open. ​ He tried to walk, but his legs were unwilling. Leaning against the wall, he let his body slide down to the floor like a sack of sand. ​ There was a dim glow of light at the end of the hall, but crawling took an interminable effort. The light was coming from a control console inside a small room. The dark surroundings offered little information about the devices inside. Exhausted, he hoisted himself into a chair and listened to the rhythmic sound of blood drops smacking onto the floor like a drum beating Taps. ​ As his face blanched, he trembled with dizziness and nausea, his tunnel vision narrowed, the room blackened and spun . . . and then . . . nothingness . . . ​ *** Youngblood woke in a cold sweat. His head throbbed but the room had stopped spinning. He was still sitting at the dysfunctional control panel though only a few dials remained lit. As a computer science major, he thought he should make sense of them, but they were as foreign as a Gödel puzzle. Damn! ​ His fist smashed into the console. ​ HUMMM. ​ He heard the distinct sound but couldn’t pinpoint its location. ​ With every sense alert, he sat waiting . . . ​ “Hello? Is anyone there?” ​ The cold dark concrete walls and poured concrete floor echoed his words but offered no response. ​ There were a couple of doors further down the hall. ​ He stood up. Simply stretching his body took all his effort. His entire body hurt, but slowly he managed to shuffle toward the doors. ​ The first door was locked. ​ The next one was too. ​ He twisted around a corner, but a misstep caused him to fight his own momentum to forestall crashing headfirst into the unyielding wall. The impact to his shoulder knocked him back and whirled him around. Reaching out, he grasped a handle and yanked it to steady himself. ​ The door opened. ​ A storage closet? ​ Catching his breath, he strained his eyes against the dark shadows to identify several large cardboard boxes and a few wooden crates. The largest box was next to a cabinet with symbols he didn’t recognize. Yet, a Red Cross sign was visible on the furthest crate. He stretched his hands toward the old dirt covered wooden crate and pried open the thick heavy lid with his fingers. ​ “Argh.” ​ The cry of anguish was from his own mouth. He placed his suffering hands under his armpits and squeezed until the strained fingers returned to normal. ​ After several minutes, he pulled the lid away and let dirt rattle down into the container. He reached inside to grab a medical package. ​ Thanks. ​ He used the meager emergency rations to stop the bleeding and applied analgesic wherever he could reach. The medication flowed through his veins, stifling the shock and blood loss. He started to relax but his parched throat cried . . . water. ​ He was unable to make out the markings on the other boxes, but he opened the nearest one and groped inside for something familiar. ​ No. ​ Next. ​ No. The last cardboard box . . . ​ Yes. A bottle of water. ​ Taking great gulps, he guzzled what seemed a treasure from an extinct world. He looked for more. There was only one. ​ He tackled another wooden crate. Inside was a flashlight, but it didn’t work. There were batteries on the bottom, but they leaked acid goop. Yet, a few seemed OK. He tried them and felt like a rich man when the flashlight lit and offer the first real peek at his surroundings. ​ There was a nearby room with more defunct hibernation chambers. Another room had medical equipment for reviving patients. ​ But there were no windows anywhere. ​ It’s a bunker. ​ But why put hibernation chambers underground? ​ Putting the puzzle aside, he dug deeper into the storage containers. There were useful items; a butane lighter, a compass, nylon line, a hatchet, a shovel, a hacksaw and lots of basic tools for repairing electrical and computer equipment. ​ He found a workman’s jumpsuit coveralls hanging from a hook on the closet wall and a pair of large black boots. ​ These will come in handy. ​ He moved on and found another closet full of boxes. These were sealed with a plastic wrap, but there was no auxiliary power system visible. ​ There’s got to be a communication device somewhere. ​ He returned to the console and found a diagram framed on the nearby wall. It appeared to be a network of underground tunnels connecting bunkers. An annotated alphanumerical system designated this bunker HB11. Several others had similar designations, but there were also two unique identifiers, YO and SP. The HB might be for hibernation, but he had no clue what YO or SP might represent, nor could he guess how to access the network of tunnels. ​ As he stared at the tunnel map, memorizing the layout, he imagined all the places he could travel to and the places he might visit. What kind of facilities were at each stop along the way? Maybe someday he will find out. ​ He examined some instruments on the console which still had power. The computer system seemed functional, but there wasn’t any written material or operable viewscreen that could offer him instructions. The instruments were as complicated as a spaceship’s and when he attempted to patch into the AI system, he heard a noise. ​ He held his breath. ​ Is someone coming? ​ A humming sound continued from a device in the next room. He must have activated it with his random actions. The machine was marked with English letters and symbols that indicated it was a medical treatment apparatus. He could read several tags on the valves and dials and guessed it was a rejuvenation machine. It took several minutes to surmise how it would work. ​ I’ve got to try this. ​ He climbed naked into the rejuvenation tub and opened a faucet. Synth-fluid and hot medicated elixirs filled the vat. Setting the timer for two hours, he lingered while the potions treated his many superficial ailments. ​ A shame this can’t cure my disease. ​ He relaxed during the treatment while it invigorated his frail body. ​ What’s next? ​ He dragged his body about and toured the bunker, returning to the original room. The flashlight shone on dozens of forsaken hibernation chambers. There was a twin room across the hall, but it too had become a graveyard. ​ A tragedy. I should commemorate them . . . later. ​ He considered revisiting the closets and the locked doors, but that could wait. ​ He looked for more water. ​ No water. ​ The next decision would be critical for his survival. He wanted to use the rejuvenation machine while he restored the power and computer systems, but how long would that take? He had no food or water. Besides, even though he had been a computer whiz, this equipment was far beyond his expertise. Questions exploded in his head like a string of fireworks. ​ Should I stay here? Should I go exploring? ​ Running his fingers through his long shaggy hair, he concluded there was no choice. Putting on the coveralls and boots, he stuffed computer instruments and tools into his cargo pockets. He filled a backpack with survival items and even though it was heavy, he was chafing to get started. Where are the people? ​ *** “I’ve never seen a sky like this,” said Youngblood, as he climbed out of the bunker’s hatch. His eyes took a moment to adjust to the bright sun peeking between a few windswept clouds. “Noon,” he mused and let the hatch drop down. ​ Birds flew overhead, and a glimpse of motion alerted him to a nearby rodent, but there were no roads or worn paths visible. ​ “There’s life, but where are the people?” ​ Originally, he had been placed in hibernation in California’s Stanford Hospital. He had no idea where he was now, but it stood to reason he shouldn’t be too very far away. ​ He surveyed the landscape around him. To the east, he saw a prominent hill rising about a mile away. It was surrounded by scrub brush poking out between a few scattered pine trees. ​ “Hmm . . . a good vantage point.” ​ He swung around and noticed similar regions to the north and south. ​ But things were far different to the west. ​ Where the sky kissed the horizon, blue turned into a mosaic of red, brown, and purple swirls, and the silhouette of a city’s barebone skeleton rose in the distance like a faraway mirage. An acidic stench of smoke and ash invaded his nostrils forcing him to cover his mouth to suppress a spasmodic cough. A brownish yellow haze floated on the hot dry air and dark soot settled on his coveralls. ​ His mouth could barely speak the words, “I can’t believe they actually did it.” ​ A single tear ran down his cheek as he brushed the barren residual ash off his clothes. Anyway, it’s better to die in a flash . . . than suffocate. ​ He licked his lips and swallowed to relieve his parched throat. ​ His fear . . . Where are the people? ​ Became . . . Are there people? ​ Swinging the backpack over his shoulder, he faced east and started forward. A slight cooling breeze sent him on his way as he marched toward the hill. He walked only a few hundred yards before he had to stop and rest. The process repeated itself until his muscles cramped and screamed. He wiped the perspiration off his forehead with his sleeve as he passed areas of dead trees and fall branches. He remained alert for a flash of color, or movement, or any sign of smoke.

  • Rear Admiral Henry Gallant | H Peter Alesso

    Rear Admiral Henry Gallant AMAZON Chapter 1 Far Away ​ Captain Henry Gallant was still far away, but he could already make out the bright blue marble of Earth floating in the black velvet ocean of space. ​ His day was flat and dreary. Since entering the solar system, he had been unable to sleep. Instead, he found himself wandering around the bridge like a marble rattling in a jar. His mind had seemingly abandoned his body to meander on its own, leaving his empty shell to limp through his routine. He hoped tomorrow would bring something better. ​ I’ll be home soon, he thought. ​ A welcoming image of Alaina flashed into his mind, but it was instantly shattered by the memory of their last bitter argument. The quarrel had occurred the day he was deployed to the Ross star system and had haunted him throughout the mission. Now that incident loomed like a glaring threat to his homecoming. ​ As he stared at the main viewscreen of the Constellation, he listened to the bridge crew’s chatter. “The sensor sweep is clear, sir,” reported an operator. ​ Gallant was tempted to put a finger to his lips and hiss, “shh,” so he could resume his brooding silence. But that would be unfair to his crew. They were as exhausted and drained from the long demanding deployment as he was. They deserved better. ​ He plopped down into his command chair and said, “Coffee.” ​ The auto-server delivered a steaming cup to the armrest portal. After a few gulps, the coffee woke him from his zombie state. He checked the condition of his ship on a viewscreen. ​ The Constellation was among the largest machines ever built by human beings. She was the queen of the task force, and her crew appreciated her sheer size and strength. She carried them through space with breathtaking majesty, possessing power and might and stealth that established her as the quintessential pride of human ingenuity. They knew every centimeter of her from the forward viewport to the aft exhaust port. Her dull grey titanium hull didn’t glitter or sparkle, but every craggy plate on her exterior was tingling with lethal purpose. She could fly conventionally at a blistering three-tenths the speed of light between planets. And between stars, she warped at faster than the speed of light. Even now, returning from the Ross star system with her depleted starfighters, battle damage, and exhausted crew, she could face any enemy by spitting out starfighters, missiles, lasers, and plasma death. ​ After a moment, he switched the readout to scan the other ships in the task force. Without taking special notice, he considered the material state of one ship after another. Several were in a sorrowful dysfunctional condition, begging for a dockyard’s attention. He congratulated himself for having prepared a detailed refit schedule for when they reached the Moon’s shipyards. He hoped it would speed along the repair process. ​ Earth’s moon would offer the beleaguered Task Force 34, the rest and restoration it deserved after its grueling operation. The Moon was the main hub of the United Planets’ fleet activities. The Luna bases were the most elaborate of all the space facilities in the Solar System. They performed ship overhauls and refits, as well as hundreds of new constructions. Luna’s main military base was named Armstrong Luna and was the home port of the 1st Fleet, fondly called the Home Fleet. ​ Captain Julie Ann McCall caught Gallant’s eye as she rushed from the Combat Information Center onto the bridge. There was a troubled look on her face. ​ Is she anxious to get home too? ​ Was there someone special waiting for her? Or would she, once more, disappear into the recesses of the Solar Intelligence Agency? ​ After all these years, she’s still a mystery to me. ​ McCall approached him and leaned close to his face. ​ In a hushed throaty voice, she whispered, “Captain, we’ve received an action message. You must read it immediately.” ​ Her tight self-control usually obscured her emotions, but now something extraordinary appeared in her translucent blue eyes—fear! ​ He placed his thumb over his command console ID recognition pad. A few swipes over the screen, and he saw the latest action message icon flashing red. He tapped the symbol, and it opened. TOP SECRET: ULTRA - WAR WARNING Date-time stamp: 06.11.2176.12:00 Authentication code: Alpha-Gamma 1916 To: All Solar System Commands From: Solar Intelligence Agency Subject: War Warning Diplomatic peace negotiations with the Titans have broken down. Repeat: Diplomatic peace negotiations with the Titans have broken down. What this portends is unknown, but all commands are to be on the highest alert in anticipation of the resumption of hostilities. Russell Rissa Director SIA TOP SECRET: ULTRA - WAR WARNING He reread the terse communication. ​ As if emerging from a cocoon, Gallant brushed off his preoccupation over his forthcoming liberty. He considered the possibilities. Last month, he sent the sample Halo detection devices to Earth. He hoped that the SIA had analyzed the technology and distributed it to the fleet, though knowing government bureaucracy, he guessed that effort would need his prodding before the technology came into widespread use. Still, there should be time before it becomes urgent. The SIA had predicted that the Titans would need at least two years to rebuild their forces before they could become a threat again. Could he rely on that? ​ Even though he was getting closer to Earth with every passing second, the light from the inner planets was several days old. Something could have already transpired. There was one immutable lesson in war: never underestimate your opponent. ​ A shiver ran down his spine. ​ This is bad. Very bad! ​ Gone was the malaise that had haunted him earlier. Now, he emerged as a disciplined military strategist, intent on facing a major new challenge. ​ Looking expectantly, he examined McCall’s face for an assessment. ​ Shaking her head, she hesitated. “The picture is incomplete. I have little to offer.” ​ Gallant needed her to be completely open and honest with him, but he was unsure how to win that kind of support. ​ He rubbed his chin and spoke softly, “I’d like to tell you a story about a relationship I’ve had with a trusted colleague. And I’d like you to pretend that you were that colleague.” ​ McCall furrowed her brow, but a curious gleam grew in her eyes. ​ He said, “I’ve known this colleague long enough to know her character even though she has been secretive about her personal life and loyalties.” ​ McCall inhaled and visibly relaxed as she exhaled. Her eyes focused their sharp acumen on Gallant. ​ “She is bright enough to be helpful and wise enough not to be demanding,” continued Gallant. “She has offered insights into critical issues and made informed suggestions that have influenced me. She is astute and might know me better than I know myself because of the tests she has conducted. When I’ve strayed into the sensitive topic of genetic engineering, she has soothed my bumpy relationship with politicians.” ​ He hesitated. Then added, “Yet, she has responsibilities and professional constraints on her candidness. She might be reluctant to speak openly on sensitive issues, particularly to me.” ​ McCall’s face was a blank mask, revealing no trace of her inner response to his enticing words. He said, “If you can relate to this, I want you to consider that we are at a perilous moment. It is essential that you speak frankly to me about any insights you might have about this situation.” She swallowed and took a step closer to Gallant. Their faces were mere centimeters apart. ​ “Very well,” she said. “The Chameleon are a spent force. After the loss of their last Great Ship, they are defenseless. They agreed to an unconditional surrender. They might even beg for our help from the Titans. Their moral system is like ours and should not be a concern in any forthcoming action. However, the Titans have an amoral empathy with other species.” ​ He gave an encouraging nod. ​ She added, “Despite the defeat of Admiral Zzey’s fleet in Ross, the Titans remain a considerable threat. They opened peace negotiations ostensibly to seek a treaty with a neutral zone between our two empires. But we can’t trust them. They are too aggressive and self-interested to keep any peace for long. One option they might try is to eliminate the Chameleon while they have the opportunity. Another is to rebuild their fleet for a future strike against us. However, the most alarming possibility would be an immediate attack against us with everything they currently have. They might even leave their home world exposed. But that would only make sense if they could achieve an immediate and overwhelming strategic victory.” ​ Gallant grimaced as he absorbed her analysis. ​ She concluded, “This dramatic rejection of diplomacy can only mean that they are ready to reignite the war—with a vengeance. They will strike us with swift and ruthless abandon.” ​ Gallant turned his gaze toward the bright blue marble—still far away.

  • Thinking on the Web | H Peter Alesso

    Thinking on the Web AMAZON Chapter 2 Gödel: What is Decidable? In the last chapter, we suggested that small wireless devices connected to an intelligent Web could produce ubiquitous computing and empower the Information Revolution. In the future, Semantic Web architecture is designed to add some intelligence to the Web through machine processing capabilities. For the Semantic Web to succeed the expressive power of the logic added to its mark-up languages must be balanced against the resulting computational complexit y. Therefore, it is important to evaluate both the expressive characteristics of logic languages, as well as, their inherit limitations. In fact, some options for Web logic include solutions that may not be solvable through rational argument. In particular, the work of Kurt Gödel identified the concept of undecidability where the truth or falsity of some statements may not be determined. In this chapter, we review some of the basic principles of logic and related them to the suitability for Web applications. First, we review the basic concept of logic, and discuss various characteristics and limitations of logic analysis. We introduce First Order Logics (FOL) and its subsets, such as Descriptive Logic and Horn Logic which offer attractive characteristics for Web applications. These languages set the parameters for how expressive Web markup languages can become. Second, we investigate how logic conflicts and limitations in computer programming and Artificial Intelligence (AI) have been handled in closed environments to date. We consider how errors in logic contribute to significant ‘bugs’ that lead to crashed computer programs. Third, we review how Web architecture is used to partition the delivery of business logic from the user interface. The Web architecture keeps the logic restricted to executable code residing on the server and delivers user-interface presentations residing within the markup languages traveling over the Web. The Semantic Web changes this partitioned arrangement. Finally, we discuss the implications of using logic in markup languages on the Semantic Web. Philosophical and Mathematical Logic Aristotle described man as a “rational animal” and established the study of logic beginning with the process of codifying syllogisms. A syllogism is a kind of argument in which there are three propositions, two of them premises, one a conclusion. Aristotle was the first to create a logic system which allowed predicates and subjects to be represented by letters or symbols. His logic form allowed one to substitute for subjects and predicates with letters (variables). For example: If A is predicated of all B, and B is predicated of all C, then A is predicated of all C. By predicated, Aristotle means B belongs to A, or all B's are A's. For instance, we can substitute subjects and predicates into this syllogism to get: If all humans (B's) are mortal (A), and all Greeks (C's) are humans (B's), then all Greeks (C's) are mortal (A). Today, Aristotle's system is mostly seen as of historical value. Subsequently, other philosophers and mathematicians such as Leibniz developed methods to represent logic and reasoning as a series of mechanical and symbolic tasks. They were followed by logicians who developed mechanical rules to carry out logical deductions. In logic, as in grammar, a subject is what we make an assertion about, and a predicate is what we assert about the subject. Today, logic is considered to be the primary reasoning mechanism for solving problems. Logic allows us to sets up systems and criteria for distinguishing acceptable arguments from unacceptable arguments. The structure of arguments is based upon formal relations between the newly produced assertions and the previous ones. Through argument we can then express inferences. Inferences are the processes where new assertions may be produced from existing ones. When relationships are independent of the assertions themselves we call them ‘formal’. Through these processes, logic provides a mechanism for the extension of knowledge. As a result, logic provides prescriptions for reasoning by machines, as well as, by people. Traditionally, logic has been studied as a branch of philosophy. However, since the mid-1800’s logic has been commonly studied as a branch of mathematics and more recently as a branch of computer science. The scope of logic can therefore be extended to include reasoning using probability and causality. In addition, logic includes the study of structures of fallacious arguments and paradoxes. By logic then, we mean the study and application of the principles of reasoning, and the relationships between statements, concepts or propositions. Logic incorporates both the methods of reasoning and the validity of the results. In common language, we refer to logic in several ways; logic can be considered as a framework or system of reasoning, a particular mode or process of reasoning, or the guiding principles of a field or discipline. We also use the term "logical" to describe a reasoned approach to solve a problem or get to a decision, as opposed to the alternative "emotional" approaches to react or respond to a situation. As logic has developed, its scope has splintered int o many distinctive branches. These distinctions serve to formalize different forms of logic as a science. The distinctions between the various branches of logic lead to their limitations and expressive capabilities which are central issues to designing the Semantic Web languages. The following sections identify some of the more important distinctions. Deductive and Inductive Reasoning Originally, logic consisted only of deductive reasoning which was concerned with a premise and a resultant deduction. However, it is important to note that inductive reasoning – the study of deriving a reliable generalization from observations – has also been included in the study of logic. Correspondingly, we must distinguish between deductive validity and inductive validity. The notion of deductive validity can be rigorously stated for systems of formal logic in terms of the well-understood notions of semantics. An inference is deductively valid if and only if there is no possible situation in which all the premises are true and the conclusion false. Inductive validity on the other hand requires us to define a reliable generalization of some set of observations. The task of providing this definition may be approached in various ways, some of which use mathematical models of probability. Paradox A paradox is an apparently true statement that seems to lead to a contradiction or to a situation that defies intuition. Typically, either the statements in question do not really imply the contradiction; or the puzzling result is not really a contradiction; or the premises themselves are not all really true (or, cannot all be true together). The recognition of ambiguities, equivocations, and unstated assumptions underlying known paradoxes has often led to significant advances in science, philosophy and mathematics. Formal and Informal Logic Formal logic (sometimes called ‘symbolic logic’) attempts to capture the nature of logical truth and inference in formal systems. This consists of a formal language, a set of rules of derivation (often called ‘rules of inference’), and sometimes a set of axioms. The formal language consists of a set of discrete symbols, a syntax (i.e., the rules for the construction of a statement), and a semantics (i.e., the relationship between symbols or groups of symbols and their meanings). Expressions in formal logic are often called ‘formulas.’ The rules of derivation and potential axioms then operate with the language to specify a set of theorems, which are formulas that are either basic axioms or true statements that are derivable using the axioms and rules of derivation. In the case of formal logic systems, the theorems are often interpretable as expressing logical truths (called tautologies). Formal logic encompasses a wide variety of logic systems. For instance, propositional logic and predicate logic are kinds of formal logic, as well as temporal logic, modal logic, Hoare logic and the calculus of constructions. Higher-order logics are logical systems based on a hierarchy of types. For example, Hoare logic is a formal system developed by the British computer scientist C. A. R. Hoare. The purpose of the system is to provide a set of logical rules in order to reason about the correctness of computer programs with the rigor of mathematical logic. The purpose of such a system is to provide a set of logical rules by which to reason about the correctness of computer programs with the rigor of mathematical logic. The central feature of Hoare logic is the Hoare triple. A triple describes how the execution of a piece of code changes the state of the computation. A Hoare triple is of the form: {P} C {Q} where P and Q are assertions and C is a command. P is called the precondition and Q the post-condition. Assertions are formulas in predicate logic. An interpretation of such a triple is: Whenever P holds of the state before the execution of C, then Q will hold afterwards. Alternatively, informal logic is the study of logic that is used in natural language arguments. Informal logic is complicated by the fact that it may be very hard to extract the formal logical structure embedded in an argument. Informal logic is also more difficult because the semantics of natural language assertions is much more complicated than the semantics of formal logical systems. Mathematical Logic Mathematical logic really refers to two distinct areas of research: the first is the application of the techniques of formal logic to mathematics and mathematical reasoning, and the second, the application of mathematical techniques to the representation and analysis of formal logic. The boldest attempt to apply logic to mathematics was pioneered by philosopher-logician Bertrand Russell. His idea was that mathematical theories were logical tautologies, and his program was to show this by means to a reduction of mathematics to logic. The various attempts to carry this out met with a series of failures, such as Russell's Paradox, and the defeat of Hilbert's Program by Gödel's incompleteness theorems (which we shall describe shortly). Russell's paradox represents either of two interrelated logical contradictions. The first is a contradiction arising in the logic of sets or classes. Some sets can be members of themselves, while others can not. The set of all sets is itself a set, and so it seems to be a member of itself. The null or empty set, however, must not be a member of itself. However, suppose that we can form a set of all sets that, like the null set, are not included in themselves. The paradox arises from asking the question of whether this set is a member of itself. It is, if and only if, it is not! The second form is a contradiction involving properties. Some properties seem to apply to themselves, while others do not. The property of being a property is itself a property, while the property of being a table is not, itself, a table. Hilbert's Program was developed in the early 1920s, by German mathematician David Hilbert. It called for a formalization of all of mathematics in axiomatic form, together with a proof that this axiomatization of mathematics is consistent. The consistency proof itself was to be carried out using only what Hilbert called ‘finitary’ methods. The special epistemological character of this type of reasoning yielded the required justification of classical mathematics. It was also a great influence on Kurt Gödel, whose work on the incompleteness theorems was motivated by Hilbert's Program. In spite of the fact that Gödel's work is generally taken to prove that Hilbert's Program cannot be carried out, Hilbert's Program has nevertheless continued to be influential in the philosophy of mathematics, and work on Revitalized Hilbert Programs has been central to the development of proof theory. Both the statement of Hilbert's Program and its refutation by Gödel depended upon their work establishing the second area of mathematical logic, the application of mathematics to logic in the form of proof theory. Despite the negative nature of Gödel's incompleteness theorems, a result in model theory can be understood as showing how close logics came to being true: every rigorously defined mathematical theory can be exactly captured by a First-Order Logical (FOL) theory. Thus it is apparent that the two areas of mathematical logic are complementary. Logic is extensively applied in the fields of artificial intelligence and computer science. These fields provide a rich source of problems in formal logic. In the 1950s and 60s, researchers predicted that when human knowledge could be expressed using logic with mathematical notation, it would be possible to create a machine that reasons, or produces artificial intelligence. This turned out to be more difficult than expected because of the complexity of human reasoning. In logic programming, a program consists of a set of axioms and rules. In symbolic logic and mathematical logic, proofs by humans can be computer-assisted. Using automated theorem proving, machines can find and check proofs, as well as work with proofs too lengthy to be written out by hand. However, the computation complexity of carrying out automated theorem proving is a serious limitation. It is a limitation that we will find in subsequent chapters significantly impacts the Semantic Web. ​ Decidability In the 1930s, the mathematical logician, Kurt Gödel shook the world of mathematics when he established that, in certain important mathematical domains, there are problems that cannot be solved or propositions that cannot be proved, or disproved, and are therefore undecidable. Whether a certain statement of first order logic is provable as a theorem is one example; and whether a polynomial equation in several variables has integer solutions is another. While humans solve problems in these domains all the time, it is not certain that arbitrary problems in these domains can always be solved. This is relevant for artificial intelligence since it is important to establish the boundaries for a problem’s solution. Kurt Gödel Kurt Gödel (shown Figure 2-1) was born April 28, 1906 in Brünn, Austria-Hungary (now Brno, Czech Republic). He had rheumatic fever when he was six years old and his health became a chronic concern over his lifetime. Kurt entered the University of Vienna in 1923 where he was influenced by the lectures of Wilhelm Furtwängler. Furtwängler was an outstanding mathematician and teacher, but in addition he was paralyzed from the neck down, and this forced him to lecture from a wheel chair with an assistant to write on the board. This made a big impression on Gödel who was very conscious of his own health. As an undergraduate Gödel studied Russell's book Introduction to Mathematical Philosophy. He completed his doctoral dissertation under Hans Hahn in 1929. His thesis proved the completeness of the first order functional calculus. He subsequently became a member of the faculty of the University of Vienna, where he belonged to the school of logical positivism until 1938. Gödel is best known for his 1931 proof of the "Incompleteness Theorems." He proved fundamental results about axiomatic systems showing that in any axiomatic mathematical system there are propositions that cannot be proved or disproved within the axioms of the system. In particular, the consistency of the axioms cannot be proved. This ended a hundred years of attempts to establish axioms and axiom-based logic systems which would put the whole of mathematics on this basis. One major attempt had been by Bertrand Russell with Principia Mathematica (1910-13). Another was Hilbert's formalism which was dealt a severe blow by Gödel's results. The theorem did not destroy the fundamental idea of formalism, but it did demonstrate that any system would have to be more comprehensive than that envisaged by Hilbert. One consequence of Gödel's results implied that a computer can never be programmed to answer all mathematical questions. In 1935, Gödel proved important results on the consistency of the axiom of choice with the other axioms of set theory. He visited Göttingen in the summer of 1938, lecturing there on his set theory research and returned to Vienna to marry Adele Porkert in 1938. After settling in the United States, Gödel again produced work of the greatest importance. His “Consistency of the axiom of choice and of the generalized continuum-hypothesis with the axioms of set theory” (1940) is a classic of modern mathematics. In this he proved that if an axiomatic system of set theory of the type proposed by Russell and Whitehead in Principia Mathematica is consistent, then it will remain so when the axiom of choice and the generalized continuum-hypothesis are added to the system. This did not prove that these axioms were independent of the other axioms of set theory, but when this was finally established by Cohen in 1963 he used the ideas of Gödel. Gödel held a chair at Princeton from 1953 until his death in 1978. Propositional Logic Propositional logic (or calculus) is a branch of symbolic logic dealing with propositions as units and with the combinations and connectives that relate them. It can be defined as the branch of symbolic logic that deals with the relationships formed between propositions by connectives such as compounds and connectives shown below: Symbols Statement Connectives p q "either p is true, or q is true, or both" disjunction p · q "both p and q are true" conjunction p q "if p is true, then q is true" implication p q "p and q are either both true or both false" equivalence A ‘truth table’ is a complete list of the possible truth values of a statement. We use "T" to mean "true", and "F" to mean "false" (or "1" and "0" respectively). Truth tables are adequate to test validity, tautology, contradiction, contingency, consistency, and equivalence. This is important because truth tables are a mechanical application of the rules. Propositional calculus is a formal system for deduction whose atomic formulas are propositional variables. In propositional calculus, the language consists of propositional variables (or placeholders) and sentential operators (or connectives). A well-formed formula is any atomic formula or a formula built up from sentential operators. First-Order Logic (FOL) First-Order Logic (FOL), also known as first-order predicate calculus, is a systematic approach to logic based on the formulation of quantifiable statements such as "there exists an x such that..." or "for any x, it is the case that...”. A first-order logic theory is a logical system that can be derived from a set of axioms as an extension of first-order logic. FOL is distinguished from higher order logic in that the values "x" in the FOL statements are individual values and not properties. Even with this restriction, first-order logic is capable of formalizing all of set theory and most of mathematics. Its restriction to quantification of individual properties makes it difficult to use for the purposes of topology, but it is the classical logical theory underlying mathematics. The branch of mathematics called Model Theory is primarily concerned with connections between first order properties and first order structures. First order languages are by their nature very restrictive and as a result many questions can not be discussed using them. On the other hand first-order logics have precise grammars. Predicate calculus is quantificational and based on atomic formulas that are propositional functions and modal logic. In Predicate calculus, as in grammar, a subject is what we make an assertion about, and a predicate is what we assert about the subject. Automated Inference for FOL Automated inference using first-order logic is harder than using Propositional Logic because variables can take on potentially an infinite number of possible values from their domain. Hence there are potentially an infinite number of ways to apply the Universal-Elimination rule of inference. Godel's Completeness Theorem says that FOL is only semi-decidable. That is, if a sentence is true given a set of axioms, there is a procedure that will determine this. However, if the sentence is false, then there is no guarantee that a procedure will ever determine this. In other words, the procedure may never halt in this case. As a result, the Truth Table method of inference is not complete for FOL because the truth table size may be infinite. Natural deduction is complete for FOL, but is not practical for automated inference because the ‘branching factor’ in the search process is too large. This is the result of the necessity to try every inference rule in every possible way using the set of known sentences. Let us consider the rule of inference known as Modus Ponens (MP). Modus Ponens is a rule of inference pertaining to the IF/THEN operator. Modus Ponens states that if the antecedent of a conditional is true, then the consequent must also be true: (MP) Given the statements p and if p then q, infer q. The Generalized Modus Ponens (GMP) is not complete for FOL. However, Generalized Modus Ponens is complete for Knowledge Bases (KBs) containing only Horn clauses. An other very important logic that we shall discuss in detail in chapter 8 is Horn logic. A Horn clause is a sentence of the form: (Ax) (P1(x) ^ P2(x) ^ ... ^ Pn(x)) => Q(x) where there are 0 or more Pi's, and the Pi's and Q are positive (i.e., un-negated) literals. Horn clauses represent a subset of the set of sentences representable in FOL. For example: P(a) v Q(a) is a sentence in FOL, but is not a Horn clause. Natural deduction using GMP is complete for KBs containing only Horn clauses. Proofs start with the given axioms/premises in KB, deriving new sentences using GMP until the goal/query sentence is derived. This defines a forward chaining inference procedure because it moves "forward" from the KB to the goal. For example: KB = All cats like fish, cats eat everything they like, and Molly is a cat. In first-order logic then, (1) KB = (Ax) cat(x) => likes(x, Fish) (2) (Ax)(Ay) (cat(x) ^ likes(x,y)) => eats(x,y) (3) cat(Molly) Query: Does Molly eat fish? Proof: Use GMP with (1) and (3) to derive: (4) likes(Molly, Fish) Use GMP with (3), (4) and (2) to derive: eats(Molly, Fish) Conclusion: Yes, Molly eats fish. ​ Description Logic Description Logics (DLs) allow specifying a terminological hierarchy using a restricted set of first-order formulas. DLs have nice computational properties (they are often decidable and tractable), but the inference services are restricted to classification and subsumption. That means, given formulae describing classes, the classifier associated with certain description logic will place them inside a hierarchy. Given an instance description, the classifier will determine the most specific classes to which the instance belongs. From a modeling point of view, Description Logics correspond to Predicate Logic statements with three variables suggesting that modeling is syntactically bound. Descriptive Logic is one possibility for Inference Engines for the Semantic Web. Another possibility is based on Horn-logic, which is another subset of First-Order Predicate logic (see Figure 2-2). In addition, Descriptive Logic and rule systems (e.g., Horn Logic) are somewhat orthogonal which means that they overlap, but one does not subsume the other. In other words, there are capabilities in Horn logic that are complementary to those available for Descriptive Logic. Both Descriptive Logic and Horn Logic are critical branches of logic that highlight essential limitations and expressive powers which are central issues to designing the Semantic Web languages. We will discuss them further in chapter 8. Using Full First-Order Logic (FFOL) for specifying axioms requires a full-fledged automated theorem prover. However, FOL is semi-decidable and doing inferencing becomes computationally untractable for large amounts of data and axioms. This means, than in an environment like the Web, FFOL programs will not scale to handle huge amounts of knowledge. Besides full first theorem proving would mean maintaining consistency throughout the Web, which is impossible. Description Logic fragment of FOL. FOL includes expressiveness beyond the overlap, notably: positive disjunctions; existentials; and entailment of non-ground and non-atomic conclusions. Horn FOL is another fragment of FOL. Horn Logic Program (LP) is a slight weakening of Horn FOL. "Weakening" here means that the conclusions from a given set of Horn premises that are entailed according to the Horn LP formalism are a subset of the conclusions entailed (from that same set of premises) according to the Horn FOL formalism. However, the set of ground atomic conclusions is the same in the Horn LP as in the Horn FOL. For most practical purposes (e.g., relational database query answering), Horn LP is thus essentially similar in its power to the Horn FOL. Horn LP is a fragment of both FOL and nonmonotonic LP. This discussion may seem esoteric, but it is precisely these types of issues that will decide both the design of the Semantic Web as well as is likelihood to succeed. Higher Order Logic Higher Order Logics (HOL's) provide greater expressive power than FOL, but they are even more difficult computationally. For example, in HOL's, one can have true statements that are not provable (see discussion of Gödel’s Incompleteness Theorem). There are two aspects of this issue: higher-order syntax and higher-order semantics. If a higher-order semantics is not needed (and this is often the case), a second-order logic can often be translated into a first-order logic. In first-order semantics, variables can only range over domains of individuals or over the names of predicates and functions, but not over sets as such. In higher-order syntax, variables are allowed to appear in places where normally predicate or function symbols appear. Predicate calculus is the primary example of logic where syntax and semantics are both first-order. There are logics that have higher-order syntax but first-order semantics. Under a higher-order semantics, an equation between predicate (or function) symbols, is true, if and only if logics with a higher-order semantics and higher-order syntax are statements expressing trust about other statements. To state it another way, higher-order logic is distinguished from first-order logic in several ways. The first is the scope of quantifiers; in first-order logic, it is forbidden to quantify over predicates. The second way in which higher-order logic differs from first-order logic is in the constructions that are allowed in the underlying type theory. A higher-order predicate is a predicate that takes one or more other predicates as arguments. In general, a higher-order predicate of order n takes one or more (n − 1)th-order predicates as arguments (where n > 1). ​ Recursion theory Recursion is the process a procedure goes through when one of the steps of the procedure involves rerunning a complete set of identical steps. In mathematics and computer science, recursion is a particular way of specifying a class of objects with the help of a reference to other objects of the class: a recursive definition defines objects in terms of the already defined objects of the class. A recursive process is one in which objects are defined in terms of other objects of the same type. Using a recurrence relation, an entire class of objects can be built up from a few initial values and a small number of rules. The Fibonacci numbers (i.e., the infinite sequence of numbers starting 0, 1, 1, 2, 3, 5, 8, 13, …, where the next number in the sequence is defined a s the sum of the previous two numbers) is a commonly known recursive set. The following is a recursive definition of person's ancestors: One's parents are one's ancestors (base case). The parents of any ancestor are also ancestors of the person under consideration (recursion step). Therefore, your ancestors include: your parents, and your parents' parents (grandparents), and your grandparents' parents, and everyone else you get by successively adding ancestors. It is convenient to think that a recursive definition defines objects in terms of "previously defined" member of the class. While recursive definitions are useful and widespread in mathematics, care must be taken to avoid self-recursion, in which an object is defined in terms of itself, leading to an infinite nesting (see Figure 1-1: “The Print Gallery” by M.C. Escher is a visual illustration of self-recursion). ​ Knowledge Representation Let’s define what we mean by the fundamental terms “data,” “information,” “knowledge,” and "understanding." An item of data is a fundamental element of an application. Data can be represented by populations and labels. Data is raw; it exists and has no significance beyond its existence. It can exist in any form, usable or not. It does not have meaning by itself. Information on the other hand is an explicit association between items of data. Associations represent a function relating one set of things to another set of things. Information can be considered to be data that has been given meaning by way of relational connections. This "meaning" can be useful, but does not have to be. A relational database creates information from the data stored within it. Knowledge can be considered to be an appropriate collection of information, such that it is useful. Knowledge-based systems contain knowledge as well as information and data. A rule is an explicit functional association from a set of information things to a specific information thing. As a result, a rule is knowledge. We can construct information from data and knowledge from information and finally produce understanding from knowledge. Understanding lies at the highest level. Understanding is an interpolative and probabilistic process that is cognitive and analytical. It is the process by which one can take existing knowledge and synthesize new knowledge. One who has understanding can pursue useful actions because he can synthesize new knowledge or information from what is previously known (and understood). Understanding can build upon currently held information, knowledge, and understanding itself. AI systems possess understanding in the sense that they are able to synthesize new knowledge from previously stored information and knowledge. An important element of AI is the principle that intelligent behavior can be achieved through processing of symbolic structures representing increments of knowledge. This has produced knowledge-representation languages that allow the representation and manipulation of knowledge to deduce new facts from the existing knowledge. The knowledge-representation language must have a well-defined syntax and semantics system while supporting inference. Three techniques have been popular to express knowledge representation and inference: (1) Logic-based approaches, (2) Rule-based systems, and (3) Frames and semantic networks. Logic-based approaches use logical formulas to represent complex relationships. They require a well-defined syntax, semantics, and proof theory. The formal power of a logical theorem proof can be applied to knowledge to derive new knowledge. Logic is used as the formalism for programming languages and databases. It can also be used as a formalism to implement knowledge methodology. Any formalism that admits a declarative semantics and can be interpreted both as a programming language and a database language is a knowledge language. However, the approach is inflexible and requires great precision in stating the logical relationships. In some cases, common sense inferences and conclusions cannot be derived, and the approach may be inefficient, especially when dealing with issues that result in large combinations of objects or concepts. Rule-based approaches are more flexible and allow the representation of knowledge using sets of IF-THEN or other conditional rules. This approach is more procedural and less formal in its logic. As a result, reasoning can be controlled through a forward or backward chaining interpreter. Frames and semantic networks capture declarative information about related objects and concepts where there is a clear class hierarchy and where the principle of inheritance can be used to infer the characteristics of members of a subclass. The two forms of reasoning in this technique are matching (i.e., identification of objects having common properties), and property inheritance in which properties are inferred for a subclass. Frames and semantic networks are limited to representation and inference of relatively simple systems. In each of these approaches, the knowledge-representation component (i.e., problem-specific rules and facts) is separate from the problem-solving and inference procedures. For the Semantic Web to function, computers must have access to structured collections of information and sets of inference rules that they can use to conduct automated reasoning. AI researchers have studied such systems and produced today’s Knowledge Representation (KR). KR is currently in a state comparable to that of hypertext before the advent of the Web. Knowledge representation contains the seeds of important applications, but to fully realize its potential, it must be linked into a comprehensive global system. ​ Computational Logic ​ Programming a computer involves creating a sequence of logical instructions that the computer will use to perform a wide variety of tasks. While it is possible to create programs directly in machine language, it is uncommon for programmers to work at this level because of the abstract nature of the instructions. It is better to write programs in a simple text file using a high-level programming language which can later be compiled into executable code. ​ The ‘logic model’ for programming is a basic element that communicates the logic behind a program. A logic model can be a graphic representation of a program illustrating the logical relationships between program elements and the flow of calculation, data manipulation or decisions as the program executes its steps. ​ Logic models typically use diagrams, flow sheets, or some other type of visual schematic to convey relationships between programmatic inputs, processes, and outcomes. Logic models attempt to show the links in a chain of reasoning about relationships to the desired goal. The desired goal is usually shown as the last link in the model. ​ A logic program may consist of a set of axioms and a goal statement. The logic form can be a set of ‘IF-THEN’ statements. The rules of inference are applied to determine whether the axioms are sufficient to ensure the truth of the goal statement. The execution of a logic program corresponds to the construction of a proof of the goal statement from the axioms. ​ In the logic programming model the programmer is responsible for specifying the basic logical relationships and does not specify the manner in which the inference rules are applied. Thus Logic + Control = Algorithms The operational semantics of logic programs correspond to logical inference. The declarative semantics of logic programs are derived from the term model. The denotation of semantics in logic programs are defined in terms of a function which assigns meaning to the program. There is a close relation between the axiomatic semantics of imperative programs and logic programs. The control portion of the equation is provided by an inference engine whose role is to derive theorems based on the set of axioms provided by the programmer. The inference engine uses the operations of resolution and unification to construct proofs. Faulty logic models occur when the essential problem has not been clearly stated or defined. ​ Program developers work carefully to construct logic models to avoid logic conflicts, recursive loops, and paradoxes within their computer programs. As a result, programming logic should lead to executable code without paradox or conflict, if it is flawlessly produced. Nevertheless we know that ‘bugs’ or programming errors do occur, some of which are directly or indirectly a result of logic conflicts. ​ As programs have grown in size from thousands of line of code to millions of lines, the problems of ‘bugs’ and logic conflicts have also grown. Today programs such as operating systems can have over 25 million lines of codes and considered to have hundreds of thousands of ‘bugs’ most of which are seldom encountered during routine program usage. ​ Confining logic issues to beta-testing on local servers allows programmers reasonable control of conflict resolution. Now consider applying many lines of application code logic to the Semantic Web were it may access many information nodes. The magnitude of the potential conflicts could be somewhat daunting. ​ Artificial Intelligence John McCarthy of MIT contributed the term ‘Artificial Intelligence’ (AI) and by the late 1950s, there were many researchers in AI working on programming computers. Eventually, AI expanded into such fields as philosophy, psychology and biology. AI is sometimes described in two ways: strong AI and weak AI. Strong AI asserts that computers can be made to think on a level equal to humans. Weak AI simply holds that some ‘thinking-like’ features can be added to computers to make them more useful tools. Examples of Weak AI abound: expert systems, drive-by-wire cars, smart browsers, and speech recognition software. These weak AI components may, when combined, begin to approach the expectations of strong AI. AI includes the study of computers that can perform cognitive tasks including: understanding natural language statements, recognizing visual patterns or scenes, diagnosing diseases or illnesses, solving mathematical problems, performing financial analyses, learning new procedures for problem solving, and playing complex games, like chess. We will provide a more detailed discussion on Artificial Intelligence on the Web and what is meant by machine intelligence in Chapter 3. ​ Web Architecture and Business Logic So far we have explored the basic elements, characteristics, and limitations of logic and suggested that errors in logic contribute to many significant ‘bugs’ that lead to crashed computer programs. Next we will review how Web architecture is used to partition the delivery of business logic from the user interface. The Web architecture keeps the logic restricted to executable code residing on the server and delivering user interface presentations residing within the markup languages traveling along the Internet. This simple arrangement of segregating the complexity of logic to the executable programs residing on servers has minimized processing difficulties over the Web itself. Today, markup languages are not equipped with logic connectives. So all complex logic and detailed calculations must be carried out by specially compiled programs residing on Web servers where they are accessed by server page frameworks. The result is highly efficient application programs on the server must communicate very inefficiently with other proprietary applications using XML in simple ASCII text. In addition, there is difficulty in interoperable programming which greatly inhibits automation of Web Services. Browsers such as Internet Explorer and Netscape Navigator view Web pages written in HyperText Markup Language (HTML). The HTML program can be written to a simple text file that is recognized by the browser and it can call embedded script programming. In addition, HTML can include compiler directives that call server pages with access to proprietary compiled programming. As a result, simple-text HTML is empowered with important capabilities to call complex business logic programming residing on servers both in the frameworks of Microsoft’s .NET and Sun’s J2EE. These frameworks support Web Services and form a vital part of today’s Web. When a request comes into the Web server, the Web server simply passes the request to the program best able to handle it. The Web server doesn't provide any functionality beyond simply providing an environment in which the server-side program can execute and pass back the generated responses. The server-side program provides functions as transaction processing, database connectivity, and messaging. Business logic is concerned with logic about: how we model real world business objects - such as accounts, loans, travel; how these objects are stored; how these objects interact with each other - e.g. a bank account must have an owner and a bank holder's portfolio is the sum of his accounts; and who can access and update these objects. As an example, consider an online store that provides real-time pricing and availability information. The site will provide a form for you to choose a product. When you submit your query, the site performs a lookup and returns the results embedded within an HTML page. The site may implement this functionality in numerous ways. The Web server delegates the response generation to a script, however, the business logic for the pricing lookup is included from an application server. With that change, instead of the script knowing how to look up the data and formulate a response, the script can simply call the application server's lookup service. The script can then use the service's result when the script generates its HTML response. The application server serves the business logic for looking up a product's pricing information. That functionality doesn't say anything about display or how the client must use the information. Instead, the client and application server send data back and forth. When a client calls the application server's lookup service, the service simply looks up the information and returns it to the client. By separating the pricing logic from the HTML response-generating code, the pricing logic becomes reusable between applications. A second client, such as a cash register, could also call the same service as a clerk checking out a customer. Recently, eXtensible Markup Language (XML) Web Services use an XML payload to a Web server. The Web server can then process the data and respond much as application servers have in the past. XML has become the standard for data transfer of all types of applications. XML provides a data model that is supported by most data-handling tools and vendors. Structuring data as XML allows hierarchical, graph-based representations of the data to be presented to tools, which opens up a host of possibilities. The task of creating and deploying Web Services automatically requires interoperable standards. The most advanced vision for the next generation of Web Services is the development of Web Services over Semantic Web Architecture. The Semantic Web Now let’s consider using logic within markup languages on the Semantic Web. This means empowering the Web’s expressive capability, but at the expense of reducing Web performance. The current Web is built on HTML and XML, which describes how information is to be displayed and laid out on a Web page for humans to read. In addition, HTML is not capable of being directly exploited by information retrieval techniques. XML may have enabled the exchange of data across the Web, but it says nothing about the meaning of that data. In effect, the Web has developed as a medium for humans without a focus on data that could be processed automatically. As a result, computers are unable to automatically process the meaning of Web content. For machines to perform useful automatic reasoning tasks on these documents, the language machines use must go beyond the basic semantics of XML Schema. They will require an ontology language, logic connectives, and rule systems. ​ By introducing these elements the Semantic Web is intended to be a paradigm shift just as powerful as the original Web. The Semantic Web will bring meaning to the content of Web pages, where software agents roaming from page-to-page can carry out automated tasks. ​ The Semantic Web will be constructed over the Resource Description Framework (RDF) and Web Ontology Language (OWL). In addition, it will implement logic inference and rule systems. These languages are being developed by the W3C. Data can be defined and linked using RDF and OWL so that there is more effective discovery, automation, integration, and reuse across different applications. These languages are conceptually richer than HTML and allow representation of the meaning and structure of content (interrelationships between concepts). This makes Web content understandable by software agents, opening the way to a whole new generation of technologies for information processing, retrieval, and analysis. If a developer publishes data in XML on the Web, it doesn’t require much more effort to take the extra step and publish the data in RDF. By creating ontologies to describe data, intelligent applications won’t have to spend time translating various XML schemas. An ontology defines the terms used to describe and represent an area of knowledge. Although XML Schema is sufficient for exchanging data between parties who have agreed to the definitions beforehand, their lack of semantics prevents machines from reliably performing this task with new XML vocabularies. In addition, the ontology of RDF and RDF Schema (RDFS) is very limited (see Chapter 5). RDF is roughly limited to binary ground predicates and RDF Schema is roughly limited to a subclass hierarchy and a property hierarchy with domain and range definitions. Adding an Ontology language will permit the development of explicit, formal conceptualizations of models (see Chapter 6). The main requirements of an onotology language include: a well-defined syntax, a formal semantics, convenience of expression, n efficient reasoning support system, and sufficient expressive power. Since the W3C has established that the Semantic Web would require much more expressive power than using RDF and RDF Schema would offer, the W3C has defined Web Ontology Language (called OWL). The layered architecture of the Semantic Web would suggest that one way to develop the necessary ontology language is to extend RDF Schema by using the RDF meaning of classes and properties and adding primitives to support richer expressiveness. However, simply extending RDF Schema would fail to achieve the best combination of expressive power and efficient reasoning. The layered architecture of the Semantic Web promotes the downward compatibility and reuse of software is only achieved with OWL Full (see Chapter 6), but at the expense of computational intractability. RDF and OWL (DL and Lite, see Chapter 6) are specializations of predicate logic. They provide a syntax that fits well with Web languages. They also define reasonable subsets of logic that offer a trade-off between expressive power and computational complexity. Semantic Web research has developed from the traditions of Artificial Intelligence (AI) and ontology languages. Currently, the most important ontology languages on the Web are XML, XML Schema, RDF, RDF Schema, and OWL. Agents are pieces of software that work autonomously and proactively. In most cases agent will simply collect and organize information. Agents on the Semantic Web will receive some tasks to perform and seek information from Web resources, while communicating with other Web agents, in order to fulfill its task. Semantic Web agents will utilize metadata, ontologies, and logic to carry out its tasks. In a closed environment, Semantic Web specifications have already been used to accomplish many tasks, such as data interoperability for business-to-business (B2B) transactions. Many companies have expended resources to translate their internal data syntax for their partners. As the world migrates towards RDF and ontologies, interoperability will become more flexible to new demands. An inference is a process of using rules to manipulate knowledge to produce new knowledge. Adding logic to the Web means using rules to make inferences and choosing a course of action. The logic must be powerful enough to describe complex properties of objects, but not so powerful that agents can be tricked by a paradox. A combination of mathematical and engineering issues complicates this task. We will provide a more detailed presentation on paradoxes on the Web and what is solvable on the Web in the next few chapters. ​ Inference Engines for the Semantic Web Inference engines process the knowledge available in the Semantic Web by deducing new knowledge from already specified knowledge. Higher Order Logic (HOL) based inference engines have to greatest expressive power among all known logics such as the characterization of transitive closure. However, higher order logics don't have nice computational properties. There are true statements, which are unprovable (Gödel’s Incompleteness Theorem). Full First Order Logic (FFOL) based inference engines for specifying axioms requires a full-fledged automated theorem prover. FOL is semi-decidable and doing inferencing is computationally not tractable for large amounts of data and axioms. This means, than in an environment like the Web, HOL and FFOL programs would not scale up for handling huge amounts of knowledge. Besides full first theorem proving would mean to maintain consistency throughout the web, which is impossible. Predicate calculus is the primary example of logic where syntax and semantics are both first-order. From a modeling point of view, Description Logics correspond to Predicate Logic statements with three variables suggesting that modeling is syntactically bound and is a good candidate language for Web logic. Other possibilities for inference engines for the Semantic Web are languages based on Horn-logic, which is another fragment of First-Order Predicate logic (see Figure 2-2). In addition, Descriptive Logic and rule systems (e.g., Horn Logic) have different capabilities. Both Descriptive Logic and Horn Logic are critical branches of logic that highlight essential limitations and expressive powers which are central issues to designing the Semantic Web languages. We will discuss them further in chapters, 6, 7, 8 and 9. Conclusion For the Semantic Web to provide machine processing capabilities, the logic expressive power of mark-up languages must be balanced against the resulting computational complexity of reasoning. In this chapter, we examined both the expressive characteristics of logic languages, as well as, their inherit limitations. First Order Logics (FOL) fragments such as Descriptive Logic and Horn Logic offer attractive characteristics for Web applications and set the parameters for how expressive Web markup languages can become. We also reviewed the concept of Artificial Intelligence (AI) and how logic is applied in computer programming. After exploring the basic elements, characteristics, and limitations of logic and suggesting that errors in logic contribute to many significant ‘bugs’ that lead to crashed computer programs, we reviewed how Web architecture is used to partition the delivery of business logic from the user interface. The Web architecture keeps the logic restricted to executable code residing on the server and delivering user interface presentations residing within the markup languages traveling along the Internet. Finally, we discussed the implications of using logic within markup languages on the Web through the development of the Semantic Web. Our conclusions from this chapter include: Logic is the foundation of knowledge representation which can be applied to AI in general and the World Wide Web specially. Logic can provide a high-level language for expressing knowledge and has high expressive power. Logic has a well-understood formal semantics for assigning unambiguous meaning to logic statements. In addition, we saw that proof systems exist that can automatically derive statements syntactically from premises. Predicate logic uniquely offers a sound and complete proof system while higher-order logics do not. By tracking the proof to reach its consequence the logic can provide explanations for the answers. Currently, complex logic and detailed calculations must be carried out by specially compiled programs residing on Web servers where they are accessed by server page frameworks. The result is highly efficient application programs on the server must communicate very inefficiently with other proprietary applications using XML in simple ASCII text. In addition, this difficulty for interoperable programs greatly inhibits automation of Web Services. The Semantic Web offers a way to use logic in the form of Descriptive Logic or Horn Logic on the Web. Exercises 2-1. Explain how logic for complex business calculations is currently carried out through .NET and J2EE application servers. 2-2. Explain the difference between FOL and HOL. 2-3. Why is it necessary to consider less powerful expressive languages for the Semantic Web? 2-4. Why is undeciability a concern on the Web? ​ Website http://escherdroste.math.leidenuniv.nl/ offers visualize the mathematical structure behind Escher's Print Gallery using the Escher and the Droste effect. This mathematical structure answers some questions about Escher's picture, such as: "what's in the blurry white hole in the middle?" This project is an initiative of Hendrik Lenstra of the Universiteit Leiden and the University of California at Berkeley. Bart de Smit of the Universiteit Leiden runs the project. Interlude #2: Truth and Beauty As John passed with a sour look on his face, Mary looked up from her text book and asked, “Didn’t you enjoy the soccer game?” “How can you even ask that when we lost?” asked John gloomily. “I think the team performed beautifully, despite the score” said Mary. This instantly frustrated John and he said, "Do you know Mary that sometimes I find it disarming the way you express objects in terms of beauty. I find that simply accepting something on the basis of its beauty can lead to false conclusions?" Mary reflected upon this before offering a gambit of her own, "Well John, do you know that sometimes I find that relying on objective truth alone can lead to unattractive conclusions." John became flustered and reflected his dismay by demanding, "Give me an example." Without hesitation, Mary said, "Perhaps you will recall that in the late 1920's, mathematicians were quite certain that every well-posed mathematical question had to have a definite answer ─ either true or false. For example, suppose they claimed that every even number was the sum of two prime numbers,” referring to Goldbach's Conjecture which she had just been studying in her text book. Mary continued, “Mathematicians would seek the truth or falsity of the claim by examining a chain of logical reasoning that would lead in a finite number of steps to prove if the claim were either true or false." "So mathematicians thought at the time," said John. "Even today most people still do." "Indeed," said Mary. "But in 1931, logician Kurt Gödel proved that the mathematicians were wrong. He showed that every sufficiently expressive logical system must contain at least one statement that can be neither proved nor disproved following the logical rules of that system. Gödel proved that not every mathematical question has to have a yes or no answer. Even a simple question about numbers may be undecidable. In fact, Gödel proved that t here exist questions that while being undecidable by the rules of logical system can be seen to be actually true if we jump outside that system. But they cannot be proven to be true.” “Thank you for that clear explanation,” said John. “But isn’t such a fact simply a translation into mathematic terms of the famous Liar’s Paradox: ‘This statement is false.’” “Well, I think it's a little more complicated than that,” said Mary. “But Gödel did identify the problem of self-reference that occurs in the Liar’s Paradox. Nevertheless, Gödel’s theorem contradicted the thinking of most of the great mathematicians of his time. The result is that one can not be as certain as the mathematician had desired. See what I mean, Gödel may have found an important truth, but it was – well to be frank – rather disappointingly unattractive," concluded Mary. "On the contrary,” countered John, “from my perspective it was the beauty of the well-posed mathematical question offered by the mathematicians that was proven to be false. Mary replied, “I’ll have to think about that.”

  • Fame | H Peter Alesso

    Science Fiction Writers Hall of Fame Isaac Asimov Asimov is one of the foundational voices of 20th-century science fiction. His work often incorporated hard science, creating an engaging blend of scientific accuracy and imaginative speculation. Known for his "Robot" and "Foundation" series, Asimov's ability to integrate scientific principles with compelling narratives has left an enduring legacy in the field. Arthur C. Clarke The author of numerous classics including "2001: A Space Odyssey," Clarke's work is notable for its visionary, often prophetic approach to future technologies and space exploration. His thoughtful, well-researched narratives stand as enduring examples of 'hard' science fiction. Robert A. Heinlein Heinlein, one of science fiction's most controversial and innovative writers, is best known for books like "Stranger in a Strange Land" and "Starship Troopers." His work is known for its strong political ideologies and exploration of societal norms. Philip K. Dick With stories often marked by paranoid and dystopian themes, Dick's work explores philosophical, sociological, and political ideas. His books like "Do Androids Dream of Electric Sheep?" inspired numerous films, solidifying his impact on popular culture. Ray Bradbury Known for his poetic prose and poignant societal commentary, Bradbury's work transcends genre. His dystopian novel "Fahrenheit 451" remains a touchstone in the canon of 20th-century literature, and his short stories continue to inspire readers and writers alike. Ursula K. Le Guin Le Guin's works, such as "The Left Hand of Darkness" and the "Earthsea" series, often explored themes of gender, sociology, and anthropology. Her lyrical prose and profound explorations of human nature have left an indelible mark on science fiction. Frank Herbert The author of the epic "Dune" series, Herbert crafted a detailed and complex future universe. His work stands out for its intricate plotlines, political intrigue, and environmental themes. William Gibson Gibson is known for his groundbreaking cyberpunk novel "Neuromancer," where he coined the term 'cyberspace.' His speculative fiction often explores the effects of technology on society. H.G. Wells Although Wells's works were published on the cusp of the 20th century, his influence carried well into it. Known for classics like "The War of the Worlds" and "The Time Machine", Wells is often hailed as a father of science fiction. His stories, filled with innovative ideas and social commentary, have made an indelible impact on the genre. Larry Niven Known for his 'Ringworld' series and 'Known Space' stories, Niven's hard science fiction works are noted for their imaginative, scientifically plausible scenarios and compelling world-building. Octavia Butler Butler's work often incorporated elements of Afrofuturism and tackled issues of race and gender. Her "Xenogenesis" series and "Kindred" are known for their unique and poignant explorations of human nature and society. Orson Scott Card Best known for his "Ender's Game" series, Card's work combines engaging narrative with introspective examination of characters. His stories often explore ethical and moral dilemmas. Alfred Bester Bester's "The Stars My Destination" and "The Demolished Man" are considered classics of the genre. His work is recognized for its powerful narratives and innovative use of language. Kurt Vonnegut Though not strictly a science fiction writer, Vonnegut's satirical and metafictional work, like "Slaughterhouse-Five," often used sci-fi elements to highlight the absurdities of human condition. Harlan Ellison Known for his speculative and often dystopian short stories, Ellison's work is distinguished by its cynical tone, inventive narratives, and biting social commentary. Stanislaw Lem Lem's work, such as "Solaris," often dealt with philosophical questions. Philip José Farmer Known for his "Riverworld" series, Farmer's work often explored complex philosophical and social themes through creative world-building and the use of historical characters. He is also recognized for his innovations in the genre and the sexual explicitness of some of his work. J. G. Ballard Best known for his novels "Crash" and "High-Rise", Ballard's work often explored dystopian modernities and psychological landscapes. His themes revolved around surrealistic and post-apocalyptic visions of the human condition, earning him a unique place in the sci-fi genre. AI Science Fiction Hall of Fame As a science fiction aficionado and AI expert, there's nothing more exciting to me t han exploring the relationship between sci-fi literature and artificial intelligence. Science fiction is an innovative genre, often years ahead of its time, an d has influenced AI's development in ways you might not expect. But it's not just techies like us who should be interested - students of AI can learn a lot from these visionary authors. So buckle up, as we're about to embark on an insider's journey through the most famous science fiction writers in the hall of fame! The Science Fiction-AI Connection Science fiction and AI go together like peanut butter and jelly. In fact, one could argue that some of our most advanced AI concepts and technologies sprung from the seeds planted by sci-fi authors. I remember as a young techie, curled up with my dog, reading Isaac Asimov’s "I, Robot". I was just a teenager, but that book completely changed how I saw the potential of AI. ​ The Most Famous Sci-Fi Writers and their AI Visions Ready for a deep dive into the works of the greats? Let's take a closer look at some of the most famous science fiction writers in the hall of fame, and how their imaginations have shaped the AI we know today. ​ Isaac Asimov: Crafting the Ethics of AI You can't talk about AI in science fiction without first mentioning Isaac Asimov. His "I, Robot" introduced the world to the Three Laws of Robotics, a concept that continues to influence AI development today. As an AI student, I remember being fascinated by how Asimov's robotic laws echoed the ethical considerations we must grapple with in real-world AI. ​ Philip K. Dick: Dreaming of Synthetic Humans Next up, Philip K. Dick. If you've seen Blade Runner, you've seen his influence at work. In "Do Androids Dream of Electric Sheep?" (the book Blade Runner is based on), Dick challenges us to question what it means to be human and how AI might blur those lines. It's a thought that has certainly kept me up late on more than a few coding nights! ​ Arthur C. Clarke: AI, Autonomy, and Evolution Arthur C. Clarke's "2001: A Space Odyssey" has been both a source of inspiration and caution in my work. The AI character HAL 9000 is an eerie portrayal of autonomous AI systems' potential power and risks. It's a reminder that AI, like any technology, can be a double-edged sword. ​ William Gibson: AI in Cyberspace Finally, William Gibson's "Neuromancer" gave us a vision of AI in cyberspace before the internet was even a household name. I still remember my shock reading about an AI entity in the digital ether - years later, that same concept is integral to AI in cybersecurity. ​ The Power of Creativity These authors' works are testaments to the power of creativity in imagining the possibilities of AI. As students, you'll need to push boundaries and think outside the box - just like these authors did. ​ Understanding Potential and Limitations The stories these authors spun provide us with vivid scenarios of AI's potential and limitations. They remind us that while AI has massive potential, it's not without its challenges and dangers. ​ Conclusion And there we have it - our deep dive into the most famous science fiction writers in the hall of fame and their influence on AI. Their work is not just fiction; it's a guiding light, illuminating the path that has led us to the AI world we live in today. As students, we have the opportunity to shape the AI of tomorrow, just as these authors did. So why not learn from the best? Science Fiction Greats of the 21st Century Neal Stephenson is renowned for his complex narratives and incredibly detailed world-building. His Baroque Cycle trilogy is a historical masterpiece, while Snow Crash brought the concept of the 'Metaverse' into popular culture. China Miéville has won several prestigious awards for his 'weird fiction,' a blend of fantasy and science fiction. Books like Perdido Street Station and The City & The City are both acclaimed and popular. His work is known for its rich, evocative language and innovative concepts. Kim Stanley Robinson is best known for his Mars trilogy, an epic tale about the terraforming and colonization of Mars. He's famous for blending hard science, social commentary, and environmental themes. He continues this trend in his 21st-century works like the climate-focused New York 2140. Margaret Atwood, while also recognized for her mainstream fiction, has made significant contributions to science fiction. Her novel The Handmaid's Tale and its sequel The Testaments provide a chilling dystopian vision of a misogynistic society. Her MaddAddam trilogy further underscores her unique blend of speculative fiction and real-world commentary. Alastair Reynolds is a leading figure in the hard science fiction subgenre, known for his space opera series Revelation Space. His work, often centered around post-humanism and AI, is praised for its scientific rigor and inventive plotlines. Reynolds, a former scientist at the European Space Agency, incorporates authentic scientific concepts into his stories. Paolo Bacigalupi's works often deal with critical environmental and socio-economic themes. His debut novel The Windup Girl won both the Hugo and Nebula awards and is renowned for its bio-punk vision of the future. His YA novel, Ship Breaker, also received critical acclaim, winning the Michael L. Printz Award. Ann Leckie's debut novel Ancillary Justice, and its sequels, are notable for their exploration of AI, gender, and colonialism. Ancillary Justice won the Hugo, Nebula, and Arthur C. Clarke Awards, a rare feat in science fiction literature. Her unique narrative styles and complex world-building are highly appreciated by fans and critics alike. Iain M. Banks was a Scottish author known for his expansive and imaginative 'Culture' series. Though he passed away in 2013, his work remains influential in the genre. His complex storytelling and exploration of post-scarcity societies left a significant mark in science fiction. William Gibson is one of the key figures in the cyberpunk sub-genre, with his novel Neuromancer coining the term 'cyberspace.' In the 21st century, he continued to innovate with his Blue Ant trilogy. His influence on the genre, in terms of envisioning the impacts of technology on society, is immense. Ted Chiang is highly regarded for his thoughtful and philosophical short stories. His collection Stories of Your Life and Others includes "Story of Your Life," which was adapted into the film Arrival. Each of his carefully crafted tales explores a different scientific or philosophical premise. Charlie Jane Anders is a diverse writer who combines elements of science fiction, fantasy, and more in her books. Her novel All the Birds in the Sky won the 2017 Nebula Award for Best Novel. She's also known for her work as an editor of the science fiction site io9. N.K. Jemisin is the first author to win the Hugo Award for Best Novel three years in a row, for her Broken Earth Trilogy. Her works are celebrated for their diverse characters, intricate world-building, and exploration of social issues. She's one of the most influential contemporary voices in fantasy and science fiction. Liu Cixin is China's most prominent science fiction writer and the first Asian author to win the Hugo Award for Best Novel, for The Three-Body Problem. His Remembrance of Earth's Past trilogy is praised for its grand scale and exploration of cosmic civilizations. His work blends hard science with complex philosophical ideas. John Scalzi is known for his accessible writing style and humor. His Old Man's War series is a popular military science fiction saga, and his standalone novel Redshirts won the 2013 Hugo Award for Best Novel. He's also recognized for his blog "Whatever," where he discusses writing, politics, and more. Cory Doctorow is both a prolific author and an advocate for internet freedom. His novel Little Brother, a critique of increased surveillance, is frequently used in educational settings. His other novels, like Down and Out in the Magic Kingdom, are known for their examination of digital rights and technology's impact on society. Octavia Butler (1947-2006) was an award-winning author known for her incisive exploration of race, gender, and societal structures within speculative fiction. Her works like the Parable series and Fledgling have continued to influence and inspire readers well into the 21st century. Her final novel, Fledgling, a unique take on vampire mythology, was published in 2005. Peter F. Hamilton is best known for his space opera series such as the Night's Dawn trilogy and the Commonwealth Saga. His work is often noted for its scale, complex plotting, and exploration of advanced technology and alien civilizations. Despite their length, his books are praised for maintaining tension and delivering satisfying conclusions. Ken Liu is a prolific author and translator in science fiction. His short story "The Paper Menagerie" is the first work of fiction to win the Nebula, Hugo, and World Fantasy Awards. As a translator, he's known for bringing Liu Cixin's The Three-Body Problem to English-speaking readers. Ian McDonald is a British author known for his vibrant and diverse settings, from a future India in River of Gods to a colonized Moon in the Luna series. His work often mixes science fiction with other genres, and his narrative style has been praised as vivid and cinematic. He has won several awards, including the Hugo, for his novellas and novels. James S.A. Corey is the pen name of collaborators Daniel Abraham and Ty Franck. They're known for The Expanse series, a modern space opera exploring politics, humanity, and survival across the solar system. The series has been adapted into a critically acclaimed television series. Becky Chambers is praised for her optimistic, character-driven novels. Her debut, The Long Way to a Small, Angry Planet, kickstarted the popular Wayfarers series and was shortlisted for the Arthur C. Clarke Award. Her focus on interpersonal relationships and diverse cultures sets her work apart from more traditional space operas. Yoon Ha Lee's Machineries of Empire trilogy, beginning with Ninefox Gambit, is celebrated for its complex world-building and innovative use of technology. The series is known for its intricate blend of science, magic, and politics. Lee is also noted for his exploration of gender and identity in his works. Ada Palmer's Terra Ignota series is a speculative future history that blends philosophy, politics, and social issues in a post-scarcity society. The first book in the series, Too Like the Lightning, was a finalist for the Hugo Award for Best Novel. Her work is appreciated for its unique narrative voice and in-depth world-building. Charlie Stross specializes in hard science fiction and space opera, with notable works including the Singularity Sky series and the Laundry Files series. His books often feature themes such as artificial intelligence, post-humanism, and technological singularity. His novella "Palimpsest" won the Hugo Award in 2010. Kameron Hurley is known for her raw and gritty approach to science fiction and fantasy. Her novel The Light Brigade is a time-bending military science fiction story, while her Bel Dame Apocrypha series has been praised for its unique world-building. Hurley's work often explores themes of gender, power, and violence. Andy Weir shot to fame with his debut novel The Martian, a hard science fiction tale about a man stranded on Mars. It was adapted into a successful Hollywood film starring Matt Damon. His later works, Artemis and Project Hail Mary, continue his trend of scientifically rigorous, yet accessible storytelling. Jeff VanderMeer is a central figure in the New Weird genre, blending elements of science fiction, fantasy, and horror. His Southern Reach Trilogy, starting with Annihilation, explores ecological themes through a mysterious, surreal narrative. The trilogy has been widely praised, with Annihilation adapted into a major motion picture. Nnedi Okorafor's Africanfuturist works blend science fiction, fantasy, and African culture. Her novella Binti won both the Hugo and Nebula awards. Her works are often celebrated for their unique settings, compelling characters, and exploration of themes such as cultural conflict and identity. Claire North is a pen name of Catherine Webb, who also writes under Kate Griffin. As North, she has written several critically acclaimed novels, including The First Fifteen Lives of Harry August, which won the John W. Campbell Memorial Award for Best Science Fiction Novel. Her works are known for their unique concepts and thoughtful exploration of time and memory. M.R. Carey is the pen name of Mike Carey, known for his mix of horror and science fiction. His novel The Girl With All the Gifts is a fresh take on the zombie genre, and it was later adapted into a film. Carey's works are celebrated for their compelling characters and interesting twists on genre conventions. Greg Egan is an Australian author known for his hard science fiction novels and short stories. His works often delve into complex scientific and mathematical concepts, such as artificial life and the nature of consciousness. His novel Diaspora is considered a classic of hard science fiction. Steven Erikson is best known for his epic fantasy series, the Malazan Book of the Fallen. However, he has also made significant contributions to science fiction with works like Rejoice, A Knife to the Meat. His works are known for their complex narratives, expansive world-building, and philosophical undertones. Vernor Vinge is a retired San Diego State University professor of mathematics and computer science and a Hugo award-winning science fiction author. Although his most famous work, A Fire Upon the Deep, was published in the 20th century, his later work including the sequel, Children of the Sky, has continued to influence the genre. He is also known for his 1993 essay "The Coming Technological Singularity," in which he argues that rapid technological progress will soon lead to the end of the human era. Jo Walton has written several novels that mix science fiction and fantasy, including the Hugo and Nebula-winning Among Others. Her Thessaly series, starting with The Just City, is a thought experiment about establishing Plato's Republic in the ancient past. She is also known for her non-fiction work on the history of science fiction and fantasy. Hugh Howey is best known for his series Wool, which started as a self-published short story and grew into a successful series. His works often explore post-apocalyptic settings and the struggle for survival and freedom. Howey's success has been a notable example of the potential of self-publishing in the digital age. Richard K. Morgan is a British author known for his cyberpunk and dystopian narratives. His debut novel Altered Carbon, a hardboiled cyberpunk mystery, was adapted into a Netflix series. His works are characterized by action-packed plots, gritty settings, and exploration of identity and human nature. Hannu Rajaniemi is a Finnish author known for his unique blend of hard science and imaginative concepts. His debut novel, The Quantum Thief, and its sequels have been praised for their inventive ideas and complex, layered narratives. Rajaniemi, who holds a Ph.D. in mathematical physics, incorporates authentic scientific concepts into his fiction. Stephen Baxter is a British author who often writes hard science fiction. His Xeelee sequence is an expansive future history series covering billions of years. Baxter is known for his rigorous application of scientific principles and his exploration of cosmic scale and deep time. C.J. Cherryh is an American author who has written more than 60 books since the mid-1970s. Her Foreigner series, which began in the late '90s and has continued into the 21st century, is a notable science fiction series focusing on political conflict and cultural interaction. She has won multiple Hugo Awards and was named a Grand Master by the Science Fiction and Fantasy Writers of America. Elizabeth Bear is an American author known for her diverse range of science fiction and fantasy novels. Her novel Hammered, which combines cybernetics and Norse mythology, started the acclaimed Jenny Casey trilogy. She has won multiple awards, including the Hugo, for her novels and short stories. Larry Niven is an American author best known for his Ringworld series, which won the Hugo, Nebula, and Locus awards. In the 21st century, he continued the series and collaborated with other authors on several other works, including the Bowl of Heaven series with Gregory Benford. His works often explore hard science concepts and future history. David Mitchell is known for his genre-blending novels, such as Cloud Atlas, which weaves six interconnected stories ranging from historical fiction to post-apocalyptic science fiction. The novel was shortlisted for the Booker Prize and adapted into a film. His works often explore themes of reality, identity, and interconnectedness. Robert J. Sawyer is a Canadian author known for his accessible style and blend of hard science fiction with philosophical and ethical themes. His Neanderthal Parallax trilogy, which started in 2002, examines an alternate world where Neanderthals became the dominant species. He is a recipient of the Hugo, Nebula, and John W. Campbell Memorial awards. Daniel Suarez is known for his high-tech thrillers. His debut novel Daemon and its sequel Freedom™ explore the implications of autonomous computer programs on society. His books are praised for their action-packed narratives and thought-provoking themes related to technology and society. Kazuo Ishiguro is a Nobel Prize-winning author, known for his poignant and thoughtful novels. Never Let Me Go, published in 2005, combines elements of science fiction and dystopian fiction in a heartbreaking narrative about cloned children raised for organ donation. Ishiguro's work often grapples with themes of memory, time, and self-delusion. Malka Older is a humanitarian worker and author known for her Infomocracy trilogy. The series, starting with Infomocracy, presents a near-future world where micro-democracy has become the dominant form of government. Her work stands out for its political savvy and exploration of information technology. James Lovegrove is a versatile British author, known for his Age of Odin series and Pantheon series which blend science fiction with mythology. His Firefly novel series, based on the popular Joss Whedon TV show, has been well received by fans. He's praised for his engaging writing style and inventive blending of genres. Emily St. John Mandel is known for her post-apocalyptic novel Station Eleven, which won the Arthur C. Clarke Award and was a finalist for the National Book Award and the PEN/Faulkner Award. Her works often explore themes of memory, fate, and interconnectedness. Her writing is praised for its evocative prose and depth of character. Sue Burke's debut novel Semiosis is an engaging exploration of human and alien coexistence, as well as the sentience of plants. The book was a finalist for the John W. Campbell Memorial Award and spawned a sequel, Interference. Burke's work is known for its realistic characters and unique premise. Tade Thompson is a British-born Yoruba author known for his Rosewater trilogy, an inventive blend of alien invasion and cyberpunk tropes set in a future Nigeria. The first book in the series, Rosewater, won the Arthur C. Clarke Award. His works are celebrated for their unique settings and blend of African culture with classic and innovative science fiction themes. Send Your Suggestion First name Last name Email What did you like best? How can we improve? Send Feedback Thanks for sharing your feedback with us!

  • Captain Heny Gallant | H Peter Alesso

    Captain Henry Gallant AMAZON Chapter 1 Streak Across the Sky ​ Cold night air smacked Rob Ryan in the face as he stepped out of the Liftoff bar—a favorite haunt of pilots. He was still weaving his way through the parking terminal looking for his single-seat jet-flyer when a familiar face appeared at his elbow. ​ Grabbing his arm, his friend said, “You shouldn’t fly. Let me give you a ride.” ​ Ryan straightened to his full six-two height and shrugged off his friend’s hand. ​ “I’m fine,” he said, swiping a lock of unkempt brown hair out of his eyes. ​ “Don’t be pigheaded. There’s a difference between self-reliance and foolishness.” ​ He pushed past his friend. “Nonsense. I fly better when I’m . . . mellow.” ​ As he left his buddy behind, he noticed a young woman who had come out of the bar after him. He had spent the past hour eyeing this smokin’ hot redhead, but she had been with somebody. Now she was heading out on her own. She glanced at him and quickened her pace. ​ A thought penetrated the fog in his mind. ​ I’ll show her. ​ At his Cobra 777 jet-flyer, he zipped up his pressure suit, buckled into the cockpit, and pulled on his AI neural interface—all the while imagining a wild take-off that would wow the redhead. ​ He jockeyed his jet along the taxiway onto the runway. When the turbo launch kicked in, the black-and-chrome jet spewed a cloud of exhaust and dust across the strip. He jammed the throttle all the way in and gave a whoop of pure joy at the roar and explosive thrust of the machine. The exhilaration—a primitive, visceral feeling—increased by the second, along with his altitude and speed. His love of speed was only matched by his almost unhealthy fascination with flying machines—too fast was never fast enough. ​ For a few seconds, his mind flashed back to his very first flight. The thrill only lasted a few minutes before the mini flyer spun out and crashed. Without a word, his father picked him up and sat him back down in the seat, restarting the engine with a wink and a grin. Clearest of all was the memory of his father’s approval as he took off again and soared higher and faster than before. ​ Now he sliced through the crisp night air in a military jet that had his name engraved on the side. He ignited an extra thruster to drive the engine even hotter. Riding the rush of adrenaline, he pulled back on the stick to pull the nose up. Atmospheric flying was different than being in space, and for him, it had a sensual rhythm all its own. As he reached altitude, he pulled a tight loop and snapped the jet inverted, giving himself a bird’s-eye view of the ground below. ​ But instead of reveling in admiration as expected, he found himself fighting for control against a powerful shockwave as a Scorpion 699 jet blew past him. The blast of its fuel exhaust was nothing compared to the indignation and shame that burned his face. ​ It was the redhead. ​ Damn. She’s good. ​ His pulse raced as he became fully alert. Determined to pursue her, he angled the ship across air traffic lanes, breaking every safety regulation in the book. Instinctively his eyes scanned the horizon and the edges around him, watching for threats or other machines that might interfere with his trajectory. Pinwheeling in a high-G turn, he felt the crush of gravity against his chest, yet still, his hand on the throttle urged ever more speed from the machine. ​ He lost track of the Scorpion in the clouds, and in mere seconds she maneuvered behind him. He tried to shake her using every evasive maneuver he had learned in his fighter training but couldn’t do it. His eyes roamed the sky, watching for potential dangers. The night sky was dark, but several landmarks lit up the ground below him. Earth’s capital, Melbourne, glowed with activity to the north; a mountain range stretched across the horizon 50 km to the west, and an airport lay to the south at the edge of the ocean. As he scanned the skyline, he noticed a radio-telescope antenna. Impulsively he dove toward it, the Scorpion on his tail. ​ At the last moment, the redhead broke pursuit to avoid the antenna, but in a moment of reckless folly, Ryan crashed through the flimsy wire mesh, no more substantial to his Cobra than a wisp of cloud. “That’ll need a patch,” he chuckled. ​ But once more, the Scorpion blew by him. He watched it roar away as if he were in slow motion. As the redhead curved back toward him for another pass, he gritted his teeth in frustration. With thrusters already at max burn, he punched the afterburner to create his own shock wave and turned head-on into her path. “Damn!” he screamed as the other ship twisted away. ​ His golden rule for staying alive while flying was “never yield but always leave yourself an out.” Folly had made him reckless, and he knew his reflexes were sluggish, but he was pissed at himself for letting this pilot provoke him. ​ Recovering his reason, he leveled off and threw down the skid flaps to reach a more reasonable speed. The jet took the torque and inertia strain, and the flashing red lights on his display turned yellow and then green. Despite his irritation, he allowed himself a faint smile when his AI read the Scorpion’s registration: Lorelei Steward. ​ Good sense advised that he throttle back, but pride won out. Spotting the Scorpion silhouetted against a cloud, he jammed the throttle forward yet again. ​ Finally, behind her, his smile broadened. She wouldn’t slip away this time. She pulled her jet into a violent oblique pop, rolled inverted until the nose pointed to the ground then returned to upright. ​ He stuck with her, move for move. ​ Abruptly she angled for the nearby mountain range. He chased her, low and fast, through a pass and down into a twisting canyon, rolling and pitching in a dizzying display of aerobatic skill. He kept close on her six until they blew out of the ravine. ​ In a desperate ploy to shake him, she turned back toward Melbourne’s airspace and headed straight into a crowded flying highway. ​ Ryan was so close behind that it took a few seconds before he realized her blunder. She had turned into an oncoming traffic lane. ​ The cockpit warning lights lit up the cabin as Ryan dodged a stream of oncoming vehicles. Up ahead, Lorelei ducked under a passenger liner that swerved directly into his path. ​ Time slowed to a crawl as he foresaw his fate—he could escape by pulling up—but that would force the crowded passenger liner to dive and crash into the ground. ​ “Damn it all!” he yelled and dove—leaving the liner a clear path to safety. Through the neural interface, his AI shrieked, ​ TOO LOW! PULL UP! TOO LOW! PULL UP! ​ He used every bit of expertise he could muster to twist, turn, and wrestle his jet into a controlled descent. His vision narrowed as the lights of city and ships gave way to a line of unyielding rocks zooming toward him. In a blink, he ran out of time—and altitude. ​ BRACE FOR IMPACT! ​ The Cobra plowed a trough a hundred meters long across the desert floor. Ryan sat in the cockpit, stunned and disoriented amid the flames and wreckage until his lungs convulsed from the dense smoke. An acidic stench and the taste of jet fuel assailed his nose and throat, rousing him from his stupor. Fumbling to unbuckle the safety harness, he held his breath until he could release the hatch and climb out of his ruined machine. Shaking hands searched his body for broken bones. To his relief, he was intact . . . if he didn’t count the ringing in his ears and the blood that coursed down his face. ​ The maxim from flight school ran through his mind: “Any landing you walk away from . . .” But as he limped away, his beloved Cobra burned into a twisted mound of molten metal, its nose buried in the dusty red ground. He shook his head at the wreck. “Captain Gallant is going to have my ass.”

  • Portfolio | H Peter Alesso

    The Henry Gallant Saga COURAGE is only a word . . . until you witness it. Then . . . it is contagious. Henry Gallant is the only Natural left in Earth's genetically engineered space navy. Despite overwhelming odds and the doubts of his shipmates, Gallant refuses to back down as he uses his unique abilities to fight for victory at the farthest reaches of the Solar System. Follow Gallant as he finds the spine to stand tall, vanquish fear, and rain violence upon the methane-breathing enemy aliens. The nation needs a hero like Henry Gallant. He fights! For fans of Horatio Hornblower and Honor Harring ton. 1/9

  • e-Video | H Peter Alesso

    e-Video AMAZON Chapter 1 Bandwidth for Video ​ Electronic-Video, or “e-Video”, includes all audio/video clips that are distributed and played over the Internet, either by direct download or streaming video. The problem with video, however, has been its inability to travel over networks without clogging the lines. If you’ve ever tried to deliver video, you know that even after heroic efforts on your part (including optimizing the source video, the hardware, the software, the editing and the compression process) there remains a significant barrier to delivering your video over the Web. That is the “last mile” connection to the client. ​ So before we explain the details of how to produce, capture, edit and compress video for the Web, we had better begin by describing the near term opportunities for overcoming the current bandwidth limitations for delivering video over the Internet. ​ In this chapter, we will describe how expanding broadband fiber networks will reach out to the “last mile” to homes and businesses creating opportunities for video delivery. In order to accomplish this, we will start by quantifing three essential concerns: ​ the file size requirements for sending video data over the Internet, the network fiber capacity of the Internet for the near future and the progress of narrowband (28.8Kbps) to broadband (1.5 Mbps) over the “last mile.” This will provide an understanding of the difficulties being overcome in transforming video from the current limited narrowband streaming video to broadband video delivery. ​ Transitioning from Analog to Digital Technology ​ Thomas Alva Edison’s contributions to the telegraph, phonograph, telephone, motion pictures and radio helped transform the 20th Century with analog appliances in the home and the factory. Many of Edison’s contributions were based on the continuous electrical analog signal. ​ Today, Edison’s analog appliances are being replaced by digital ones. Why? Let’s begin by comparing the basic analog and digital characteristics. ​ Analog signals move along wires as electromagnetic waves. The signal’s frequency refers to the number of time per second that a wave oscillates in a complete cycle. The higher the speed, or frequency, the more cycles of a wave are completed in a given period of time. A baud rate is one analog electric cycle or wave per second. Frequency is also stated in hertz (Hz). (Kilohertz or kHz represents 1000 Hz, MHz represents 1,000,000 Hz and GHz represents a billion Hz). ​ Analog signals, such as voice, radio, and TV involve oscillations within specified ranges of frequency. For example: Voice has a range of 300 to 3300 Hz Analog cable TV has a range of 54 MHz to 750MHz Analog microwave towers have a range of 2 to 12 GHz Sending a signal along analog wires is similar to sending water through a pipe. The further it travels the more force it loses and the weaker it becomes. It can also pick up vibrations, or noise, which introduces signal errors. ​ Today, analog technology has become available world-wide through the following transmission media: 1/. Copper wire for telephone (one-to-one communication). 2/. Broadcast for radio & television (one-to-many communication). 3/. Cable for television (one-to-many communication). ​ Most forms of analog content, from news to entertainment, have been distributed over one or more of these methods. Analog technology prior to 1990, was based primarily on the one-to-many distribution system as show in the Table below where information was primarily directed toward individuals from a central point. Table 1-1 Analog Communication Prior to 1990 Prior to 1990, over 99% of businesses and homes had content reach them from any one of the three transmission delivery systems. Only the telephone allowed two-way communication, however. While the other analog systems where reasonably efficient in delivering content, the client could only send feedback, or pay bills, through ordinary postal mail. Obviously, the interactivity level of this system was very low. ​ The technology used in Coaxial Cable TV (CATV) is designed for the transport of video signals. It is comprised of three systems: AM, FM, and Digital. Since the current CATV system with coaxial analog technology is highly limited in bandwidth new technology is necessary for applications requiring higher bandwidth. In the digital system, a CATV network will get better performance than AM/FM systems and ease the migration from coaxial to a fiber based system. Fiber-optics in CATV networks will eliminate most bottlenecks and increase channel capacity for high speed networks. ​ Analog signals are a continuous variable waveform that are information intensive. They require considerable bandwidth and care in transmission. Analog transmissions over phone lines have some inherent problems when used for sending data. Analog signals lose their strength over long distances and often need to be amplified. Signal processing introduces distortions and become amplified raising the possibility of errors. ​ In contrast to the waveform of analog signals, digital signals are transmitted over wire connections by varying the voltage across the line between a high and a low state. Typically, a high voltage level represents a binary digit 1 and a low voltage level represents a binary digit 0. Because they are binary, digital signals are inherently less complex than analog signals and over long distances they are more reliable. If a digital signal needs to be boosted, the signal is simply regenerated rather than being amplified. ​ As a result, digital signals have the following advantages over analog: Superior quality Fewer errors Higher transmission speeds Less complex equipment The excitement over converting analog to digital media is, therefore, easy to explain. It is motivated by cost-effective higher quality digital processing for data, voice and video information. In transitioning from analog to digital technologies however, several significant changes are also profoundly altering broadcast radio and television. The transition introduces fundamental changes from one way broadcast to two-way transmission, and thereby the potential for interactivity, and scheduling of programming to suit the user’s needs. ​ Not only is there an analog to digital shift, but a synchronous to asynchronous shift as well. Television and radio no longer needs to be synchronous and simultaneous. Rather the viewer and listener can control the time of performance. ​ In addition, transmission can be one of three media: copper wire, cable, or wireless. Also, the receiver is transitioning from a dumb device, such as the television, to an intelligent set-top box with significant CPU power. This potentially changes the viewer from a passive to an interactive participant. Today, both analog and digital video technologies coexist in the production and creative part of the process leading up to the point where the video is broadcast. ​ Currently, businesses and homes can receive content from one to six delivery systems: analog: copper wire (telephones), coaxial cable (TV cable), or broadcast (TV or radio); digital: copper wire (modem, DSL), Ethernet modem, or wireless (satellite). ​ At the present time, analog systems still dominate, but digital systems are competing very favorably as infrastructure becomes available. Analog/digital telephone and digital cable allow two-way communication and these technologies are rapidly growing. The digital systems are far more efficient and allow greater interactivity with the client. ​ Competing Technologies ​ The race is on as cable, data, wireless, and telecommunications companies are scrambling to piece together the broadband puzzle and to compete in future markets. The basic infrastructure of copper wire, cable and satellite, as well as, the packaged contents are in place to deliver bigger, richer data files and media types. In special cases, data transmission over the developing computer networks within corporations and between universities, already exist. Groups vying to dominate have each brought different technologies and standards to the table. For the logical convergence of hardware, software and networking technology to occur the interface of theses industries must meet specific inter-operational capabilities and must achieve customer expectations for quality of service. ​ Long distance and local Regional Bell Operating Companies (RBOC) telephone companies started with the phone system designed for point-to-point communication, POTS (plain old telephones) and have evolved into a large switched, distributed network, capable of handling millions of simultaneous calls. They track and bill accordingly with an impressive performance record. They have delivered 99.999% reliability with high quality audio. Their technology is now evolving toward DSL (Digital Subscriber Line) modems. AT&T has made significant progress in leading broadband technology development now that it has added the vast cable networks of Tele-Communications Inc. and MediaOne Group to telephone and cellular. Currently, AT&T with about 45% of the market can plug into more U.S. households than any other provider. But other telecommunications companies, such as Sprint and MCI, as well as, the regional Bell operating companies, are also capable of integrating broadband technology with their voice services. ​ Although both routing and architecture of the telephone network has evolved since the AT&T divestiture, the basics remain the same. About 25,000 central offices in the U.S. connect through 1200 intermediate switching nodes, called access tandems. The switching centers are connected by trunks designed to carry multiple voice frequency circuits using frequency division multiplexing (FDM), or synchronous time-division multiplexing (TDM), or wavelength division multiplexing (WDM) for optics. ​ The cable companies Time Warner, Comcast, Cox Communications and Charter Communications have 60 million homes wired with coaxial cable primarily one-way cable offering one-to-many broadcast service. Their technology competes through the introduction of cable modems and the upgrade of their infrastructure to support two-way communication. The merger between AOL and Time Warner demonstrates how Internet and content companies are finding ways to converge. ​ Cable television networks currently reaches 200 million homes. On the other hand, satellite television can potentially reach 1 billion homes. These will offer nearly complete coverage of the U.S., digital satellite is also competing. DirecTV, has DirecPC, which can beam data to a PC. Its rival, EchoStar Corp., is working with interactive TV player, TiVo Inc., to deliver video and data service to a set-top box. However, satellite is currently not only a one-way delivery system, but is also the most expensive in the U.S. In regions of the world outside the U.S. where the capital investment in copper wires and cable has yet to be made, satellite may have a better competitive opportunity. ​ The Internet itself doesn’t own its own connections. Internet data traffic passes along the copper, fiber, coaxial cable, and wireless transmission of the other industries as a digital alternative to analog transmissions. The new media is being built to include text, graphics, audio, and video across platforms of television, Internet, cable and wireless industries. The backbone uses wide area communications technology, including satellite, fiber, coaxial cable, copper and wireless. Data servers mix mainframes, workstations, supercomputers, and microcomputers and a diversity of clients populate the end-points of the networks including; conventions PCs, palmtops, PDAs, smart phones, set-top boxes, and TVs. Figure 1-1 Connecting the backbone of the Internet to Your Home Web-television hybrids, such as, WebTV provide opportunities for cross-promotion between television and Internet. Independent developers may take advantage of broadcast-Internet synergy by creating shows to targeted audiences ​ Clearly, the future holds a need for interaction between the TV and the Internet. But will it appear as TV quality video transmitted over the Internet and subsequently displayed on a TV set. Or, alternatively, as URL information embedded within existing broadcast TV set pictures. Perhaps both. ​ Streaming Video ​ Streaming is the ability to play media, such as audio and video, directly over the Internet without downloading the entire file before play begins. Digital encoding is required to convert the analog signal into compressed digital format for transmission and playback. ​ Streaming videos send a constant flow of audio/video information to their audience. While streaming videos may be archived for on-demand viewing, they can also be shown in real-time. Examples include play-by-play sports events, concerts and corporate board meetings. But a streaming video offers more than a simple digitized signal transmitted over the Internet. It offers the ability for interactive audience response and unparalleled form of two-way communication. The interactive streaming video process is referred to as Webcasting. ​ Widespread Web-casting will be impractical, however, until audiences have access rates of a minimum of 100 Kbps or faster. Compression technology can be expected to grow more powerful, significantly reducing bandwidth requirement. By 2006 the best estimates indicate that 40 Million homes will have cable modems and 25 Million DSL connections with access rates of 1.5 Mbps. ​ We shall see in Chapters 5, 6 and 7 how the compression codecs and software standards will competitively change “effective” Internet bandwidth and the quality of delivered video. ​ The resultant video quality at a given bandwidth is highly dependent upon the specific video compressor. The human eye is extremely non-linear and its capabilities are difficult to quantify. The quality of compression, specific video application, typical content, available bandwidth, and user preferences all must be considered when evaluating compressor options. Some optimize for “talking heads” while other optimize for motion. To date, the value of streaming video has been primarily the rebroadcast of TV content and redirected audio from radio broadcasts. The success of these services to compete with traditional analog broadcasts will depend upon the ability of streaming video producers to develop and deliver their content using low cost computers that present a minimal barrier to entry. Small, low cost independent producers will effectively target audiences previously ignored. Streaming videos steadily moving toward the integration of text, graphics, audio, and video with interactive on-line chat will find new audiences. In Chapter 2, we present business models to address business’s video needs. ​ Despite these promising aspects, streaming video is still a long way from providing a satisfactory audio/video experience in comparison to traditional broadcasts. The low data transmission rates are a severe limitation on the quality of streaming videos. While a direct broadcast satellite dish receives data at 2 Mbps, an analog modem is currently limited to 0.05 Mbps. The new cable modems and ADSL are starting to offer speeds competitive with satellite, but they will take time to penetrate globally. Unlike analog radio and television, streaming videos requires a dynamic connection between the computer providing the content to the viewer. Current computer technology limits the viewing audience to up to 50,000. While strategies to overcome this with replicating servers may increase audiences, this too will take effort. ​ The enhancement of data compression reduces the required video data streaming rates to more manageable levels. The technology has only recently reached the point where video can be digitized and compressed to levels which allow reasonable appearance during distribution over digital networks. Advances continue to come, improving look and delivery of video. ​ Calculating Bandwidth Requirements ​ So far we have presented the advantages of digital techology, unfortunately there is one rather large disadvantage - bandwidth limitations. Let’s try some simple math that illustrates the difficulties. Live, or on-demand, streaming video and/or audio is relatively easy to encode. The most difficult part is not the encoding of the files. It is determining what level of data may be transmitted. The following Table contains information that will help with some basic terms and definitions: Why the difference between Kbps and KB/sec? File sizes on a hard drive are measured in Kilobytes (KB). But the data that transferred over a modem is measured in Kilobits per second (Kbps) because it's comparatively slower than a hard drive. ​ In the case of a 28.8Kbps modem the maximum data transfer rate is 2.5 KB/sec even through the calculated rate is 28.8Kbs / 8 bits in a byte = 3.6KB/sec. This is because there is approximately a 30% losses of transmission capabilities lost due to Internet “noise.” This is due to traffic congestion on the web and more than one surfer requesting information on the same server. ​ The following Table 1-4 provides information concerning the characteristics of video files. This includes pixels per frame and frames per file (film size file). We can use the information in Table 1-4 to compare to some simple calculations. We will use the following formula to calculate the approximate size in Megabytes of a digitized video file: ​ (pixel width) x (pixel height) x (color bit depth) x (fps) x (duration in seconds) ​ 8,000,000 (bits / MB) ​ For three minutes of video at 15 frames per second with a color bit depth of 24-bit in a window that is 320x240 pixels, the digitized source file would be approximately 622 Megabytes: (320) x (240) x (24) x (15) x (180) / 8,000,000 = 622 Megabytes We will see in chapter 4, how data compression will significantly reduce this burden. ​ Now that we have our terms defined, let's take the case of a TV station that wants to broadcast their channel live 24hrs a day for a month over the web to a target audience of 56 Kbps modem users. In this case, a live stream generates a 4.25KB/sec since a 56Kbps file transfers at 4.25KB/sec. So how much data would be transferred in a 24 hr period if one stream was constantly being used? ​ ANSWER = 4.25 KB/sec * (number of seconds in a day) * 30 days per month = 11 GB/month So, one stream playing a file encoded for 56 Kbs for 24hrs a day will generate 11 gigabytes in a month. How is this figure useful? ​ This figure becomes important if you can estimate the average number of viewers in a month, then you can estimate the total amount of data that will be transferred from your process. Ultimately the issue becomes one of the need for sufficient backbone infrastructure to carry many broadcasts to many viewers across the networks. ​ For HDTV with a screen size of 1080x1920 and 24-bit color, a bandwidth of 51.8 Mbps is required. This is a serious amount of data flow to route around the Internet to millions of viewers. ​ Transitioning from Narrowband to Broadband ​ In telecommunications, bandwidth refers to data capacity of a channel. For an analog service, the bandwidth is defined as the difference between the highest and lowest frequency within which the medium carries traffic. For example, cabling that carries data between 200 MHz and 300 MHz has a bandwidth of 100MHz. In addition to analog speeds in hertz (Hz) and digital speeds in bits per second (bps), the carrying rate is sometimes categorized as narrowband and broadband. It is useful to relate this to an analogy in which wider pipes carry more water. TV and cable are carried at broadband speeds. However, most telephone and modem data traffic from the central offices to individual homes and businesses are carried at slower narrowband speeds. This is usually referred to as the “last mile” issue. ​ The definitions for narrowband and broadband vary within the industries, but are summarized for our purposes as: Narrowband refers to rates less than 1.5 Mbps Broadband refers to rates at or beyond 1.5 Mbps A major bottleneck of analog services exists between cabling of residents and telephone central offices. Digital Subscriber Line (DSL) and cable modem are gaining in availability. Cable TV companies are investing heavily in converting their cabling from one-way only cable TV to two-way systems for cable modems and telephones. In contrast to the “last-mile” for residential areas, telephones companies are laying fiber cables for digital services from their switches to office buildings where the high-density client base justifies the additional expense. ​ We can appreciate the potential target audience for video by estimating; how fast the “last mile” bandwidth demand is growing. Because installing underground fiber costs more than $20,000 per mile, fiber only makes sense for businesses and network backbones. Not for “last mile” access to homes. Table 1-5 shows the estimated number of users connected at various modem speeds in 1999 and 2006. High-speed consumer connections are now being implemented through cable modems and digital subscriber lines (DSL). ​ Approximately 1.3 million home had cable modems by the end of 1999 in comparison to 300,000 DSL connections primarily to businesses. By 2006, we project 40 million cable modems and 25 million DSL lines. Potentially data at the rate of greater than one megabit per second could be delivered to over 80 per cent of more than 550 million residential telephone lines in the world. Better than one megabit per second can also be delivered over fiber/coax CATV lines configured for two-way transmission, to approximately 10 million out of 200 million total users (though these can be upgraded). ​ In2000, the median bandwidth in the U.S. is less than 56. This is de facto a narrowband environment. But worldwide there is virtually limitless demand for communications as presented by the following growth rates: The speed of computer connections is soaring. The number of connections at greater than 1.5 Mbps is growing at 45% per year in residential areas and at 55% per year in business areas. Because of improving on-line experience, people will stay connected about 20% longer per year. As more remote areas of the world get connected, messages will travel about 15% father a year. The number of people online worldwide in 1999 was 150 million, but the peak Internet load was only 10% and the actual transmission time that data was being transferred, was only 25% of that number. With the average access rate of 44 kbps this indicates an estimate of about 165 Gbps at peak load. In 2006 there will be about 300 million users and about 65 million of these will have broadband (>1.5 Mbps) access. With the addition of increased peak load and increased actual transmission time, this will result in an estimated usage of about 16.5 Tera-bits per second of data processing. ​ It all adds up to a lot of bits. It leads to a demand for total data communications in 2006 of nearly a100-fold increase over 1999. With the number of new users connecting to the Internet growing this fast can the fiber backbone meet this demand? Figure 1-2 answers this question. ​ Figure 1-2 shows the growth in Local Area Networks (LANs) from 1980 to 2000 with some projection into the next decade. In addition, the Internet capacity is shows that over the last few decades and indicates the potential growth rate into the next decade. The jump up in Internet capacity due to Dense Wavelength Division Multiplexing (DWDM) is a projection of the multiply effect of this new technology. As a result this figure shows that we can expect multi-Tera-bit per second performance from the Internet backbone in the years ahead. This will meet the projected growth in demand. Great! But, what about that “last mile” of copper, coax, and wireless? ​ The “last mile” involves servers, networks, content and transitions from narrow to broadband. Initially, the “last mile” will convert to residential broadband not as fiber optics, but as a network overlaid on existing telephone and cable television wiring. One megabit per second can be delivered to over 80 % or more of 550 million residential telephone lines in the world. It can also be delivered over all fiber/coax CATV lines configured for two-way service. The latter represents a small fraction of the worldwide CATV lines however, requiring only 10 million homes out of 200 million. But upgrade programs will convert the remainder in 5 years. The endgame of the upgrade process may be fiber directly to the customer’s home, but not for the next decade or two. A fiber signal travels coast to coast in 30 ms and human latency (period to achieve recognition) is about 50 milliseconds. Thus fiber is the only technology to deliver viewable HDTV video. However, due to the cost and man-power involved, we’re stuck with the “last mile” remaining copper, coax and wireless for a while yet. ​ The Table 1-7 below summarizes how the five delivery approaches for analog and digital technologies will co-exist for the next few years. In chapter 8, we will present network background on the technologies and standards and revisit this table in more detail. ​ ​ One-way * (FFTH is fiber to the home, FTTC is fiber to the curb, MPEG-2 is a compression standard see chapter 4, ATM is Asynchronous Transfer Mode see chapter 8, TDM is Time Division Multiplexing see chapter 8). Preparing to Converge ​ To be fully prepared to take advantage of the converging technologies, we must ask and answer the right questions. This is not as easy as it might seem. We could ask, “Which company will dominate the broadband data and telecommunication convergence?” But this would be inadequate because the multi-trillion dollar world e-commerce market is too big for any one company to monopolize. ​ We could ask, “Which broadband networks will dominate the Internet backbone?” But this would be inadequate because innovative multiplexing and compression advances will make broadband ubiquitous and subservient to the “last mile” problem. ​ We could ask, “Which transmission means (cable, wireless, or copper) will dominate the “last mile”?” But this would be inadequate because the geographical infrastructure diversity of these technologies throughout the world will dictate different winners in different regions of the world demonstrating this as a “local” problem. Individually, these questions address only part of the convergence puzzle. It is e-commerce’s demand for economic efficiency that will force us to face the important q estion of the telecommunication convergence puzzle. ​ “What are meaningful broadband cross-technology standards?” ​ Without globally accepted standards, hardware and software developers can’t create broad solutions for consumer demand. As a result, we will be concerned throughout this book in pointing out the directions and conflicts that various competing standards are undertaking. ​ Conclusion ​ In this chapter, we presented the background of analog technology’s transition toward digital technology. This chapter provided a calculation that illustrated why digital video data is such a difficult bandwidth problem. It evaluated the rate of change of conversion from narrowband connections to broadband. This rate establishing a critical perspective on the timeline of the demand for Internet video. ​ On the basis of this chapter, you should conclude that: The Internet backbone combination of fiber and optical multiplexing will perform in the multi-Tera-bps range and provide plenty of network bandwidth in the next few years. The “last mile” connectivity will remain twisted pair, wireless, and coax cable for the next few years, but broadband (1.5Mbps) access through cable modems and x-DSL will grow to 40 million users in just a few years. Streaming video was identified as the crossroads of technology convergence. It is the bandwidth crisis of delivering video that will prove decisive in setting global standards and down-selecting competing technologies. The success of streaming video in its most cost-effect and customer satisfying form will define the final technology convergence model into the 21st Century

H. Peter Alesso

©2023 by hpeteralesso.com.

bottom of page