DALL-E Will Revolutionize Art, Bringing Ethical Concerns Along With It

July 5th, 2022

A realistic painting of a half-man, half-cyborg made with DALL-E 2
Artwork made with DALL-E 2 (Source: OpenAI)

DALL-E 2, an artificial intelligence created by OpenAI, has the capability to generate high quality art using nothing but user-inputted text. Impressively, the pictures don't have any of the telltale signs of a typical AI. For example, the AI perfectly creates the composition of the images, correctly parsing the context of the user's request. There are no objects that awkwardly morph into others, and faces are highly detailed. In contrast, the AI DALL-E mini, which was created entirely separately to fulfill the same purpose, is highly lacking. The AI's limitations, such as its inability to create faces, limit its potential for much besides memes and to bring basic ideas to life. Still, it demonstrates an important stepping stone for AI, as this technology is advancing rapidly.

One point of contention about OpenAI's DALL-E is the fact that it can be used to create malicious deepfakes of celebrities or other famous people in situations that would jeopardize their reputation. Due to its high restrictions, it is only accessible by certain people, preventing its use for these purposes. However, it remains to be seen how it will be used by the general public. Most likely, it will continue being used primarily for memes and to test the boundaries of AI.

Opinion: Limitations on Software Can Be Beneficial

July 20th, 2022

Anyone who owns a Mac knows the typical story. Maybe you want to download a video game or a add-on for one. Maybe you just want to run a program for school. Either way, you know it's going to be a struggle to make sure the program works on your machine. However, this inconvenience comes with a potential benefit: optimization. By requiring the program to be made specifically for a certain platform, the creator has to rewrite the program, allowing them to optimize it along the way and preventing software from being run on devices that it is not designed for. In addition, the majority of viruses are designed for Windows computers, making it much more difficult for a virus to infect a Mac. The same goes for iPhones. I have personally struggled with helping my grandmother navigate her nearly unusable Android phone, due to viruses that opened the web browser and redirected her to various pages every couple of minutes. As such, I was relieved when I found an iPhone that I was able to buy for her, with the knowledge that it was highly unlikely that it would become infected with a virus. To top it all off, Apple recently released a new feature: Lockdown Mode. This mode further decreases the risk of security flaws through methods such as restricting unknown calls and limiting iMessage attachments, making it nearly impossible for an iPhone to get a virus. However, Apple has stated that this mode is solely to be used by those who need it due to their occupation, such as politicians, journalists, and those in related professions. Even without Lockdown Mode, the iPhone's limitations have their drawbacks: for example, they need to be jailbroken to customize the lock screen or to download certain apps.

While these apps can be fun to tinker around with, they ultimately do not bring functionality that is absolutely necessary to the average user. In addition, many developers now ensure that their products are available on all platforms, drawn by the appeal of a larger audience enjoying their software. This is largely seen by the market share of Apple phones in the United States. Limiting a piece of software to Android would prevent developers from accessing this large demographic. It is unlikely that restrictions will begin to ease as viruses become increasingly harder to detect, and companies such as Apple are continuing to make their phones harder to jailbreak with every update.

Quantum Technologies and their Potential Impact on Computing

July 31st, 2022

Recently, an astonishing feat involving something called quantum entanglement was achieved. In short, quantum entanglement allows for changes in one atom to be immediately reflected in another, regardless of their distance. This linkage means that looking at one atom will tell you about the state of the other, hence the name "quantum". Researchers were able to perform quantum entanglement on two atoms twenty miles apart. These two atoms were made to emit photons, which were sent through fiber-optic cables. Does that sound like something we know and love today? The article mentions briefly that this feat is "an important step to realizing a quantum internet." The implications of this mean that we may someday have instant upload and download speeds, with distance no longer being a factor in internet speeds. Granted, this would be expensive to implement. But for data-heavy organizations such as NASA, quantum internet will bring a level of efficiency previous thought to be impossible. As time passes, this technology would undoubtedly become more widespread.

However, quantum technology is nothing new. Quantum computers have existed for years, with their system of using not bits (which only have two states, 0 and 1), but qubits (which have infinite states anywhere from 0 to 1) to store data making them exponentially faster than normal computers. With quantum internet to potentially go with them, it seems that we may soon be in a new age of technology. Most notably, quantum computers with quantum internet together could make completely instantaneous calculations and data transfer possible. Unfortunately, quantum technology is very volatile, with any minor changes in conditions potentially altering the state of quantum data, such as qubits. As you may know, the way quantum computers work is by using a system of qubits (or quantum bits), rather than the typical bits in a computer. While bits are limited to the states 0 and 1, qubits are able to have states in between, enabling them to transfer much more data than a single bit. However, upon checking the state of a qubit, it will automatically jump to a state of 0 or 1, depending on which it is closest to. This makes it difficult to read quantum data despite having it stored. We can only hope that advances in technology will allow consumers to take advantage of these developments.

New iOS Feature Can Benefit Teachers

September 7th, 2022

Youtuber Dave2D recently made a video showcasing the new Continuity Camera that iOS 16 will come with. This feature allows you to mount your iPhone on top of your laptop to serve as a virtual webcam, using a separately sold mount. Dave2D claims that it "looks impossible" due to the way it projects a perspective that does not appear to be possible with the iPhone's built in camera. The feature works by using the edge of the phone's camera, which is placed behind the Mac and faces towards the user, in order to take a live recording of the desk in front of it. It then broadcasts the recording to the Mac, creating a desk camera without the need for costly equipment and specialized software.

Teachers can benefit from this new technology, seeing as many of them rely on large, hard-to-adjust desk cameras to broadcast their worksheets. With the Desk View feature, they can free up space on the desk and eliminate the need for extra peripherals and software. This is especially beneficial for teachers who move around often and cannot bring their desk camera with them. This feature is especially beneficial for teachers who teach virtually, as they can emulate the feeling of being in a real classroom without having to struggle with writing with a mouse on a screen.

However, this feature comes with some drawbacks. Most notably, the perspective-stretching the camera does in order to be able to project a 2 dimensional view causes 3 dimensional objects to appear skewed. Dave2D demonstrates this by putting his hands underneath the camera, with his fingers being stretched to fill the screen. However, this should not impact the usability of the feature. It will still prove infinitely valuable to teachers who teach using worksheets, such as mathematics, science, and English teachers.

The Nothing Phone: The Future of Technology?

September 17th, 2022

The Nothing Phone(1) in black, showcasing the glyphs on the back
The Nothing Phone(1)'s simple design (Source: The Economic Times)

The aptly-named Nothing Phone(1) is an incredibly minimal phone designed to avoid unnecessary software and hardware. The creators of it, Nothing, aim to make technology as simple as possible—a mission that seems to be at odds with the Android software it runs. However, it manages to pull it off, with a sleek design and software that is designed to be minimally intrusive. The design consists of a simple front face with a full-frame screen and only a small camera cutout at the top right. In the true spirit of minimalism, it makes the most of its limited design. With a set of LED lights on the back, the phone forms various "glyphs" to alert you to different kinds of notifications. Due to its relatively stripped-back design, it retails for about $400 USD, making it the perfect device for someone who doesn't need a flashy phone.

The question is whether this is a trend that will inspire tech companies such as Apple and Samsung to cut back on unnecessary features. Companies such as Samsung, which are notorious for including unnecessary apps and features, will likely continue to overload their phones with various functions and apps to give themselves an advantage over their competitors. On the other hand, Apple has released stripped-back versions of their phones in the past, such as the iPhone SE, indicating a desire to appeal to audiences that want simpler, cheaper phones. If the Nothing Phone(1) is successful enough, Apple may follow in Nothing's footsteps by making their software and design even more minimal with an added emphasis on functionality.

Why USB-C Should be the Default

October 4th, 2022

The USB-C and USB 3.1 ports side-by-side, showing the USB-C's smaller size and reversible design
A USB-A 3.1 port (left) next to a USB-C port (right) (Source: OnLogic)

If you're anything like me and have a collection of different devices needing different cables, you'll appreciate the growing popularity of USB-C, a type of connection similar to a regular USB but with crucial differences. Designed to be a "universal standard," USB-C technology is becoming common on a wide range of devices, from phones to tablets, computers, and even headphones.

The shape of the port allows it to be reversible, counteracting the typical USB port's notoriously difficult-to-connect shape. Not only does this allow for it to be connected more easily, but it means that fewer cables will be needed in the future. In fact, the UK has found it to be so convenient that it is requiring Apple to switch to putting USB-C ports in their phones instead of Lightning ports. The current Lightning port, the kind found on Apple devices, is frustrating due to the fact that it requires a unique cable which is prone to breaking. If this is enforced in the UK, it could encourage Apple to do the same in other countries, essentially eliminating the need to have a Lightning cable which no longer comes in the box with most devices. In addition, if every one of your devices uses the same cable, there is no longer a need to have multiple types of each on hand at all times. Not only is this more convenient, but USB-C technology is much faster than the typical USB-A port.

USB-C does come with a drawback, however, since there are multiple types of the same cable. Some transfer power only, while some are designed to transfer data and some do both. Meanwhile, most USB-A connections are able to transfer both. However, for most people who only need to charge their devices, this will not be an issue. For those who need to transfer data, they can continue to use USB-A, and if this technology is ever deprecated, they can use a cable with both charging and data transfer capabilities.

Ultimately, having USB-C as the default is a goal that will take some time, but will result in increased convenience for many people.

The "Stagnation" of Innovation?

December 18th, 2022

The iPhone 13 Pro and 14 Pro side-by-side
The only notable difference between the iPhone 13 Pro and 14 Pro (Source: MacRumors)

As many people like to joke, the design of the iPhone 14 isn't very different from the iPhone 13. The most notable change was the movement of the camera "notch" downward to form an island. However, this change isn't even present on the base model iPhone 14, only on the 14 Pro and Pro Max. And yet, countless people upgrade their phones every year for reasons that seem absurd at times. What should we make of this apparent "stagnation" of innovation when it comes to technology design? With the time and resources that a company like Apple has, shouldn't there be a new design every year? After all, it would make more people want to upgrade, wouldn't it? It's important to remember that the scarcity of something makes it more desirable. In this case, the radical redesign of a phone isn't something that comes very often, whether it's part of the Samsung line the Apple line, or something else entirely. On top of this, it's just not feasible for companies to redesign their phones every year. They run the risk of introducing changes that are too radical or costly, dissuading potential customers from upgrading. And if people will buy new phones regardless of how different they are, why bother?

It's also important to remember that there are other companies guilty of this. For example, besides its obvious flagship S line, Samsung has numerous cheaper phones that recycle the same design. While Samsung's flagship phones enjoy more design variation than Apple's, there still isn't much of a difference between—for instance—the Samsung Galaxy S21 and the S22. The only difference is the material the back is made out of, which tends to go unnoticed when it's hidden behind a case. And again, we see people go for the newest Samsung phones due to camera improvements and other factors that seem to improve every year.

In conclusion, just because innovation in the phone industry seems to be at a standstill doesn't mean it isn't happening. Even though it is disappointing to see a new phone that looks exactly the same as the old one being released, this ultimately means very little to consumers and companies.

How AI Can Be a Learning Tool

January 12th, 2022

Sample ChatGPT Conversation
An example of ChatGPT at work (Source: Mashable)

ChatGPT is just one of the many new AI-based technology that's taken the world by storm. And it's already recieving backlash. For instance, many school districts, such as New York City's, have chosen to ban it from schools. This is a tale as old as time. People run away from new technology rather than embrace it. The development of the internet was considered a "fad" and many refused to use it. Only time will tell whether AI will stick around or fade away, but either way, it is here now and has the capability to change the education system. I have seen many concerned computer science students fear that AI will take their jobs, and that they need a new career path. However, AI can be worked with in such a way that it advances their education and their careers. For example, AI can handle the task of removing bugs from code: if you give it your code and the error you get, it will provide you an updated piece of code, (hopefully) error-free. It can also write basic code that is used within a lot of different programs, and since it's AI, it can be custom-designed for what you need. This allows programmers to increase their efficiency.

On the topic of education, AI can be used in many different subjects. Students of all disciplines can use it to explain concepts in a personalized way teachers may not be able to. Art students can use AI images to generate basic ideas and references before they begin to create them by hand. Even teachers can use AI to generate examples for essays and write directions for projects and outlines. However, ChatGPT's crucial flaw is that it is not always accurate in the information it provides. This is why it requires careful evaluation before anything it outputs can be used. With this aside, it has a promising future and should be embraced in classrooms and work environments rather than shunned.