Business Ethics In Apps: On Censorship and Data Mining
Business ethics in apps is a big question now that app censorship, data mining, and other unethical mobile app practices are becoming public knowledge.
Smartphones have come to play an increasingly intimate role in human behavior. We spend as much as three hours a day using mobile devices as an innate part of our lives - taking photos of ourselves and our loved ones, making payments with our credit cards, and storing personal data that we need for the future.
This is a lot of trust to put into a device, and that trust is stretched even further when it comes to third-party mobile applications. These apps are ultimately built by individuals outside of official Apple and Android teams, and though many have been trustworthy others have turned out to be made to the detriment of the user.
A special type of business ethics exists in the mobile app development industry, and these ethics need to be respected in order to protect end-users from potential exposure or disaster. Think of how much information an individual’s smartphone records and it’s easy to see why these ethics are essential for developers.
Before we talk about big topics like app censorship, let’s take a look at more common examples of unethical mobile app practices and why these are so detrimental to enterprises and businesses launching their own:
Examples Of Unethical Mobile App Practices
There have been plenty of examples in the mobile app world of less than savory behavior, from collecting personal data for nefarious purposes to violating app store policies and worse.
These are typically applications that fall into two different categories:
- Apps specifically built for illegal purposes that are made to steal credit card information or other personal information which is then sold elsewhere.
- Apps that are built by inexperienced developers who don’t know better when it comes to required permissions, ethics, or moral UX design.
Many of the lower-quality applications on the app store make use of purposefully misleading prompts, permissions, or false promises to gain access to a user’s information. Some common examples of this include collecting too much data from submission forms, disguising ads as normal content, and adding subscriptions that don’t remind users of their renewal.
These applications fall under the first category - apps made specifically for the purpose of fooling or misleading users. Apps that are using highly unethical practices regularly get removed from the app store, and any mobile development engineer being asked to develop these on behalf of a company should have major red flags going off in their heads.
Remember, as a developer, your professional reputation and previous experiences are key to any future roles you want to gain and you do not want to have a stain on your track record. Always do your due diligence when applying for a development role and take a look at their existing portfolio or proposals to gain an idea of how ethical the company is when it comes to their app projects.
Then there is the less nefarious, but just as dangerous, inexperienced developer or development team. They build applications not always thinking about the impact on the end-user and end up making costly mistakes. This is often the case when there are hacks or data leaks, and falls under category two of unethical app practices.
Less-experienced developers may not have the right experience with cybersecurity and best practices to know how to circumvent these risks. This makes for a risky situation all around - risky for the end-user losing their data, risky to the developers who receive bad reviews, and risky to the original funder or investor of the app too.
Big Apps, Big Accusations
Now, it’s true that low-quality apps usually show their true colors soon enough and alert the user that not all is what it seems.
But what about the big apps that everyone uses?
TikTok is a short-video-based app that receives hundreds of millions of views a day. The app’s primary target market is young people, from tweens to people in their mid-twenties, but nearly everyone on the globe has seen some version of a TikTok video somewhere online. So what is the problem with this?
The problem is data mining.
Data mining is an unethical practice in which an application gathers information about a user in order to more accurately advertise to them or influence their behavior. TikTok parent company ByteDance has been accused of data mining, a practice that is even more questionable here because so many of its users are underage.
And TikTok is not the only app giant that has raised major issues surrounding privacy in recent years. Facebook, Twitter, and other social media platforms have all been accused of similarly accessing private data and even leaking it.
Social media platforms in particular are applications that carry a lot of weight in this conversation. Social apps are some of the most commonly used applications, and they also make use of a lot of the user’s personal information to customize their experiences - but there is a fine line between personalization and privacy invasion.
Many social media giants have been accused of another type of unethical behavior: censorship. Curating social feeds to suit certain regional regimes is one such example - but censorship takes place in all types of places when it comes to smartphones.
The Problem isn’t Just Apps - It’s App Stores
Bad apps aren’t the only source of heartache for mobile users and developers.
Even official app stores have made questionable decisions in the past that has left the public wondering about their role and the level of power they have over digital content.
Back in 2019, Spotify accused Apple of making use of bad practices to make sure their own music software iTunes remains on top of the app store, and this is just the tip of the iceberg. The Apple app store has had multiple censorship and removals in the past, such as Apple censoring Parler as a free speech platform before finally conceding and allowing it to be listed.
This kind of power is a double-edged sword for mobile app users. On one hand, keeping tabs on what is being uploaded helps app stores to ensure that low-quality or risky apps aren’t uploaded, but on the other hand, it also gives them the power to censor applications based on their own perspectives and gains.
Solutions To Unethical Behavior In Apps
Getting rid of bad apps isn’t as easy as reporting or removing them. Think about application giants like Facebook that have legal teams which would quickly fight and override such decisions.
The real solutions need to come from the ground up. From onboarding onwards for mobile developers, there should be a keen focus on using ethical practices and remaining within the stipulations set by official app stores.
Then, there needs to be a level of accountability. Companies running applications that collect personal data need to be taken to task when it comes to how they handle data storage and cyber security for example. The punishments for not complying with the rules should be serious enough to set examples for others considering the same path.
And then there is the end user, that also needs to take on a level of responsibility when it comes to their safety when using third-party apps. There needs to be more awareness in terms of how companies use this data, which will help end users question the intent behind the permissions they are agreeing to, the subscriptions they are paying, and more.
As the old saying goes, it’s fair to trust but just as fair to verify.
How Businesses Can Ensure Their Apps Are Built Ethically
Ethical mobile application development takes a moment to learn but a lifetime to master, which is why businesses looking to partner with a development agency will need to do their due diligence or risk working with the wrong type of service provider.
Tip: The reality is that when you commission an application to be developed, you will get what you pay for. And if you choose to go cheap from the get-go, you might end up paying more than you expect when you need someone else to come in and fix it.
The better idea is to work with an experienced team that has a traceable track record of delivering quality, ethical applications on time and maintaining them. The problem is that hiring your own team in-house to do just that will be a time-consuming and costly process that will see your competitors get to market before you can.
But what if you could work with a team that has already been vetted, delivered profitable apps to their clients, and has been recognized as the top mobile development agency in Boston by Clutch.co?
NineTwoThree Venture Studio is that team. We seamlessly integrate into your app plans from concept to completion, ensuring that your goals are met and timelines are kept on target while you focus on sales and marketing. The result? World-class applications loved by founders, developers, and users alike for their quality.
Build your application knowing it will be an asset to your brand, not a costly detriment.
Reach out to us today to learn more about our process and how we can help you get your app to market the right way.