Inline Functions in Props
Mistake: Defining functions inline inside components, which causes unnecessary re-renders.
Wrong Code:
Correct Practice: Define the function outside or use
π©βπ» React Native Hub
Mistake: Defining functions inline inside components, which causes unnecessary re-renders.
Wrong Code:
import { Button } from 'react-native';
const MyComponent = () => {
return (
<Button title="Press Me" onPress={() => console.log('Button Pressed')} />
);
};
Correct Practice: Define the function outside or use
useCallback to memoize it.
import { Button } from 'react-native';
import { useCallback } from 'react';
const MyComponent = () => {
const handlePress = useCallback(() => {
console.log('Button Pressed');
}, []);
return <Button title="Press Me" onPress={handlePress} />;
};
Please open Telegram to view this post
VIEW IN TELEGRAM
π3
Ignoring Platform-Specific Differences
Mistake: Hardcoding platform-specific logic instead of handling differences dynamically.
Wrong Code:
Correct Practice: Use
π©βπ» React Native Hub
Mistake: Hardcoding platform-specific logic instead of handling differences dynamically.
Wrong Code:
import { StyleSheet, Text } from 'react-native';
const MyComponent = () => {
return <Text style={styles.text}>Hello</Text>;
};
const styles = StyleSheet.create({
text: {
fontSize: 20,
paddingTop: 20, // Might look bad on iOS
},
});
Correct Practice: Use
Platform.select or conditional logic to handle differences.import { StyleSheet, Text, Platform } from 'react-native';
const MyComponent = () => {
return <Text style={styles.text}>Hello</Text>;
};
const styles = StyleSheet.create({
text: {
fontSize: 20,
paddingTop: Platform.select({ ios: 20, android: 10 }),
},
});
Please open Telegram to view this post
VIEW IN TELEGRAM
π6
Hardcoding Dimensions Instead of Using Flexbox
Mistake: Using fixed dimensions for layout, making the app non-responsive.
Wrong Code:
Correct Practice: Use
π©βπ» React Native Hub
Mistake: Using fixed dimensions for layout, making the app non-responsive.
Wrong Code:
import { View, StyleSheet } from 'react-native';
const Box = () => <View style={styles.box} />;
const styles = StyleSheet.create({
box: {
width: 100,
height: 100,
backgroundColor: 'blue',
},
});
Correct Practice: Use
flex for responsive layouts.
import { View, StyleSheet } from 'react-native';
const Box = () => <View style={styles.box} />;
const styles = StyleSheet.create({
box: {
flex: 1,
backgroundColor: 'blue',
},
});
Please open Telegram to view this post
VIEW IN TELEGRAM
π3π₯1
Hey guys!
Just wanted to share this cool article on React Design Patterns. No matter how much experience weβve got, itβs always a good idea to refresh the basics β helps us write cleaner code and build better apps!
Check it out π
https://dev.to/codeparrot/react-design-patterns-best-practices-for-scalable-applications-46ja
React Native Hub
Just wanted to share this cool article on React Design Patterns. No matter how much experience weβve got, itβs always a good idea to refresh the basics β helps us write cleaner code and build better apps!
Check it out π
https://dev.to/codeparrot/react-design-patterns-best-practices-for-scalable-applications-46ja
React Native Hub
π5
β‘ Hermes
Hermes is enabled by default in new Expo projects, but ensure itβs configured:
Impact:60% faster startup times and 30% memory savings
Concrete gains:
- π App startup: 60% faster startup times
- πΎ Memory usage: 30% memory savings
- π¦ Bundle size: Smaller with bytecode pre-compilation
Note: New Architecture is enabled by default since Expo SDK 52 (November 2024) for new projects, and mandatory by default since SDK 53 (April 2025).
React Native Hub
Hermes is enabled by default in new Expo projects, but ensure itβs configured:
// app.json
{
"expo": {
"jsEngine": "hermes",
"plugins": [
["expo-build-properties", {
"android": { "enableHermes": true },
"ios": { "enableHermes": true }
}]
]
}
}
Impact:
Concrete gains:
- π App startup: 60% faster startup times
- πΎ Memory usage: 30% memory savings
- π¦ Bundle size: Smaller with bytecode pre-compilation
Note: New Architecture is enabled by default since Expo SDK 52 (November 2024) for new projects, and mandatory by default since SDK 53 (April 2025).
React Native Hub
π3π₯1
π¨ Splash Screen: Kill the White Flash of Death
Properly configured splash screens prevent the dreaded white flash during app startup:
Control splash screen programmatically:
Pro tips for seamless experience:
- Match colors: Use the same
- Proper timing: Hide splash only after content is ready to render
- Dark mode support: Always provide dark variant for better UX
React Native Hub
Properly configured splash screens prevent the dreaded white flash during app startup:
// app.json
{
"expo": {
"plugins": [
[
"expo-splash-screen",
{
"backgroundColor": "#232323",
"image": "./assets/images/splash-icon.png",
"dark": {
"image": "./assets/images/splash-icon-dark.png",
"backgroundColor": "#000000"
},
"imageWidth": 200
}
]
]
}
}
Control splash screen programmatically:
import * as SplashScreen from 'expo-splash-screen';
// Prevent auto-hide
SplashScreen.preventAutoHideAsync();
// Set animation options
SplashScreen.setOptions({
duration: 1000,
fade: true,
});
// Hide when app is ready
useEffect(() => {
async function prepare() {
try {
// Load fonts, make API calls, etc.
await Font.loadAsync(MyFont);
await loadUserData();
} catch (e) {
console.warn(e);
} finally {
setAppIsReady(true);
}
}
prepare();
}, []);
const onLayoutRootView = useCallback(() => {
if (appIsReady) {
SplashScreen.hide();
}
}, [appIsReady]);
Pro tips for seamless experience:
- Match colors: Use the same
backgroundColor as your first screen- Proper timing: Hide splash only after content is ready to render
- Dark mode support: Always provide dark variant for better UX
React Native Hub
π₯4π1
π Changelog vs Release Notes π
Whatβs the difference? How do you use them correctly?
My friend recently posted an article about this β check it outπ
React Native Hub
Whatβs the difference? How do you use them correctly?
My friend recently posted an article about this β check it out
React Native Hub
Please open Telegram to view this post
VIEW IN TELEGRAM
Hey guysπ
I just dropped a new article on how to style React Native apps the right way β best practices, tips, and examples included.
π Check it out: The Complete React Native Styling Guide
Let me know what you think!
React Native Hub
I just dropped a new article on how to style React Native apps the right way β best practices, tips, and examples included.
π Check it out: The Complete React Native Styling Guide
Let me know what you think!
React Native Hub
π₯7π6
React Native Date Picker
This library solves the problem of implementing Date/Time picker components in your app.
Β· Fully native implementation for all platforms
Β· TurboModules support
Β· Modal and inline modes
Β· Extensive customization options
React Native Hub
This library solves the problem of implementing Date/Time picker components in your app.
Β· Fully native implementation for all platforms
Β· TurboModules support
Β· Modal and inline modes
Β· Extensive customization options
React Native Hub
π₯9β‘4
Hey everyone π
Iβve decided to take this channel in a new direction.
From now on, React Native Hub will become my personal space β Ars Dev. Donβt worry, Iβll still be sharing content about React Native, mobile development, and everything around it. The only difference is that it will be more personal, with my own thoughts, tips, and insights as a developer.
If youβve been following me for React Native content β youβll still get plenty of that. But youβll also see a bit more of my journey in indie app development.
If you enjoy the content here and want to support the growth of this channel, you can buy me a coffee
Thanks for being here π
Please open Telegram to view this post
VIEW IN TELEGRAM
β‘7π₯7π2
Expo SDK 54 Released
Yesterday Expo announced the release of SDK 54. Here are the highlights and what you need to know when upgrading.
βοΈ Faster iOS builds β dependencies are now precompiled XCFrameworks
βοΈ iOS 26 support β new Liquid Glass icons + expo-glass-effect for glass UI π₯
βοΈ Edge-to-edge on Android enabled by default
βοΈ Better updates β progress tracking, custom headers, and new reload options
βοΈ New packages β expo-app-integrity, stable expo-file-system, improved expo-sqlite
βοΈ Breaking change β Legacy Architecture ends with SDK 54
π Full changelog: expo.dev/changelog/sdk-54
Yesterday Expo announced the release of SDK 54. Here are the highlights and what you need to know when upgrading.
βοΈ Faster iOS builds β dependencies are now precompiled XCFrameworks
βοΈ iOS 26 support β new Liquid Glass icons + expo-glass-effect for glass UI π₯
βοΈ Edge-to-edge on Android enabled by default
βοΈ Better updates β progress tracking, custom headers, and new reload options
βοΈ New packages β expo-app-integrity, stable expo-file-system, improved expo-sqlite
βοΈ Breaking change β Legacy Architecture ends with SDK 54
π Full changelog: expo.dev/changelog/sdk-54
π4π₯2
Hey everyone π
Yeah, I know β it's been way too quiet hereπ« . Had a vacation and now I'm finally back in the routine.
Posts are coming back. Regularly this time.
And if you ever have topics or ideas you'd like me to cover feel free to share them anytime.
Good to be backπ
Yeah, I know β it's been way too quiet hereπ« . Had a vacation and now I'm finally back in the routine.
Posts are coming back. Regularly this time.
And if you ever have topics or ideas you'd like me to cover feel free to share them anytime.
Good to be back
Please open Telegram to view this post
VIEW IN TELEGRAM
π10π¨βπ»1
This media is not supported in your browser
VIEW IN TELEGRAM
π I wanted to share a library I recently came across β
It's a free, open-source package for adding animated glow effects to buttons, images, and UI elements. The cool part? There's a visual builder on their site where you can tweak animations, see results in real-time, and just copy the code. Super satisfying to play with.
It could be nice for those key moments where you want users to feel something β like confirming a payment or highlighting a premium feature.
react-native-animated-glow
It's a free, open-source package for adding animated glow effects to buttons, images, and UI elements. The cool part? There's a visual builder on their site where you can tweak animations, see results in real-time, and just copy the code. Super satisfying to play with.
It could be nice for those key moments where you want users to feel something β like confirming a payment or highlighting a premium feature.
π5β‘1π₯1
This media is not supported in your browser
VIEW IN TELEGRAM
Google just quietly dropped Code Wiki β a platform that transforms any repository into interactive documentation π²
The tool automatically maps your entire project, generates diagrams, and even creates video walkthroughs using NotebookLM. Plus, you can chat with Gemini to clarify anything you're confused about. Oh, and it's completely free.
Definitely bookmarking this one π€©
The tool automatically maps your entire project, generates diagrams, and even creates video walkthroughs using NotebookLM. Plus, you can chat with Gemini to clarify anything you're confused about. Oh, and it's completely free.
Definitely bookmarking this one π€©
Please open Telegram to view this post
VIEW IN TELEGRAM
π₯5π4
Media is too big
VIEW IN TELEGRAM
Google Released Their Cursor Alternative
So Google released Antigravity (https://antigravity.google) β their answer to Cursor and other AI coding tools.
The interesting part? Theyβre positioning it differently. Instead of just βwrite code faster,β theyβre saying:
UPD: Google has already published an introductory course
Let me know what you think. If thereβs enough interest, Iβll dive deep and share what I find.
So Google released Antigravity (https://antigravity.google) β their answer to Cursor and other AI coding tools.
The interesting part? Theyβre positioning it differently. Instead of just βwrite code faster,β theyβre saying:
focus on building solutions, not writing individual lines of code. The emphasis is on AI agents and integrated AI experience throughout the development flow.
UPD: Google has already published an introductory course
Let me know what you think. If thereβs enough interest, Iβll dive deep and share what I find.
π₯7
πΈ Save Money on Expo Builds
Found a really cool tool that helps generate GitHub Actions workflows and build Expo apps using GitHub free plan.
The basic Expo plan at $19/month gives you at most:
Β· 30 Android builds or
Β· 15 iOS builds
With expobuilder, you can use GitHub Actionsβ 2000 free minutes, which is roughly enough for:
Β· 200 Android builds or
Β· 20 iOS builds per month (macOS uses free minutes x10)
π§ How does it work?
Β· expobuilder generates a workflow that uses expo-cli βlocal to build on GitHub runners
Β· you need an Expo account and a generated Expo Token
Β· even if youβre not familiar with GitHub Actions, you can set up a CI/CD pipeline from scratch by following the instructions
π Setting up secrets:
Β· you only need EXPO_TOKEN to get started
Β· for notifications and store publishing - add the other keys
βοΈ Build storage:
Β· artifacts are saved in GitHub Releases by default, but you can connect your own storage
This is a legit money-saver if youβre building frequently.
Have you tried building Expo apps with GitHub Actions? Would love to hear your experience!ββββββββββββββββ
Found a really cool tool that helps generate GitHub Actions workflows and build Expo apps using GitHub free plan.
The basic Expo plan at $19/month gives you at most:
Β· 30 Android builds or
Β· 15 iOS builds
With expobuilder, you can use GitHub Actionsβ 2000 free minutes, which is roughly enough for:
Β· 200 Android builds or
Β· 20 iOS builds per month (macOS uses free minutes x10)
π§ How does it work?
Β· expobuilder generates a workflow that uses expo-cli βlocal to build on GitHub runners
Β· you need an Expo account and a generated Expo Token
Β· even if youβre not familiar with GitHub Actions, you can set up a CI/CD pipeline from scratch by following the instructions
π Setting up secrets:
Β· you only need EXPO_TOKEN to get started
Β· for notifications and store publishing - add the other keys
βοΈ Build storage:
Β· artifacts are saved in GitHub Releases by default, but you can connect your own storage
This is a legit money-saver if youβre building frequently.
Have you tried building Expo apps with GitHub Actions? Would love to hear your experience!ββββββββββββββββ
π₯7β‘1π1π1
Hi guysπ! Since this channel is growing, I want to know better β whatβs your stack?
(This helps me know what to share more of π)
(This helps me know what to share more of π)
Anonymous Poll
22%
Frontend (web)
14%
Backend
14%
Mobile (native or cross-platform)
39%
Full-stack
5%
DevOps / Infrastructure
6%
Other π
π5π1
Media is too big
VIEW IN TELEGRAM
A new interview with Ilya Sutskever has recently been published
Here are the main points he talked about:
1. Models today can crush benchmarks, ace IQ tests, and solve olympiad-level problems. But they struggle with real-world tasks. The current training approach has hit a ceiling.
2. The "throw more compute at it" formula is exhausted.
What matters now isn't adding more power β it's discovering new training methods. We have enough compute, but it's not delivering the exponential gains we used to see. The focus is shifting to new algorithms and research with existing models.
3. The difference between a model and a human? Humans learn fast from tiny amounts of data. We literally build our worldview from fragments of information and self-correct along the way. AI needs to consume entire knowledge bases and still struggles with context. Bottom line: humans are way more efficient learners right now.
4. Interesting take on AGI: it's not a model that knows everything. It's a model that knows how to learn. When that happens, progress will accelerate dramatically. Why? Millions of AI workers learning like humans, but faster. That's going to shake up the job market hard.
5. AI can't verify its own actions yet. Humans have emotions, intuition, that gut feeling when something's off β it's a feedback system. AI just executes functions. Without this mechanism, it's unreliable.
6. Big models will be rolled out gradually so society can adapt. Just like GPT existed and functioned for 3 years before being shown to the public.
The overall picture? We're not racing toward one massive Skynet. Instead, we're heading toward specialized AIs, each mastering its own domain. Once we crack that, we'll clone millions of copies β and that's when the real shift begins. Some roles will vanish, others will survive.
So what should we do? Learn to work with these tools, not against them. And double down on the skills that can't be automated
watch interview
Ilya co-founded OpenAI and was one of the key minds behind GPT. In 2024, he left OpenAI to start Safe Superintelligence Inc β a company focused on building safe superintelligence with safety at its core
Here are the main points he talked about:
1. Models today can crush benchmarks, ace IQ tests, and solve olympiad-level problems. But they struggle with real-world tasks. The current training approach has hit a ceiling.
2. The "throw more compute at it" formula is exhausted.
What matters now isn't adding more power β it's discovering new training methods. We have enough compute, but it's not delivering the exponential gains we used to see. The focus is shifting to new algorithms and research with existing models.
3. The difference between a model and a human? Humans learn fast from tiny amounts of data. We literally build our worldview from fragments of information and self-correct along the way. AI needs to consume entire knowledge bases and still struggles with context. Bottom line: humans are way more efficient learners right now.
4. Interesting take on AGI: it's not a model that knows everything. It's a model that knows how to learn. When that happens, progress will accelerate dramatically. Why? Millions of AI workers learning like humans, but faster. That's going to shake up the job market hard.
5. AI can't verify its own actions yet. Humans have emotions, intuition, that gut feeling when something's off β it's a feedback system. AI just executes functions. Without this mechanism, it's unreliable.
6. Big models will be rolled out gradually so society can adapt. Just like GPT existed and functioned for 3 years before being shown to the public.
The overall picture? We're not racing toward one massive Skynet. Instead, we're heading toward specialized AIs, each mastering its own domain. Once we crack that, we'll clone millions of copies β and that's when the real shift begins. Some roles will vanish, others will survive.
So what should we do? Learn to work with these tools, not against them. And double down on the skills that can't be automated
watch interview
π4π₯4π2
