Users own identity, data, and content. Real-time interactions. No central authority.
Everything you expect from modern socialβfully on-chain.
Users fully control their profiles and identity.
Posts stored on ICP canisters with on-chain ownership.
Likes, comments, reposts, and messaging update instantly.
Triggered by Likes/Comments/Reposts, Follows, and Messages.
Composable interfaces so devs can extend DeSocial.
Rust + ICP with sharded canisters. No single point of failure.
Notifications are triggered by Like/Comment/Repost, Follow/Unfollow, and Real-time Messaging. Profile & Notifications are accessible anytime via the Global Menu.
flowchart TD
%% Entry & Auth
I[Create Internet Identity] --> A[Register / Login]
A --> H[Home / Feed]
%% Global Menu (always accessible)
M((Menu))
H -.-> M
A -.-> M
%% Menu Sections
M --> H[Home / Feed]
M --> CP[Create Post]
M --> EP[Edit Profile]
M --> EX[Explore Users]
M --> MSG[Messages]
M --> NTF[Notifications]
M --> FF[Followers & Following]
M --> LO[Log Out]
%% Create Post Options
CP --> PT1[Text Post]
CP --> PT2[Text + Image Post]
CP --> PT3[Text + Video Post]
%% Feed β Posts
H --> P1[View Post]
%% User's Own Post Actions
P1 -->|If Owner| P2[Edit Own Post]
P1 -->|If Owner| P3[Delete Own Post]
%% Interactions by Any User
P1 --> LCR[Like / Comment / Repost]
%% Explore Users
EX --> OU[View Another User's Profile]
OU --> FF2[Follow / Unfollow]
OU --> OPV[View Their Posts]
OPV --> P1
%% Edit Profile
EP --> EP_DONE[Profile Updated]
%% Notifications Triggers
MSG --> NTF
LCR --> NTF
FF2 --> NTF
%% Session End
LO --> X((Log Out))
Even though DeSocial solves the issues of centralized platforms, being fully decentralized also brings new challenges.
Users may upload nudity, hate speech, or illegal content.
Without central moderation, fake accounts or spam could flood the network.
Storing heavy media (images, videos) directly on-chain can be expensive.
Messaging and interactions could be misused for harassment if not moderated properly.
Striking a balance between free speech and responsible use is difficult.
Users could exploit the app for scams, spreading misinformation, or manipulating communities.
Our plan mixes on-chain governance with practical user tools.
DAO-based moderation where users can vote/flag harmful content.
Mark sensitive/NSFW content rather than removing it; users choose what to see.
Bad actors lose reputation/tokens; good actors earn rewards for positive behavior.
Block/report, customizable filters, and strong privacy settings.
ICP canisters for metadata + IPFS/Arweave for large media; community-driven filters.
Users/communities subscribe to moderation lists (family-friendly, strict, or free-speech).
Join governance discussions and propose rules the community can vote on.
sh -ci "$(curl -fsSL https://smartcontracts.org/install.sh)"
dfx --version
dfx start --background
dfx deploy
npm start
dfx start --background # Start local replica
dfx deploy # Deploy canisters
npm run generate # Generate frontend bindings from Candid
npm start # Start React dev server
dfx stop # Stop replica