From Scattered Assets to a Unified Vision: The Problem We Set Out to Solve
In many design communities, valuable resources are scattered across individual hard drives, Slack channels, and forgotten Google Drive folders. Our community faced a familiar pain point: designers and developers repeatedly creating similar components, wasting hours on redundant work. Junior members struggled to find high-quality assets, while senior contributors hoarded their best work out of habit. The lack of a centralized, curated library meant inconsistent brand representation, slower project timelines, and a missed opportunity for collective growth. This fragmentation not only hindered productivity but also dampened community morale—people felt their contributions were siloed rather than celebrated. We knew a better approach was possible, and the idea of a shared, community-curated library began to take shape. But we also recognized that simply dumping files into a repository wouldn't solve the underlying issues. We needed a system that balanced accessibility with quality control, encouraged participation without overwhelming volunteers, and ultimately demonstrated enough value to attract external support—like a design grant. This article chronicles our journey from scattered assets to a winning grant proposal, offering a replicable framework for other communities facing similar challenges.
Why a Shared Library Matters More Than You Think
A shared library does more than save time—it fosters collaboration, standardizes quality, and builds a sense of ownership among members. When everyone contributes to and benefits from a common resource, the community becomes stronger. In our case, the library became a tangible proof of our collective capability, which was instrumental in winning the design grant.
The Initial Assessment: Taking Stock of Our Chaos
We started by surveying our community of about 200 active members. The results were telling: 78% reported spending at least two hours per week searching for design assets, and 45% admitted to recreating components because they couldn't find the originals. This data, while not from a formal study, highlighted the urgent need for change. We documented these pain points to build a case for the grant.
Setting the Stage for Grant Success
Winning a design grant requires more than a good idea—it demands a clear problem statement, a feasible solution, and evidence of community impact. Our fragmented asset situation provided a compelling narrative. We framed the library as a way to reduce waste, accelerate project delivery, and democratize access to design resources. This narrative resonated with the grant committee, who saw the potential for ripple effects beyond our immediate community.
Lessons from Other Communities
We studied similar initiatives in open-source software and design systems. For example, the Bootstrap community's approach to component sharing and the Material Design library's governance model offered useful parallels. We adapted their principles—like clear contribution guidelines and regular review cycles—to our context, avoiding common pitfalls such as unchecked growth or low-quality submissions.
Building Momentum Before the Grant
Before applying for the grant, we launched a pilot library with a small group of trusted contributors. This allowed us to test our curation process, gather feedback, and demonstrate early wins. The pilot's success—measured by a 30% reduction in asset search time among participants—gave us concrete results to include in our grant proposal. It also built internal confidence that the project was viable.
Our initial problem was universal: fragmented resources hindering community productivity. By documenting this pain and piloting a solution, we laid the groundwork for a library that would eventually attract grant funding and serve as a model for others.
Core Frameworks: How We Designed the Curation System
A shared library without a curation framework is just a digital junkyard. We knew that for our community to adopt and trust the library, we needed a transparent, consistent system for evaluating and organizing assets. After weeks of discussion and iteration, we settled on a three-pillar framework: quality standards, categorization taxonomy, and contribution workflow. Each pillar addressed a specific aspect of the curation challenge and was designed to scale with community growth. The quality standards ensured that every asset met a baseline level of polish and usability. The taxonomy made it easy for members to find what they needed without wading through irrelevant items. The workflow balanced openness with oversight, allowing anyone to contribute while maintaining a review process to catch issues early. This framework became the backbone of our library and a key selling point in our grant application. Below, we break down each component and share the rationale behind our choices, including what we learned from initial missteps.
The Quality Standards: Defining 'Good Enough'
We established four criteria for every asset: (1) visual consistency with our community's style guide, (2) proper file naming and metadata, (3) compatibility with commonly used tools (Figma, Sketch, Adobe XD), and (4) inclusion of usage notes or examples. These standards were strict enough to maintain quality but flexible enough to encourage contributions. Early drafts were too rigid, deterring participation; we relaxed them after feedback.
The Taxonomy: Organizing for Discovery
Our taxonomy started with three top-level categories: UI Components, Icons, and Templates. Each had subcategories like Buttons, Form Elements, Navigation Patterns under UI Components. We used a flat hierarchy to avoid deep nesting, which confused users. Tags supplemented categories for cross-referencing (e.g., 'dark mode', 'responsive'). We refined the taxonomy through user testing—observing how members searched and adjusting accordingly.
The Contribution Workflow: From Submission to Approval
The workflow had five steps: (1) Contributor submits asset via a form with metadata, (2) Automated checks verify file integrity and naming, (3) Peer reviewers evaluate against quality standards, (4) Moderator approves or requests revisions, (5) Asset is published with attribution and date. This workflow reduced the burden on any single person and gave contributors clear expectations. We used a lightweight tool like Trello to track submissions initially, later moving to a custom system.
Balancing Openness and Control
A key tension was between welcoming contributions and maintaining quality. We addressed this by offering a 'sandbox' area where experimental assets could be shared without formal review, then promoted to the main library after refinement. This approach encouraged experimentation while keeping the core library reliable. It also reduced friction for new contributors who were unsure of the standards.
Iterating Based on Feedback
After three months, we surveyed contributors and users. Common complaints included slow review times (averaging 5 days) and unclear rejection reasons. We responded by adding more reviewers from different time zones and creating a template for rejection feedback. These changes improved satisfaction scores from 3.2 to 4.1 out of 5 within two months.
Our curation framework evolved through trial and error, but its core principles—transparency, consistency, and community input—remained constant. This framework not only made the library usable but also demonstrated to the grant committee that we had a sustainable governance model.
Execution and Workflow: Turning Theory into Practice
Having a framework on paper is one thing; making it work with a volunteer community is another. Execution required careful planning, clear communication, and a willingness to adapt. We formed a small core team of five volunteers: two coordinators, two reviewers, and one technical lead. Each had defined responsibilities, but we emphasized cross-training to avoid bus-factor risks. We set up weekly sync meetings to discuss bottlenecks, celebrate wins, and adjust priorities. The workflow itself was documented in a public wiki, complete with screenshots and video tutorials, to lower the barrier for new contributors. We also established service-level agreements (SLAs) for reviews: 48 hours for standard submissions, 24 hours for urgent ones (e.g., assets needed for a community event). These SLAs were ambitious for a volunteer team, but they motivated us to streamline processes. Below, we detail the specific steps we took to execute the curation workflow, including tools we used, communication channels, and how we handled conflicts.
Step 1: Submitting an Asset
We created a simple submission form using Google Forms, asking for the asset file (via link), a description, category tags, and a brief usage note. The form included a checklist of quality standards to remind contributors of requirements. We also allowed bulk submissions for experienced contributors, with a single form accepting up to 10 assets at once, which encouraged power users to contribute larger sets.
Step 2: Automated Pre-Check
Before human review, an automated script (written by our technical lead) checked file format, naming convention (e.g., 'button-primary.svg'), and file size (max 5 MB). Assets that failed received an automated email with instructions on how to fix issues. This step reduced the manual review burden by about 20% and provided immediate feedback to contributors.
Step 3: Peer Review
Two peer reviewers from a rotating pool of 10 volunteers evaluated each asset independently using a rubric. They scored on a 1-5 scale for visual quality, usability, and documentation. If both scores averaged 3.5 or above, the asset was approved. If scores diverged by more than 1 point, a third reviewer was called in. This system reduced bias and ensured consistency.
Step 4: Moderation and Publication
A moderator (one of the coordinators) reviewed the peer feedback and made the final decision. Approved assets were added to the library's main repository (hosted on GitHub and mirrored on a static site). The moderator also updated the asset metadata with review date and version number. Rejected assets received a detailed explanation and an invitation to revise and resubmit.
Step 5: Ongoing Maintenance
We scheduled quarterly audits to retire outdated assets, update metadata, and merge duplicates. During these audits, we also reviewed usage analytics (download counts, user ratings) to identify popular assets that might need maintenance or expansion. This proactive approach kept the library fresh and relevant.
Handling Conflicts and Disagreements
Conflicts were inevitable—especially when rejecting a contributor's work. We established a 'reconsideration' process where contributors could appeal decisions to a separate panel of three experienced members. Only about 5% of rejections were appealed, and of those, half were overturned. This process built trust and showed that the system was fair.
Scaling the Workflow for Growth
As the library grew, we recognized that manual review wouldn't scale indefinitely. We began exploring automated quality checks using AI-based tools, but we kept humans in the loop for subjective judgments. We also introduced 'trusted contributor' status, allowing frequent, high-quality contributors to bypass some review steps, which reduced bottlenecks.
Execution required relentless attention to detail and a culture of continuous improvement. By documenting our workflow and iterating based on feedback, we turned a theoretical framework into a living system that served our community effectively.
Tools, Stack, and Economics: Building on a Budget
One of the biggest challenges for any community project is choosing the right tools without breaking the bank. We had zero budget initially—everything was volunteer time and free-tier services. Our technical lead evaluated options based on cost, ease of use, integration capabilities, and community familiarity. We settled on a stack that combined GitHub for version control, a static site generator (Eleventy) for the public-facing library, and a lightweight CMS (Decap CMS) for non-technical contributors to manage metadata. This stack was free, open-source, and extensible. For communication, we used Slack (community already had it) and a public Discord server for broader discussions. The total monthly cost was $0, except for a domain name ($12/year) and occasional cloud storage upgrades (under $10/month after we hit free limits). Below, we break down the tooling decisions, their pros and cons, and the economic realities of maintaining a community library. We also share how we justified these costs in the grant proposal and what we would do differently with more funding.
Version Control: GitHub as the Single Source of Truth
GitHub provided free private repositories for our team of five, with public repositories available for the final library. We used branches for contributions (feature branches for each asset) and pull requests for review. This aligned with our workflow and allowed us to track changes, revert mistakes, and maintain a history. The learning curve was steep for non-developers, so we created step-by-step guides and held two workshops.
Static Site Generator: Eleventy for Speed and Simplicity
Eleventy generated a fast, searchable website from Markdown files. We customized templates to display assets with preview images, tags, and download buttons. The static nature meant low hosting costs (GitHub Pages) and quick load times. The downside was that updates required a rebuild, but we automated that with GitHub Actions triggered by merges to the main branch.
Content Management: Decap CMS for Non-Technical Curators
Decap CMS provided a web-based interface for adding and editing assets without touching code. It integrated with GitHub and allowed us to define custom fields (name, description, category, file URL). This was crucial for community members who weren't comfortable with git. We faced occasional sync issues, but they were manageable.
Communication and Coordination Tools
Slack remained our primary communication channel, with dedicated channels for submissions, reviews, and general discussion. We also used a public Trello board for tracking the pipeline—contributors could see where their asset was in the workflow. This transparency reduced anxiety and questions about status.
Economic Realities and Grant Justification
Our initial costs were negligible, but as the library grew, we needed more storage (for larger asset files) and potentially a dedicated server for faster builds. The grant proposal included a modest budget for these items, plus stipends for future community moderators (to sustain momentum). We emphasized that the grant would allow us to scale without burning out volunteers.
What We'd Do Differently with Funding
With grant money, we would invest in a dedicated asset management platform (like Frontify or a custom solution) to reduce manual overhead, hire a part-time curator to maintain quality, and offer small incentives for top contributors (e.g., gift cards or conference tickets). These investments would accelerate growth and improve user experience.
Lessons on Tooling
The key lesson was to start with simple, free tools and upgrade only when pain points became unbearable. Over-engineering early on wastes momentum. Our stack evolved: we began with a shared Google Drive and a spreadsheet, then moved to a basic website, and finally to the GitHub-based system. Each transition was driven by clear user feedback.
Choosing the right tools on a budget is about trade-offs. We prioritized low cost, ease of adoption, and alignment with our workflow. The grant allowed us to plan for future improvements, but the foundation was built with free resources and community effort.
Growth Mechanics: From Pilot to Grant-Winning Recognition
Growing a community library from a small pilot to a grant-winning project required intentional strategies for visibility, engagement, and persistence. After the initial launch, we focused on three growth levers: increasing contributions, driving usage, and building external recognition. Each lever required different tactics but reinforced the others. For contributions, we gamified the process with leaderboards and 'contributor of the month' awards. For usage, we integrated the library into community design challenges and showcased library assets in our newsletter. For recognition, we submitted the project to design award programs and reached out to design publications. The grant application was the culmination of these efforts—a way to formalize our impact and secure resources for sustainability. Below, we detail the specific growth mechanics that worked, the metrics we tracked, and how we maintained momentum over 18 months. We also share the story of how we stumbled upon the grant opportunity and crafted a proposal that stood out.
Tactic 1: Gamification and Recognition
We introduced a points system: 10 points for a successful contribution, 5 for a helpful review, and 2 for reporting an issue. Monthly leaderboards were shared in Slack, and the top contributor received a shoutout in our newsletter and a small custom badge for their portfolio. This friendly competition boosted contributions by 40% in the first two months.
Tactic 2: Integration with Community Activities
We embedded the library into our monthly design challenges—participants were encouraged to use library assets and submit new ones. This created a virtuous cycle: challenges drove library usage, and library assets made challenges easier. We also created 'starter kits' from library components for hackathons, reducing setup time for participants.
Tactic 3: External Visibility and Partnerships
We submitted the library to design resource directories (like Design Resources and Freebiesbug) and wrote guest posts for design blogs about our curation process. We also partnered with a local design school, where students contributed assets as part of their coursework. These partnerships brought in fresh perspectives and increased our reach.
The Grant Discovery and Application
A team member spotted the grant opportunity in a design industry newsletter. It was offered by a foundation supporting community-driven design projects. We adapted our existing documentation—problem statement, framework, workflow, and early metrics—into a proposal that highlighted our collaborative model and potential for replication. We also included testimonials from community members about how the library had impacted their work.
Metrics That Mattered
We tracked contributions (total assets, unique contributors), usage (downloads, page views), and engagement (review turnaround time, satisfaction scores). For the grant, we emphasized the reduction in redundant work (estimated 200 hours saved per quarter based on surveys) and the diversity of contributors (30% from underrepresented groups). These numbers, while not from a rigorous study, were grounded in community feedback and were persuasive.
Sustaining Momentum
Growth plateaus are common. After six months, contributions slowed. We reinvigorated the community by hosting a 'library sprint'—a weekend event where contributors aimed to add 50 new assets. The sprint was promoted with a countdown and live progress bar. It resulted in 73 new assets and re-engaged lapsed contributors.
Persistence Pays Off
The grant application process took three months and required multiple revisions. We were rejected on the first attempt but received feedback on improving our sustainability plan. We resubmitted with a clearer budget and a longer-term roadmap, and were accepted on the second try. This taught us the value of persistence and responsiveness to feedback.
Growth doesn't happen by accident. It requires deliberate strategies, consistent effort, and a willingness to iterate based on data. Our community's persistence turned a small library into a recognized project with the resources to thrive.
Risks, Pitfalls, and Lessons Learned the Hard Way
No project of this scale is without its challenges. Despite our best intentions, we encountered several pitfalls that threatened the library's success. Some were predictable—like contributor burnout and scope creep—while others caught us off guard, such as copyright disputes and technical debt from rapid prototyping. In this section, we share the most significant risks we faced, the mistakes we made, and the mitigations we implemented. Our goal is to help other communities avoid these same traps. We cover five major categories: governance failures, quality control breakdowns, technical issues, community dynamics, and grant-related risks. For each, we describe the scenario, the impact, and the corrective actions we took. We also reflect on what we wish we had known from the start, and how our approach evolved over time.
Pitfall 1: Governance Gridlock
In the early months, decisions about taxonomy changes or quality standard updates required consensus from the core team. This led to delays and frustration. We learned to adopt a 'lazy consensus' model: proposals were open for comment for one week, and if no objections were raised, they were implemented. This sped up decision-making while still allowing for input.
Pitfall 2: Quality Control Overreach
Initially, our quality standards were too strict, rejecting assets that were 'good enough' but not perfect. This discouraged contributions from junior members. We relaxed the standards and introduced a 'beta' tag for assets that met basic criteria but needed refinement. This change increased contributions by 60% and improved community satisfaction.
Pitfall 3: Technical Debt from Rapid Prototyping
Our early tooling was cobbled together quickly, resulting in a fragile system. For example, the automated pre-check script had a bug that incorrectly rejected files with underscores in names. We invested time in refactoring and wrote unit tests to prevent regressions. This upfront investment saved countless hours of firefighting later.
Pitfall 4: Contributor Burnout
Our core team of five was doing most of the work, leading to exhaustion. We recognized the signs early—missed deadlines, shorter communications, and decreased enthusiasm—and responded by recruiting more volunteers and rotating responsibilities. We also set explicit expectations: no one was expected to contribute more than 5 hours per week.
Pitfall 5: Copyright Ambiguity
A contributor submitted an icon set that was later found to be derivative of a commercial icon pack. This led to a tense situation where we had to remove the assets and update our contribution agreement to require original work or properly licensed derivatives. We now ask contributors to explicitly state the license of their assets (e.g., MIT, CC0, or custom).
Pitfall 6: Grant-Related Risks
After winning the grant, we faced pressure to deliver on promises made in the proposal. We had committed to a certain number of assets and users within a timeline. To avoid overpromising, we built in buffers and communicated progress transparently with the grantor. We also set aside a contingency budget for unexpected expenses.
Pitfall 7: Scope Creep
Community members often suggested ambitious features (e.g., a plugin for Figma, a mobile app). While these ideas were exciting, they diverted resources from core curation. We created a 'future ideas' document to capture suggestions without committing to them, and revisited it quarterly to prioritize aligned with our strategic goals.
Mitigation Strategies That Worked
We established a risk register early on and reviewed it monthly. This proactive approach allowed us to catch issues before they escalated. We also fostered a culture where mistakes were openly discussed and documented, turning failures into learning opportunities for the entire community.
Pitfalls are inevitable, but they don't have to derail a project. By anticipating risks, responding quickly, and learning from mistakes, we built a more resilient library and a stronger community. The key is to embrace transparency and continuous improvement.
Frequently Asked Questions and Decision Checklist
Over the course of building our shared library, we received countless questions from other community organizers and individual contributors. Some questions were practical ('How do we handle licensing?'), while others were strategic ('When should we apply for a grant?'). In this section, we've compiled the most common questions along with our honest answers based on our experience. We also provide a decision checklist to help you evaluate whether your community is ready to undertake a similar project. This checklist covers readiness in terms of community size, technical skills, organizational capacity, and long-term commitment. Use it as a starting point for your own planning. Remember that every community is unique, so adapt these guidelines to your context.
FAQ 1: How many people do we need to start?
We started with a core team of five, but you can begin with as few as two or three dedicated individuals. The key is to have at least one person with technical skills (for the website) and one with organizational skills (to manage submissions and reviews). As the library grows, you can recruit more volunteers.
FAQ 2: What if our community is small?
A small community can still build a valuable library by focusing on a niche area. For example, if your community specializes in mobile UI design, create a library specifically for mobile components. Quality over quantity will attract users and contributors even if the community is small.
FAQ 3: How do we handle licensing and attribution?
We standardized on the Creative Commons Zero (CC0) license for all assets, meaning contributors waived their rights and the assets were public domain. This simplified reuse and avoided attribution headaches. However, some contributors preferred to retain credit, so we also allowed assets under MIT license with a clear attribution requirement in the file metadata.
FAQ 4: How do we prevent the library from becoming outdated?
Regular audits are essential. We scheduled quarterly reviews to deprecate assets that were no longer relevant (e.g., outdated design trends) or broken. We also encouraged users to flag outdated assets via a simple form. Additionally, we tracked version updates for tools like Figma and updated assets accordingly.
FAQ 5: Should we apply for a grant before or after launching?
We recommend launching a pilot first. A working prototype with real usage data strengthens your grant proposal. Grant committees want to see that you have a track record, even if it's small. Our pilot ran for six months before we applied, and the results were instrumental in our success.
FAQ 6: What if we don't win a grant?
Not winning a grant doesn't mean the project is a failure. The library itself provides value to your community regardless of external funding. Continue to grow it organically, and consider alternative funding sources like sponsorships from design tool companies or crowdfunding campaigns.
Decision Checklist: Is Your Community Ready?
- Community size: At least 20 active members who can contribute or use the library.
- Technical skills: At least one person comfortable with version control and static site generation.
- Organizational capacity: A small team willing to commit 3-5 hours per week for at least six months.
- Clear use case: A specific gap or pain point that the library will address.
- Long-term commitment: Willingness to maintain the library for at least 12 months (the typical grant cycle).
- Community buy-in: Evidence that members are excited about the idea and willing to contribute.
Additional Resources
We've compiled a list of templates (contribution guidelines, review rubrics, grant proposal outlines) that we use and update regularly. These are available on our community website. Feel free to adapt them for your own projects.
We hope these answers and the checklist help you assess your readiness and avoid some of the bumps we encountered. Remember, the journey is as valuable as the destination—the community bonds formed through collaboration are a reward in themselves.
Synthesis and Next Actions: Turning Your Vision into Reality
Building a community-curated shared library that wins a design grant is an ambitious but achievable goal. Throughout this article, we've walked through the entire process: from identifying the problem of scattered assets, to designing a curation framework, executing a workflow, selecting tools, growing the library, navigating pitfalls, and answering common questions. Now, we synthesize the key takeaways into a set of actionable next steps. Whether you are just starting out or already have a pilot running, these steps will help you move forward with clarity and confidence. The most important lesson is that success depends on community engagement, transparent processes, and a willingness to iterate. The grant is a milestone, not the end goal—the real reward is a thriving community that creates and shares valuable resources together. We encourage you to start small, document everything, and celebrate every contribution.
Step 1: Conduct a Community Survey
Understand the specific pain points of your community. Ask about current challenges with design assets, what types of resources they need most, and their willingness to contribute. Use a simple online form and share it in your community channels. The results will inform your library's scope and build initial buy-in.
Step 2: Define Your Curation Framework
Based on survey results, draft quality standards, a taxonomy, and a contribution workflow. Start simple—you can always add complexity later. Share the draft with the community for feedback before finalizing. This collaborative approach ensures the framework meets everyone's needs.
Step 3: Build a Minimum Viable Library
Launch a pilot with a small set of high-quality assets contributed by the core team. Use free tools like GitHub and a static site generator. The goal is to have a working library that you can show to early adopters and gather feedback. Don't worry about perfection at this stage.
Step 4: Recruit and Train Contributors
Create clear documentation for how to contribute, including video tutorials if possible. Host a kickoff workshop to walk through the process and answer questions. Recognize early contributors publicly to encourage others. Build a pipeline of reviewers to distribute the workload.
Step 5: Promote and Gather Data
Integrate the library into your community's regular activities—challenges, design sprints, newsletters. Track usage metrics and collect testimonials. This data will not only help you improve the library but also strengthen a future grant application. Share success stories to maintain momentum.
Step 6: Explore Grant Opportunities
Research grants that align with your project's goals. Look for foundations or organizations that support open design, community building, or creative technology. Tailor your proposal to highlight community impact, sustainability, and replicability. Don't be discouraged by rejection; use feedback to improve and reapply.
Step 7: Plan for Sustainability
Even if you win a grant, plan for the long term. Consider creating a governance structure that can outlast initial funding. Diversify revenue through sponsorships, donations, or paid premium assets (while keeping the core library free). Continuously engage the community to ensure the library remains relevant and used.
The journey of building a shared library is as rewarding as the destination. You'll learn about collaboration, leadership, and the power of collective effort. We hope our story inspires you to take the first step. Good luck!
Comments (0)
Please sign in to post a comment.
Don't have an account? Create one
No comments yet. Be the first to comment!