Former JP Morgan Private London Club to Open First Investor Pool on Matchpool
In our Use Case Series, we want to showcase how people can use Pools in specific industries.
The Matchmaking Economy
enables any community / club owner to connect their network together
and match individuals together for reward. In this article we look at
how former JPMorgan Investor Herman Liu will open one of the first
investor pools on the platform and bring value to his network.
October 2016, Matchpool received seed investment from Alphabit Fund.
Alphabit’s CEO Liam Robertson started Alphabit with Herman Liu as Chief
Strategy Officer. Liu has a private members network which he syndicates
deals between. Liu organises regular private meetings and formally
introduces his clients in person.
Robertson first introduced Liu to Matchpool, Liu was very quick to
suggest bringing his private investor club onto the platform. Liu has
the ability to setup an investor pool with custom parameters and rules
so that he can informally allow his network to connect with each other
inside of the pool. Just like his private members network, Liu can
decide the price of the subscription fee to the pool or set an entrance
fee for members to join. He can then decide on how he would like his
network to interact on the platform. Liu would make it free for two
members to start a private message conversation so their would be no
matchmaking fee. A matchmaking fee in this use case would compromise the
reputation of a respectable community owner.
three business models inside of Matchpool allow flexibility and choice
to the Poolmaker so that they can create an experience in their pool
that meets the needs of their network.
welcome business and investor community owners to Matchpool in any
industry and look forward to enabling matchmakers all over the world.
What does it look like behind-the-scenes when advertisers participate?
spend LUN to purchase impressions on pages that are matched with ads
for relevance. They submit a quadruplet (A,K,B,G) where
A is a textual adK is a list of keywords with which they’d like to be associated with that don’t appear in AB is the maximum amount they’d bid, in LUN per 1000 impressions.G is the total budget for ads in LUN
in mind that LUN are divisible up to 18 decimal places, so you can send
0.123456789123456789 LUN if you want. Advertisers call the LUN Pool
contract on the blockchain, giving the hash of the ad+keywords as well
as their bid, and they transfer their budget G of LUN to the LUN pool. (Advertisers must purchase LUN to advertise on the Lunyr platform). Advertisers also send (A,K,B,G) to the ad auction module, which cross-references the blockchain, and then computes an ad rank
our word embedding model, we can associate each document (collection of
words) with a vector whose distance from other document-vectors
represents its semantic similarity to those documents, so we can define a
function relevance(doc1, doc2) that returns a number indicating the
relevance. We can use this function on A concatenated with K (A | K) and
each web page to get a relevance score. We then combine this relevance
score with B to determine the rank of the ad.
Impression price = the amount your nearest lower competitor pays / your quality score) + a small number.
clicks on ads. The ad layer will be running analytics that advertisers
can view to understand their ad performance. When the advertiser’s
budget for impressions has been exhausted, the ad is no longer served.
Ad-ranks are re-evaluated regularly.
How do we update content?
IPFS has a name service IPNS,
similar to DNS. Basically it inserts a layer of indirection between DNS
and IPFS, allowing us to make a permanent DNS record pointing to our
IPFS node’s id,
And then we can have an association between our node’s id and the latest content, which we can update very easily:
ipfs name publish <new content hash>
anybody requests our node’s id, IPFS will automatically search for the
content hash associated with that name that has the largest sequence
number (i.e. the latest).
What is word embedding and why do we want to use it?
The idea behind word embedding
is that words that are used in similar contexts probably have similar
meaning, so if we train a neural network to recognize when words are in
context and out of context, then that network will encode a lot of
semantic information. The reason this works is that the notion of
context is really flexible, and simply represents what the geometry of
the media-vector-space *should* look like locally. So if we have known
matches of context (via peer review), and it includes images relating to
text, then we could train another neural network to associate vectors
with images that are close to the word vectors for words that are in
context and far from word vectors that are out of context. We can do this with any media. We can even do it with other languages by taking known pairs of synonymous words and treating that as the notion of in-context.
a nutshell, this technique is both state-of-the-art and very flexible.
It was developed at Google as a marriage of old Natural Language
Processing (NLP) ideas with new neural network ideas. It has been demonstrated
that the word embedding technique doc2vec does very well on identifying
duplicate questions in Q&A forums. This is essentially what we want
to do: determine the similarity between bodies of text.
How do we use word embeddings?
have a database of each document, its IPFS hash, its last hash
(documents are edited), its latest vector, and the model version that
produced the vector (the model may be retrained). Additionally, we have
(or something like it), which is a way of storing a large number of
vectors in a hierarchy of rectangles in such a way that it makes it fast
to look up the nearest neighbors to a given vector. We compute the
vector corresponding to the text ad, and then use the R-tree to look up
the nearest N neighbors to that vector. We then look up the document
hashes corresponding to those vectors in the database. This gives us a
list of the N most relevant pages for that ad, which we can sort by
distance from the ad vector. The quality of an (ad, document) pair is
essentially how close together their vectors are in this vector space.
This then gets combined with the bid amount for that ad and the bids for
other nearby ads to determine the ad-rank. We store the word embedding
model on IPFS so anyone who wants to audit the process may do so. We
periodically recompute the vectors for the documents to account for
How do the ads get served?
hash of the content being served and the ad layer matches this with the
latest results from the Ad Repository. If the ad is clicked, the
reports that the page was served, so the Ad Performance Module can track
Click-Thru-Rate = Clicks/Impressions and present this to advertisers.
What is the Ad Repository?
the ad-rank is determined, several quadruplets are created, (ad-text,
document hash, ad-rank, timestamp). These are stored in a database, and
that document and displays them.
What is the Ad Performance Module?
ad performance module records impressions and clicks for every ad, so
that advertisers can view the performance of their ads.
What is the Ad Auction?
This is the engine that determines ad rank, so it interfaces with IPFS and can view the ads and bids that advertisers submit.
What is the LUN Pool?
LUN Pool stores all the LUN that advertisers pay, along with newly
created LUN. These tokens are distributed at the end of every pay period
to Lunyr and the contributors in proportion to the CBN they earn.
The overall system design will be revealed soon. Stay tuned for more details.
The first two weeks of March have been an exciting time for the project. Advances in development, publicity, and the large rally in ETH have been the center of our attention during this time — some highlights are to be detailed below.
We’ve made advancements in development during the last two weeks as we’ve created basic prototypes for certain aspects of the fund while also refining other technical concepts. For instance, one of our more rough and early concepts — ‘Credit Proposals’ — which allow a decentralized VC platform to take on leverage, has been overhauled to allow for more secure and reliable collateral. While concepts like this are far from perfect, we enjoy the challenge of trying to bring new ideas into practice and engaging with knowledgeable members of the community along the way.
Additionally, we look forward to progress in the Ethereum Name Service (ENS) and StabL Coin platforms and are actively researching how it could improve Vega.
Last week we were the subject of an article published on both Bitcoin Magazine and NASDAQ by Giulio Prisco. This exposed Vega to a much wider audience and sparked valuable discourse with a variety of interested people. We were also the subject of an article on ETH News, written by Jim Manning. We’d like to thank both Prisco and Manning for the great articles — we appreciate opportunity to be exposed to so many enthusiastic individuals.
Finally, we are excited to announce that we’ve officially added Mikko Ohtamaa, former CTO of LocalBitcoins and current Co-Founder at TokenMarket, to the Vega team as an advisor. His technical expertise and general experience in this market will be invaluable moving forward.
Thanks to all those who recently taken interest in the project — we look forward to discussing ideas with you.
If you have any questions, comments, or feedback, please join our Slack, where we are most actively engaged.
Starbase Project announcing its Bounty Program for All Starbase Supporters, Helpers and Contributors, now you will be able to Earn Star Token (STAR) for Doing Activities all over the web, Translating, Writing Articles, Creating Videos, Wearing Signatures on your Profile in Bitcointalk.org Forum.
In this post we will talk 1 STAR as $0.01. !
And all bounty is First-come-first-served basis !
Blog and Media Bounty
if you have a Blog, a Crypto related or a Technology Related News website, You can Participate in Blog Bounty and Can Earn a Solid Reward.
and if you can create a good video using your Video Creation Skills, You can Earn a Solid Reward As Well.
We’ll Pay Equal of 20 $ to 100$ in “STAR” for Each Blog Post Bases of your Blog Post Quality and Website Traffic, and We’ll Pay Equal of 10$ to 25$ in “STAR” Bases of Your Video Quality and Views.
1: A blog Must be in Unique words, must have unique contents, copy pasted contents will not be accepted.
2: We Accept Blogs, in English, Chinese, Russian. for All other languages you must Send your Blog to[email protected] and Get approval.
3: An eligible Blog Post Must contains at least 500 Words(without spaces) to get Accepted.
4: Your Blog Post must contains 2 links of official website: http://starbase.co
5: for video bounty your video must be at least 2 minute long.
6: Video must be descriptive and informative about Starbase Project,
7: Video must be Visible in the video sharing site till the end of ICO, any video will get removed due to any reason (copyright etc) will not be counted.
Translation and Local Community Management Bounty
Translate the Official Announcement thread, Whitepaper and Starbase FAQ in your Language, Manage the thread, keep that updated and active and Earn Star Token.
Equal of 25$ to 50$ in “STAR” for Each Quality and Artistic Translation.
Equal of 1$ to 5$ in “STAR” for Every New Posts, Updates and Announcement in the Local Thread.
1: Announcement Thread,
Download PSD files of ANN Thread: (Link will be updated here)
1:Translation Must be original. using google translator or any other translation tool will cause your submission reject.
2: You must finish Translation within a week of your reserving that translation.
3: Translators must be able to Manage their local threads with Posting all Important Announcements, Updates, News from the Main thread. and Answer other users’s questions and Support.
Interested Translators PM @jamalaezaz in bitcointalk or Slack to Reserve their Spot.
Signature and Avatar Campaign:
You can Now earn “STAR” by Joining signature and Avatar Campaign at Bitcointalk forum by wearing signature and Avatar to your profile.
Earn Equal of 100$ in “STAR” Wearing the Signature/Avatar of Starbase During whole campaign Period..
Payment will be in the Following Ratio: (Per Month)
Jr Member: Equal of 25$ in “STAR”
Member: equal of 30$ in “STAR”
Full Member: Equal of 50$ in “STAR”
Sr Member: Equal of 75$ in “STAR”
Hero/Legendary: Equal of 100$ in “STAR”
Full member and Above can earn additional 10$ Wearing Avatar.
1: Users are not allowed to remove the signature During the campaign period.. removing campaign in that period will be result of No Payment.
2: User must do a minimum of 50 posts. Less than 50 posts will not get paid.
3: Spam, Low Quality and Off topic posts will not be counted for payment.
4: One account Per User. using multiple accounts and cheating the bounty is not allowed, user found to be cheating with any form of bounty will be permanently blacklisted.
Newsletter Subscribers Bounty
Subscribe Newsletter in Official Website by adding your email address , and Earn Equal of 0.5$ In “STAR” for Subscribing the Newsletter.
only one Email Per user. user found to be using multiple emails will be blacklisted and not get bounty payment.