Search Results

Search found 6357 results on 255 pages for 'facebook graph'.

Page 142/255 | < Previous Page | 138 139 140 141 142 143 144 145 146 147 148 149  | Next Page >

  • How can I perform sentiment analysis on extracted text from online sources?

    - by aniket69
    I'm working on extracting the sentiment from YouTube comments, blogs, news content, Facebook wall posts, and Twitter feeds. I'm looking for an automated way to do this: the two third-party solutions I've found have been AlchemyAPI and RapidMiner. Are these the best way to approach this project, or should I be using something else? Is there a more efficient way to approach sentiment analysis? What techniques have worked for you in a project like this?

    Read the article

  • Temporary Post Used For Theme Detection (2c152d1d-b299-442d-8586-f901f2ae7155 – 3bfe001a-32de-4114-a6b4-4005b770f6d7)

    - by Gopinath
    This is a temporary post that was not deleted. Please delete this manually. (095f0c04-8773-4c8c-8925-410f694fb4f4 – 3bfe001a-32de-4114-a6b4-4005b770f6d7) This article titled,Temporary Post Used For Theme Detection (2c152d1d-b299-442d-8586-f901f2ae7155 – 3bfe001a-32de-4114-a6b4-4005b770f6d7), was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • PRISM : La NSA aurait aussi espionné l'Union Européenne, les responsables européens choqués par les révélations

    Le projet PRISM autorise les fédéraux américains à fouiller nos données stockées en ligne un ancien employé aux renseignements le dévoileSe basant sur des fuites d'un ancien employé au renseignement américain, l'éditorial américain Washington Post a révélé que l'agence de sécurité nationale américaine (NSA) et le FBI avaient accès aux bases de données de neuf poids lourds sur internet, parmi lesquels Facebook, Google ou même encore plus récemment Apple. Le projet, au nom de code PRISM, mis en place depuis 2007, permet aux deux agences de fouiller les données clients de ces entreprises sans aucune ordonnance préalable de la justice.

    Read the article

  • PRISM : la NSA met sur pied de nouveaux systèmes de sécurité informatique pour surveiller les administrateurs système

    Le projet PRISM autorise les fédéraux américains à fouiller nos données stockées en ligne un ancien employé aux renseignements le dévoileSe basant sur des fuites d'un ancien employé au renseignement américain, l'éditorial américain Washington Post a révélé que l'agence de sécurité nationale américaine (NSA) et le FBI avaient accès aux bases de données de neuf poids lourds sur internet, parmi lesquels Facebook, Google ou même encore plus récemment Apple. Le projet, au nom de code PRISM, mis en place depuis 2007, permet aux deux agences de fouiller les données clients de ces entreprises sans aucune ordonnance préalable de la justice.

    Read the article

  • Interview: Eben Moglen - Freedom vs. The Cloud Log

    <b>The H Open:</b> "Free software has won: practically all of the biggest and most exciting Web companies like Google, Facebook and Twitter run on it. But it is also in danger of losing, because those same services now represent a huge threat to our freedom..."

    Read the article

  • Total Solar Eclipse 13/November/2012 - update

    - by TATWORTH
    Panasonic Eclipse Live will start their broadcast at 18:30 UT tonight at https://www.facebook.com/PanasonicEclipseLive/app_435671416492320Alternative URLs are http://www.ustream.tv/channel/panasonic-eclipse-live-by-solar-power-1 and http://www.ustream.tv/channel/panasonic-eclipse-live-by-solar-power-2/collection/744e86aa753e(The start time is 04:30 by local Australian time and will be the 14/November for them.)

    Read the article

  • Should I use my own public API on my site (via JS)?

    - by newboyhun
    First of all, this question is far more different other 'public api questions' like this: Should a website use its own public API?, second, sorry for my English. You can find the question summarized at the bottom of this question. What I want to achieve is a big website with a public api, so who like programming (like me) and likes my website, can replicate my website's data with a much better approach (of course with some restrictions). Almost everything could be used by the public API. Because of this, I was thinking about making the whole website AJAX driven. There would be parts of the API which would be limited only to my website (domain), like login, registering. There would be only an INTERFACE on the client side, which would use the public and private API to make this interface working. The website would be ONLY CLIENT SIDE, well, I mean, the website would only use AJAX to use the api. How do I imagine this? The website would be like a mobile application, the application only sending a request to a webserver, which returns a json, the application parses it, and uses it to advance in the application. (e.g.: login) My thoughts: Pros: The whole website is built up by javascript, this means I don't need to transfer the html to the client, saving bandwidth. (I hope so) Anyone can use up the data of my website to make their own cool things. (Is this a con or pro? O_O) The public API is always in use, so I can see if there are any error. Cons: Without Javascript the website is unusable. The bad guys easily can load the server with requesting too much data (like Request Per Second 10000), but this can be countered via limiting this with some PHP code and logging. Probably much more work So the question in some words is: Should I build my website around my own api? Is it good to work only on the client side? Is this good for a big website? (e.x.: facebook, yeah facebook is a different story, but could it run with an 'architecture' like this?)

    Read the article

  • PRISM : « United Stasi of America », pour un ancien haut-fonctionnaire les « donneurs d'alertes » sont les vrais patriotes et ils se multiplieront

    Le projet PRISM autorise les fédéraux américains à fouiller nos données stockées en ligne un ancien employé aux renseignements le dévoileSe basant sur des fuites d'un ancien employé au renseignement américain, l'éditorial américain Washington Post a révélé que l'agence de sécurité nationale américaine (NSA) et le FBI avaient accès aux bases de données de neuf poids lourds sur internet, parmi lesquels Facebook, Google ou même encore plus récemment Apple. Le projet, au nom de code PRISM, mis en place depuis 2007, permet aux deux agences de fouiller les données clients de ces entreprises sans aucune ordonnance préalable de la justice.

    Read the article

  • YouTube Can Be Used to Improve Search Engine Ranking

    We all know that YouTube is one of the most well-liked websites. Did you know that YouTube can also help increase your site traffic and rankings like Facebook and Twitter? Listed below are some helpful tips to get site traffic for free using YouTube so grab the chance of utilizing those no charge services.

    Read the article

  • SecureAsia@Tokyo 2012??????????

    - by user762552
    ?????????????facebook?????????????????????···??????????SecureAsia@Tokyo 2012???????????????????????????????????????????????????????1.??????????????(DAY2:7?18?11:15-12:00)???????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????(1) ???????????(2)????????????????????????????????????????????????????????????????????????????????????????????????????????????(7?12-13?)???????????7?12???????????????????????????????????????????????????????????????Oracle Database Firewall?????????????????????????:2.????????????????????????????????·???????(???????????????????)???Howard A.Schmidt?????·??????????????????????????????????????????????????????????????????????????(?????????????)?????????????????????????????????????????????? ?????(????????????????)?????????? 3.ISLA???????????????????????????????ISC2??????????????Information Security Leadership Achievement(ISLA)????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????(???????????????????)??????????????????????????????????????????···??????????????????????????

    Read the article

  • NetBeans 7.2??????????????

    - by user13137856
    NetBeans 7.2 ???????????????????????????? UI ??????????? ????????????????????? http://bits.netbeans.org/download/trunk/nightly/latest/ ?????????????????????????????????? NetBeans ?????Build??????????????????????????????????????????(·)?????????????????????????????????????????????? ???????????????????????????????????????????????????????????????????????? NetBeans ??????????? NetBeans ?????????? twitter NetBeans ?????????? facebook ???

    Read the article

  • Dropped impression 25 days after restructure

    - by Hamid
    Our website is a non English property related website (moshaver.com) which is similar to rightmove.co.uk. On September 2012 our website was adversely affected by Panda causing our Google incoming clicks to drop from around 3000 clicks to less than a thousand. We were hoping that Google will eventually realize that we are not a spam website and things will get better. However, in August 2013 we were almost sure that we needed to do something, so we started to restructure our web content. We used the canonical tag to remove our search results and point to our listing pages, using the noindex tag to remove it from our listing pages which does not have any properties at the moment. We also changed title tags to more friendly ones, in addition to other changes. Our changes were effective on 10th August. As shown in the graph taken from Google Analytics Search Engine Optimization section, these changes has resulted in an increase in the number of times Google displayed our results in its search results. Our impressions almost doubled starting 15th August. However, as the graph shows, our CTR dropped from this date from around 15% to 8%. This might have been because of our changed title tags (so people were less likely to click on them), or it might be normal for increased impressions. This situation has continued up until 10th September, when our impressions decreased dramatically to less than a thousand. This is almost 30% of our original impressions (before website restructure) and 15% of the new impressions. At the same time our impressions has increased dramatically to around 50%. I have two theories for this increase. The first one is that these statistics are less accurate for lower impressions. The second one is that Google is now only displaying our results for queries directly related to our website (our name, our url), and not for general terms, such as "apartments in a specific city". The second theory also explains the dramatic decrease in impression as well. After digging the analytic data a little more, I constructed the following table. It displays the breakdown of our impressions, clicks and ctr in different Google products (web and image) and in total. What I understand from this table is that, most of our increased impressions after restructure were on the image search section. I don't think users of search would be looking for content in our website. Furthermore, it shows that the drop in our web search ctr, is as dramatic of the overall ctr (-30% in compare to -60%) . I thought posting it here might help you understand the situation better. Is it possible that Google has tested our new structure for 25 days, and then decided to decrease our impressions because of the the new low CTR? Or should we look for another factor? If this is the case, how long does it usually take for Google to give us another chance? It has been one month since our impressions has dropped.

    Read the article

  • Sneak Peak: Social Developer Program at JavaOne

    - by Mike Stiles
    By guest blogger Roland Smart We're just days away from what is gunning to be the most exciting installment of OpenWorld to date, so how about an exciting sneak peak at the very first Social Developer Program? If your first thought is, "What's a social developer?" you're not alone. It’s an emerging term and one we think will gain prominence as social experiences become more prevalent in enterprise applications. For those who keep an eye on the ever-evolving Facebook platform, you'll recall that they recently rebranded their PDC (preferred developer consultant) group as the PMD (preferred marketing developer), signaling the importance of development resources inside the marketing organization to unlock the potential of social. The marketing developer they're referring to could be considered a social developer in a broader context. While it's true social has really blossomed in the marketing context and CMOs are winning more and more technical resources, social is starting to work its way more deeply into the enterprise with the help of developers that work outside marketing. Developers, like the rest of us, have fallen in "like" with social functionality and are starting to imagine how social can transform enterprise applications in the way it has consumer-facing experiences. The thesis of my presentation is that social developers will take many pages from the marketing playbook as they apply social inside the enterprise. To support this argument, lets walk through a range of enterprise applications and explore how consumer-facing social experiences might be interpreted in this context. Here's one example of how a social experience could be integrated into a sales enablement application. As a marketer, I spend a great deal of time collaborating with my sales colleagues, so I have good insight into their working process. While at Involver, we grew our sales team quickly, and it became evident some of our processes broke with scale. For example, we used to have weekly team meetings at which we'd discuss what was working and what wasn't from a messaging perspective. One aspect of these sessions focused on "objections" and "responses," where the salespeople would walk through common objections to purchasing and share appropriate responses. We tried to map each context to best answers and we'd capture these on a wiki page. As our team grew, however, participation at scale just wasn't tenable, and our wiki pages quickly lost their freshness. Imagine giving salespeople a place where they could submit common objections and responses for their colleagues to see, sort, comment on, and vote on. What you'd get is an up-to-date and relevant repository of information. And, if you supported an application like this with a social graph, it would be possible to make good recommendations to individual sales people about the objections they'd likely hear based on vertical, product, region or other graph data. Taking it even further, you could build in a badging/game element to reward those salespeople who participate the most. Both these examples are based on proven models at work inside consumer-facing applications. If you want to learn about how HR, Operations, Product Development and Customer Support can leverage social experiences, you’re welcome to join us at JavaOne or join our Social Developer Community to find some of the presentations after OpenWorld.

    Read the article

  • Profiling Startup Of VS2012 &ndash; JustTrace Profiler

    - by Alois Kraus
    JustTrace is made by Telerik which is mainly known for its collection of UI controls. The current version (2012.3.1127.0) does include a performance and memory profiler which does cost 614€ and is currently with a special offer for 306€ on sale. It does include one year of free upgrades. The uneven € numbers are calculated from the 799€ and 50% dicsount price. The UI is already in Metro style and simple to use. Multi process, attach, method recording filter are not supported. It looks like JustTrace is like Ants a Just My Code profiler. For stuff where you do not have the pdbs or you want to dig deeper into the BCL code you will not get far. After getting the profile data you get in the All Methods grid a plain list with hit count and own time. The method list for all methods is also suspiciously short which is a clear sign that you will not get far during the analysis of foreign code. But at least there is also a memory profiler included. For this I have to choose in the first window for Profiling Type “Memory Profiler” to check the memory consumption of VS.  There are some interesting number to see but I do really miss from YourKit the thread stack window. How am I supposed to get a clue when much memory is allocated and the CPU consumption is high in which places I should look? The Snapshot summary gives a rough overview which is ok for a first impression. Next is Assemblies? This gives you a list of all loaded assemblies. Not terribly useful.   The By Type view gives you exactly what it is supposed to do. You have to keep in mind that this list is filtered by the types you did check in the Assemblies list. The By Type instance list does only show types from assemblies which do not originate from Microsoft. By default mscorlib and System are not checked. That is the reason why for the first time my By Type window looked like The idea behind this feature is to show only your instances because you are ultimately responsible for the overall memory consumption. I am not sure if I do like this feature because by default it does hide too much. I do want to see at least how many strings and arrays are allocated. A simple namespace filter would also do it in my opinion. Now you can examine all string instances and look who in the object graph does keep a reference on them. That is nice but YourKit has the big plus that you can also look into the string contents.  I am also not sure how in the graph cycles are visualized and what will happen if you have thousands of objects referencing you. That's pretty much it about JustTrace. It can help the average developer to pinpoint performance and memory issues by just looking at his own code and instances. Showing them more will not help them because the sheer amount of information will overwhelm them. And you need to have a pretty good understanding how the GC and the CLR does work. When you have a performance issue at a customer machine it is sometimes very helpful to be able a bring a profiler onto the machine (no pdbs, …) and to get a full snapshot of all processes which are in the problematic use case involved. For these more advanced use cased JustTrace is certainly the wrong tool. Next: SpeedTrace

    Read the article

< Previous Page | 138 139 140 141 142 143 144 145 146 147 148 149  | Next Page >