Search Results

Search found 234 results on 10 pages for 'seattle jorg'.

Page 3/10 | < Previous Page | 1 2 3 4 5 6 7 8 9 10  | Next Page >

  • Finalists for the Microsoft Accelerator for Windows Azure

    - by ScottGu
    Today, I am pleased to announce the ten finalists for the Microsoft Accelerator for Windows Azure powered by TechStars. These startups are about to launch into a three-month program where they will develop new products and businesses using Windows Azure. The response to the program has been fantastic - we received nearly 600 applications from entrepreneurs in 69 countries around the world, spanning a host of industries including retail, travel, entertainment, banking, real estate and more.  There were so many innovative ideas and amazing teams that it really made the selection process hard.  We finally landed on 10 finalists, based on their experience, qualifications, and innovative business ideas built on the cloud. This fall’s Windows Azure class includes: Advertory – Berlin, Germany. Advertory helps local businesses increase revenue and build customer loyalty. Appetas – Seattle, WA. Appetas' mission is to make restaurants look as beautiful online as they do on the plate! BagsUp – Sydney, Australia. Find great places from people you trust. Embarke – San Diego, CA. Embarke allows developers and companies the ability to integrate with any human communication channel (Facebook, Email, Text Message, Twitter) without having to learn the specifics, write code, or spend time on any of them. Fanzo – Seattle, WA. Fanzo puts sports fans in the spotlight. Find other fans, show off your fanswagger and get rewarded for your passion. MetricsHub – Bellevue, WA. A service providing cloud monitoring with incident detection and prebuilt workflows for remedying common problems. Mobilligy – Bellevue, WA. Mobilligy revolutionizes how people pay their bills by bringing convenient, secure, and instant bill payment support to mobile devices. Realty Mogul – Los Angeles, CA. Realty Mogul is a crowdfunding platform for real estate where accredited investors pool capital and invest in properties that are acquired, managed and eventually resold by professional private real estate companies and their management teams. Staq – San Francisco, CA. Back-end as a service for APIs. Socedo – Bellevue, WA. A simple and effective web application for lead generation and relationship management on Twitter. Each startup will be hosted in Seattle and mentored by entrepreneurs and venture capitalists as well as leaders from Windows Azure and other Microsoft organizations. The teams will spend the first month ideating and refining their business concepts with input and advice from their mentors as well as Microsoft customers, followed by two months of design and development. They will present their results to investors and Microsoft partners at an event in mid-January. We are really looking forward to seeing how their businesses evolve.  These teams have demonstrated incredible energy, passion, and innovative capabilities – and they are ready to show the world what’s possible with Windows Azure. Thanks, Scott P.S. And if you are new to Twitter you can also optionally follow me: @scottgu

    Read the article

  • Last chance to enter! Exceptional DBA Awards 2011

    - by Rebecca Amos
    Only 1 day left to enter the Exceptional DBA Awards! Get started on your entry today, and you could be heading to Seattle for the PASS Summit in October. All you need to do is visit the Exceptional DBA website and answer a few questions about: Your career and achievements as a SQL Server DBAAny mistakes you've made along the way (and how you tackled them)Activities you're involved in within the SQL Server community – for example writing, blogging, contributing to forums, speaking at events, or organising user groupsWhy you think you should be the Exceptional DBA of 2011 As well as the respect and recognition of your peers – and a great boost to your CV – you could win full conference registration to this year's PASS Summit in Seattle (including accommodation and $300 towards travel expenses) – where the award will be presented, as well as a copy of Red Gate's SQL DBA Bundle, and a chance to be featured here, on Simple-Talk.com. So why not give it a shot? Start your entry now at www.exceptionaldba.com (nominations close on 30 June).

    Read the article

  • SQL SERVER – Weekly Series – Memory Lane – #003

    - by pinaldave
    Here is the list of curetted articles of SQLAuthority.com across all these years. Instead of just listing all the articles I have selected a few of my most favorite articles and have listed them here with additional notes below it. Let me know which one of the following is your favorite article from memory lane. 2006 This was the first year of my blogging and lots of new things I was learning as I go. I was indeed an infant in blogging a few years ago. However, as time passed by I have learned a lot. This year was year of experiments and new learning. 2007 Working as a full time DBA I often encoutered various errors and I started to learn how to avoid those error and document the same. ERROR Msg 5174 Each file size must be greater than or equal to 512 KB Whenever I see this error I wonder why someone is trying to attempt a database which is extremely small. Anyway, it does not matter what I think I keep on seeing this error often in industries. Anyway the solution of the error is equally interesting – just created larger database. Dilbert Humor This was very first encounter with database humor and I started to love it. It does not matter how many time we read this cartoon it does not get old. Generate Script with Data from Database – Database Publishing Wizard Generating schema script with data is one of the most frequently performed tasks among SQL Server Data Professionals. There are many ways to do the same. In the above article I demonstrated that how we can use the Database Publishing Wizard to accomplish the same. It was new to me at that time but I have not seen much of the adoption of the same still in the industry. Here is one of my videos where I demonstrate how we can generate data with schema. 2008 Delete Backup History – Cleanup Backup History Deleting backup history is important too but should be done carefully. If this is not carried out at regular interval there is good chance that MSDB will be filled up with all the old history. Every organization is different. Some would like to keep the history for 30 days and some for a year but there should be some limit. One should regularly archive the database backup history. South Asia MVP Open Days 2008 This was my very first year Microsoft MVP. I had Indeed big blast at the event and the fun was incredible. After this event I have attended many different MVP events but the fun and learning this particular event presented was amazing and just like me many others are not able to forget the same. Here are other links related to the event: South Asia MVP Open Day 2008 – Goa South Asia MVP Open Day 2008 – Goa – Day 1 South Asia MVP Open Day 2008 – Goa – Day 2 South Asia MVP Open Day 2008 – Goa – Day 3 2009 Enable or Disable Constraint  This is very simple script but I personally keep on forgetting it so I had blogged it. Till today, I keep on referencing this again and again as sometime a very little thing is hard to remember. Policy Based Management – Create, Evaluate and Fix Policies This article will cover the most spectacular feature of SQL 2008 – Policy-based management and how the configuration of SQL Server with policy-based management architecture can make a powerful difference. Policy based management is loaded with several advantages. It can help you implement various policies for reliable configuration of the system. It also provides additional administrative assistance to DBAs and helps them effortlessly manage various tasks of SQL Server across the enterprise. SQLPASS 2009 – My Very First SQPASS Experience Just Brilliant! I never had an experience such a thing in my life. SQL SQL and SQL – all around SQL! I am listing my own reasons here in order of importance to me. Networking with SQL fellows and experts Putting face to the name or avatar Learning and improving my SQL skills Understanding the structure of the largest SQL Server Professional Association Attending my favorite training sessions Since last time I have never missed a single time this event. This event is my favorite event and something keeps me going. Here are additional post related SQLPASS 2009. SQL PASS Summit, Seattle 2009 – Day 1 SQL PASS Summit, Seattle 2009 – Day 2 SQL PASS Summit, Seattle 2009 – Day 3 SQL PASS Summit, Seattle 2009 – Day 4 2010 Get All the Information of Database using sys.databases Even though we believe that we know everything about our database, we do not know a lot of things about our database. This little script enables us to know so many details about databases which we may not be familiar with. Run this on your server today and see how much you know your database. Reducing CXPACKET Wait Stats for High Transactional Database While engaging in a performance tuning consultation for a client, a situation occurred where they were facing a lot of CXPACKET Waits Stats. The client asked me if I could help them reduce this huge number of wait stats. I usually receive this kind of request from other client as well, but the important thing to understand is whether this question has any merits or benefits, or not. I discusses the same in this article – a bit long but insightful for sure. Error related to Database in Use There are so many database management operations in SQL Server which requires exclusive access to the database and it is not always possible to get it. When any database is online in SQL Server it either applications or system thread often accesses them. This means database can’t have exclusive access and the operations which required this access throws an error. There is very easy method to overcome this minor issue – a single line script can give you exclusive access to the database. Difference between DATETIME and DATETIME2 Developers have found the root reason of the problem when dealing with Date Functions – when data time values are converted (implicit or explicit) between different data types, which would lose some precision, so the result cannot match each other as expected. In this blog post I go over very interesting details and difference between DATETIME and DATETIME2 History of SQL Server Database Encryption I recently met Michael Coles and Rodeney Landrum the author of one of the kind book Expert SQL Server 2008 Encryption at SQLPASS in Seattle. During the conversation we ended up how Microsoft is evolving encryption technology. The same discussion lead to talking about history of encryption tools in SQL Server. Michale pointed me to page 18 of his book of encryption. He explicitly gave me permission to re-produce relevant part of history from his book. 2011 Functions FIRST_VALUE and LAST_VALUE with OVER clause and ORDER BY Some time an interesting feature and smart audience make a total difference in places. From last two days, I have been writing on SQL Server 2012 feature FIRST_VALUE and LAST_VALUE. I created a puzzle which was very interesting and got many people attempt to resolve it. It was based on following two articles: Introduction to FIRST_VALUE and LAST_VALUE Introduction to FIRST_VALUE and LAST_VALUE with OVER clause I even provided the hint about how one can solve this problem. The best part was many people solved the problem without using hints! Try your luck!  A Real Story of Book Getting ‘Out of Stock’ to A 25% Discount Story Available This is a great problem and everybody would love to have it. We had it and we loved it. Our book got out of stock in 48 hours of releasing and stocks were empty. We faced many issues and learned many valuable lessons. Some we were able to avoid in the future and some we are still facing it as those problems have no solutions. However, since that day – our books never gone out of stock. This inspiring learning story for us and I am confident that you will love to read it as well. Introduction to LEAD and LAG – Analytic Functions Introduced in SQL Server 2012 SQL Server 2012 introduces new analytical function LEAD() and LAG(). This function accesses data from a subsequent row (for lead) and previous row (for lag) in the same result set without the use of a self-join . It will be very difficult to explain this in words so I will attempt small example to explain you this function. I had a fantastic time writing this blog post and I am very confident when you read it, you will like the same. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Memory Lane, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Trying to Organise a Software Craftsman Pilgrimage

    - by Liam McLennan
    As I have previously written, I am trying to organise a software craftsman pilgrimage. The idea is to donate some time working with quality developers so that we learn from each other. To be honest I am also trying to be the worst. “Always be the worst guy in every band you’re in.” Pat Metheny I ended up posting a message to both the software craftsmanship group and the Seattle Alt.NET group and I got a good response from both. I have had discussions with people based in: Seattle, New York, Long Island, Austin and Chicago. Over the next week I have to juggle my schedule and confirm the company(s) I will be spending time with, but the good news is it seems that I will not be left hanging.

    Read the article

  • PASS Summit Survey

    - by andyleonard
    One item mentioned in the PASS Board Q & A was the PASS Summit survey, which should be in your Inbox today if you attended the PASS Summit 2013. Charlotte? Did you enjoy having the Summit in Charlotte? If so, the PASS Summit Survey is your primary and most effective means of communicating this fact to the PASS Board and PASS HQ. The same holds if you didn’t like the Summit in Charlotte. Would you rather have the Summit remain forever in Seattle? Would you like to see the Summit in Seattle two...(read more)

    Read the article

  • Code Contracts with Interfaces: "Method Invocation skipped. Compiler will generate method invocation

    - by Jörg Battermann
    Good evening, I just started playing with Microsoft.Contracts (latest version) and plugging it on top of a sample interface and right now it looks like this: namespace iRMA2.Core.Interfaces { using System; using System.Collections.Generic; using System.ComponentModel.Composition; using System.Diagnostics.Contracts; /// <summary> /// Base Interface declarations for iRMA2 Extensions /// </summary> [InheritedExport] [ContractClass(typeof(IiRMA2ExtensionContract))] public interface IiRMA2Extension { /// <summary> /// Gets the name. /// </summary> /// <value>The name of the Extension.</value> string Name { get; } /// <summary> /// Gets the description. /// </summary> /// <value>The description.</value> string Description { get; } /// <summary> /// Gets the author of the extension. Please provide complete information to get in touch with author(s) and the corresponding department /// </summary> /// <value>The author of the extensions.</value> string Author { get; } /// <summary> /// Gets the major version. /// </summary> /// <value>The major version of the extension.</value> int MajorVersion { get; } /// <summary> /// Gets the minor version. /// </summary> /// <value>The minor version.</value> int MinorVersion { get; } /// <summary> /// Gets the build number. /// </summary> /// <value>The build number.</value> int BuildNumber { get; } /// <summary> /// Gets the revision. /// </summary> /// <value>The revision.</value> int Revision { get; } /// <summary> /// Gets the depends on. /// </summary> /// <value>The dependencies to other <c>IiRMA2Extension</c> this one has.</value> IList<IiRMA2Extension> DependsOn { get; } } /// <summary> /// Contract class for <c>IiRMA2Extension</c> /// </summary> [ContractClassFor(typeof(IiRMA2Extension))] internal sealed class IiRMA2ExtensionContract : IiRMA2Extension { #region Implementation of IiRMA2Extension /// <summary> /// Gets or sets the name. /// </summary> /// <value>The name of the Extension.</value> public string Name { get { Contract.Ensures(!String.IsNullOrEmpty(Contract.Result<string>())); return default(string); } set { Contract.Requires(value != null); } } /// <summary> /// Gets the description. /// </summary> /// <value>The description.</value> public string Description { get { throw new NotImplementedException(); } } /// <summary> /// Gets the author of the extension. Please provide complete information to get in touch with author(s) and the corresponding department /// </summary> /// <value>The author of the extensions.</value> public string Author { get { throw new NotImplementedException(); } } /// <summary> /// Gets the major version. /// </summary> /// <value>The major version of the extension.</value> public int MajorVersion { get { throw new NotImplementedException(); } } /// <summary> /// Gets the minor version. /// </summary> /// <value>The minor version.</value> public int MinorVersion { get { throw new NotImplementedException(); } } /// <summary> /// Gets the build number. /// </summary> /// <value>The build number.</value> public int BuildNumber { get { throw new NotImplementedException(); } } /// <summary> /// Gets the revision. /// </summary> /// <value>The revision.</value> public int Revision { get { throw new NotImplementedException(); } } /// <summary> /// Gets the Extensions this one depends on. /// </summary> /// <value>The dependencies to other <c>IiRMA2Extension</c> this one has.</value> public IList<IiRMA2Extension> DependsOn { get { Contract.Ensures(Contract.Result<IList<IiRMA2Extension>>() != null); return default(IList<IiRMA2Extension>); } } #endregion } } Now why are the two Contract.Ensures(...) 'blured' out visually with the tooltip saying "Method Invocation skipped. Compiler will generate method invocation because the method is conditional or it is partial method without implementation" and in fact the CodeContracts output does not count/show them... What am I missing & doing wrong here? -J

    Read the article

  • Alternative to System.Web.HttpUtility.HtmlEncode/Decode?

    - by Jörg Battermann
    Is there any 'slimmer' alternative to the System.Web.HttpUtility.HtmlEncode/.Decode functions in .net 3.5 (sp1)? A separate library is fine.. or even 'wanted', at least something that does not pull in a 'whole new world' of dependencies that System.Web requires. I only want to convert a normal string into its xml/xhtml compliant equivalent (& back).

    Read the article

  • Proper 'cleartool mkview' for ClearCase Snapshot view creation

    - by Jörg Battermann
    Good afternoon, seems like I am somewhat stuck in CC-land these days, but I have one (hopefully) final question regarding proper CC-handling: When using the CC View Creation Wizard with the two steps / details below, I can create a proper Snapshot view on my machine perfectly fine, however when trying to do the same with the mkview command, it fails... Here are the screenshots of the view creation wizard: Now that results into the (working) following view: cleartool> lsview battjo6r_view2 battjo6r_view2 \\Eh40yd4c\Views\battjo6r_view2.vws cleartool> lsview -long battjo6r_view2 Tag: battjo6r_view2 Global path: \\Eh40yd4c\Views\battjo6r_view2.vws Server host: Eh40yd4c Region: CT_WORK Active: NO View tag uuid:f34cf43f.b4d048df.845d.ed:21:a2:9c:45:ff View on host: Eh40yd4c View server access path: D:\Views\battjo6r_view2.vws View uuid: f34cf43f.b4d048df.845d.ed:21:a2:9c:45:ff View attributes: snapshot View owner: WW005\battjo6r However, when trying to create the view manually via mkview -snapshot -tag battjo6r_view2 -vws \\Eh40yd4c\Views\battjo6r_view2.vws -host Eh40yd4c -hpath D:\Views\battjo6r_view2.vws -gpath \\Eh40yd4c\Views\battjo6r_view2.vws battjo6r_view2 ... I get the following error: cleartool> mkview -snapshot -tag battjo6r_view2 -vws \\Eh40yd4c\Views\battjo6r_view2.vws -host Eh40yd4c -hpath D:\Views\battjo6r_view2.vws -gpath \\Eh40yd4c\Views\battjo6r_view2.vws battjo6r_view2 Created view. Host-local path: Eh40yd4c:D:\Views\battjo6r_view2.vws Global path: \\Eh40yd4c\Views\battjo6r_view2.vws cleartool: Error: Unable to find view by uuid:6f99f7ae.6a5d40e4.ba32.37:8e:e5:a4:ed:18, last known at "<viewhost>:<stg_path>". cleartool: Error: Unable to establish connection to snapshot view "6f99f7ae.6a5d40e4.ba32.37:8e:e5:a4:ed:18": ClearCase object not found cleartool: Warning: Unable to open snapshot view "D:\SnapShotViews\battjo6r_view2". cleartool: Error: Unable to create snapshot view "battjo6r_view2". Removing the view ... Any idea why this is happening? Am I missing something?

    Read the article

  • .Net Library for parsing source code files?

    - by Jörg Battermann
    Does anyone know of a good .NET library that allows me to parse source code files, but not only .NET source code files (like java, perl, ruby, etc)? I need programmatic access to the contents of various source code files (e.g. class/method /parameter names, types, etc.). Has anyone come across something like this? I know within .NET it is reasonably possible and there are some libraries out there, but I need that to be abstracted to more types of programming languages.

    Read the article

  • Backend raising (INotify)PropertyChanged events to all connected clients?

    - by Jörg Battermann
    One of our 'frontend' developers keeps requesting from us backend developers that the backend notifies all connected clients (it's a client/server environment) of changes to objects. As in: whenever one user makes a change, all other connected clients must be notified immediately of the change. At the moment our architecture does not have a notification system of that kind and we don't have a sort of pub/sub model for explicitly chosen objects (e.g. the one the frontend is currently implementing).. which would make sense in such a usecase imho, but obviously requires extra implementation. However, I thought frontends typically check for locks for concurrently existing user changes on the same object and rather pull for changes / load on demand and in the background rather than the backend pushing all changes to all clients for all objects constantly.. which seems rather excessive to me. However, it's being argumented that e.g. the MS Entity Framework does in fact publish (INotify)PropertyChanged not only for local changes, but for all such changes including other client connections, but I have found no proof or details regarding this. Can anyone shed some light into this? Do other e.g. ORMs etc provide broadcasted (INotify)PropertyChanged events on entities?

    Read the article

  • WPF Graph Layout Component

    - by Jörg B.
    Does anyone know of or even better.. can wholeheartedly recommend a WPF graph layout component (Microsoft Research had GLEE a while back but it hasn't been updated after 1.0 since 2007 and isn't WPF etc) as seen e.g. in the screenshot below? I've seen yFiles WPF and Lasalle's AddFlow for WPF, but are there any alternatives? (c) Screenshot: RedGate Ants Memory Profiler

    Read the article

  • Xunit: Perform all 'Assert'ions in one test method?

    - by Jörg Battermann
    Is it possible to tell xUnit.net to perform all e.g. Assert.True() in one test method? Basically in some of our use/testcases all assertions belong logically to one and the same 'scope' of tests and I have e.g. something like this: [Fact(DisplayName = "Tr-MissImpl")] public void MissingImplementationTest() { // parse export.xml file var exportXml = Libraries.Utilities.XML.GenericClassDeserializer.DeserializeXmlFile<Libraries.MedTrace.ExportXml>( ExportXmlFile); // compare parsed results with expected ones Assert.True(exportXml.ContainsRequirementKeyWithError("PERS_154163", "E0032A")); Assert.True(exportXml.ContainsRequirementKeyWithError("PERS_155763", "E0032A")); Assert.True(exportXml.ContainsRequirementKeyWithError("PERS_155931", "E0032A")); Assert.True(exportXml.ContainsRequirementKeyWithError("PERS_157145", "E0032A")); Assert.True(exportXml.ContainsRequirementKeyWithError("s_sw_ers_req_A", "E0032A")); Assert.True(exportXml.ContainsRequirementKeyWithError("s_sw_ers_req_C", "E0032A")); Assert.True(exportXml.ContainsRequirementKeyWithError("s_sw_ers_req_D", "E0032A")); } Now if e.g. the first Assert.True(...) fails, the other ones are not executed/checked. I'd rather not break these seven Assertions up into separate methods, since these really do belong together logically (the TC only is 'passed' entirely if all seven are passing all together).

    Read the article

  • R Question: calculating deltas in a timeseries

    - by Jörg Beyer
    I have a timeseries of samples in R: > str(d) 'data.frame': 5 obs. of 3 variables: $ date: POSIXct, format: "2010-03-04 20:47:00" "2010-03-04 21:47:00" ... $ x : num 0 10 11 15.2 20 $ y : num 0 5 7.5 8.4 12.5 > d date x y 1 2010-03-04 20:47:00 0.0 0.0 2 2010-03-04 21:47:00 10.0 5.0 3 2010-03-04 22:47:00 11.0 7.5 4 2010-03-04 23:47:00 15.2 8.4 5 2010-03-05 00:47:00 20.0 12.5 In this example samples for x and y are taken every hour (but the time delta is not fix). The x and y values are always growing (like a milage counter in a car). I need the deltas, how much was the growth in between, something like this: 1 2010-03-04 20:47:00 0.0 0.0 2 2010-03-04 21:47:00 10.0 5.0 3 2010-03-04 22:47:00 1.0 2.5 4 2010-03-04 23:47:00 4.2 0.9 5 2010-03-05 00:47:00 4.8 4.1 And I also need the detlas per time (x and y delta, divided by the time - delta per hour) How would I do this in R?

    Read the article

  • DeltaXML Diff like library for .Net?

    - by Jörg Battermann
    We are currently using DeltaXML in our .Net application to analyse two versions of .xml files regarding their differences, but since DeltaXML is a java application/library, we're looking for a more homogeneous way to accomplish that task. Does anyone know a .Net diff library similiar to DeltaXML?

    Read the article

  • Referencing a x86 assembly in a 64bit one

    - by Jörg Battermann
    In one project we have a dependency on a legacy system and its .Net assembly which is delivered as a 'x86' and they do not provide a 'Any CPU' and/or 64bit one. Now the project itself juggles with lots of data and we hit the limitations we have with a forced x86 on the whole project due to that one assembly (if we used 64bit/any cpu it would raise a BadImageFormatException once that x86 is loaded on a 64bit machine). Now is there a (workaround)way to use a x86 assembly in a 64bit .net host app? That tiny dependency on that x86 assembly makes and keeps 99% of the project heavily limited.

    Read the article

  • Match Regex across newlines?

    - by Jörg Battermann
    I have a regex ( "(&lt;lof&lt;).*?(&gt;&gt;)" ) that works and matches perfectly on single line input. However, if the input contains newlines between the two () parts it does not match at all. What's the best way to ignore any newlines at all in that case?

    Read the article

  • Application leaking Strings?

    - by Jörg B.
    My .net application does some heavy string loading/manipulation and unfortunately the memory consumption keeps rising and rising and when looking at it with a profiler I see alot of unreleased string instances. Now at one point of time or another I do need all objects t hat do have these string fields, but once done, I could get rid of e.g. the half of it and I Dispose() and set the instances to null, but the Garbage Collector does not to pick that up.. they remain in memory (even after half an hour after disposing etc). Now how do I get properly rid of unneeded strings/object instances in order to release them? They are nowhere referenced anymore (afaik) but e.g. aspose's memory profiler says their distance to the gc's root is '3'?

    Read the article

  • .Net OutOfMemory on Server but not Desktop

    - by Jörg Battermann
    Is it possible that the .Net framework behaves differently when it comes to garbage collection / memory limitations on server environments? I am running explicitly x86 compiled apps on a 64bit server machine with 32gbs of physical ram and I am running out of memory (SystemOutOfMemoryException) even though nothing but that particular app is running and the server/all other apps utilize 520mb total.. but I cannot reproduce that behaviour on my own (client win7) machine. Now I know that the app -is- memory intensive, but why is it causing problems on the server and not on the client?

    Read the article

  • Memory leak while using emoticons on CRichEditCtrl

    - by Jorg B Jorge
    I'm developing a text editor class (for a chat application) based on CRichEditCtrl (MFC) with emoticon support. After i load the emoticon's bitmap, I use the function OleCreateStaticFromData to insert it into CRichEditCtrl. After that i just delete the bitmap object allocated by myself. I can verify (using a GDIView utility) that all resources i allocate have been properly released. This works perfectly: the bitmap (emoticon) is drawn on the CRichEditCtrl window and is handled just like a character. My problem is that I don't know how to deallocate the memory (internal) allocated by OleCreateStaticFromData to manage the bitmap (emoticon). The memory allocated for any emoticon used is never released, even if i delete the CRichEditCtrl object. I'd like to know how to fix that issue. Is that a MFC's issue or i'm doing something wrong ? Thx.

    Read the article

  • Using visual studio 6 c++ compiler from within emacs

    - by jörg
    Hey guys, I'm just getting started with c++ development and I would like to use emacs to write the code and then compile and run it from within emacs using the visual studio 6 compiler. I have already googled around a bit but just can't seem to find an explanation of how this is done. Any pointers? Thanks for your help, joerg

    Read the article

  • Echo cancellation

    - by Jorg B Jorge
    Can any of you suggest a good and stable echo cancelation package (gnu or not) to be linked with my videoconference application (C/C++) (Windows / Linux / MacOSX) ? My application should be freeware, so i do not want to pay for each user who download the app.

    Read the article

  • Relational vs. Dimensional Databases, what's the difference?

    - by grautur
    I'm trying to learn about OLAP and data warehousing, and I'm confused about the difference between relational and dimensional modeling. Is dimensional modeling basically relational modeling, but allowing for redundant/un-normalized data? For example, let's say I have historical sales data on (product, city, # sales). I understand that the following would be a relational point-of-view: Product | City | # Sales Apples, San Francisco, 400 Apples, Boston, 700 Apples, Seattle, 600 Oranges, San Francisco, 550 Oranges, Boston, 500 Oranges, Seattle, 600 While the following is a more dimensional point-of-view: Product | San Francisco | Boston | Seattle Apples, 400, 700, 600 Oranges, 550, 500, 600 But it seems like both points of view would nonetheless be implemented in an identical star schema: Fact table: Product ID, Region ID, # Sales Product dimension: Product ID, Product Name City dimension: City ID, City Name And it's not until you start adding some additional details to each dimension that the differences start popping up. For instance, if you wanted to track regions as well, a relational database would tend to have a separate region table, in order to keep everything normalized: City dimension: City ID, City Name, Region ID Region dimension: Region ID, Region Name, Region Manager, # Regional Stores While a dimensional database would allow for denormalization to keep the region data inside the city dimension, in order to make it easier to slice the data: City dimension: City ID, City Name, Region Name, Region Manager, # Regional Stores Is this correct?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10  | Next Page >