Search Results

Search found 14532 results on 582 pages for 'dynamic types'.

Page 180/582 | < Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >

  • CLSF & CLK 2013 Trip Report by Jeff Liu

    - by jamesmorris
    This is a contributed post from Jeff Liu, lead XFS developer for the Oracle mainline Linux kernel team. Recently, I attended both the China Linux Storage and Filesystem workshop (CLSF), and the China Linux Kernel conference (CLK), which were held in Shanghai. Here are the highlights for both events. CLSF - 17th October XFS update (led by Jeff Liu) XFS keeps rapid progress with a lot of changes, especially focused on the infrastructure/performance improvements as well as  new feature development.  This can be reflected with a sample statistics among XFS/Ext4+JBD2/Btrfs via: # git diff --stat --minimal -C -M v3.7..v3.12-rc4 -- fs/xfs|fs/ext4+fs/jbd2|fs/btrfs XFS: 141 files changed, 27598 insertions(+), 19113 deletions(-) Ext4+JBD2: 39 files changed, 10487 insertions(+), 5454 deletions(-) Btrfs: 70 files changed, 19875 insertions(+), 8130 deletions(-) What made up those changes in XFS? Self-describing metadata(CRC32c). This is a new feature and it contributed about 70% code changes, it can be enabled via `mkfs.xfs -m crc=1 /dev/xxx` for v5 superblock. Transaction log space reservation improvements. With this change, we can calculate the log space reservation at mount time rather than runtime to reduce the the CPU overhead. User namespace support. So both XFS and USERNS can be enabled on kernel configuration begin from Linux 3.10. Thanks Dwight Engen's efforts for this thing. Split project/group quota inodes. Originally, project quota can not be enabled with group quota at the same time because they were share the same quota file inode, now it works but only for v5 super block. i.e, CRC enabled. CONFIG_XFS_WARN, an new lightweight runtime debugger which can be deployed in production environment. Readahead log object recovery, this change can speed up the log replay progress significantly. Speculative preallocation inode tracking, clearing and throttling. The main purpose is to deal with inodes with post-EOF space due to speculative preallocation, support improved quota management to free up a significant amount of unwritten space when at or near EDQUOT. It support backgroup scanning which occurs on a longish interval(5 mins by default, tunable), and on-demand scanning/trimming via ioctl(2). Bitter arguments ensued from this session, especially for the comparison between Ext4 and Btrfs in different areas, I have to spent a whole morning of the 1st day answering those questions. We basically agreed on XFS is the best choice in Linux nowadays because: Stable, XFS has a good record in stability in the past 10 years. Fengguang Wu who lead the 0-day kernel test project also said that he has observed less error than other filesystems in the past 1+ years, I own it to the XFS upstream code reviewer, they always performing serious code review as well as testing. Good performance for large/small files, XFS does not works very well for small files has already been an old story for years. Best choice (maybe) for distributed PB filesystems. e.g, Ceph recommends delopy OSD daemon on XFS because Ext4 has limited xattr size. Best choice for large storage (>16TB). Ext4 does not support a single file more than around 15.95TB. Scalability, any objection to XFS is best in this point? :) XFS is better to deal with transaction concurrency than Ext4, why? The maximum size of the log in XFS is 2038MB compare to 128MB in Ext4. Misc. Ext4 is widely used and it has been proved fast/stable in various loads and scenarios, XFS just need more customers, and Btrfs is still on the road to be a manhood. Ceph Introduction (Led by Li Wang) This a hot topic.  Li gave us a nice introduction about the design as well as their current works. Actually, Ceph client has been included in Linux kernel since 2.6.34 and supported by Openstack since Folsom but it seems that it has not yet been widely deployment in production environment. Their major work is focus on the inline data support to separate the metadata and data storage, reduce the file access time, i.e, a file access need communication twice, fetch the metadata from MDS and then get data from OSD, and also, the small file access is limited by the network latency. The solution is, for the small files they would like to store the data at metadata so that when accessing a small file, the metadata server can push both metadata and data to the client at the same time. In this way, they can reduce the overhead of calculating the data offset and save the communication to OSD. For this feature, they have only run some small scale testing but really saw noticeable improvements. Test environment: Intel 2 CPU 12 Core, 64GB RAM, Ubuntu 12.04, Ceph 0.56.6 with 200GB SATA disk, 15 OSD, 1 MDS, 1 MON. The sequence read performance for 1K size files improved about 50%. I have asked Li and Zheng Yan (the core developer of Ceph, who also worked on Btrfs) whether Ceph is really stable and can be deployed at production environment for large scale PB level storage, but they can not give a positive answer, looks Ceph even does not spread over Dreamhost (subject to confirmation). From Li, they only deployed Ceph for a small scale storage(32 nodes) although they'd like to try 6000 nodes in the future. Improve Linux swap for Flash storage (led by Shaohua Li) Because of high density, low power and low price, flash storage (SSD) is a good candidate to partially replace DRAM. A quick answer for this is using SSD as swap. But Linux swap is designed for slow hard disk storage, so there are a lot of challenges to efficiently use SSD for swap. SWAPOUT swap_map scan swap_map is the in-memory data structure to track swap disk usage, but it is a slow linear scan. It will become a bottleneck while finding many adjacent pages in the use of SSD. Shaohua Li have changed it to a cluster(128K) list, resulting in O(1) algorithm. However, this apporoach needs restrictive cluster alignment and only enabled for SSD. IO pattern In most cases, the swap io is in interleaved pattern because of mutiple reclaimers or a free cluster is shared by all reclaimers. Even though block layer can merge interleaved IO to some extent, but we cannot count on it completely. Hence the per-cpu cluster is added base on the previous change, it can help reclaimer do sequential IO and the block layer will be easier to merge IO. TLB flush: If we're reclaiming one active page, we should first move the page from active lru list to inactive lru list, and then reclaim the page from inactive lru to swap it out. During the process, we need to clear PTE twice: first is 'A'(ACCESS) bit, second is 'P'(PRESENT) bit. Processors need to send lots of ipi which make the TLB flush really expensive. Some works have been done to improve this, including rework smp_call_functiom_many() or remove the first TLB flush in x86, but there still have some arguments here and only parts of works have been pushed to mainline. SWAPIN: Page fault does iodepth=1 sync io, but it's a little waste if only issue a page size's IO. The obvious solution is doing swap readahead. But the current in-kernel swap readahead is arbitary(always 8 pages), and it always doesn't perform well for both random and sequential access workload. Shaohua introduced a new flag for madvise(MADV_WILLNEED) to do swap prefetch, so the changes happen in userspace API and leave the in-kernel readahead unchanged(but I think some improvement can also be done here). SWAP discard As we know, discard is important for SSD write throughout, but the current swap discard implementation is synchronous. He changed it to async discard which allow discard and write run in the same time. Meanwhile, the unit of discard is also optimized to cluster. Misc: lock contention For many concurrent swapout and swapin , the lock contention such as anon_vma or swap_lock is high, so he changed the swap_lock to a per-swap lock. But there still have some lock contention in very high speed SSD because of swapcache address_space lock. Zproject (led by Bob Liu) Bob gave us a very nice introduction about the current memory compression status. Now there are 3 projects(zswap/zram/zcache) which all aim at smooth swap IO storm and promote performance, but they all have their own pros and cons. ZSWAP It is implemented based on frontswap API and it uses a dynamic allocater named Zbud to allocate free pages. Zbud means pairs of zpages are "buddied" and it can only store at most two compressed pages in one page frame, so the max compress ratio is 50%. Each page frame is lru-linked and can do shink in memory pressure. If the compressed memory pool reach its limitation, shink or reclaim happens. It decompress the page frame into two new allocated pages and then write them to real swap device, but it can fail when allocating the two pages. ZRAM Acts as a compressed ramdisk and used as swap device, and it use zsmalloc as its allocator which has high density but may have fragmentation issues. Besides, page reclaim is hard since it will need more pages to uncompress and free just one page. ZRAM is preferred by embedded system which may not have any real swap device. Now both ZRAM and ZSWAP are in driver/staging tree, and in the mm community there are some disscussions of merging ZRAM into ZSWAP or viceversa, but no agreement yet. ZCACHE Handles file page compression but it is removed out of staging recently. From industry (led by Tang Jie, LSI) An LSI engineer introduced several new produces to us. The first is raid5/6 cards that it use full stripe writes to improve performance. The 2nd one he introduced is SandForce flash controller, who can understand data file types (data entropy) to reduce write amplification (WA) for nearly all writes. It's called DuraWrite and typical WA is 0.5. What's more, if enable its Dynamic Logical Capacity function module, the controller can do data compression which is transparent to upper layer. LSI testing shows that with this virtual capacity enables 1x TB drive can support up to 2x TB capacity, but the application must monitor free flash space to maintain optimal performance and to guard against free flash space exhaustion. He said the most useful application is for datebase. Another thing I think it's worth to mention is that a NV-DRAM memory in NMR/Raptor which is directly exposed to host system. Applications can directly access the NV-DRAM via a memory address - using standard system call mmap(). He said that it is very useful for database logging now. This kind of NVM produces are beginning to appear in recent years, and it is said that Samsung is building a research center in China for related produces. IMHO, NVM will bring an effect to current os layer especially on file system, e.g. its journaling may need to redesign to fully utilize these nonvolatile memory. OCFS2 (led by Canquan Shen) Without a doubt, HuaWei is the biggest contributor to OCFS2 in the past two years. They have posted 46 upstream patches and 39 patches have been merged. Their current project is based on 32/64 nodes cluster, but they also tried 128 nodes at the experimental stage. The major work they are working is to support ATS (atomic test and set), it can be works with DLM at the same time. Looks this idea is inspired by the vmware VMFS locking, i.e, http://blogs.vmware.com/vsphere/2012/05/vmfs-locking-uncovered.html CLK - 18th October 2013 Improving Linux Development with Better Tools (Andi Kleen) This talk focused on how to find/solve bugs along with the Linux complexity growing. Generally, we can do this with the following kind of tools: Static code checkers tools. e.g, sparse, smatch, coccinelle, clang checker, checkpatch, gcc -W/LTO, stanse. This can help check a lot of things, simple mistakes, complex problems, but the challenges are: some are very slow, false positives, may need a concentrated effort to get false positives down. Especially, no static checker I found can follow indirect calls (“OO in C”, common in kernel): struct foo_ops { int (*do_foo)(struct foo *obj); } foo->do_foo(foo); Dynamic runtime checkers, e.g, thread checkers, kmemcheck, lockdep. Ideally all kernel code would come with a test suite, then someone could run all the dynamic checkers. Fuzzers/test suites. e.g, Trinity is a great tool, it finds many bugs, but needs manual model for each syscall. Modern fuzzers around using automatic feedback, but notfor kernel yet: http://taviso.decsystem.org/making_software_dumber.pdf Debuggers/Tracers to understand code, e.g, ftrace, can dump on events/oops/custom triggers, but still too much overhead in many cases to run always during debug. Tools to read/understand source, e.g, grep/cscope work great for many cases, but do not understand indirect pointers (OO in C model used in kernel), give us all “do_foo” instances: struct foo_ops { int (*do_foo)(struct foo *obj); } = { .do_foo = my_foo }; foo>do_foo(foo); That would be great to have a cscope like tool that understands this based on types/initializers XFS: The High Performance Enterprise File System (Jeff Liu) [slides] I gave a talk for introducing the disk layout, unique features, as well as the recent changes.   The slides include some charts to reflect the performances between XFS/Btrfs/Ext4 for small files. About a dozen users raised their hands when I asking who has experienced with XFS. I remembered that when I asked the same question in LinuxCon/Japan, only 3 people raised their hands, but they are Chris Mason, Ric Wheeler, and another attendee. The attendee questions were mainly focused on stability, and comparison with other file systems. Linux Containers (Feng Gao) The speaker introduced us that the purpose for those kind of namespaces, include mount/UTS/IPC/Network/Pid/User, as well as the system API/ABI. For the userspace tools, He mainly focus on the Libvirt LXC rather than us(LXC). Libvirt LXC is another userspace container management tool, implemented as one type of libvirt driver, it can manage containers, create namespace, create private filesystem layout for container, Create devices for container and setup resources controller via cgroup. In this talk, Feng also mentioned another two possible new namespaces in the future, the 1st is the audit, but not sure if it should be assigned to user namespace or not. Another is about syslog, but the question is do we really need it? In-memory Compression (Bob Liu) Same as CLSF, a nice introduction that I have already mentioned above. Misc There were some other talks related to ACPI based memory hotplug, smart wake-affinity in scheduler etc., but my head is not big enough to record all those things. -- Jeff Liu

    Read the article

  • An Honest look at SharePoint Web Services

    - by juanlarios
    INTRODUCTION If you are a SharePoint developer you know that there are two basic ways to develop against SharePoint. 1) The object Model 2) Web services. SharePoint object model has the advantage of being quite rich. Anything you can do through the SharePoint UI as an administrator or end user, you can do through the object model. In fact everything that is done through the UI is done through the object model behind the scenes. The major disadvantage to getting at SharePoint this way is that the code needs to run on the server. This means that all web parts, event receivers, features, etc… all of this is code that is deployed to the server. The second way to get to SharePoint is through the built in web services. There are many articles on how to manipulate web services, how to authenticate to them and interact with them. The basic idea is that a remote application or process can contact SharePoint through a web service. Lots has been written about how great these web services are. This article is written to document the limitations, some of the issues and frustrations with working with SharePoint built in web services. Ultimately, for the tasks I was given to , SharePoint built in web services did not suffice. My evaluation of SharePoint built in services was compared against creating my own WCF Services to do what I needed. The current project I'm working on right now involved several "integration points". A remote application, installed on a separate server was to contact SharePoint and perform an task or operation. So I decided to start up Visual Studio and built a DLL and basically have 2 layers of logic. An integration layer and a data layer. A good friend of mine pointed me to SOLID principles and referred me to some videos and tutorials about it. I decided to implement the methodology (although a lot of the principles are common sense and I already incorporated in my coding practices). I was to deliver this dll to the application team and they would simply call the methods exposed by this dll and voila! it would do some task or operation in SharePoint. SOLUTION My integration layer implemented an interface that defined some of the basic integration tasks that I was to put together. My data layer was about the same, it implemented an interface with some of the tasks that I was going to develop. This gave me the opportunity to develop different data layers, ultimately different ways to get at SharePoint if I needed to. This is a classic SOLID principle. In this case it proved to be quite helpful because I wrote one data layer completely implementing SharePoint built in Web Services and another implementing my own WCF Service that I wrote. I should mention there is another layer underneath the data layer. In referencing SharePoint or WCF services in my visual studio project I created a class for every web service call. So for example, if I used List.asx. I created a class called "DocumentRetreival" this class would do the grunt work to connect to the correct URL, It would perform the basic operation of contacting the service and so on. If I used a view.asmx, I implemented a class called "ViewRetrieval" with the same idea as the last class but it would now interact with all he operations in view.asmx. This gave my data layer the ability to perform multiple calls without really worrying about some of the grunt work each class performs. This again, is a classic SOLID principle. So, in order to compare them side by side we can look at both data layers and with is involved in each. Lets take a look at the "Create Project" task or operation. The integration point is described as , "dll is to provide a way to create a project in SharePoint". Projects , in this case are basically document libraries. I am to implement a way in which a remote application can create a document library in SharePoint. Easy enough right? Use the list.asmx Web service in SharePoint. So here we go! Lets take a look at the code. I added the List.asmx web service reference to my project and this is the class that contacts it:  class DocumentRetrieval     {         private ListsSoapClient _service;      d   private bool _impersonation;         public DocumentRetrieval(bool impersonation, string endpt)         {             _service = new ListsSoapClient();             this.SetEndPoint(string.Format("{0}/{1}", endpt, ConfigurationManager.AppSettings["List"]));             _impersonation = impersonation;             if (_impersonation)             {                 _service.ClientCredentials.Windows.ClientCredential.Password = ConfigurationManager.AppSettings["password"];                 _service.ClientCredentials.Windows.ClientCredential.UserName = ConfigurationManager.AppSettings["username"];                 _service.ClientCredentials.Windows.AllowedImpersonationLevel =                     System.Security.Principal.TokenImpersonationLevel.Impersonation;             }     private void SetEndPoint(string p)          {             _service.Endpoint.Address = new EndpointAddress(p);          }          /// <summary>         /// Creates a document library with specific name and templateID         /// </summary>         /// <param name="listName">New list name</param>         /// <param name="templateID">Template ID</param>         /// <returns></returns>         public XmlElement CreateLibrary(string listName, int templateID, ref ExceptionContract exContract)         {             XmlDocument sample = new XmlDocument();             XmlElement viewCol = sample.CreateElement("Empty");             try             {                 _service.Open();                 viewCol = _service.AddList(listName, "", templateID);             }             catch (Exception ex)             {                 exContract = new ExceptionContract("DocumentRetrieval/CreateLibrary", ex.GetType(), "Connection Error", ex.StackTrace, ExceptionContract.ExceptionCode.error);                             }finally             {                 _service.Close();             }                                      return viewCol;         } } There was a lot more in this class (that I am not including) because i was reusing the grunt work and making other operations with LIst.asmx, For example, updating content types, changing or configuring lists or document libraries. One of the first things I noticed about working with the built in services is that you are really at the mercy of what is available to you. Before creating a document library (Project) I wanted to expose a IsProjectExisting method. This way the integration or data layer could recognize if a library already exists. Well there is no service call or method available to do that check. So this is what I wrote:   public bool DocLibExists(string listName, ref ExceptionContract exContract)         {             try             {                 var allLists = _service.GetListCollection();                                return allLists.ChildNodes.OfType<XmlElement>().ToList().Exists(x => x.Attributes["Title"].Value ==listName);             }             catch (Exception ex)             {                 exContract = new ExceptionContract("DocumentRetrieval/GetList/GetListWSCall", ex.GetType(), "Unable to Retrieve List Collection", ex.StackTrace, ExceptionContract.ExceptionCode.error);             }             return false;         } This really just gets an XMLElement with all the lists. It was then up to me to sift through the clutter and noise and see if Document library already existed. This took a little bit of getting used to. Now instead of working with code, you are working with XMLElement response format from web service. I wrote a LINQ query to go through and find if the attribute "Title" existed and had a value of the listname then it would return True, if not False. I didn't particularly like working this way. Dealing with XMLElement responses and then having to manipulate it to get at the exact data I was looking for. Once the check for the DocLibExists, was done, I would either create the document library or send back an error indicating the document library already existed. Now lets examine the code that actually creates the document library. It does what you are really after, it creates a document library. Notice how the template ID is really an integer. Every document library template in SharePoint has an ID associated with it. Document libraries, Image Library, Custom List, Project Tasks, etc… they all he a unique integer associated with it. Well, that's great but the client came back to me and gave me some specifics that each "project" or document library, should have. They specified they had 3 types of projects. Each project would have unique views, about 10 views for each project. Each Project specified unique configurations (auditing, versioning, content types, etc…) So what turned out to be a simple implementation of creating a document library as a repository for a project, turned out to be quite involved.  The first thing I thought of was to create a template for document library. There are other ways you can do this too. Using the web Service call, you could configure views, versioning, even content types, etc… the only catch is, you have to be working quite extensively with CAML. I am not fond of CAML. I can do it and work with it, I just don't like doing it. It is quite touchy and at times it is quite tough to understand where errors were made with CAML statements. Working with Web Services and CAML proved to be quite annoying. The service call would return a generic error message that did not particularly point me to a CAML statement syntax error, or even a CAML error. I was not sure if it was a security , performance or code based issue. It was quite tough to work with. At times it was difficult to work with because of the way SharePoint handles metadata. There are "Names", "Display Name", and "StaticName" fields. It was quite tough to understand at times, which one to use. So it took a lot of trial and error. There are tools that can help with CAML generation. There is also now intellisense for CAML statements in Visual Studio that might help but ultimately I'm not fond of CAML with Web Services.   So I decided on the template. So my plan was to create create a document library, configure it accordingly and then use The Template Builder that comes with the SharePoint SDK. This tool allows you to create site templates, list template etc… It is quite interesting because it does not generate an STP file, it actually generates an xml definition and a feature you can activate and make that template available on a site or site collection. The first issue I experienced with this is that one of the specifications to this template was that the "All Documents" view was to have 2 web parts on it. Well, it turns out that using the template builder , it did not include the web parts as part of the list template definition it generated. It backed up the settings, the views, the content types but not the custom web parts. I still decided to try this even without the web parts on the page. This new template defined a new Document library definition with a unique ID. The problem was that the service call accepts an int but it only has access to the built in library int definitions. Any new ones added or created will not be available to create. So this made it impossible for me to approach the problem this way.     I should also mention that one of the nice features about SharePoint is the ability to create list templates, back them up and then create lists based on that template. It can all be done by end user administrators. These templates are quite unique because they are saved as an STP file and not an xml definition. I also went this route and tried to see if there was another service call where I could create a document library based no given template name. Nope! none.      After some thinking I decide to implement a WCF service to do this creation for me. I was quite certain that the object model would allow me to create document libraries base on a template in which an ID was required and also templates saved as STP files. Now I don't want to bother with posting the code to contact WCF service because it's self explanatory, but I will post the code that I used to create a list with custom template. public ServiceResult CreateProject(string name, string templateName, string projectId)         {             string siteurl = SPContext.Current.Site.Url;             Guid webguid = SPContext.Current.Web.ID;                        using (SPSite site = new SPSite(siteurl))             {                 using (SPWeb rootweb = site.RootWeb)                 {                     SPListTemplateCollection temps = site.GetCustomListTemplates(rootweb);                     ProcessWeb(siteurl, webguid, web => Act_CreateProject(web, name, templateName, projectId, temps));                 }//SpWeb             }//SPSite              return _globalResult;                   }         private void Act_CreateProject(SPWeb targetsite, string name, string templateName, string projectId, SPListTemplateCollection temps) {                         var temp = temps.Cast<SPListTemplate>().FirstOrDefault(x => x.Name.Equals(templateName));             if (temp != null)             {                             try                 {                                         Guid listGuid = targetsite.Lists.Add(name, "", temp);                     SPList newList = targetsite.Lists[listGuid];                     _globalResult = new ServiceResult(true, "Success", "Success");                 }                 catch (Exception ex)                 {                     _globalResult = new ServiceResult(false, (string.IsNullOrEmpty(ex.Message) ? "None" : ex.Message + " " + templateName), ex.StackTrace.ToString());                 }                                       }        private void ProcessWeb(string siteurl, Guid webguid, Action<SPWeb> action) {                        using (SPSite sitecollection = new SPSite(siteurl)) {                 using (SPWeb web = sitecollection.AllWebs[webguid]) {                     action(web);                 }                     }                  } This code is actually some of the code I implemented for the service. there was a lot more I did on Project Creation which I will cover in my next blog post. I implemented an ACTION method to process the web. This allowed me to properly dispose the SPWEb and SPSite objects and not rewrite this code over and over again. So I implemented a WCF service to create projects for me, this allowed me to do a lot more than just create a document library with a template, it now gave me the flexibility to do just about anything the client wanted at project creation. Once this was implemented , the client came back to me and said, "we reference all our projects with ID's in our application. we want SharePoint to do the same". This has been something I have been doing for a little while now but I do hope that SharePoint 2010 can have more of an answer to this and address it properly. I have been adding metadata to SPWebs through property bag. I believe I have blogged about it before. This time it required metadata added to a document library. No problem!!! I also mentioned these web parts that were to go on the "All Documents" View. I took the opportunity to configure them to the appropriate settings. There were two settings that needed to be set on these web parts. One of them was a Project ID configured in the webpart properties. The following code enhances and replaces the "Act_CreateProject " method above:  private void Act_CreateProject(SPWeb targetsite, string name, string templateName, string projectId, SPListTemplateCollection temps) {                         var temp = temps.Cast<SPListTemplate>().FirstOrDefault(x => x.Name.Equals(templateName));             if (temp != null)             {                 SPLimitedWebPartManager wpmgr = null;                               try                 {                                         Guid listGuid = targetsite.Lists.Add(name, "", temp);                     SPList newList = targetsite.Lists[listGuid];                     SPFolder rootFolder = newList.RootFolder;                     rootFolder.Properties.Add(KEY, projectId);                     rootFolder.Update();                     if (rootFolder.ParentWeb != targetsite)                         rootFolder.ParentWeb.Dispose();                     if (!templateName.Contains("Natural"))                     {                         SPView alldocumentsview = newList.Views.Cast<SPView>().FirstOrDefault(x => x.Title.Equals(ALLDOCUMENTS));                         SPFile alldocfile = targetsite.GetFile(alldocumentsview.ServerRelativeUrl);                         wpmgr = alldocfile.GetLimitedWebPartManager(PersonalizationScope.Shared);                         ConfigureWebPart(wpmgr, projectId, CUSTOMWPNAME);                                              alldocfile.Update();                     }                                        if (newList.ParentWeb != targetsite)                         newList.ParentWeb.Dispose();                     _globalResult = new ServiceResult(true, "Success", "Success");                 }                 catch (Exception ex)                 {                     _globalResult = new ServiceResult(false, (string.IsNullOrEmpty(ex.Message) ? "None" : ex.Message + " " + templateName), ex.StackTrace.ToString());                 }                 finally                 {                     if (wpmgr != null)                     {                         wpmgr.Web.Dispose();                         wpmgr.Dispose();                     }                 }             }                         }       private void ConfigureWebPart(SPLimitedWebPartManager mgr, string prjId, string webpartname)         {             var wp = mgr.WebParts.Cast<System.Web.UI.WebControls.WebParts.WebPart>().FirstOrDefault(x => x.DisplayTitle.Equals(webpartname));             if (wp != null)             {                           (wp as ListRelationshipWebPart.ListRelationshipWebPart).ProjectID = prjId;                 mgr.SaveChanges(wp);             }         }   This Shows you how I was able to set metadata on the document library. It has to be added to the RootFolder of the document library, Unfortunately, the SPList does not have a Property bag that I can add a key\value pair to. It has to be done on the root folder. Now everything in the integration will reference projects by ID's and will not care about names. My, "DocLibExists" will now need to be changed because a web service is not set up to look at property bags.  I had to write another method on the Service to do the equivalent but with ID's instead of names.  The second thing you will notice about the code is the use of the Webpartmanager. I have seen several examples online, and also read a lot about memory leaks, The above code does not produce memory leaks. The web part manager creates an SPWeb, so just dispose it like I did. CONCLUSION This is a long long post so I will stop here for now, I will continue with more comparisons and limitations in my next post. My conclusion for this example is that Web Services will do the trick if you can suffer through CAML and if you are doing some simple operations. For Everything else, there's WCF! **** fireI apologize for the disorganization of this post, I was on a bus on a 12 hour trip to IOWA while I wrote it, I was half asleep and half awake, hopefully it makes enough sense to someone.

    Read the article

  • CodePlex Daily Summary for Tuesday, May 18, 2010

    CodePlex Daily Summary for Tuesday, May 18, 2010New ProjectsCafeControl: Supports Remote LOGIN,Remote logout ,Account Creation ,Account LOGOUT ,Temporary Login ,SMS Reported and many other features Requires .net 4.0 Cloud & Contacts: Cloud Contacts makes it easier for share your contacts.Cow connect: Ziel des Projektes Cow connect, ist es ein Tool zu schrieben das Verschiedene Datenbanken, unterschiedlicher Herdenmanagement Tool z.b: Helm Multik...DNN Simple Blog: A simplified blog module that adheres to the DNN API and is designed for a single blog author per module instance. The module also makes use of Web...dotSpatial: dotSpatial is an open source project focused on developing a core set of GIS and mapping libraries that live together harmoniously in the System.Sp...Dynamic Survey Forms - SharePoint Web Part: Create manage dynamic survey forms as SharePoint web part. Record survey score for logged user or for someone else. This project has been designed ...EDXL Sharp: EDXLSharp is a C# / .NET 3.5 implementation of the OASIS Emergency Data Exchange Language (EDXL) family of standards. This set of libraries can be...EPiServer CMS 6 Visual Studio Project Template for VB w/ Public Templates: This is a Visual Studio 2008 Project Template with will allow the creation of a EPiServer CMS 6 project set up as and with all code in Visual Basic...Functional Command Toolbar for Windows: A floating window with a text box for typing functional command and executing it. The tool supports .NET addin, functional scripting and other feat...GameFX: Silverlight Game Development LibraryLightweight Fluent Workflow: ObjectFlow is a lightweight workflow engine for creating & executing workflows. The fluent interface makes it easy to define and understand workf...LinqSpecs: A toolset for use the specification pattern in linq queries.Money Watch: Personal Finances management system written in C#, NHibernate and SQL express.Multi-screen Viewer: This viewer allows to open and view pdf file (presentation) on multiple screens. There is no need to see the file in fullscreen on each screen (mon...neo-tsql: set of stored procedures and functions for sql serverNHTrace: NHTrace is a tool for tracing sql commands executed by NHibernate with syntax highlighting.NQueue: NQueue provides an enterprise level work scheduling and execution framework and toolset that contains no single point of failure. Using a farm of s...Online Cash Manager: Online Cash Manager based on ASP .NET MVC2 VS 2010 RTM MVC 2POCO Bridge: Bridging Silverlight and full .NET apps.REngine - game engine in Silverlight: REngine makes it easier for game developers to develop games in Microsoft Silverlight. RunAs Launcher: RunAs Launcher is a C# utility that provides a GUI for running applications under different credentials. It works in situations where the built-in ...secs4net: SECS-II/GEM/HSMS implementation on .NET. This library provide easy way to communicate with SEMI standard compatible device.SharePoint Admin Dashboard: SharePoint Dashboard for admins. Allows lightening fast multiple server management. RDP doesn't scale. Manage 10 servers easier than 1 with i...Silver spring: saltSocial Map: Social map is a social network based on geograpghical informationTweetZone: TweetZone is new type of twitter client application include DATABASE in it, and it shows you STATS. This Application's cache makes it faster to acc...Yet Another Database Viewer: Yet Another Database Viewer is developed for a basic database view and editing so you don't have to install anything. It's developed in c#.New Releases3FD - Framework For Fast Development: Alpha 1: The first test release. There is still some bugs, but it is functional. The garbage collector is showing memory leaking that must be corrected in t...Ajax Toolkit for ASP.NET MVC: MvcAjaxToolkit gridext with ContextMenu and Tmpl: MvcAjaxToolkit gridext with ContextMenu and Tmpl gridext is a extension for flexigridASP.NET MVC Extensions: SP1 Preview: SP1 Preview ========= 1. Autofac support added. 2. Changed Windsor Adapter. IWindsorInstaller is used instead of IModule.Book Cataloger: BookCataloger1.0.7a: New Features: Author editor form prototype Improvements: .NET Framework 4.0 required Input checking improved Comment edit loads and saves text...Braintree Client Library: Braintree-2.2.0: Prevent race condition when pulling back collection results -- search results represent the state of the data at the time the query was run Renam...CassiniDev - Cassini 3.5/4.0 Developers Edition: CassiniDev 3.5.1 and 4.0.1 beta 2: Documentation New in CassiniDev v3.5.1.0/v4.0.1.0 Added .Net 4 / VS10 build. Simplified test fixtures. Un-Refactored the not-so-simple MVP pa...dotNetTips: dotNetTips.Utility 3.5 R3: This is a new release (version 3.5.0.4) compatible with .NET 3.5. Requires SP1 if using the Entity Framework extensions. This is a minor update fro...Dynamic Survey Forms - SharePoint Web Part: Dynamic Survey forms for SharePoint. Alpha 1.0.1: Alpha release. Before running installer create database from script attached in zip file. In your web.config make sure your first connection strin...Event Scavenger: Viewer version 3.2.1: Added quick filters on event source and event id dialog boxes. Collector and Admin tool unaffected.Expression Blend Samples: Blend 4 Sketch Mockups Library: This library provides a set of commonly needed controls, icons and cursors to use in SketchFlow applications. After running the installer, create ...Fluent Ribbon Control Suite: Fluent Ribbon Control Suite 1.3: Fluent Ribbon Control Suite 1.3(supports .NET 3.5 and .NET 4 RTM) Includes: Fluent.dll (with .pdb and .xml) Showcase Application Samples Found...Home Access Plus+: v4.2.2: Version 4.2.2 Change Log: Changes to how mycomputer handles NTFS permissions File Changes: ~/Bin/HAP.Web.dll ~/Bin/HAP.Web.pdbIdeaNMR: IdeaNMR Web App PreAlpha 0.1: This is the first release.IP Informer: Beta Release: V0.8.0.0 Beta.LinkedIn® for Windows Mobile: LinkedIn for Windows Mobile v0.9: Main updates for this release Fixed Status update. (updates where not correctly passed on to LinkedIn) Added landscape/GSensor support. (Currentl...LINQ to Twitter: LINQ to Twitter Beta v2.0.11: New items added since v1.1 include: Support for OAuth (via DotNetOpenAuth), secure communication via https, VB language support, serialization of ...LinqSpecs: Version 1.0 alpha: This is the alpha version of LinqSpecs.miniTodo: mini Todo version 0.2: ・デザインを透明主体に変更  -件数を表示している部分のドラッグでウィンドウ移動  -上記の部分右クリックで、「最前面に表示」、「全アイテム管理」 ・グラフを日/週/月単位の3種類に増やした ・新規作成、完了時にアニメーション追加。完了時にはサウンドも追加 ・テキスト未入力時は追加ボタンも非表示My Notepad: My Notepad (Beta): This is the Beta version of My Notepad. The software is stable enough for you to use. Enjoy the flexibility of docking and also the all new Syntax ...NHTrace: NHTrace-45713: NHTrace-45713Nito.KitchenSink: Version 8: New features (since Version 5) Fixed null reference bug in ExceptionExtensions. Added DynamicStaticTypeMembers and RefOutArg for dynamically (lat...Nito.LINQ: Beta (v0.5): Rx version The "with Rx" versions of Nito.LINQ are built against Rx 1.0.2521.102, released 2010-05-14. Breaking changes Corrected internal read-on...Object/Relational Mapper & Code Generator in Net 2.0 for Relational & XML Schema: 2.9: Work on UI templates for associative tables (2-column PK), using users/roles pages as an example. Added templates for Not-In-Group sql and cache-ba...patterns & practices - GAX Extensions Library: GEL for gax2010: This is the GEL for GAX 2010, support Visual Stuido 2010patterns & practices - Smart Client Guidance: Smart Client Software Factory 2010 Documentation: If the right-side pane of the chm file is not displayed correctly, do the following: 1) Download SCSF2010Guide.chm file. 2) Start the windows explo...patterns & practices - Windows Azure Guidance: WAAG - Part 1 - Release Candidate: "Release Candidate" for Part 1 of the Windows Azure Guide Highlights of this release are: Code samples complete. Fixed few bugs on "Dependency Ch...Rawr: Rawr 2.3.17: >Rawr3 Public Beta has been released! Click here for details.< - Lots of improvements to the default data files. There is a known issue with the s...RunAs Launcher: RunAs Launcher 1.2: This is the first version being released to CodePlex. Simply extract the file and run the executable. For those that wish to download the source c...Rx Contrib: V1.5: Bug fixsecs4net: Release 1.0: Notes: Runtime requirement: .Net framework 2.0 SP2 with System.Core(.NET 3.5), System.Threading(Rx for 3.5 SP1)SharePoint Admin Dashboard: SPDashboard v1.0: SPDashboard v1.0ShortURL Creator: ShortURL Creator 1.3.0.0: Added new provider u.nu and minimum UI changesStyleCop+: StyleCop+ 0.7: StyleCop+ is now fully compatible with StyleCop 4.4. The following entities were supported in Advanced Naming Rules: - Delegate - Event - Property...Value Injecter: an aspect oriented mapper: Value Injecter 1.2: ValueInjecter library, Sample solution that contains: web-forms sample project win-forms sample project unit tests samplesVCC: Latest build, v2.1.30517.0: Automatic drop of latest buildVCC: Latest build, v2.1.30517.1: Automatic drop of latest buildVidCoder: 0.4.1: Changes: Marks system as "working" to prevent computer from sleeping during an encode. CPU priority changed to BelowNormal during encodes. Enco...WSP Listener: WSP Listener version 2.0.0.0: This new version includes: All assemblies and required assets in one WSP Seperated code in library assembly Activate the WSP Listener with one...Yet Another Database Viewer: Beta: first release of the programYet another developer blog - Examples: Asynchronous TreeView in ASP.NET WebForms: This sample application shows how to use jQuery TreeView plugin for creating an asynchronous TreeView in ASP.NET WebForms. This application is acco...Most Popular ProjectsRawrWBFS ManagerAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitWindows Presentation Foundation (WPF)patterns & practices – Enterprise LibraryMicrosoft SQL Server Community & SamplesPHPExcelASP.NETMost Active Projectspatterns & practices – Enterprise LibraryPHPExcelRawrBlogEngine.NETMicrosoft Biology FoundationCustomer Portal Accelerator for Microsoft Dynamics CRMWindows Azure Command-line Tools for PHP DevelopersGMap.NET - Great Maps for Windows Forms & PresentationCassiniDev - Cassini 3.5/4.0 Developers EditionDotNetZip Library

    Read the article

  • Advanced TSQL Tuning: Why Internals Knowledge Matters

    - by Paul White
    There is much more to query tuning than reducing logical reads and adding covering nonclustered indexes.  Query tuning is not complete as soon as the query returns results quickly in the development or test environments.  In production, your query will compete for memory, CPU, locks, I/O and other resources on the server.  Today’s entry looks at some tuning considerations that are often overlooked, and shows how deep internals knowledge can help you write better TSQL. As always, we’ll need some example data.  In fact, we are going to use three tables today, each of which is structured like this: Each table has 50,000 rows made up of an INTEGER id column and a padding column containing 3,999 characters in every row.  The only difference between the three tables is in the type of the padding column: the first table uses CHAR(3999), the second uses VARCHAR(MAX), and the third uses the deprecated TEXT type.  A script to create a database with the three tables and load the sample data follows: USE master; GO IF DB_ID('SortTest') IS NOT NULL DROP DATABASE SortTest; GO CREATE DATABASE SortTest COLLATE LATIN1_GENERAL_BIN; GO ALTER DATABASE SortTest MODIFY FILE ( NAME = 'SortTest', SIZE = 3GB, MAXSIZE = 3GB ); GO ALTER DATABASE SortTest MODIFY FILE ( NAME = 'SortTest_log', SIZE = 256MB, MAXSIZE = 1GB, FILEGROWTH = 128MB ); GO ALTER DATABASE SortTest SET ALLOW_SNAPSHOT_ISOLATION OFF ; ALTER DATABASE SortTest SET AUTO_CLOSE OFF ; ALTER DATABASE SortTest SET AUTO_CREATE_STATISTICS ON ; ALTER DATABASE SortTest SET AUTO_SHRINK OFF ; ALTER DATABASE SortTest SET AUTO_UPDATE_STATISTICS ON ; ALTER DATABASE SortTest SET AUTO_UPDATE_STATISTICS_ASYNC ON ; ALTER DATABASE SortTest SET PARAMETERIZATION SIMPLE ; ALTER DATABASE SortTest SET READ_COMMITTED_SNAPSHOT OFF ; ALTER DATABASE SortTest SET MULTI_USER ; ALTER DATABASE SortTest SET RECOVERY SIMPLE ; USE SortTest; GO CREATE TABLE dbo.TestCHAR ( id INTEGER IDENTITY (1,1) NOT NULL, padding CHAR(3999) NOT NULL,   CONSTRAINT [PK dbo.TestCHAR (id)] PRIMARY KEY CLUSTERED (id), ) ; CREATE TABLE dbo.TestMAX ( id INTEGER IDENTITY (1,1) NOT NULL, padding VARCHAR(MAX) NOT NULL,   CONSTRAINT [PK dbo.TestMAX (id)] PRIMARY KEY CLUSTERED (id), ) ; CREATE TABLE dbo.TestTEXT ( id INTEGER IDENTITY (1,1) NOT NULL, padding TEXT NOT NULL,   CONSTRAINT [PK dbo.TestTEXT (id)] PRIMARY KEY CLUSTERED (id), ) ; -- ============= -- Load TestCHAR (about 3s) -- ============= INSERT INTO dbo.TestCHAR WITH (TABLOCKX) ( padding ) SELECT padding = REPLICATE(CHAR(65 + (Data.n % 26)), 3999) FROM ( SELECT TOP (50000) n = ROW_NUMBER() OVER (ORDER BY (SELECT 0)) - 1 FROM master.sys.columns C1, master.sys.columns C2, master.sys.columns C3 ORDER BY n ASC ) AS Data ORDER BY Data.n ASC ; -- ============ -- Load TestMAX (about 3s) -- ============ INSERT INTO dbo.TestMAX WITH (TABLOCKX) ( padding ) SELECT CONVERT(VARCHAR(MAX), padding) FROM dbo.TestCHAR ORDER BY id ; -- ============= -- Load TestTEXT (about 5s) -- ============= INSERT INTO dbo.TestTEXT WITH (TABLOCKX) ( padding ) SELECT CONVERT(TEXT, padding) FROM dbo.TestCHAR ORDER BY id ; -- ========== -- Space used -- ========== -- EXECUTE sys.sp_spaceused @objname = 'dbo.TestCHAR'; EXECUTE sys.sp_spaceused @objname = 'dbo.TestMAX'; EXECUTE sys.sp_spaceused @objname = 'dbo.TestTEXT'; ; CHECKPOINT ; That takes around 15 seconds to run, and shows the space allocated to each table in its output: To illustrate the points I want to make today, the example task we are going to set ourselves is to return a random set of 150 rows from each table.  The basic shape of the test query is the same for each of the three test tables: SELECT TOP (150) T.id, T.padding FROM dbo.Test AS T ORDER BY NEWID() OPTION (MAXDOP 1) ; Test 1 – CHAR(3999) Running the template query shown above using the TestCHAR table as the target, we find that the query takes around 5 seconds to return its results.  This seems slow, considering that the table only has 50,000 rows.  Working on the assumption that generating a GUID for each row is a CPU-intensive operation, we might try enabling parallelism to see if that speeds up the response time.  Running the query again (but without the MAXDOP 1 hint) on a machine with eight logical processors, the query now takes 10 seconds to execute – twice as long as when run serially. Rather than attempting further guesses at the cause of the slowness, let’s go back to serial execution and add some monitoring.  The script below monitors STATISTICS IO output and the amount of tempdb used by the test query.  We will also run a Profiler trace to capture any warnings generated during query execution. DECLARE @read BIGINT, @write BIGINT ; SELECT @read = SUM(num_of_bytes_read), @write = SUM(num_of_bytes_written) FROM tempdb.sys.database_files AS DBF JOIN sys.dm_io_virtual_file_stats(2, NULL) AS FS ON FS.file_id = DBF.file_id WHERE DBF.type_desc = 'ROWS' ; SET STATISTICS IO ON ; SELECT TOP (150) TC.id, TC.padding FROM dbo.TestCHAR AS TC ORDER BY NEWID() OPTION (MAXDOP 1) ; SET STATISTICS IO OFF ; SELECT tempdb_read_MB = (SUM(num_of_bytes_read) - @read) / 1024. / 1024., tempdb_write_MB = (SUM(num_of_bytes_written) - @write) / 1024. / 1024., internal_use_MB = ( SELECT internal_objects_alloc_page_count / 128.0 FROM sys.dm_db_task_space_usage WHERE session_id = @@SPID ) FROM tempdb.sys.database_files AS DBF JOIN sys.dm_io_virtual_file_stats(2, NULL) AS FS ON FS.file_id = DBF.file_id WHERE DBF.type_desc = 'ROWS' ; Let’s take a closer look at the statistics and query plan generated from this: Following the flow of the data from right to left, we see the expected 50,000 rows emerging from the Clustered Index Scan, with a total estimated size of around 191MB.  The Compute Scalar adds a column containing a random GUID (generated from the NEWID() function call) for each row.  With this extra column in place, the size of the data arriving at the Sort operator is estimated to be 192MB. Sort is a blocking operator – it has to examine all of the rows on its input before it can produce its first row of output (the last row received might sort first).  This characteristic means that Sort requires a memory grant – memory allocated for the query’s use by SQL Server just before execution starts.  In this case, the Sort is the only memory-consuming operator in the plan, so it has access to the full 243MB (248,696KB) of memory reserved by SQL Server for this query execution. Notice that the memory grant is significantly larger than the expected size of the data to be sorted.  SQL Server uses a number of techniques to speed up sorting, some of which sacrifice size for comparison speed.  Sorts typically require a very large number of comparisons, so this is usually a very effective optimization.  One of the drawbacks is that it is not possible to exactly predict the sort space needed, as it depends on the data itself.  SQL Server takes an educated guess based on data types, sizes, and the number of rows expected, but the algorithm is not perfect. In spite of the large memory grant, the Profiler trace shows a Sort Warning event (indicating that the sort ran out of memory), and the tempdb usage monitor shows that 195MB of tempdb space was used – all of that for system use.  The 195MB represents physical write activity on tempdb, because SQL Server strictly enforces memory grants – a query cannot ‘cheat’ and effectively gain extra memory by spilling to tempdb pages that reside in memory.  Anyway, the key point here is that it takes a while to write 195MB to disk, and this is the main reason that the query takes 5 seconds overall. If you are wondering why using parallelism made the problem worse, consider that eight threads of execution result in eight concurrent partial sorts, each receiving one eighth of the memory grant.  The eight sorts all spilled to tempdb, resulting in inefficiencies as the spilled sorts competed for disk resources.  More importantly, there are specific problems at the point where the eight partial results are combined, but I’ll cover that in a future post. CHAR(3999) Performance Summary: 5 seconds elapsed time 243MB memory grant 195MB tempdb usage 192MB estimated sort set 25,043 logical reads Sort Warning Test 2 – VARCHAR(MAX) We’ll now run exactly the same test (with the additional monitoring) on the table using a VARCHAR(MAX) padding column: DECLARE @read BIGINT, @write BIGINT ; SELECT @read = SUM(num_of_bytes_read), @write = SUM(num_of_bytes_written) FROM tempdb.sys.database_files AS DBF JOIN sys.dm_io_virtual_file_stats(2, NULL) AS FS ON FS.file_id = DBF.file_id WHERE DBF.type_desc = 'ROWS' ; SET STATISTICS IO ON ; SELECT TOP (150) TM.id, TM.padding FROM dbo.TestMAX AS TM ORDER BY NEWID() OPTION (MAXDOP 1) ; SET STATISTICS IO OFF ; SELECT tempdb_read_MB = (SUM(num_of_bytes_read) - @read) / 1024. / 1024., tempdb_write_MB = (SUM(num_of_bytes_written) - @write) / 1024. / 1024., internal_use_MB = ( SELECT internal_objects_alloc_page_count / 128.0 FROM sys.dm_db_task_space_usage WHERE session_id = @@SPID ) FROM tempdb.sys.database_files AS DBF JOIN sys.dm_io_virtual_file_stats(2, NULL) AS FS ON FS.file_id = DBF.file_id WHERE DBF.type_desc = 'ROWS' ; This time the query takes around 8 seconds to complete (3 seconds longer than Test 1).  Notice that the estimated row and data sizes are very slightly larger, and the overall memory grant has also increased very slightly to 245MB.  The most marked difference is in the amount of tempdb space used – this query wrote almost 391MB of sort run data to the physical tempdb file.  Don’t draw any general conclusions about VARCHAR(MAX) versus CHAR from this – I chose the length of the data specifically to expose this edge case.  In most cases, VARCHAR(MAX) performs very similarly to CHAR – I just wanted to make test 2 a bit more exciting. MAX Performance Summary: 8 seconds elapsed time 245MB memory grant 391MB tempdb usage 193MB estimated sort set 25,043 logical reads Sort warning Test 3 – TEXT The same test again, but using the deprecated TEXT data type for the padding column: DECLARE @read BIGINT, @write BIGINT ; SELECT @read = SUM(num_of_bytes_read), @write = SUM(num_of_bytes_written) FROM tempdb.sys.database_files AS DBF JOIN sys.dm_io_virtual_file_stats(2, NULL) AS FS ON FS.file_id = DBF.file_id WHERE DBF.type_desc = 'ROWS' ; SET STATISTICS IO ON ; SELECT TOP (150) TT.id, TT.padding FROM dbo.TestTEXT AS TT ORDER BY NEWID() OPTION (MAXDOP 1, RECOMPILE) ; SET STATISTICS IO OFF ; SELECT tempdb_read_MB = (SUM(num_of_bytes_read) - @read) / 1024. / 1024., tempdb_write_MB = (SUM(num_of_bytes_written) - @write) / 1024. / 1024., internal_use_MB = ( SELECT internal_objects_alloc_page_count / 128.0 FROM sys.dm_db_task_space_usage WHERE session_id = @@SPID ) FROM tempdb.sys.database_files AS DBF JOIN sys.dm_io_virtual_file_stats(2, NULL) AS FS ON FS.file_id = DBF.file_id WHERE DBF.type_desc = 'ROWS' ; This time the query runs in 500ms.  If you look at the metrics we have been checking so far, it’s not hard to understand why: TEXT Performance Summary: 0.5 seconds elapsed time 9MB memory grant 5MB tempdb usage 5MB estimated sort set 207 logical reads 596 LOB logical reads Sort warning SQL Server’s memory grant algorithm still underestimates the memory needed to perform the sorting operation, but the size of the data to sort is so much smaller (5MB versus 193MB previously) that the spilled sort doesn’t matter very much.  Why is the data size so much smaller?  The query still produces the correct results – including the large amount of data held in the padding column – so what magic is being performed here? TEXT versus MAX Storage The answer lies in how columns of the TEXT data type are stored.  By default, TEXT data is stored off-row in separate LOB pages – which explains why this is the first query we have seen that records LOB logical reads in its STATISTICS IO output.  You may recall from my last post that LOB data leaves an in-row pointer to the separate storage structure holding the LOB data. SQL Server can see that the full LOB value is not required by the query plan until results are returned, so instead of passing the full LOB value down the plan from the Clustered Index Scan, it passes the small in-row structure instead.  SQL Server estimates that each row coming from the scan will be 79 bytes long – 11 bytes for row overhead, 4 bytes for the integer id column, and 64 bytes for the LOB pointer (in fact the pointer is rather smaller – usually 16 bytes – but the details of that don’t really matter right now). OK, so this query is much more efficient because it is sorting a very much smaller data set – SQL Server delays retrieving the LOB data itself until after the Sort starts producing its 150 rows.  The question that normally arises at this point is: Why doesn’t SQL Server use the same trick when the padding column is defined as VARCHAR(MAX)? The answer is connected with the fact that if the actual size of the VARCHAR(MAX) data is 8000 bytes or less, it is usually stored in-row in exactly the same way as for a VARCHAR(8000) column – MAX data only moves off-row into LOB storage when it exceeds 8000 bytes.  The default behaviour of the TEXT type is to be stored off-row by default, unless the ‘text in row’ table option is set suitably and there is room on the page.  There is an analogous (but opposite) setting to control the storage of MAX data – the ‘large value types out of row’ table option.  By enabling this option for a table, MAX data will be stored off-row (in a LOB structure) instead of in-row.  SQL Server Books Online has good coverage of both options in the topic In Row Data. The MAXOOR Table The essential difference, then, is that MAX defaults to in-row storage, and TEXT defaults to off-row (LOB) storage.  You might be thinking that we could get the same benefits seen for the TEXT data type by storing the VARCHAR(MAX) values off row – so let’s look at that option now.  This script creates a fourth table, with the VARCHAR(MAX) data stored off-row in LOB pages: CREATE TABLE dbo.TestMAXOOR ( id INTEGER IDENTITY (1,1) NOT NULL, padding VARCHAR(MAX) NOT NULL,   CONSTRAINT [PK dbo.TestMAXOOR (id)] PRIMARY KEY CLUSTERED (id), ) ; EXECUTE sys.sp_tableoption @TableNamePattern = N'dbo.TestMAXOOR', @OptionName = 'large value types out of row', @OptionValue = 'true' ; SELECT large_value_types_out_of_row FROM sys.tables WHERE [schema_id] = SCHEMA_ID(N'dbo') AND name = N'TestMAXOOR' ; INSERT INTO dbo.TestMAXOOR WITH (TABLOCKX) ( padding ) SELECT SPACE(0) FROM dbo.TestCHAR ORDER BY id ; UPDATE TM WITH (TABLOCK) SET padding.WRITE (TC.padding, NULL, NULL) FROM dbo.TestMAXOOR AS TM JOIN dbo.TestCHAR AS TC ON TC.id = TM.id ; EXECUTE sys.sp_spaceused @objname = 'dbo.TestMAXOOR' ; CHECKPOINT ; Test 4 – MAXOOR We can now re-run our test on the MAXOOR (MAX out of row) table: DECLARE @read BIGINT, @write BIGINT ; SELECT @read = SUM(num_of_bytes_read), @write = SUM(num_of_bytes_written) FROM tempdb.sys.database_files AS DBF JOIN sys.dm_io_virtual_file_stats(2, NULL) AS FS ON FS.file_id = DBF.file_id WHERE DBF.type_desc = 'ROWS' ; SET STATISTICS IO ON ; SELECT TOP (150) MO.id, MO.padding FROM dbo.TestMAXOOR AS MO ORDER BY NEWID() OPTION (MAXDOP 1, RECOMPILE) ; SET STATISTICS IO OFF ; SELECT tempdb_read_MB = (SUM(num_of_bytes_read) - @read) / 1024. / 1024., tempdb_write_MB = (SUM(num_of_bytes_written) - @write) / 1024. / 1024., internal_use_MB = ( SELECT internal_objects_alloc_page_count / 128.0 FROM sys.dm_db_task_space_usage WHERE session_id = @@SPID ) FROM tempdb.sys.database_files AS DBF JOIN sys.dm_io_virtual_file_stats(2, NULL) AS FS ON FS.file_id = DBF.file_id WHERE DBF.type_desc = 'ROWS' ; TEXT Performance Summary: 0.3 seconds elapsed time 245MB memory grant 0MB tempdb usage 193MB estimated sort set 207 logical reads 446 LOB logical reads No sort warning The query runs very quickly – slightly faster than Test 3, and without spilling the sort to tempdb (there is no sort warning in the trace, and the monitoring query shows zero tempdb usage by this query).  SQL Server is passing the in-row pointer structure down the plan and only looking up the LOB value on the output side of the sort. The Hidden Problem There is still a huge problem with this query though – it requires a 245MB memory grant.  No wonder the sort doesn’t spill to tempdb now – 245MB is about 20 times more memory than this query actually requires to sort 50,000 records containing LOB data pointers.  Notice that the estimated row and data sizes in the plan are the same as in test 2 (where the MAX data was stored in-row). The optimizer assumes that MAX data is stored in-row, regardless of the sp_tableoption setting ‘large value types out of row’.  Why?  Because this option is dynamic – changing it does not immediately force all MAX data in the table in-row or off-row, only when data is added or actually changed.  SQL Server does not keep statistics to show how much MAX or TEXT data is currently in-row, and how much is stored in LOB pages.  This is an annoying limitation, and one which I hope will be addressed in a future version of the product. So why should we worry about this?  Excessive memory grants reduce concurrency and may result in queries waiting on the RESOURCE_SEMAPHORE wait type while they wait for memory they do not need.  245MB is an awful lot of memory, especially on 32-bit versions where memory grants cannot use AWE-mapped memory.  Even on a 64-bit server with plenty of memory, do you really want a single query to consume 0.25GB of memory unnecessarily?  That’s 32,000 8KB pages that might be put to much better use. The Solution The answer is not to use the TEXT data type for the padding column.  That solution happens to have better performance characteristics for this specific query, but it still results in a spilled sort, and it is hard to recommend the use of a data type which is scheduled for removal.  I hope it is clear to you that the fundamental problem here is that SQL Server sorts the whole set arriving at a Sort operator.  Clearly, it is not efficient to sort the whole table in memory just to return 150 rows in a random order. The TEXT example was more efficient because it dramatically reduced the size of the set that needed to be sorted.  We can do the same thing by selecting 150 unique keys from the table at random (sorting by NEWID() for example) and only then retrieving the large padding column values for just the 150 rows we need.  The following script implements that idea for all four tables: SET STATISTICS IO ON ; WITH TestTable AS ( SELECT * FROM dbo.TestCHAR ), TopKeys AS ( SELECT TOP (150) id FROM TestTable ORDER BY NEWID() ) SELECT T1.id, T1.padding FROM TestTable AS T1 WHERE T1.id = ANY (SELECT id FROM TopKeys) OPTION (MAXDOP 1) ; WITH TestTable AS ( SELECT * FROM dbo.TestMAX ), TopKeys AS ( SELECT TOP (150) id FROM TestTable ORDER BY NEWID() ) SELECT T1.id, T1.padding FROM TestTable AS T1 WHERE T1.id IN (SELECT id FROM TopKeys) OPTION (MAXDOP 1) ; WITH TestTable AS ( SELECT * FROM dbo.TestTEXT ), TopKeys AS ( SELECT TOP (150) id FROM TestTable ORDER BY NEWID() ) SELECT T1.id, T1.padding FROM TestTable AS T1 WHERE T1.id IN (SELECT id FROM TopKeys) OPTION (MAXDOP 1) ; WITH TestTable AS ( SELECT * FROM dbo.TestMAXOOR ), TopKeys AS ( SELECT TOP (150) id FROM TestTable ORDER BY NEWID() ) SELECT T1.id, T1.padding FROM TestTable AS T1 WHERE T1.id IN (SELECT id FROM TopKeys) OPTION (MAXDOP 1) ; SET STATISTICS IO OFF ; All four queries now return results in much less than a second, with memory grants between 6 and 12MB, and without spilling to tempdb.  The small remaining inefficiency is in reading the id column values from the clustered primary key index.  As a clustered index, it contains all the in-row data at its leaf.  The CHAR and VARCHAR(MAX) tables store the padding column in-row, so id values are separated by a 3999-character column, plus row overhead.  The TEXT and MAXOOR tables store the padding values off-row, so id values in the clustered index leaf are separated by the much-smaller off-row pointer structure.  This difference is reflected in the number of logical page reads performed by the four queries: Table 'TestCHAR' logical reads 25511 lob logical reads 000 Table 'TestMAX'. logical reads 25511 lob logical reads 000 Table 'TestTEXT' logical reads 00412 lob logical reads 597 Table 'TestMAXOOR' logical reads 00413 lob logical reads 446 We can increase the density of the id values by creating a separate nonclustered index on the id column only.  This is the same key as the clustered index, of course, but the nonclustered index will not include the rest of the in-row column data. CREATE UNIQUE NONCLUSTERED INDEX uq1 ON dbo.TestCHAR (id); CREATE UNIQUE NONCLUSTERED INDEX uq1 ON dbo.TestMAX (id); CREATE UNIQUE NONCLUSTERED INDEX uq1 ON dbo.TestTEXT (id); CREATE UNIQUE NONCLUSTERED INDEX uq1 ON dbo.TestMAXOOR (id); The four queries can now use the very dense nonclustered index to quickly scan the id values, sort them by NEWID(), select the 150 ids we want, and then look up the padding data.  The logical reads with the new indexes in place are: Table 'TestCHAR' logical reads 835 lob logical reads 0 Table 'TestMAX' logical reads 835 lob logical reads 0 Table 'TestTEXT' logical reads 686 lob logical reads 597 Table 'TestMAXOOR' logical reads 686 lob logical reads 448 With the new index, all four queries use the same query plan (click to enlarge): Performance Summary: 0.3 seconds elapsed time 6MB memory grant 0MB tempdb usage 1MB sort set 835 logical reads (CHAR, MAX) 686 logical reads (TEXT, MAXOOR) 597 LOB logical reads (TEXT) 448 LOB logical reads (MAXOOR) No sort warning I’ll leave it as an exercise for the reader to work out why trying to eliminate the Key Lookup by adding the padding column to the new nonclustered indexes would be a daft idea Conclusion This post is not about tuning queries that access columns containing big strings.  It isn’t about the internal differences between TEXT and MAX data types either.  It isn’t even about the cool use of UPDATE .WRITE used in the MAXOOR table load.  No, this post is about something else: Many developers might not have tuned our starting example query at all – 5 seconds isn’t that bad, and the original query plan looks reasonable at first glance.  Perhaps the NEWID() function would have been blamed for ‘just being slow’ – who knows.  5 seconds isn’t awful – unless your users expect sub-second responses – but using 250MB of memory and writing 200MB to tempdb certainly is!  If ten sessions ran that query at the same time in production that’s 2.5GB of memory usage and 2GB hitting tempdb.  Of course, not all queries can be rewritten to avoid large memory grants and sort spills using the key-lookup technique in this post, but that’s not the point either. The point of this post is that a basic understanding of execution plans is not enough.  Tuning for logical reads and adding covering indexes is not enough.  If you want to produce high-quality, scalable TSQL that won’t get you paged as soon as it hits production, you need a deep understanding of execution plans, and as much accurate, deep knowledge about SQL Server as you can lay your hands on.  The advanced database developer has a wide range of tools to use in writing queries that perform well in a range of circumstances. By the way, the examples in this post were written for SQL Server 2008.  They will run on 2005 and demonstrate the same principles, but you won’t get the same figures I did because 2005 had a rather nasty bug in the Top N Sort operator.  Fair warning: if you do decide to run the scripts on a 2005 instance (particularly the parallel query) do it before you head out for lunch… This post is dedicated to the people of Christchurch, New Zealand. © 2011 Paul White email: @[email protected] twitter: @SQL_Kiwi

    Read the article

  • Currency Conversion in Oracle BI applications

    - by Saurabh Verma
    Authored by Vijay Aggarwal and Hichem Sellami A typical data warehouse contains Star and/or Snowflake schema, made up of Dimensions and Facts. The facts store various numerical information including amounts. Example; Order Amount, Invoice Amount etc. With the true global nature of business now-a-days, the end-users want to view the reports in their own currency or in global/common currency as defined by their business. This presents a unique opportunity in BI to provide the amounts in converted rates either by pre-storing or by doing on-the-fly conversions while displaying the reports to the users. Source Systems OBIA caters to various source systems like EBS, PSFT, Sebl, JDE, Fusion etc. Each source has its own unique and intricate ways of defining and storing currency data, doing currency conversions and presenting to the OLTP users. For example; EBS stores conversion rates between currencies which can be classified by conversion rates, like Corporate rate, Spot rate, Period rate etc. Siebel stores exchange rates by conversion rates like Daily. EBS/Fusion stores the conversion rates for each day, where as PSFT/Siebel store for a range of days. PSFT has Rate Multiplication Factor and Rate Division Factor and we need to calculate the Rate based on them, where as other Source systems store the Currency Exchange Rate directly. OBIA Design The data consolidation from various disparate source systems, poses the challenge to conform various currencies, rate types, exchange rates etc., and designing the best way to present the amounts to the users without affecting the performance. When consolidating the data for reporting in OBIA, we have designed the mechanisms in the Common Dimension, to allow users to report based on their required currencies. OBIA Facts store amounts in various currencies: Document Currency: This is the currency of the actual transaction. For a multinational company, this can be in various currencies. Local Currency: This is the base currency in which the accounting entries are recorded by the business. This is generally defined in the Ledger of the company. Global Currencies: OBIA provides five Global Currencies. Three are used across all modules. The last two are for CRM only. A Global currency is very useful when creating reports where the data is viewed enterprise-wide. Example; a US based multinational would want to see the reports in USD. The company will choose USD as one of the global currencies. OBIA allows users to define up-to five global currencies during the initial implementation. The term Currency Preference is used to designate the set of values: Document Currency, Local Currency, Global Currency 1, Global Currency 2, Global Currency 3; which are shared among all modules. There are four more currency preferences, specific to certain modules: Global Currency 4 (aka CRM Currency) and Global Currency 5 which are used in CRM; and Project Currency and Contract Currency, used in Project Analytics. When choosing Local Currency for Currency preference, the data will show in the currency of the Ledger (or Business Unit) in the prompt. So it is important to select one Ledger or Business Unit when viewing data in Local Currency. More on this can be found in the section: Toggling Currency Preferences in the Dashboard. Design Logic When extracting the fact data, the OOTB mappings extract and load the document amount, and the local amount in target tables. It also loads the exchange rates required to convert the document amount into the corresponding global amounts. If the source system only provides the document amount in the transaction, the extract mapping does a lookup to get the Local currency code, and the Local exchange rate. The Load mapping then uses the local currency code and rate to derive the local amount. The load mapping also fetches the Global Currencies and looks up the corresponding exchange rates. The lookup of exchange rates is done via the Exchange Rate Dimension provided as a Common/Conforming Dimension in OBIA. The Exchange Rate Dimension stores the exchange rates between various currencies for a date range and Rate Type. Two physical tables W_EXCH_RATE_G and W_GLOBAL_EXCH_RATE_G are used to provide the lookups and conversions between currencies. The data is loaded from the source system’s Ledger tables. W_EXCH_RATE_G stores the exchange rates between currencies with a date range. On the other hand, W_GLOBAL_EXCH_RATE_G stores the currency conversions between the document currency and the pre-defined five Global Currencies for each day. Based on the requirements, the fact mappings can decide and use one or both tables to do the conversion. Currency design in OBIA also taps into the MLS and Domain architecture, thus allowing the users to map the currencies to a universal Domain during the implementation time. This is especially important for companies deploying and using OBIA with multiple source adapters. Some Gotchas to Look for It is necessary to think through the currencies during the initial implementation. 1) Identify various types of currencies that are used by your business. Understand what will be your Local (or Base) and Documentation currency. Identify various global currencies that your users will want to look at the reports. This will be based on the global nature of your business. Changes to these currencies later in the project, while permitted, but may cause Full data loads and hence lost time. 2) If the user has a multi source system make sure that the Global Currencies and Global Rate Types chosen in Configuration Manager do have the corresponding source specific counterparts. In other words, make sure for every DW specific value chosen for Currency Code or Rate Type, there is a source Domain mapping already done. Technical Section This section will briefly mention the technical scenarios employed in the OBIA adaptors to extract data from each source system. In OBIA, we have two main tables which store the Currency Rate information as explained in previous sections. W_EXCH_RATE_G and W_GLOBAL_EXCH_RATE_G are the two tables. W_EXCH_RATE_G stores all the Currency Conversions present in the source system. It captures data for a Date Range. W_GLOBAL_EXCH_RATE_G has Global Currency Conversions stored at a Daily level. However the challenge here is to store all the 5 Global Currency Exchange Rates in a single record for each From Currency. Let’s voyage further into the Source System Extraction logic for each of these tables and understand the flow briefly. EBS: In EBS, we have Currency Data stored in GL_DAILY_RATES table. As the name indicates GL_DAILY_RATES EBS table has data at a daily level. However in our warehouse we store the data with a Date Range and insert a new range record only when the Exchange Rate changes for a particular From Currency, To Currency and Rate Type. Below are the main logical steps that we employ in this process. (Incremental Flow only) – Cleanup the data in W_EXCH_RATE_G. Delete the records which have Start Date > minimum conversion date Update the End Date of the existing records. Compress the daily data from GL_DAILY_RATES table into Range Records. Incremental map uses $$XRATE_UPD_NUM_DAY as an extra parameter. Generate Previous Rate, Previous Date and Next Date for each of the Daily record from the OLTP. Filter out the records which have Conversion Rate same as Previous Rates or if the Conversion Date lies within a single day range. Mark the records as ‘Keep’ and ‘Filter’ and also get the final End Date for the single Range record (Unique Combination of From Date, To Date, Rate and Conversion Date). Filter the records marked as ‘Filter’ in the INFA map. The above steps will load W_EXCH_RATE_GS. Step 0 updates/deletes W_EXCH_RATE_G directly. SIL map will then insert/update the GS data into W_EXCH_RATE_G. These steps convert the daily records in GL_DAILY_RATES to Range records in W_EXCH_RATE_G. We do not need such special logic for loading W_GLOBAL_EXCH_RATE_G. This is a table where we store data at a Daily Granular Level. However we need to pivot the data because the data present in multiple rows in source tables needs to be stored in different columns of the same row in DW. We use GROUP BY and CASE logic to achieve this. Fusion: Fusion has extraction logic very similar to EBS. The only difference is that the Cleanup logic that was mentioned in step 0 above does not use $$XRATE_UPD_NUM_DAY parameter. In Fusion we bring all the Exchange Rates in Incremental as well and do the cleanup. The SIL then takes care of Insert/Updates accordingly. PeopleSoft:PeopleSoft does not have From Date and To Date explicitly in the Source tables. Let’s look at an example. Please note that this is achieved from PS1 onwards only. 1 Jan 2010 – USD to INR – 45 31 Jan 2010 – USD to INR – 46 PSFT stores records in above fashion. This means that Exchange Rate of 45 for USD to INR is applicable for 1 Jan 2010 to 30 Jan 2010. We need to store data in this fashion in DW. Also PSFT has Exchange Rate stored as RATE_MULT and RATE_DIV. We need to do a RATE_MULT/RATE_DIV to get the correct Exchange Rate. We generate From Date and To Date while extracting data from source and this has certain assumptions: If a record gets updated/inserted in the source, it will be extracted in incremental. Also if this updated/inserted record is between other dates, then we also extract the preceding and succeeding records (based on dates) of this record. This is required because we need to generate a range record and we have 3 records whose ranges have changed. Taking the same example as above, if there is a new record which gets inserted on 15 Jan 2010; the new ranges are 1 Jan to 14 Jan, 15 Jan to 30 Jan and 31 Jan to Next available date. Even though 1 Jan record and 31 Jan have not changed, we will still extract them because the range is affected. Similar logic is used for Global Exchange Rate Extraction. We create the Range records and get it into a Temporary table. Then we join to Day Dimension, create individual records and pivot the data to get the 5 Global Exchange Rates for each From Currency, Date and Rate Type. Siebel: Siebel Facts are dependent on Global Exchange Rates heavily and almost none of them really use individual Exchange Rates. In other words, W_GLOBAL_EXCH_RATE_G is the main table used in Siebel from PS1 release onwards. As of January 2002, the Euro Triangulation method for converting between currencies belonging to EMU members is not needed for present and future currency exchanges. However, the method is still available in Siebel applications, as are the old currencies, so that historical data can be maintained accurately. The following description applies only to historical data needing conversion prior to the 2002 switch to the Euro for the EMU member countries. If a country is a member of the European Monetary Union (EMU), you should convert its currency to other currencies through the Euro. This is called triangulation, and it is used whenever either currency being converted has EMU Triangulation checked. Due to this, there are multiple extraction flows in SEBL ie. EUR to EMU, EUR to NonEMU, EUR to DMC and so on. We load W_EXCH_RATE_G through multiple flows with these data. This has been kept same as previous versions of OBIA. W_GLOBAL_EXCH_RATE_G being a new table does not have such needs. However SEBL does not have From Date and To Date columns in the Source tables similar to PSFT. We use similar extraction logic as explained in PSFT section for SEBL as well. What if all 5 Global Currencies configured are same? As mentioned in previous sections, from PS1 onwards we store Global Exchange Rates in W_GLOBAL_EXCH_RATE_G table. The extraction logic for this table involves Pivoting data from multiple rows into a single row with 5 Global Exchange Rates in 5 columns. As mentioned in previous sections, we use CASE and GROUP BY functions to achieve this. This approach poses a unique problem when all the 5 Global Currencies Chosen are same. For example – If the user configures all 5 Global Currencies as ‘USD’ then the extract logic will not be able to generate a record for From Currency=USD. This is because, not all Source Systems will have a USD->USD conversion record. We have _Generated mappings to take care of this case. We generate a record with Conversion Rate=1 for such cases. Reusable Lookups Before PS1, we had a Mapplet for Currency Conversions. In PS1, we only have reusable Lookups- LKP_W_EXCH_RATE_G and LKP_W_GLOBAL_EXCH_RATE_G. These lookups have another layer of logic so that all the lookup conditions are met when they are used in various Fact Mappings. Any user who would want to do a LKP on W_EXCH_RATE_G or W_GLOBAL_EXCH_RATE_G should and must use these Lookups. A direct join or Lookup on the tables might lead to wrong data being returned. Changing Currency preferences in the Dashboard: In the 796x series, all amount metrics in OBIA were showing the Global1 amount. The customer needed to change the metric definitions to show them in another Currency preference. Project Analytics started supporting currency preferences since 7.9.6 release though, and it published a Tech note for other module customers to add toggling between currency preferences to the solution. List of Currency Preferences Starting from 11.1.1.x release, the BI Platform added a new feature to support multiple currencies. The new session variable (PREFERRED_CURRENCY) is populated through a newly introduced currency prompt. This prompt can take its values from the xml file: userpref_currencies_OBIA.xml, which is hosted in the BI Server installation folder, under :< home>\instances\instance1\config\OracleBIPresentationServicesComponent\coreapplication_obips1\userpref_currencies.xml This file contains the list of currency preferences, like“Local Currency”, “Global Currency 1”,…which customers can also rename to give them more meaningful business names. There are two options for showing the list of currency preferences to the user in the dashboard: Static and Dynamic. In Static mode, all users will see the full list as in the user preference currencies file. In the Dynamic mode, the list shown in the currency prompt drop down is a result of a dynamic query specified in the same file. Customers can build some security into the rpd, so the list of currency preferences will be based on the user roles…BI Applications built a subject area: “Dynamic Currency Preference” to run this query, and give every user only the list of currency preferences required by his application roles. Adding Currency to an Amount Field When the user selects one of the items from the currency prompt, all the amounts in that page will show in the Currency corresponding to that preference. For example, if the user selects “Global Currency1” from the prompt, all data will be showing in Global Currency 1 as specified in the Configuration Manager. If the user select “Local Currency”, all amount fields will show in the Currency of the Business Unit selected in the BU filter of the same page. If there is no particular Business Unit selected in that filter, and the data selected by the query contains amounts in more than one currency (for example one BU has USD as a functional currency, the other has EUR as functional currency), then subtotals will not be available (cannot add USD and EUR amounts in one field), and depending on the set up (see next paragraph), the user may receive an error. There are two ways to add the Currency field to an amount metric: In the form of currency code, like USD, EUR…For this the user needs to add the field “Apps Common Currency Code” to the report. This field is in every subject area, usually under the table “Currency Tag” or “Currency Code”… In the form of currency symbol ($ for USD, € for EUR,…) For this, the user needs to format the amount metrics in the report as a currency column, by specifying the currency tag column in the Column Properties option in Column Actions drop down list. Typically this column should be the “BI Common Currency Code” available in every subject area. Select Column Properties option in the Edit list of a metric. In the Data Format tab, select Custom as Treat Number As. Enter the following syntax under Custom Number Format: [$:currencyTagColumn=Subjectarea.table.column] Where Column is the “BI Common Currency Code” defined to take the currency code value based on the currency preference chosen by the user in the Currency preference prompt.

    Read the article

  • CodePlex Daily Summary for Wednesday, November 02, 2011

    CodePlex Daily Summary for Wednesday, November 02, 2011Popular ReleasesWYGIWYS Driver: 1.1.2.0 release: Win32, x86, kernel mode driver.Reset IPv6: 1.0.0.0: First release. Only works for Spanish and English operating system editions.BExplorer (Better Explorer): Better Explorer 2.0.0.631 Alpha: Changelog: Added: Some new functions in ribbon Added: Possibility to choose displayed columns Added: Basic Search Fixed: Some bugs after navigation Fixed: Attempt to fix slow navigation and slow start Known issues: - BreadcrumbBar fails on some situations - Basic search not work quite well in some situations Please if anyone find bugs be kind and report them at the Issue Tracker! Thanks!DotNetNuke® Community Edition: 05.06.04: Major Highlights Fixed issue with upgrades on systems that had upgraded the Telerik library to 6.0.0 Fixed issue with Razor Host upgrade to 5.6.3 The logic for module administration checks contains incorrect logic in 1 place, opening the possibility of a user with edit permissions gaining access to functionality they should not have through a particularly crafted url Security FixesBrowsers support the ability to remember common strings such as usernames/addresses etc. Code was adde...Terminals: Version 2.0 - Beta 2 Release: The team has finally put the nail into the official release date for version 2.0. As bugs are winding down on the 2.0 Roadmap we decided to push out another build - the first 2.0 Beta build. Please take time to use and abuse this release. We left logging in place, and this is a debug build so be sure to submit your logs on each bug reported, and please do report all bugs! Check the source code page on the site, this beta includes all commits since (and including) the 90428 check-in back i...iTuner - The iTunes Companion: iTuner 1.4.4322: Added German (unverified, apologies if incorrect) Properly source invariant resources with correct resIDs Replaced obsolete lyric providers with working providers Fix Pseudolater to correctly morph every third char Fix null reference in CatalogBaseTrack Folder Changes: Track Folder Changes 1.0: Track Folder Changes 1.0 (binary)Windows Workflow Foundation on Codeplex: Microsoft.Activities v1.8.8: Microsoft.Activities Overview How do I install Microsoft.Activities? Updates in this release9318Technical Analysis Engine for .NET: Technical Analysis Engine 1.25: What's new in the 1.25 release (2011-11-01): - Added Williams %R indicator - Added Moving Average Envelopes indicatorBF3Rcon.NET: BF3Rcon 3.0: This release is targeted for RCON documentation based on R3. Everything should be beta stable, but it's alpha because I haven't been able to fully test it. When a stable release is ready, a proper changelog will be kept. Important Edit: There's one method that will keep this from working in Mono. GeneratePasswordHash uses void HashAlgorithm.Dispose(), which isn't in Mono. This will have to be changed to Clear() in the next release. If anyone needs a Mono version of this immediately, you can...BoxWorld: BoxWorld_2011.10.30: BoxWorld - 8.0.1110.30 This is the initial release of BoxWorld. I'd recommend downloading the installer as it contains the compiled code and everything all nicely contained. By default, you end up with this directory structure: C:\Program Files\ViperWorks\BoxWorld C:\Program Files\ViperWorks\BoxWorld\Data C:\Program Files\ViperWorks\BoxWorld\Interface C:\Program Files\ViperWorks\BoxWorld\Source In the root you have the compiled EXE files, one for the main release, one for the LITE release ...VidCoder: 1.2.1: Fixed a couple regressions: video encoder was blank in queue and crashes with the High Profile preset when opening the Settings window. Fixed problem with auto-update introduced in 1.2.0. If you have 1.2.0 you will need to update manually to get this.Koober: Koober - The Ebook Creator 0.2: The official release of Koober as Open source. Koober is a ebook creator for Windows, and Koob Reader is the reader.patterns & practices: Enterprise Library Contrib: Enterprise Library Contrib - 5.0 (Oct 2011): This release of Enterprise Library Contrib is based on the Microsoft patterns & practices Enterprise Library 5.0 core and contains the following: Common extensionsTypeConfigurationElement<T> - A Polymorphic Configuration Element without having to be part of a PolymorphicConfigurationElementCollection. AnonymousConfigurationElement - A Configuration element that can be uniquely identified without having to define its name explicitly. Data Access Application Block extensionsMySql Provider - ...Network Monitor Open Source Parsers: Network Monitor Parsers 3.4.2748: The Network Monitor Parsers packages contain parsers for more than 400 network protocols, including RFC based public protocols and protocols for Microsoft products defined in the Microsoft Open Specifications for Windows and SQL Server. NetworkMonitor_Parsers.msi is the base parser package which defines parsers for commonly used public protocols and protocols for Microsoft Windows. In this release, NetowrkMonitor_Parsers.msi continues to improve quality and fix bugs. It has included the fo...Media Companion: MC 3.420b Weekly: Ensure .NET 4.0 Full Framework is installed. (Available from http://www.microsoft.com/download/en/details.aspx?id=17718) Ensure the NFO ID fix is applied when transitioning from versions prior to 3.416b. (Details here) Movies Fixed: Fanart and poster scraping issues TV Shows (Re)Added: Rebuild single show Fixed: Issue when shows are moved from original location Ability to handle " for actor nicknames Crash when episode name contains "<" (does not scrape yet) Clears fanart when switch...patterns & practices - Unity: Unity 3.0 for .NET4.5 Preview: The Unity 3.0.1026.0 Preview enables Unity to work on .NET 4.5 with both the WinRT and desktop profiles. The major changes include: Unity projects updated to target .NET 4.5. Dynamic build plans modified to use compiled lambda expressions instead of Reflection.Emit Converting reflection to use the new TypeInfo for reflection. Projects updated to work with the Microsoft Visual Studio 2011 Preview Notes/Known Issues: The Microsoft.Practices.Unity.UnityServiceLocator class cannot be use...Managed Extensibility Framework: MEF 2 Preview 4: Detailed information on this release is available on the BCL team blog.AcDown????? - Anime&Comic Downloader: AcDown????? v3.6: ?? ● AcDown??????????、??????,??????????????????????,???????Acfun、Bilibili、???、???、???、Tucao.cc、SF???、?????80????,???????????、?????????。 ● AcDown???????????????????????????,???,???????????????????。 ● AcDown???????C#??,????.NET Framework 2.0??。?????"Acfun?????"。 ????32??64? Windows XP/Vista/7 ????????????? ??:????????Windows XP???,?????????.NET Framework 2.0???(x86)?.NET Framework 2.0???(x64),?????"?????????"??? ??????????????,??????????: ??"AcDown?????"????????? ?? v3.6?? ??“????”...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.33: Add JSParser.ParseExpression method to parse JavaScript expressions rather than source-elements. Add -strict switch (CodeSettings.StrictMode) to force input code to ECMA5 Strict-mode (extra error-checking, "use strict" at top). Fixed bug when MinifyCode setting was set to false but RemoveUnneededCode was left it's default value of true.New ProjectsAlgorithms: algorithms, ACM C++ programs, Electronic design automation algorithmsAll-In-One: Ein Programm - Alle Funktionen Einfach Komponente erstellen - und schon gibt es eine Funktion mehr!Bar Code File Copy: Bar Code File Copy is a mobile app that allows user to initiate file transfer from a PC to another with the help of barcodes and Windows Phone 7.BlekBoox: Google Books API CF-Soft: this is my workspace to study C# programming.CodePaste.NET Tray Application: A simple tray application to create snippets on CodePaste.NET. Right click the app and select New Snippet. A snippet will be created, the contents of your clipboard will be pasted and your clipboard will be replaced with the URL after the code snippet is successfully posted.Common libs of fanthos: Fanthos.Libs.Controls provides panel and button with special draw and special events. Fanthos.Libs.Network provides class that you can use the method likes Async WCF in dotNet 2.0.Database Filesystem: The DBFS project implements a filesystem paradigm to the storage of objects in a database. The primary development language is Visual Basic .NET.Destructible Texture2D: The project shows how to create a destructible texture2D in XNA. This project is developed using the XNA framework and C# programming languageEject and Close CD Tray: For some reason i cannot seem to find any command line program to eject or close the cd tray. So i've created my own version and sharing it for everyone to use. It is an extremely simplistic program with the sole aim of ejecting or closing the cd tray and nothing else.Excel add-in for pricing options: Price options using various distributions.FDFToolkit .NET: FDFToolkit .NET makes it easier to read, write, create, populate, merge FDF, XDP, XFDF, and XML data with Acrobat and LiveCycle PDF forms. FDFToolkit.net utilizes iTextSharp 4.x technologies, under MPL license.Fiction Catalog: A catalog project designed to store information about fictional literature.Gangbee Software: This is customized version of Nopcommerce 1.9Guess Number: This simple program is letting you to guess number from 1 to 20. H? th?ng Khách s?n: H? th?ng qu?n lí khách s?nHelloTipi Photo Downloader: Permet de télécharger les photos en qualité original de votre site de famille [url:HelloTipi|http://www.hellotipi.com].Image Tagger & Resizer: Resize, and text in the lower right of picture with i.e. copyright information.Kinect Sensor Controller: A project using kinect sensor to control objectsKJFramework.Dynamic ??????: KJFramework.Dynamic ???????? KJFramework????????????????????,?????????: KJFramework.Dynamic(??????). ???????????????????????,???????????????,?????????, ????????????????????,????,???????????????,??????????(??,???????)。 ????,?????????????????????(Component), ??????DynamicDomainComponent??。 ??,????????????,???????????DynamicDomainService??。 ??????,???????????? ???????DynamicDomainService。 ??DynamicDomainService??,????????????????。 ?????, ????????????(Tunnel)???,??????(Tunnel)...Lan ETS Player Registration Software: This is the custom software that the Lan ETS staff uses to register people when they arrive at the LAN party.LINQ to Calil: LINQ to Calil ????? (http://calil.jp/) ???????? API ? LINQ ????????。???????????? LINQ ???????????????。.NET Framework 4 ???????、Silverlight ??????????。 LINQ to Calil ??????????????????????????????????????。Login Form in VB: <project name> makes it easier for <target user group> to <activity>. You'll no longer have to <activity>. It's developed in <programming language>.MbpGpsTester: MbpGpsTester is a gps test tool.Mi proyecto: Mi proyectoMy NuGET Packages: This project contains a custom program that allows you to quickly update NuGET packages from multiple sources and package feeds. It also contains set-up of some packages that are used by HydroDesktop.MyyoutubeforWp7: MyyoutubeforWp7 is a youtube application where users can search and view the videoNERO'S TOOLS: NEROS TOOLS will be an area of programming tools and resources for the automation control industry and the everyday usage of major control systems software.NetWebScript: NetWebScript is a IL to JavaScript compiler. It allow to write complex web application in C# or any other .Net language. It provide Visual Studio integration to debug into orginial sources. It allow to share code with server.Northern Lights WP7 Toolkit: Northern Lights WP7 Toolkit contains some common tools for WP7 Developers.SccmBoundaryHealthTool: .NET application that will search the AD forest for SCCM boundary configurations and report the following: * Overlapping boundaries, including those inter-boundary type, e.g. IP subnet overlaps with AD site. * Unassigned AD sites * Orphaned AD sites in SCCMSeven: Seven is a set of tools for programs development using the Oberon 07 language. SOAP Test Application (with EWS Tools): Application that allows testing of SOAP implementations, in particular Exchange Web Services.SocialCastDotNet: C# library for accessing the SocialCast APISosyal: Sosyal as.sqlce_perfDemo: just a bit of code to demonstrate sqlce insert performance on the desktop. I'm using sqlce on the desktop to persist data as its the fastest thing I can find. Insert perfromance seems to top out at around 65,000records/sec on my laptop but it also seems to limited at this point by my hard drive speeds. The sample code should be able to accept a dataTable, dataReader or dataset and persit to a sqlce database. Also in the code is a small wrapper on the sqlce connectionstring (localconne...SqlMon for OracleClient: This is a project to doing a tool just like Quest SQL Monitor that capture sql sentence in client side.So it's so cool that juect join in and user it. It's developed in C# and Visual Studio 2010 and bases on the EasyHook that is a powerfull .NET hook library.It injects in oci.dll and hook the "OCIStmtPrepare" function which excute before sql excuting,through the parameter "stms" we can get the sqltext and display itStackOverflow Offline Reader using a pdf reader: Using one or more keywords and number of questions to download, the app creates and displays a single pdf file of the questions and answers pertaining to the keyword(s). This enables the user to read questions in an offline manner or to read in a top/down fashion. std::streambuf wrapper for COM IStream: This provides a subclass of std::streambuf that wraps a COM IStream, so you can use an IStream with any C++ code that uses iostreams or the STL algorithms with a streambuf_iterator. It does no internal buffering whatsoever (I'd be happy to change this if someone needs it).step out ring signatures .net: Simple program to demonstrate how srs works in real world of .net.System Center Service Manager 2010 SP1 WCF and Web Service: Simple WCF Service and Web Service for SCSM 2010 SP1.System32 - SDK de componentes y acceso a datos para .NET: System32 - SDK de componentes y acceso a datos para .NET Taha Mail: A Java Web Based MailTest TDS: Test TFS - 1Track Folder Changes: Displays in real time a tree with the list of created/deleted/changed files in a specific directory and its subdirectoriesumbTrafficCentral: A plugin for Umbraco that adds robust and scalable handling of redirects, storing the data in the media library. WebDAV Test Application: Application to test WebDAV functionality.XnaXaml: XnaXaml allows you to add a XAML file to your Content Pipeline in XNA and edit straight from Visual Studio. It then takes the XAML file and parses through it to create mock objects that can be used to render controls to the screen in XNA. These mock objects can also be used to pass data to any GUI system you choose.???-?????????? ?? ????????????? ????????? ??????????? ??????? ???: Under construction

    Read the article

  • CodePlex Daily Summary for Monday, November 14, 2011

    CodePlex Daily Summary for Monday, November 14, 2011Popular ReleasesWeapsy: 0.4.1 Alpha: Edit Text bug fixedDesktop Google Reader: 1.4.2: This release remove the like and the broadcast buttons as Google Reader stopped supporting them (no, we don't like this decission...) Additionally and to have at least a small plus: the login window now automaitcally logs you in if you stored username and passwort (no more extra click needed) Finally added WebKit .NET to the about window and removed AwesomiumZombsquare: Solución inicial: Código fuente de la solución. Versión 7099 Tambien puedes descargar de aquí los snippets de código que utilizamos en la demostración.RDRemote: Remote Desktop remote configurator V 1.0.0: Remote Desktop remote configurator V 1.0.0SQL Monitor - tracking sql server activities: SQLMon 4.1 alpha1: 1. improved version compare, now support comparing two text files. right click on object script text box and choose "Compare" or create new query window and right click and choose "Compare" 2. improved version compare, now automatically sync two text boxes. 3. fixed problem with activities (process/job) when refreshing while current activities have less count than previous one. 4. better start up by automatically shows create connection window when there is no connection defined.Rawr: Rawr 4.2.7: This is the Downloadable WPF version of Rawr!For web-based version see http://elitistjerks.com/rawr.php You can find the version notes at: http://rawr.codeplex.com/wikipage?title=VersionNotes Rawr AddonWe now have a Rawr Official Addon for in-game exporting and importing of character data hosted on Curse. The Addon does not perform calculations like Rawr, it simply shows your exported Rawr data in wow tooltips and lets you export your character to Rawr (including bag and bank items) like Char...VidCoder: 1.2.2: Updated Handbrake core to svn 4344. Fixed the 6-channel discrete mixdown option not appearing for AAC encoders. Added handling for possible exceptions when copying to the clipboard, added retries and message when it fails. Fixed issue with audio bitrate UI not appearing sometimes when switching audio encoders. Added extra checks to protect against reported crashes. Added code to upgrade encoding profiles on old queued items.Dynamic PagedCollection (Silverlight / WPF Pagination): PagedCollection: All classes which facilitate your dynamic pagination in Silverlight or WPF !Media Companion: MC 3.422b Weekly: Ensure .NET 4.0 Full Framework is installed. (Available from http://www.microsoft.com/download/en/details.aspx?id=17718) Ensure the NFO ID fix is applied when transitioning from versions prior to 3.416b. (Details here) TV Show Resolutions... Made the TV Shows folder list sorted. Re-visibled 'Manually Add Path' in Root Folders. Sorted list to process during new tv episode search Rebuild Movies now processes thru folders alphabetically Fix for issue #208 - Display Missing Episodes is not popu...DotSpatial: DotSpatial Release Candidate 1 (1.0.823): Supports loading extensions using System.ComponentModel.Composition. DemoMap compiled as x86 so that GDAL runs on x64 machines. How to: Use an Assembly from the WebBe aware that your browser may add an identifier to downloaded files which results in "blocked" dll files. You can follow the following link to learn how to "Unblock" files. Right click on the zip file before unzipping, choose properties, go to the general tab and click the unblock button. http://msdn.microsoft.com/en-us/library...XPath Visualizer: XPathVisualizer v1.3 Latest: This is v1.3.0.6 of XpathVisualizer. This is an update release for v1.3. These workitems have been fixed since v1.3.0.5: 7429 7432 7427MSBuild Extension Pack: November 2011: Release Blog Post The MSBuild Extension Pack November 2011 release provides a collection of over 415 MSBuild tasks. A high level summary of what the tasks currently cover includes the following: System Items: Active Directory, Certificates, COM+, Console, Date and Time, Drives, Environment Variables, Event Logs, Files and Folders, FTP, GAC, Network, Performance Counters, Registry, Services, Sound Code: Assemblies, AsyncExec, CAB Files, Code Signing, DynamicExecute, File Detokenisation, GU...CODE Framework: 4.0.11110.0: Various minor fixes and tweaks.Extensions for Reactive Extensions (Rxx): Rxx 1.2: What's NewRelated Work Items Please read the latest release notes for details about what's new. Content SummaryRxx provides the following features. See the Documentation for details. Many IObservable<T> extension methods and IEnumerable<T> extension methods. Many useful types such as ViewModel, CommandSubject, ListSubject, DictionarySubject, ObservableDynamicObject, Either<TLeft, TRight>, Maybe<T> and others. Various interactive labs that illustrate the runtime behavior of the extensio...Player Framework by Microsoft: HTML5 Player Framework 1.0: Additional DownloadsHTML5 Player Framework Examples - This is a set of examples showing how to setup and initialize the HTML5 Player Framework. This includes examples of how to use the Player Framework with both the HTML5 video tag and Silverlight player. Note: Be sure to unblock the zip file before using. Note: In order to test Silverlight fallback in the included sample app, you need to run the html and xap files over http (e.g. over localhost). Silverlight Players - Visit the Silverlig...MapWindow 4: MapWindow GIS v4.8.6 - Final release - 64Bit: What’s New in 4.8.6 (Final release)A few minor issues have been fixed What’s New in 4.8.5 (Beta release)Assign projection tool. (Sergei Leschinsky) Projection dialects. (Sergei Leschinsky) Projections database converted to SQLite format. (Sergei Leschinsky) Basic code for database support - will be developed further (ShapefileDataClient class, IDataProvider interface). (Sergei Leschinsky) 'Export shapefile to database' tool. (Sergei Leschinsky) Made the GEOS library static. geos.dl...Facebook C# SDK: v5.3.2: This is a RTW release which adds new features and bug fixes to v5.2.1. Query/QueryAsync methods uses graph api instead of legacy rest api. removed dependency from Code Contracts enabled Task Parallel Support in .NET 4.0+ (experimental) added support for early preview for .NET 4.5 (binaries not distributed in codeplex nor nuget.org, will need to manually build from Facebook-Net45.sln) added additional method overloads for .NET 4.5 to support IProgress<T> for upload progress added ne...Delete Inactive TS Ports: List and delete the Inactive TS Ports: UPDATEAdded support for windows 2003 servers and removed some null reference errors when the registry key was not present List and delete the Inactive TS Ports - The InactiveTSPortList.EXE accepts command line arguments The InactiveTSPortList.Standalone.WithoutPrompt.exe runs as a standalone exe without the need for any command line arguments.ClosedXML - The easy way to OpenXML: ClosedXML 0.60.0: Added almost full support for auto filters (missing custom date filters). See examples Filter Values, Custom Filters Fixed issues 7016, 7391, 7388, 7389, 7198, 7196, 7194, 7186, 7067, 7115, 7144Microsoft Research Boogie: Nightly builds: This download category contains automatically released nightly builds, reflecting the current state of Boogie's development. We try to make sure each nightly build passes the test suite. If you suspect that was not the case, please try the previous nightly build to see if that really is the problem. Also, please see the installation instructions.New ProjectsAwpAdmin: AwpAdmin (name tentative) is a powerful BF3 admin tool. This admin tool is being designed to be easy to setup and maintain while having a great deal of customizability and power. This is written in C# and is being designed to be Mono-compatible.ChainReaction.Net: Extension library that aims to allow method chains to be attached to code statements, allowing them to be read more fluently , allowing extra logic to effectively be bolted on in a fluid wayCodeigniter SQL Azure/SQL Server Unicode supported driver.: Codeigniter?SQL Server 2005/2008 SQLAzure????????????。 NPrefix???????、Unicode???(??????)?????????????。 Active record / ???????????、N?????????????????????????。 ???????????????、N???????????????????、?????????N???????????????????。 ????????????????。 See http://msdn.microsoft.com/ja-jp/library/ms191313.aspx ??、????????????????????。 (Codeplex?Apache????????????、Ellislab license????????。) --- SQL Server 2005 / 2008 / SQL Azure driver class for Codeigniter. this driver N prefix uni...EZ-NFC: EZ-NFC is a .NET library, written in C#, aimed at simplifying the use of NFC in applications.Farigola: A library to organize and manage dynamic data. It's developed in C#/.NET language. Forca_adrikei: Forca implementada para o curso de C# da ufscar sorocaba.GemTD: Gem Tower Defense starcraft 2 map simulator implemented via C# and XNA. Components needed: Microsoft .NET Framework 4 Microsoft XNAGSISWebServiceMVC: This is a sample of an MVC application using the Greek GSIS Web Service at http://www.gsis.gr/wsnp.html (in Greek).Ini4Net: Ini4Net is a simple INI class for parsing INI files in your application. There are many INI solutions available around but non of them met my simplicity so I rolled-my-own. I have been using this since 2008 in several applications that are being used in the enterprise.Jogo da Memória: Projeto de C# - Jogo da memóriaJson DataContract Builder - Create JsonAPI SDK from samples & xmls: Yeah, you can access json with dynamic & Json.Net. But why can't we have the old static way? Is there no one miss the happiness of working with intellisense? There must be a easy way.K-Vizinhos: K-VizinhosLie to Me Windows Phone 7 App: Lie to me - application on WindowsPhone7 platform for testing face expressions, basend on popular serial "Lie To Me".Localization Project: Localization project is C# library to simplify localizing .NET applications and websites. Primary purpose of this project is support instant language switching on the fly.Luminji.wp: luminji's melearning windows phone soft.mergemania - pdf merging .net library based on iTextSharp: Merge PDF documents from different source documents into several destination documents. Set the page ranges to merge from and the page ranges to merge into. Everything is configured via an single XML file. You access all elements through strongly typed classes generated from XSD.Metro Pandora: Metro Pandora aims to ship a Pandora SDK and apps for XAML .net platforms.MiniState: MiniState is an attempt to provide simple abstraction layer to reading and writing state information like HTTP cookies to minimise cookie size and increase the quality of code and security.Mocklet: Mocklet is a suite of PowerShell cmdlets designed to help system administrator generate sample or mock data for testing or building test environments.nethelper: silverlight extend libraryNeverForget: NeverForget is a simple KB projectOpenCV examples: Sample project for interprocess image sharing. Using OpenCV and Boost. Server : Capture image from webcam and write image to shared memory region. Client : Read image from shared memory and imshow the image.Portable Class Libraries Contrib: Portable Class Libraries Contrib provides portable adapters and APIs that help bridge the gap between different platforms when using the new Portable Class Library feature. This makes it easier to convert existing platform-specific libraries over to use portable APIs.sbfa: sbfaShortcut Manager: Shortcut Manager (SM) is solution for everyone who is looking for creating keyboard shortcuts in .NET Winforms applications. SM uses Win32 API to create keyboard hook and fires supplied handler after required shortcut is pressed.SIGEMdispro: Proyecto de un curso de la universidadSimpleMvcCaptcha: Captcha HtmlHelper for ASP.NET MVC 3 with simple ariphmetic expression. No WCF required, neither any other communications. Written in C#.Traveling salesman problem solver using google maps: This application provides a solution for the traveling salesman problem using Google maps, developed in C# and ASP.net.WebPALTT: A Web performance and load test tool for testing web sites / applications. Features include easy to use scenario builder and powerful scripting for high customisability. Developed in .Net C#.WriteMyName: Código para escrever o nome do autor no começo de código fonte.zenSQLcompare: Compare SQL

    Read the article

  • CodePlex Daily Summary for Tuesday, June 12, 2012

    CodePlex Daily Summary for Tuesday, June 12, 2012Popular ReleasesWeapsy - ASP.NET MVC CMS: 1.0 RC: - Upgrade to Entity Framework 4.3.1 - Added AutoMapper custom version (by nopCommerce Team) - Added missed model properties and localization resources of Plugin Definitions - Minor changes - Fixed some bugsSolution Extender for Microsoft Dynamics CRM 2011: Solution Extender (1.0.0.0): Initial releaseQTP FT Uninstaller: QTP FT Uninstaller v3: - KnowledgeInbox has made this tool open source - Converted to C# - Better scanning & cleaning mechanismWebSocket4Net: WebSocket4Net 0.7: Changes included in this release: updated ClientEngine added proper exception handling code added state support for callback added property AllowUnstrustedCertificate for JsonWebSocket added properties for sending ping automatically improved JsBridge fixed a uri compatibility issueXenta Framework - extensible enterprise n-tier application framework: Xenta Framework 1.8.0 Beta: Catalog and Publication reviews and ratings Store language packs in data base Improve reporting system Improve Import/Export system A lot of WebAdmin app UI improvements Initial implementation of the WebForum app DB indexes Improve and simplify architecture Less abstractions Modernize architecture Improve, simplify and unify API Simplify and improve testing A lot of new unit tests Codebase refactoring and ReSharpering Utilize Castle Windsor Utilize NHibernate ORM ...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.55: Properly handle IE extension to CSS3 grammar that allows for multiple parameters to functional pseudo-class selectors. add new switch -braces:(new|same) that affects where opening braces are placed in multi-line output. The default, "new" puts them on their own new line; "same" outputs them at the end of the previous line. add new optional values to the -inline switch: -inline:(force|noforce), which can be combined with the existing boolean value via comma-separators; value "force" (which...Microsoft Media Platform: Player Framework: MMP Player Framework 2.7 (Silverlight and WP7): Additional DownloadsSMFv2.7 Full Installer (MSI) - This will install everything you need in order to develop your own SMF player application, including the IIS Smooth Streaming Client. It only includes the assemblies. If you want the source code please follow the link above. Smooth Streaming Sample Player - This is a pre-built player that includes support for IIS Smooth Streaming. You can configure the player to playback your content by simplying editing a configuration file - no need to co...Liberty: v3.2.1.0 Release 10th June 2012: Change Log -Added -Liberty is now digitally signed! If the certificate on Liberty.exe is missing, invalid, or does not state that it was developed by "Xbox Chaos, Open Source Developer," your copy of Liberty may have been altered in some (possibly malicious) way. -Reach Mass biped max health and shield changer -Fixed -H3/ODST Fixed all of the glitches that users kept reporting (also reverted the changes made in 3.2.0.2) -Reach Made some tag names clearer and more consistent between m...SHA-1 Hash Checker: SHA-1 Hash Checker (for Windows): Fixed major bugs. Removed false negatives.AutoUpdaterdotNET: AutoUpdater.NET 1.0: Everything seems perfect if you find any problem you can report to http://www.rbsoft.org/contact.htmlMedia Companion: Media Companion 3.503b: It has been a while, so it's about time we release another build! Major effort has been for fixing trailer downloads, plus a little bit of work for episode guide tag in TV show NFOs.Microsoft SQL Server Product Samples: Database: AdventureWorks Sample Reports 2008 R2: AdventureWorks Sample Reports 2008 R2.zip contains several reports include Sales Reason Comparisons SQL2008R2.rdl which uses Adventure Works DW 2008R2 as a data source reference. For more information, go to Sales Reason Comparisons report.Json.NET: Json.NET 4.5 Release 7: Fix - Fixed Metro build to pass Windows Application Certification Kit on Windows 8 Release Preview Fix - Fixed Metro build error caused by an anonymous type Fix - Fixed ItemConverter not being used when serializing dictionaries Fix - Fixed an incorrect object being passed to the Error event when serializing dictionaries Fix - Fixed decimal properties not being correctly ignored with DefaultValueHandlingLINQ Extensions Library: 1.0.3.0: New to release 1.0.3.0:Combinatronics: Combinations (unique) Combinations (with repetition) Permutations (unique) Permutations (with repetition) Convert jagged arrays to fixed multidimensional arrays Convert fixed multidimensional arrays to jagged arrays ElementAtMax ElementAtMin ElementAtAverage New set of array extension (1.0.2.8):Rotate Flip Resize (maintaing data) Split Fuse Replace Append and Prepend extensions (1.0.2.7) IndexOf extensions (1.0.2.7) Ne...????????API for .Net SDK: SDK for .Net ??? Release 1: 6?12????? ??? - ?????.net2.0?.net4.0????Winform?Web???????。 NET2 - .net 2.0?Winform?Web???? NET4 - .net 4.0?Winform?Web???? Libs - SDK ???,??2.0?4.0?SDK??? 6?11????? ??? - ?Entities???????????EntityBase,???ToString()???????json???,??????4.0???????。2.0?3.5???! ??? - Request????????AccessToken??????source=appkey?????。????,????????,???????public_timeline?????????。 ?? - ???ClinetLogin??????????RefreshToken???????false???。 ?? - ???RepostTimeline????Statuses???null???。 ?? - Utility?Bui...Audio Pitch & Shift: Audio Pitch And Shift 4.5.0: Added Instruments tab for modules Open folder content feature Some bug fixesPython Tools for Visual Studio: 1.5 Beta 1: We’re pleased to announce the release of Python Tools for Visual Studio 1.5 Beta. Python Tools for Visual Studio (PTVS) is an open-source plug-in for Visual Studio which supports programming with the Python language. PTVS supports a broad range of features including: • Supports CPython, IronPython, Jython and PyPy • Python editor with advanced member, signature intellisense and refactoring • Code navigation: “Find all refs”, goto definition, and object browser • Local and remote debugging •...Circuit Diagram: Circuit Diagram 2.0 Beta 1: New in this release: Automatically flip components when placing Delete components using keyboard delete key Resize document Document properties window Print document Recent files list Confirm when exiting with unsaved changes Thumbnail previews in Windows Explorer for CDDX files Show shortcut keys in toolbox Highlight selected item in toolbox Zoom using mouse scroll wheel while holding down ctrl key Plugin support for: Custom export formats Custom import formats Open...Umbraco CMS: Umbraco CMS 5.2 Beta: The future of Umbracov5 represents the future architecture of Umbraco, so please be aware that while it's technically superior to v4 it's not yet on a par feature or performance-wise. What's new? For full details see our http://progress.umbraco.org task tracking page showing all items complete for 5.2. In a nutshellPackage Builder Starter Kits Dynamic Extension Methods Querying / IsHelpers Friendly alt template URLs Localization Various bug fixes / performance enhancements Gett...JayData - The cross-platform HTML5 data-management library for JavaScript: JayData 1.0.5: JayData is a unified data access library for JavaScript developers to query and update data from different sources like WebSQL, IndexedDB, OData, Facebook or YQL. See it in action in this 6 minutes video New features in JayData 1.0.5http://jaydata.org/blog/jaydata-1.0.5-is-here-with-authentication-support-and-more http://jaydata.org/blog/release-notes Sencha Touch 2 module (read-only)This module can be used to bind data retrieved by JayData to Sencha Touch 2 generated user interface. (exam...New ProjectsAFS.Aspnet.Collab.Site: alles umfassender testAFS.FallBackService: Afs.ultimate.fallbackserviceAIMS: This project aims at creating an mvc-based application for NGO portfolio management.Card Douglas: Project for card discountCrashReporter.NET: Send crash reports for c# or vb.net applications directly to your mail's inbox Daphne Translator - localizing tool for Daphne: This project was created as a helper project for localization Daphne checkers software [url:http://daphne.codeplex.com]. It can be used by other developers to do translations of resources. Daphne Translator is done to be extensible by any resource type of localized texts. ExcelFileEditor: This project is a free excel file editor class library . ( Support " .xls and .xlsx " )Finegypt: Fin EgyptHuish: Huish MarineINNO DNN Open Registration: The INNO DNN Registration module collection allows for customized DotNetNuke registration, payment processing integration, profile management, transaction history and a powerful contact us form. It is built to be responsive to allow registration on mobile devices such as Android phones and tablets, iPad, iPhone and more. Key focus is on extensibility including options to hook your own custom DotNetNuke modules into it for consistent, easy registration of users.jCardView: jCardViewLinkCard: D? án VDOS - TopLinkLINQ Expression Projection: Project Description LINQ Expression Projection library provides tool that enable using lambda expressions in LINQ projections (to anonymous and predefined types) even if the lambda expression is not defined in the query but rather is stored in a variable or retrieved via a function or property. Goal This library is a tool for development of applications following the DRY principle. It is used to facilitate reuse LINQ expressions.MD5 Message-Digest Algorithm for Microsoft® .NET Micro Framework: MD5 Message-Digest Algorithm for Microsoft® .NET Micro FrameworkMoney Class for C#. Fast, light and flexible: This Money class gives you the ability to work with money of multiple currencies as if it were built-in types. It looks and behaves like a decimal with extra features, but can perform much faster.NCT-Medical Record System: I've created this project for learning purposes onlyPaiPaiAPI: ??SDKParadise Shop: sâsâsaPingyThingy: PingyThingy is a basic application that will allow you to ping multiple hosts.SCK - Spine Model Creation Kit: The SCK is a software package for generating FE spine models from a set of anatomical parameters.ScoreMS(????????): The summary is required.SiteCoreDynamicForm: This project's aim is to provide a simple solution in SiteCore for creating dynamic formsSmall Media Library (for Games and other 3D rendering applications): A .NET library that utilises DirectShow to render video into a block of memory that can be easily written to surfaces managed by rendering APIs such as DirectX and OpenGL, or used in any other way the host application requires.SportStore: The project has only training purpose. It is an ecommerce web application to sell sport products on line.Thesis Management System: This is a Thesis Management System(TMIS),Based on a three-tier structure,This is a open source project.There are many question and bugs ,I will repair by other time.This PowerShell script will output details of your default organization.: This is a PowerShell script. You can save this PowerShell file 'DefaultCRMOrganization.ps1' to your server. Microsoft Dynamics CRM administrators and professionals can have a quick browse through the details for their default organization by simply running this file. Note: Sometimes when you run this file, for first time, it might open in Notepad. In order to open this file with a PowerShell application, associate this file (with extension ps1) with C:\Windows\System32\WindowsPowerShell...Tmib Dreampad: Dream Pad is a smart text editor by Tmib Inc. Tags - Notepad , Texteditor , Note Pad , Dream Pad , Wordpad , Word Pad, TmibUAIFAI: Trabalho de EDUmbraco Dynamic Form: This project's aim is to provide a simple solution for creating dynamic forms in UmbracoVDOS - Top Link: H? th?ng tích di?m LinkCardVoteChecker: Vote Checker allows a canvasser to identify a valid Ballot by checking if the corresponding barcode is known to the election committee.WebRozklady: sWindows Application Monitoring and Administration Tool: This projects aims at developing client server architecture based windows application monitoring, administring and deployment. Also Most of the real life applications today are either windows services or Websites. WINADM suite aims at monitoring these and providing the user a graphical interface for interaction with these services.WPF 3 D: How to study wpf

    Read the article

  • Render To Texture Using OpenGL is not working but normal rendering works just fine

    - by Franky Rivera
    things I initialize at the beginning of the program I realize not all of these pertain to my issue I just copy and pasted what I had //overall initialized //things openGL related I initialize earlier on in the project glClearColor( 0.0f, 0.0f, 0.0f, 1.0f ); glClearDepth( 1.0f ); glEnable(GL_ALPHA_TEST); glEnable( GL_STENCIL_TEST ); glEnable(GL_DEPTH_TEST); glDepthFunc( GL_LEQUAL ); glEnable(GL_CULL_FACE); glFrontFace( GL_CCW ); glEnable(GL_COLOR_MATERIAL); glEnable(GL_BLEND); glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); glHint( GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST ); //we also initialize our shader programs //(i added some shader program functions for definitions) //this enum list is else where in code //i figured it would help show you guys more about my //shader compile creation function right under this enum list VVVVVV /*enum eSHADER_ATTRIB_LOCATION { VERTEX_ATTRIB = 0, NORMAL_ATTRIB = 2, COLOR_ATTRIB, COLOR2_ATTRIB, FOG_COORD, TEXTURE_COORD_ATTRIB0 = 8, TEXTURE_COORD_ATTRIB1, TEXTURE_COORD_ATTRIB2, TEXTURE_COORD_ATTRIB3, TEXTURE_COORD_ATTRIB4, TEXTURE_COORD_ATTRIB5, TEXTURE_COORD_ATTRIB6, TEXTURE_COORD_ATTRIB7 }; */ //if we fail making our shader leave if( !testShader.CreateShader( "SimpleShader.vp", "SimpleShader.fp", 3, VERTEX_ATTRIB, "vVertexPos", NORMAL_ATTRIB, "vNormal", TEXTURE_COORD_ATTRIB0, "vTexCoord" ) ) return false; if( !testScreenShader.CreateShader( "ScreenShader.vp", "ScreenShader.fp", 3, VERTEX_ATTRIB, "vVertexPos", NORMAL_ATTRIB, "vNormal", TEXTURE_COORD_ATTRIB0, "vTexCoord" ) ) return false; SHADER PROGRAM FUNCTIONS bool CShaderProgram::CreateShader( const char* szVertexShaderName, const char* szFragmentShaderName, ... ) { //here are our handles for the openGL shaders int iGLVertexShaderHandle = -1, iGLFragmentShaderHandle = -1; //get our shader data char *vData = 0, *fData = 0; int vLength = 0, fLength = 0; LoadShaderFile( szVertexShaderName, &vData, &vLength ); LoadShaderFile( szFragmentShaderName, &fData, &fLength ); //data if( !vData ) return false; //data if( !fData ) { delete[] vData; return false; } //create both our shader objects iGLVertexShaderHandle = glCreateShader( GL_VERTEX_SHADER ); iGLFragmentShaderHandle = glCreateShader( GL_FRAGMENT_SHADER ); //well we got this far so we have dynamic data to clean up //load vertex shader glShaderSource( iGLVertexShaderHandle, 1, (const char**)(&vData), &vLength ); //load fragment shader glShaderSource( iGLFragmentShaderHandle, 1, (const char**)(&fData), &fLength ); //we are done with our data delete it delete[] vData; delete[] fData; //compile them both glCompileShader( iGLVertexShaderHandle ); //get shader status int iShaderOk; glGetShaderiv( iGLVertexShaderHandle, GL_COMPILE_STATUS, &iShaderOk ); if( iShaderOk == GL_FALSE ) { char* buffer; //get what happend with our shader glGetShaderiv( iGLVertexShaderHandle, GL_INFO_LOG_LENGTH, &iShaderOk ); buffer = new char[iShaderOk]; glGetShaderInfoLog( iGLVertexShaderHandle, iShaderOk, NULL, buffer ); //sprintf_s( buffer, "Failure Our Object For %s was not created", szFileName ); MessageBoxA( NULL, buffer, szVertexShaderName, MB_OK ); //delete our dynamic data free( buffer ); glDeleteShader(iGLVertexShaderHandle); return false; } glCompileShader( iGLFragmentShaderHandle ); //get shader status glGetShaderiv( iGLFragmentShaderHandle, GL_COMPILE_STATUS, &iShaderOk ); if( iShaderOk == GL_FALSE ) { char* buffer; //get what happend with our shader glGetShaderiv( iGLFragmentShaderHandle, GL_INFO_LOG_LENGTH, &iShaderOk ); buffer = new char[iShaderOk]; glGetShaderInfoLog( iGLFragmentShaderHandle, iShaderOk, NULL, buffer ); //sprintf_s( buffer, "Failure Our Object For %s was not created", szFileName ); MessageBoxA( NULL, buffer, szFragmentShaderName, MB_OK ); //delete our dynamic data free( buffer ); glDeleteShader(iGLFragmentShaderHandle); return false; } //lets check to see if the fragment shader compiled int iCompiled = 0; glGetShaderiv( iGLVertexShaderHandle, GL_COMPILE_STATUS, &iCompiled ); if( !iCompiled ) { //this shader did not compile leave return false; } //lets check to see if the fragment shader compiled glGetShaderiv( iGLFragmentShaderHandle, GL_COMPILE_STATUS, &iCompiled ); if( !iCompiled ) { char* buffer; //get what happend with our shader glGetShaderiv( iGLFragmentShaderHandle, GL_INFO_LOG_LENGTH, &iShaderOk ); buffer = new char[iShaderOk]; glGetShaderInfoLog( iGLFragmentShaderHandle, iShaderOk, NULL, buffer ); //sprintf_s( buffer, "Failure Our Object For %s was not created", szFileName ); MessageBoxA( NULL, buffer, szFragmentShaderName, MB_OK ); //delete our dynamic data free( buffer ); glDeleteShader(iGLFragmentShaderHandle); return false; } //make our new shader program m_iShaderProgramHandle = glCreateProgram(); glAttachShader( m_iShaderProgramHandle, iGLVertexShaderHandle ); glAttachShader( m_iShaderProgramHandle, iGLFragmentShaderHandle ); glLinkProgram( m_iShaderProgramHandle ); int iLinked = 0; glGetProgramiv( m_iShaderProgramHandle, GL_LINK_STATUS, &iLinked ); if( !iLinked ) { //we didn't link return false; } //NOW LETS CREATE ALL OUR HANDLES TO OUR PROPER LIKING //start from this parameter va_list parseList; va_start( parseList, szFragmentShaderName ); //read in number of variables if any unsigned uiNum = 0; uiNum = va_arg( parseList, unsigned ); //for loop through our attribute pairs int enumType = 0; for( unsigned x = 0; x < uiNum; ++x ) { //specify our attribute locations enumType = va_arg( parseList, int ); char* name = va_arg( parseList, char* ); glBindAttribLocation( m_iShaderProgramHandle, enumType, name ); } //end our list parsing va_end( parseList ); //relink specify //we have custom specified our attribute locations glLinkProgram( m_iShaderProgramHandle ); //fill our handles InitializeHandles( ); //everything went great return true; } void CShaderProgram::InitializeHandles( void ) { m_uihMVP = glGetUniformLocation( m_iShaderProgramHandle, "mMVP" ); m_uihWorld = glGetUniformLocation( m_iShaderProgramHandle, "mWorld" ); m_uihView = glGetUniformLocation( m_iShaderProgramHandle, "mView" ); m_uihProjection = glGetUniformLocation( m_iShaderProgramHandle, "mProjection" ); ///////////////////////////////////////////////////////////////////////////////// //texture handles m_uihDiffuseMap = glGetUniformLocation( m_iShaderProgramHandle, "diffuseMap" ); if( m_uihDiffuseMap != -1 ) { //store what texture index this handle will be in the shader glUniform1i( m_uihDiffuseMap, RM_DIFFUSE+GL_TEXTURE0 ); (0)+ } m_uihNormalMap = glGetUniformLocation( m_iShaderProgramHandle, "normalMap" ); if( m_uihNormalMap != -1 ) { //store what texture index this handle will be in the shader glUniform1i( m_uihNormalMap, RM_NORMAL+GL_TEXTURE0 ); (1)+ } } void CShaderProgram::SetDiffuseMap( const unsigned& uihDiffuseMap ) { (0)+ glActiveTexture( RM_DIFFUSE+GL_TEXTURE0 ); glBindTexture( GL_TEXTURE_2D, uihDiffuseMap ); } void CShaderProgram::SetNormalMap( const unsigned& uihNormalMap ) { (1)+ glActiveTexture( RM_NORMAL+GL_TEXTURE0 ); glBindTexture( GL_TEXTURE_2D, uihNormalMap ); } //MY 2 TEST SHADERS also my math order is correct it pertains to my matrix ordering in my math library once again i've tested the basic rendering. rendering to the screen works fine ----------------------------------------SIMPLE SHADER------------------------------------- //vertex shader looks like this #version 330 in vec3 vVertexPos; in vec3 vNormal; in vec2 vTexCoord; uniform mat4 mWorld; // Model Matrix uniform mat4 mView; // Camera View Matrix uniform mat4 mProjection;// Camera Projection Matrix out vec2 vTexCoordVary; // Texture coord to the fragment program out vec3 vNormalColor; void main( void ) { //pass the texture coordinate vTexCoordVary = vTexCoord; vNormalColor = vNormal; //calculate our model view projection matrix mat4 mMVP = (( mWorld * mView ) * mProjection ); //result our position gl_Position = vec4( vVertexPos, 1 ) * mMVP; } //fragment shader looks like this #version 330 in vec2 vTexCoordVary; in vec3 vNormalColor; uniform sampler2D diffuseMap; uniform sampler2D normalMap; out vec4 fragColor[2]; void main( void ) { //CORRECT fragColor[0] = texture( normalMap, vTexCoordVary ); fragColor[1] = vec4( vNormalColor, 1.0 ); }; ----------------------------------------SCREEN SHADER------------------------------------- //vertext shader looks like this #version 330 in vec3 vVertexPos; // This is the position of the vertex coming in in vec2 vTexCoord; // This is the texture coordinate.... out vec2 vTexCoordVary; // Texture coord to the fragment program void main( void ) { vTexCoordVary = vTexCoord; //set our position gl_Position = vec4( vVertexPos.xyz, 1.0f ); } //fragment shader looks like this #version 330 in vec2 vTexCoordVary; // Incoming "varying" texture coordinate uniform sampler2D diffuseMap;//the tile detail texture uniform sampler2D normalMap; //the normal map from earlier out vec4 vTheColorOfThePixel; void main( void ) { //CORRECT vTheColorOfThePixel = texture( normalMap, vTexCoordVary ); }; .Class RenderTarget Main Functions //here is my render targets create function bool CRenderTarget::Create( const unsigned uiNumTextures, unsigned uiWidth, unsigned uiHeight, int iInternalFormat, bool bDepthWanted ) { if( uiNumTextures <= 0 ) return false; //generate our variables glGenFramebuffers(1, &m_uifboHandle); // Initialize FBO glBindFramebuffer(GL_FRAMEBUFFER, m_uifboHandle); m_uiNumTextures = uiNumTextures; if( bDepthWanted ) m_uiNumTextures += 1; m_uiTextureHandle = new unsigned int[uiNumTextures]; glGenTextures( uiNumTextures, m_uiTextureHandle ); for( unsigned x = 0; x < uiNumTextures-1; ++x ) { glBindTexture( GL_TEXTURE_2D, m_uiTextureHandle[x]); // Reserve space for our 2D render target glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexImage2D(GL_TEXTURE_2D, 0, iInternalFormat, uiWidth, uiHeight, 0, GL_RGB, GL_UNSIGNED_BYTE, NULL); glFramebufferTexture2D(GL_DRAW_FRAMEBUFFER, GL_COLOR_ATTACHMENT0 + x, GL_TEXTURE_2D, m_uiTextureHandle[x], 0); } //if we need one for depth testing if( bDepthWanted ) { glFramebufferTexture2D(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, m_uiTextureHandle[uiNumTextures-1], 0); glFramebufferTexture2D(GL_FRAMEBUFFER_EXT, GL_STENCIL_ATTACHMENT, GL_TEXTURE_2D, m_uiTextureHandle[uiNumTextures-1], 0);*/ // Must attach texture to framebuffer. Has Stencil and depth glBindRenderbuffer(GL_RENDERBUFFER, m_uiTextureHandle[uiNumTextures-1]); glRenderbufferStorage(GL_RENDERBUFFER, /*GL_DEPTH_STENCIL*/GL_DEPTH24_STENCIL8, TEXTURE_WIDTH, TEXTURE_HEIGHT ); glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, m_uiTextureHandle[uiNumTextures-1]); glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_STENCIL_ATTACHMENT, GL_RENDERBUFFER, m_uiTextureHandle[uiNumTextures-1]); } glBindFramebuffer(GL_FRAMEBUFFER, 0); //everything went fine return true; } void CRenderTarget::Bind( const int& iTargetAttachmentLoc, const unsigned& uiWhichTexture, const bool bBindFrameBuffer ) { if( bBindFrameBuffer ) glBindFramebuffer( GL_FRAMEBUFFER, m_uifboHandle ); if( uiWhichTexture < m_uiNumTextures ) glFramebufferTexture(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0 + iTargetAttachmentLoc, m_uiTextureHandle[uiWhichTexture], 0); } void CRenderTarget::UnBind( void ) { //default our binding glBindFramebuffer( GL_FRAMEBUFFER, 0 ); } //this is all in a test project so here's my straight forward rendering function for testing this render function does basic rendering steps keep in mind i have already tested my textures i have already tested my box thats being rendered all basic rendering works fine its just when i try to render to a texture then display it in a render surface that it does not work. Also I have tested my render surface it is bound exactly to the screen coordinate space void TestRenderSteps( void ) { //Clear the color and the depth glClearColor( 0.0f, 0.0f, 0.0f, 1.0f ); glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT ); //bind the shader program glUseProgram( testShader.m_iShaderProgramHandle ); //1) grab the vertex buffer related to our rendering glBindBuffer( GL_ARRAY_BUFFER, CVertexBufferManager::GetInstance()->GetPositionNormalTexBuffer().GetBufferHandle() ); //2) how our stream will be split here ( 4 bytes position, ..ext ) CVertexBufferManager::GetInstance()->GetPositionNormalTexBuffer().MapVertexStride(); //3) set the index buffer if needed glBindBuffer( GL_ELEMENT_ARRAY_BUFFER, CIndexBuffer::GetInstance()->GetBufferHandle() ); //send the needed information into the shader testShader.SetWorldMatrix( boxPosition ); testShader.SetViewMatrix( Static_Camera.GetView( ) ); testShader.SetProjectionMatrix( Static_Camera.GetProjection( ) ); testShader.SetDiffuseMap( iTextureID ); testShader.SetNormalMap( iTextureID2 ); GLenum buffers[] = { GL_COLOR_ATTACHMENT0, GL_COLOR_ATTACHMENT1 }; glDrawBuffers(2, buffers); //bind to our render target //RM_DIFFUSE, RM_NORMAL are enums (0 && 1) renderTarget.Bind( RM_DIFFUSE, 1, true ); renderTarget.Bind( RM_NORMAL, 1, false); //false because buffer is already bound //i clear here just to clear the texture to make it a default value of white //by doing this i can see if what im rendering to my screen is just drawing to the screen //or if its my render target defaulted glClearColor( 1.0f, 1.0f, 1.0f, 1.0f ); glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT ); //i have this box object which i draw testBox.Draw(); //the draw call looks like this //my normal rendering works just fine so i know this draw is fine // glDrawElementsBaseVertex( m_sides[x].GetPrimitiveType(), // m_sides[x].GetPrimitiveCount() * 3, // GL_UNSIGNED_INT, // BUFFER_OFFSET(sizeof(unsigned int) * m_sides[x].GetStartIndex()), // m_sides[x].GetStartVertex( ) ); //we unbind the target back to default renderTarget.UnBind(); //i stop mapping my vertex format CVertexBufferManager::GetInstance()->GetPositionNormalTexBuffer().UnMapVertexStride(); //i go back to default in using no shader program glUseProgram( 0 ); //now that everything is drawn to the textures //lets draw our screen surface and pass it our 2 filled out textures //NOW RENDER THE TEXTURES WE COLLECTED TO THE SCREEN QUAD //bind the shader program glUseProgram( testScreenShader.m_iShaderProgramHandle ); //1) grab the vertex buffer related to our rendering glBindBuffer( GL_ARRAY_BUFFER, CVertexBufferManager::GetInstance()->GetPositionTexBuffer().GetBufferHandle() ); //2) how our stream will be split here CVertexBufferManager::GetInstance()->GetPositionTexBuffer().MapVertexStride(); //3) set the index buffer if needed glBindBuffer( GL_ELEMENT_ARRAY_BUFFER, CIndexBuffer::GetInstance()->GetBufferHandle() ); //pass our 2 filled out textures (in the shader im just using the diffuse //i wanted to see if i was rendering anything before i started getting into other techniques testScreenShader.SetDiffuseMap( renderTarget.GetTextureHandle(0) ); //SetDiffuseMap definitions in shader program class testScreenShader.SetNormalMap( renderTarget.GetTextureHandle(1) ); //SetNormalMap definitions in shader program class //DO the draw call drawing our screen rectangle glDrawElementsBaseVertex( m_ScreenRect.GetPrimitiveType(), m_ScreenRect.GetPrimitiveCount() * 3, GL_UNSIGNED_INT, BUFFER_OFFSET(sizeof(unsigned int) * m_ScreenRect.GetStartIndex()), m_ScreenRect.GetStartVertex( ) );*/ //unbind our vertex mapping CVertexBufferManager::GetInstance()->GetPositionTexBuffer().UnMapVertexStride(); //default to no shader program glUseProgram( 0 ); } Last words: 1) I can render my box just fine 2) i can render my screen rect just fine 3) I cannot render my box into a texture then display it into my screen rect 4) This entire project is just a test project I made to test different rendering practices. So excuse any "ugly-ish" unclean code. This was made just on a fly run through when I was trying new test cases.

    Read the article

  • ????ASMM

    - by Liu Maclean(???)
    ???Oracle??????????????SGA/PGA???,????10g????????????ASMM????,????????ASMM?????????Oracle??????????,?ASMM??????DBA????????????;????????ASMM???????????????DBA???:????????????DB,?????????????DBA?????????????????????????????????,ASMM??????????,???????????,??????????,??????????????????;?10g release 1?10.2??????ASMM?????????????,???????ASMM????????ASMM?????startup???????????ASMM??AMM??,????????DBA????SGA/PGA?????????”??”??”???”???,???????????DBA????chemist(???????1??2??????????????)? ?????????????????ASMM?????,?????????????…… Oracle?SGA???????9i???????????,????: Buffer Cache ????????????,??????????????? Default Pool                  ??????,???DB_CACHE_SIZE?? Keep Pool                     ??????,???DB_KEEP_CACHE_SIZE?? Non standard pool         ???????,???DB_nK_cache_size?? Recycle pool                 ???,???db_recycle_cache_size?? Shared Pool ???,???shared_pool_size?? Library cache   ?????? Row cache      ???,?????? Java Pool         java?,???Java_pool_size?? Large Pool       ??,???Large_pool_size?? Fixed SGA       ???SGA??,???Oracle???????,?????????granule? ?9i?????ASMM,???????????SGA,??????MSMM??9i???buffer cache??????????,?????????????????????????,???9i?????????????,?????????????????????????? ????SGA?????: ?????shared pool?default buffer pool????????,??????????? ?9i???????????(advisor),?????????? ??????????????? ?????????,?????? ?????,?????ORA-04031?????????? ASMM?????: ?????????? ???????????????? ???????sga_target?? ???????????,??????????? ??MSMM???????: ???? ???? ?????? ???? ??????????,??????????? ??????????????????,??????????ORA-04031??? ASMM???????????:1.??????sga_target???????2.???????,???:????(memory component),????(memory broker)???????(memory mechanism)3.????(memory advisor) ASMM????????????(Automatically set),??????:shared_pool_size?db_cache_size?java_pool_size?large_pool _size?streams_pool_size;?????????????????,???:db_keep_cache_size?db_recycle_cache_size?db_nk_cache_size?log_buffer????SGA?????,????????????????,??log_buffer?fixed sga??????????????? ??ASMM?????????sga_target??,???????ASMM??????????????????db_cache_size?java_pool_size???,?????????????????????,????????????????????(???)????????,Oracle?????????(granule,?SGA<1GB?granule???4M,?SGA>1GB?granule???16M)???????,??????????????buffer cache,??????????????????(granule)??????????????????????sga_target??,???????????????????(dism,???????)???ASMM?????????????statistics_level?????typical?ALL,?????BASIC??MMON????(Memory Monitor is a background process that gathers memory statistics (snapshots) stores this information in the AWR (automatic workload repository). MMON is also responsible for issuing alerts for metrics that exceed their thresholds)?????????????????????ASMM?????,???????????sga_target?????statistics_level?BASIC: SQL> show parameter sga NAME TYPE VALUE ------------------------------------ ----------- ------------------------------ lock_sga boolean FALSE pre_page_sga boolean FALSE sga_max_size big integer 2000M sga_target big integer 2000M SQL> show parameter sga_target NAME TYPE VALUE ------------------------------------ ----------- ------------------------------ sga_target big integer 2000M SQL> alter system set statistics_level=BASIC; alter system set statistics_level=BASIC * ERROR at line 1: ORA-02097: parameter cannot be modified because specified value is invalid ORA-00830: cannot set statistics_level to BASIC with auto-tune SGA enabled ?????server parameter file?spfile??,ASMM????shutdown??????????????(Oracle???????,????????)???spfile?,?????strings?????spfile????????????????????,?: G10R2.__db_cache_size=973078528 G10R2.__java_pool_size=16777216 G10R2.__large_pool_size=16777216 G10R2.__shared_pool_size=1006632960 G10R2.__streams_pool_size=67108864 ???spfile?????????????????,???????????”???”?????,??????????”??”?? ?ASMM?????????????? ?????(tunable):????????????????????????????buffer cache?????????,cache????????????????,?????????? IO????????????????????????????Library cache????? subheap????,?????????????????????????????????(open cursors)?????????client??????????????buffer cache???????,???????????pin??buffer???(???????) ?????(Un-tunable):???????????????????,?????????????????,?????????????????????????large pool?????? ??????(Fixed Size):???????????,??????????????????????????????????????? ????????????????(memory resize request)?????????,?????: ??????(Immediate Request):???????????ASMM????????????????????????(chunk)?,??????OUT-OF-MEMORY(ORA-04031)???,????????????????????(granule)????????????????????granule,????????????,?????????????????????????????,????granule??????????????? ??????(Deferred Request):???????????????????????????,??????????????granule???????????????MMON??????????delta. ??????(Manual Request):????????????alter system?????????????????????????????????????????????????granule,??????grow?????ORA-4033??,?????shrink?????ORA-4034??? ?ASMM????,????(Memory Broker)????????????????????????????(Deferred)??????????????????????(auto-tunable component)???????????????,???????????????MMON??????????????????????????????????,????????????????;MMON????Memory Broker?????????????????????????MMON????????????????????????????????????????(resize request system queue)?MMAN????(Memory Manager is a background process that manages the dynamic resizing of SGA memory areas as the workload increases or decreases)??????????????????? ?10gR1?Shared Pool?shrink??????????,?????????????Buffer Cache???????????granule,????Buffer Cache?granule????granule header?Metadata(???buffer header??RAC??Lock Elements)????,?????????????????????shared pool????????duration(?????)?chunk??????granule?,????????????granule??10gR2????Buffer Cache Granule????????granule header?buffer?Metadata(buffer header?LE)????,??shared pool???duration?chunk????????granule,??????buffer cache?shared pool??????????????10gr2?streams pool?????????(???????streams pool duration????) ??????????(Donor,???trace????)???,?????????granule???buffer cache,????granule????????????: ????granule???????granule header ?????chunk????granule?????????buffer header ???,???chunk??????????????????????metadata? ???2-4??,???granule???? ??????????????????,??buffer cache??granule???shared pool?,???????: MMAN??????????buffer cache???granule MMAN????granule??quiesce???(Moving 1 granule from inuse to quiesce list of DEFAULT buffer cache for an immediate req) DBWR???????quiesced???granule????buffer(dirty buffer) MMAN??shared pool????????(consume callback),granule?free?chunk???shared pool??(consume)?,????????????????????granule????shared granule??????,???????????granule???????????,??????pin??buffer??Metadata(???buffer header?LE)?????buffer cache??? ???granule???????shared pool,???granule?????shared??? ?????ASMM???????????,??????????: _enabled_shared_pool_duration:?????????10g????shared pool duration??,?????sga_target?0?????false;???10.2.0.5??cursor_space_for_time???true??????false,???10.2.0.5??cursor_space_for_time????? _memory_broker_shrink_heaps:???????0??Oracle?????shared pool?java pool,??????0,??shrink request??????????????????? _memory_management_tracing: ???????MMON?MMAN??????????(advisor)?????(Memory Broker)?????trace???;??ORA-04031????????36,???8?????????????trace,???23????Memory Broker decision???,???32???cache resize???;??????????: Level Contents 0×01 Enables statistics tracing 0×02 Enables policy tracing 0×04 Enables transfer of granules tracing 0×08 Enables startup tracing 0×10 Enables tuning tracing 0×20 Enables cache tracing ?????????_memory_management_tracing?????DUMP_TRANSFER_OPS????????????????,?????????????????trace?????????mman_trace?transfer_ops_dump? SQL> alter system set "_memory_management_tracing"=63; System altered Operation make shared pool grow and buffer cache shrink!!!.............. ???????granule?????,????default buffer pool?resize??: AUTO SGA: Request 0xdc9c2628 after pre-processing, ret=0 /* ???0xdc9c2628??????addr */ AUTO SGA: IMMEDIATE, FG request 0xdc9c2628 /* ???????????Immediate???? */ AUTO SGA: Receiver of memory is shared pool, size=16, state=3, flg=0 /* ?????????shared pool,???,????16?granule,??grow?? */ AUTO SGA: Donor of memory is DEFAULT buffer cache, size=106, state=4, flg=0 /* ???????Default buffer cache,????,????106?granule,??shrink?? */ AUTO SGA: Memory requested=3896, remaining=3896 /* ??immeidate request???????3896 bytes */ AUTO SGA: Memory received=0, minreq=3896, gransz=16777216 /* ????free?granule,??received?0,gransz?granule??? */ AUTO SGA: Request 0xdc9c2628 status is INACTIVE /* ??????????,??????inactive?? */ AUTO SGA: Init bef rsz for request 0xdc9c2628 /* ????????before-process???? */ AUTO SGA: Set rq dc9c2628 status to PENDING /* ?request??pending?? */ AUTO SGA: 0xca000000 rem=3896, rcvd=16777216, 105, 16777216, 17 /* ???????0xca000000?16M??granule */ AUTO SGA: Returning 4 from kmgs_process for request dc9c2628 AUTO SGA: Process req dc9c2628 ret 4, 1, a AUTO SGA: Resize done for pool DEFAULT, 8192 /* ???default pool?resize */ AUTO SGA: Init aft rsz for request 0xdc9c2628 AUTO SGA: Request 0xdc9c2628 after processing AUTO SGA: IMMEDIATE, FG request 0x7fff917964a0 AUTO SGA: Receiver of memory is shared pool, size=17, state=0, flg=0 AUTO SGA: Donor of memory is DEFAULT buffer cache, size=105, state=0, flg=0 AUTO SGA: Memory requested=3896, remaining=0 AUTO SGA: Memory received=16777216, minreq=3896, gransz=16777216 AUTO SGA: Request 0x7fff917964a0 status is COMPLETE /* shared pool????16M?granule */ AUTO SGA: activated granule 0xca000000 of shared pool ?????partial granule????????????trace: AUTO SGA: Request 0xdc9c2628 after pre-processing, ret=0 AUTO SGA: IMMEDIATE, FG request 0xdc9c2628 AUTO SGA: Receiver of memory is shared pool, size=82, state=3, flg=1 AUTO SGA: Donor of memory is DEFAULT buffer cache, size=36, state=4, flg=1 /* ????????shared pool,?????default buffer cache */ AUTO SGA: Memory requested=4120, remaining=4120 AUTO SGA: Memory received=0, minreq=4120, gransz=16777216 AUTO SGA: Request 0xdc9c2628 status is INACTIVE AUTO SGA: Init bef rsz for request 0xdc9c2628 AUTO SGA: Set rq dc9c2628 status to PENDING AUTO SGA: Moving granule 0x93000000 of DEFAULT buffer cache to activate list AUTO SGA: Moving 1 granule 0x8c000000 from inuse to quiesce list of DEFAULT buffer cache for an immediate req /* ???buffer cache??????0x8c000000?granule??????inuse list, ???????quiesce list? */ AUTO SGA: Returning 0 from kmgs_process for request dc9c2628 AUTO SGA: Process req dc9c2628 ret 0, 1, 20a AUTO SGA: activated granule 0x93000000 of DEFAULT buffer cache AUTO SGA: NOT_FREE for imm req for gran 0x8c000000 / * ??dbwr??0x8c000000 granule????dirty buffer */ AUTO SGA: Returning 0 from kmgs_process for request dc9c2628 AUTO SGA: Process req dc9c2628 ret 0, 1, 20a AUTO SGA: NOT_FREE for imm req for gran 0x8c000000 AUTO SGA: Returning 0 from kmgs_process for request dc9c2628 AUTO SGA: Process req dc9c2628 ret 0, 1, 20a AUTO SGA: NOT_FREE for imm req for gran 0x8c000000 AUTO SGA: Returning 0 from kmgs_process for request dc9c2628 AUTO SGA: Process req dc9c2628 ret 0, 1, 20a AUTO SGA: NOT_FREE for imm req for gran 0x8c000000 AUTO SGA: Returning 0 from kmgs_process for request dc9c2628 AUTO SGA: Process req dc9c2628 ret 0, 1, 20a AUTO SGA: NOT_FREE for imm req for gran 0x8c000000 AUTO SGA: Returning 0 from kmgs_process for request dc9c2628 AUTO SGA: Process req dc9c2628 ret 0, 1, 20a AUTO SGA: NOT_FREE for imm req for gran 0x8c000000 ......................................... AUTO SGA: Rcv shared pool consuming 8192 from 0x8c000000 in granule 0x8c000000; owner is DEFAULT buffer cache AUTO SGA: Rcv shared pool consuming 90112 from 0x8c002000 in granule 0x8c000000; owner is DEFAULT buffer cache AUTO SGA: Rcv shared pool consuming 24576 from 0x8c01a000 in granule 0x8c000000; owner is DEFAULT buffer cache AUTO SGA: Rcv shared pool consuming 65536 from 0x8c022000 in granule 0x8c000000; owner is DEFAULT buffer cache AUTO SGA: Rcv shared pool consuming 131072 from 0x8c034000 in granule 0x8c000000; owner is DEFAULT buffer cache AUTO SGA: Rcv shared pool consuming 286720 from 0x8c056000 in granule 0x8c000000; owner is DEFAULT buffer cache AUTO SGA: Rcv shared pool consuming 98304 from 0x8c09e000 in granule 0x8c000000; owner is DEFAULT buffer cache AUTO SGA: Rcv shared pool consuming 106496 from 0x8c0b8000 in granule 0x8c000000; owner is DEFAULT buffer cache ..................... /* ??shared pool????0x8c000000 granule??chunk, ??granule?owner????default buffer cache */ AUTO SGA: Imm xfer 0x8c000000 from quiesce list of DEFAULT buffer cache to partial inuse list of shared pool /* ???0x8c000000 granule?default buffer cache????????shared pool????inuse list */ AUTO SGA: Returning 4 from kmgs_process for request dc9c2628 AUTO SGA: Process req dc9c2628 ret 4, 1, 20a AUTO SGA: Init aft rsz for request 0xdc9c2628 AUTO SGA: Request 0xdc9c2628 after processing AUTO SGA: IMMEDIATE, FG request 0x7fffe9bcd0e0 AUTO SGA: Receiver of memory is shared pool, size=83, state=0, flg=1 AUTO SGA: Donor of memory is DEFAULT buffer cache, size=35, state=0, flg=1 AUTO SGA: Memory requested=4120, remaining=0 AUTO SGA: Memory received=14934016, minreq=4120, gransz=16777216 AUTO SGA: Request 0x7fffe9bcd0e0 status is COMPLETE /* ????partial transfer?? */ ?????partial transfer??????DUMP_TRANSFER_OPS????0x8c000000 partial granule???????,?: SQL> oradebug setmypid; Statement processed. SQL> oradebug dump DUMP_TRANSFER_OPS 1; Statement processed. SQL> oradebug tracefile_name; /s01/admin/G10R2/udump/g10r2_ora_21482.trc =======================trace content============================== GRANULE SIZE is 16777216 COMPONENT NAME : shared pool Number of granules in partially inuse list (listid 4) is 23 Granule addr is 0x8c000000 Granule owner is DEFAULT buffer cache /* ?0x8c000000 granule?shared pool?partially inuse list, ?????owner??default buffer cache */ Granule 0x8c000000 dump from owner perspective gptr = 0x8c000000, num buf hdrs = 1989, num buffers = 156, ghdr = 0x8cffe000 / * ?????granule?granule header????0x8cffe000, ????156?buffer block,1989?buffer header */ /* ??granule??????,??????buffer cache??shared pool chunk */ BH:0x8cf76018 BA:(nil) st:11 flg:20000 BH:0x8cf76128 BA:(nil) st:11 flg:20000 BH:0x8cf76238 BA:(nil) st:11 flg:20000 BH:0x8cf76348 BA:(nil) st:11 flg:20000 BH:0x8cf76458 BA:(nil) st:11 flg:20000 BH:0x8cf76568 BA:(nil) st:11 flg:20000 BH:0x8cf76678 BA:(nil) st:11 flg:20000 BH:0x8cf76788 BA:(nil) st:11 flg:20000 BH:0x8cf76898 BA:(nil) st:11 flg:20000 BH:0x8cf769a8 BA:(nil) st:11 flg:20000 BH:0x8cf76ab8 BA:(nil) st:11 flg:20000 BH:0x8cf76bc8 BA:(nil) st:11 flg:20000 BH:0x8cf76cd8 BA:0x8c018000 st:1 flg:622202 ............... Address 0x8cf30000 to 0x8cf74000 not in cache Address 0x8cf74000 to 0x8d000000 in cache Granule 0x8c000000 dump from receivers perspective Dumping layout Address 0x8c000000 to 0x8c018000 in sga heap(1,3) (idx=1, dur=4) Address 0x8c018000 to 0x8c01a000 not in this pool Address 0x8c01a000 to 0x8c020000 in sga heap(1,3) (idx=1, dur=4) Address 0x8c020000 to 0x8c022000 not in this pool Address 0x8c022000 to 0x8c032000 in sga heap(1,3) (idx=1, dur=4) Address 0x8c032000 to 0x8c034000 not in this pool Address 0x8c034000 to 0x8c054000 in sga heap(1,3) (idx=1, dur=4) Address 0x8c054000 to 0x8c056000 not in this pool Address 0x8c056000 to 0x8c09c000 in sga heap(1,3) (idx=1, dur=4) Address 0x8c09c000 to 0x8c09e000 not in this pool Address 0x8c09e000 to 0x8c0b6000 in sga heap(1,3) (idx=1, dur=4) Address 0x8c0b6000 to 0x8c0b8000 not in this pool Address 0x8c0b8000 to 0x8c0d2000 in sga heap(1,3) (idx=1, dur=4) ???????granule?????shared granule??????,?????????buffer block,????1?shared subpool??????durtaion?4?chunk,duration=4?execution duration;??duration?chunk???????????,??extent???quiesce list??????????????free?execution duration?????????????,??????duration???extent(??????extent????granule)??????? ?????????????ASMM?????????,????: V$SGAINFODisplays summary information about the system global area (SGA). V$SGADisplays size information about the SGA, including the sizes of different SGA components, the granule size, and free memory. V$SGASTATDisplays detailed information about the SGA. V$SGA_DYNAMIC_COMPONENTSDisplays information about the dynamic SGA components. This view summarizes information based on all completed SGA resize operations since instance startup. V$SGA_DYNAMIC_FREE_MEMORYDisplays information about the amount of SGA memory available for future dynamic SGA resize operations. V$SGA_RESIZE_OPSDisplays information about the last 400 completed SGA resize operations. V$SGA_CURRENT_RESIZE_OPSDisplays information about SGA resize operations that are currently in progress. A resize operation is an enlargement or reduction of a dynamic SGA component. V$SGA_TARGET_ADVICEDisplays information that helps you tune SGA_TARGET. ?????????shared pool duration???,?????????

    Read the article

  • .htaccess - RewriteRules working, but browser address bar displaying full (unfriendly) URL

    - by axol
    Hey there, Haven't been able to find a solution to this around the net or these forums - apologise if I've missed something! My .htaccess RewriteRules are working well - have search-engine and user -friendly links in my web pages, and unfriendly database URLs running hidden in the background. Except when I added a RewriteRule to add "www." to the front of URLs if the user didn't enter it - to ensure only one appears in search engines. Here's what now happening, and I can't figure out why! My friendly URL structure for content is like this, and the database query string uses the first "importantword": www.example.com/importantword-nonimportantword/ .htaccess snippet: Options +FollowSymLinks Options -Indexes RewriteEngine on RewriteOptions MaxRedirects=10 RewriteBase / RewriteRule ^/$ index.php [L] RewriteRule ^(.*)-(.*)/overview/$ detail.php?categoryID=$1 [L] RewriteCond %{HTTP_HOST} !^www.example.com$ RewriteRule ^(.*)$ http://www.example.com/$1 [L] What's happening since I added the last 2 lines: CASE 1: user types (or clicks) www.example.com/honda-vehicle/overview/ - Works correctly - They are taken to the correct page and the browser URL bar says: www.example.com/honda-vehicles/overview/ CASE 2: user types example.com - Works correctly - They are taken to www.example.com and the browser URL bar says: www.example.com CASE 3: user types (or clicks) example.com/honda-vehicles/overview/ i.e. without the prefix "www" - Does NOT work correctly - They are taken to the right page, but the browser URL bar displays the unfriendly URL: www.example.com/detail.php?categoryID=honda I figure there's some issue with the order of the RewriteRules, but it's doing my head in trying to logically step through it and figure it out! Any assistance at all or pointers would be most appreciated!

    Read the article

  • Rspec2, Rails3, Authlogic: Can't run specs

    - by Sam
    When I do rspec spec in my rails project, I get No examples were matched. Perhaps {:if=>#<Proc:0x0000010126e998@/Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/rspec-core-2.3.1/lib/rspec/core/configuration.rb:50 (lambda)>, :unless=>#<Proc:0x0000010126e970@/Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/rspec-core-2.3.1/lib/rspec/core/configuration.rb:51 (lambda)>} is excluding everything? Finished in 0.00004 seconds 0 examples, 0 failures Now, this seems like maybe if I wrote a spec it would work, but as soon as I write a spec (and I do include spec_helper) /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/rspec-core-2.3.1/lib/rspec/core/backward_compatibility.rb:20:in `const_missing': uninitialized constant Authlogic (NameError) from /{myapp}/app/models/user_session.rb:1:in `<top (required)>' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/railties-3.0.3/lib/rails/engine.rb:138:in `block (2 levels) in eager_load!' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/railties-3.0.3/lib/rails/engine.rb:137:in `each' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/railties-3.0.3/lib/rails/engine.rb:137:in `block in eager_load!' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/railties-3.0.3/lib/rails/engine.rb:135:in `each' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/railties-3.0.3/lib/rails/engine.rb:135:in `eager_load!' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/railties-3.0.3/lib/rails/application.rb:108:in `eager_load!' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/railties-3.0.3/lib/rails/application/finisher.rb:41:in `block in <module:Finisher>' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/railties-3.0.3/lib/rails/initializable.rb:25:in `instance_exec' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/railties-3.0.3/lib/rails/initializable.rb:25:in `run' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/railties-3.0.3/lib/rails/initializable.rb:50:in `block in run_initializers' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/railties-3.0.3/lib/rails/initializable.rb:49:in `each' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/railties-3.0.3/lib/rails/initializable.rb:49:in `run_initializers' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/railties-3.0.3/lib/rails/application.rb:134:in `initialize!' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/railties-3.0.3/lib/rails/application.rb:77:in `method_missing' from /{myapp}/config/environment.rb:5:in `<top (required)>' from <internal:lib/rubygems/custom_require>:29:in `require' from <internal:lib/rubygems/custom_require>:29:in `require' from /{myapp}/spec/spec_helper.rb:3:in `<top (required)>' from <internal:lib/rubygems/custom_require>:29:in `require' from <internal:lib/rubygems/custom_require>:29:in `require' from /{myapp}/spec/controllers/pages_controller_spec.rb:1:in `<top (required)>' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/rspec-core-2.3.1/lib/rspec/core/configuration.rb:388:in `load' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/rspec-core-2.3.1/lib/rspec/core/configuration.rb:388:in `block in load_spec_files' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/rspec-core-2.3.1/lib/rspec/core/configuration.rb:388:in `map' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/rspec-core-2.3.1/lib/rspec/core/configuration.rb:388:in `load_spec_files' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/rspec-core-2.3.1/lib/rspec/core/command_line.rb:18:in `run' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/rspec-core-2.3.1/lib/rspec/core/runner.rb:55:in `run_in_process' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/rspec-core-2.3.1/lib/rspec/core/runner.rb:46:in `run' from /Users/samliu/.rvm/gems/ruby-1.9.2-p0@rails3/gems/rspec-core-2.3.1/lib/rspec/core/runner.rb:10:in `block in autorun' The important line here seems to be /core/backward_compatibility.rb:20:in `const_missing': uninitialized constant Authlogic (NameError) Now if this were rails 2.3.8, I'd simply put config.gem "authlogic" into the environment.rb, in the initialization code block. However, the rails 3 environment.rb looks way different (there is no config code block, so putting it in arbitrarily causes an error where config is not defined). So my questions are 1) Do I actually have to put the gem config anywhere? I looked at https://github.com/trevmex/authlogic_rails3_example/ and it seems he didn't put it anywhere. 2) Does anyone know what I'm doing wrong in terms of rspec? My gem list is *** LOCAL GEMS *** abstract (1.0.0) actionmailer (3.0.3, 3.0.1, 3.0.0, 3.0.0.rc2, 2.3.4) actionpack (3.0.3, 3.0.1, 3.0.0, 3.0.0.rc2, 2.3.4) activemodel (3.0.3, 3.0.1, 3.0.0, 3.0.0.rc2) activerecord (3.0.3, 3.0.1, 3.0.0, 3.0.0.rc2, 2.3.4) activeresource (3.0.3, 3.0.1, 3.0.0, 3.0.0.rc2, 2.3.4) activesupport (3.0.3, 3.0.1, 3.0.0, 3.0.0.rc2, 2.3.4) arel (2.0.6, 1.0.1) asdf (0.5.0) authlogic (2.1.6, 2.1.3) autotest (4.4.6, 4.4.1) autotest-fsevent (0.2.4) autotest-growl (0.2.9) autotest-rails (4.1.0) autotest-rails-pure (4.1.2) bluecloth (2.0.9) builder (2.1.2) bundler (1.0.7, 1.0.2) cgi_multipart_eof_fix (2.5.0) commonwatir (1.6.2) couchrest (0.33) cri (1.0.1) cucumber (0.4.4, 0.4.3, 0.3.11) daemons (1.1.0, 1.0.10) dependencies (0.0.7) diff-lcs (1.1.2) erubis (2.6.6) fastercsv (1.5.0) fastthread (1.0.7) firewatir (1.6.2) flay (1.4.0) flog (2.2.0) funfx (0.2.2) gem_plugin (0.2.3) gemsonrails (0.7.2) giraffesoft-resource_controller (0.6.5) haml (2.2.14) hoe (2.3.3) i18n (0.4.1) jscruggs-metric_fu (1.1.5) json_pure (1.1.9) kramdown (0.12.0) mail (2.2.13, 2.2.6.1) memcache-client (1.8.5) mime-types (1.16) mojombo-chronic (0.3.0) mongrel (1.1.5) monk (0.0.7) nanoc (3.1.5) nanoc3 (3.1.5) nokogiri (1.4.3.1, 1.4.0) open4 (0.9.6) polyglot (0.3.1, 0.2.9) rack (1.2.1, 1.0.1) rack-mount (0.6.13) rack-test (0.5.6) rails (3.0.0, 2.3.4) rails3-generators (0.17.0, 0.14.0) railties (3.0.3, 3.0.1, 3.0.0, 3.0.0.rc2) rake (0.8.7) relevance-rcov (0.9.2.1) rest-client (1.0.3) rspec (2.3.0, 2.0.0.rc, 1.2.9) rspec-core (2.3.1, 2.0.0.rc) rspec-expectations (2.3.0, 2.0.0.rc) rspec-mocks (2.3.0, 2.0.0.rc) rspec-rails (2.3.1, 2.0.0.rc, 1.2.9) ruby_parser (2.0.4) rubyforge (2.0.3) rubygems-update (1.3.6, 1.3.5) rvm (1.0.13) s4t-utils (1.0.4) safariwatir (0.3.7) sexp_processor (3.0.3) spork (0.7.3) sqlite3-ruby (1.3.1, 1.2.5) sys-uname (0.8.5) term-ansicolor (1.0.4) text-format (1.0.0) text-hyphen (1.0.0) thor (0.14.6, 0.14.3, 0.12.0) treetop (1.4.8, 1.4.2) tzinfo (0.3.23) user-choices (1.1.6) vlad (2.0.0) vlad-git (2.1.0) webrat (0.7.1, 0.6.0, 0.5.3) xml-simple (1.0.12) ZenTest (4.4.2) I am using ruby 1.9.2 and rails 3.0.3 installed using RVM on OSX 10.6 Snow Leopard. I just want to be able to run my specs like I used to. As a separate issue, autotest yields an error about an include for autotest/growl but I installed autotest-growl. Maybe this is a gem issue? I tried doing the same things and get the same error when it comes to using my ubuntu 10.04 server machine though. Gemfile source 'http://rubygems.org' gem 'rails', '3.0.3' # Bundle edge Rails instead: # gem 'rails', :git => 'git://github.com/rails/rails.git' gem 'sqlite3-ruby', :require => 'sqlite3' group :couch do gem 'couchrest' end group :user_auth do gem 'authlogic' gem "rails3-generators" gem 'facebooker' end group :markup do gem 'haml' gem 'sass' end group :testing do gem 'rspec-rails' gem 'rspec' gem 'webrat' gem 'cucumber' gem 'capybara' gem 'factory_girl' gem 'shoulda' gem 'autotest' end group :server do gem 'unicorn' end # Use unicorn as the web server # gem 'unicorn' # Deploy with Capistrano # gem 'capistrano' # To use debugger # gem 'ruby-debug' # Bundle the extra gems: # gem 'bj' # gem 'nokogiri' # gem 'sqlite3-ruby', :require => 'sqlite3' # gem 'aws-s3', :require => 'aws/s3' # Bundle gems for the local environment. Make sure to # put test-only gems in this group so their generators # and rake tasks are available in development mode: # group :development, :test do # gem 'webrat' # end Gemfile.lock GEM remote: http://rubygems.org/ specs: ZenTest (4.4.2) abstract (1.0.0) actionmailer (3.0.3) actionpack (= 3.0.3) mail (~> 2.2.9) actionpack (3.0.3) activemodel (= 3.0.3) activesupport (= 3.0.3) builder (~> 2.1.2) erubis (~> 2.6.6) i18n (~> 0.4) rack (~> 1.2.1) rack-mount (~> 0.6.13) rack-test (~> 0.5.6) tzinfo (~> 0.3.23) activemodel (3.0.3) activesupport (= 3.0.3) builder (~> 2.1.2) i18n (~> 0.4) activerecord (3.0.3) activemodel (= 3.0.3) activesupport (= 3.0.3) arel (~> 2.0.2) tzinfo (~> 0.3.23) activeresource (3.0.3) activemodel (= 3.0.3) activesupport (= 3.0.3) activesupport (3.0.3) arel (2.0.6) authlogic (2.1.6) activesupport autotest (4.4.6) ZenTest (>= 4.4.1) builder (2.1.2) capybara (0.4.0) celerity (>= 0.7.9) culerity (>= 0.2.4) mime-types (>= 1.16) nokogiri (>= 1.3.3) rack (>= 1.0.0) rack-test (>= 0.5.4) selenium-webdriver (>= 0.0.27) xpath (~> 0.1.2) celerity (0.8.6) childprocess (0.1.6) ffi (~> 0.6.3) couchrest (1.0.1) json (>= 1.4.6) mime-types (>= 1.15) rest-client (>= 1.5.1) cucumber (0.10.0) builder (>= 2.1.2) diff-lcs (~> 1.1.2) gherkin (~> 2.3.2) json (~> 1.4.6) term-ansicolor (~> 1.0.5) culerity (0.2.13) diff-lcs (1.1.2) erubis (2.6.6) abstract (>= 1.0.0) facebooker (1.0.75) json_pure (>= 1.0.0) factory_girl (1.3.2) ffi (0.6.3) rake (>= 0.8.7) gherkin (2.3.2) json (~> 1.4.6) term-ansicolor (~> 1.0.5) haml (3.0.25) i18n (0.5.0) json (1.4.6) json_pure (1.4.6) kgio (2.0.0) mail (2.2.13) activesupport (>= 2.3.6) i18n (>= 0.4.0) mime-types (~> 1.16) treetop (~> 1.4.8) mime-types (1.16) nokogiri (1.4.4) polyglot (0.3.1) rack (1.2.1) rack-mount (0.6.13) rack (>= 1.0.0) rack-test (0.5.6) rack (>= 1.0) rails (3.0.3) actionmailer (= 3.0.3) actionpack (= 3.0.3) activerecord (= 3.0.3) activeresource (= 3.0.3) activesupport (= 3.0.3) bundler (~> 1.0) railties (= 3.0.3) rails3-generators (0.17.0) railties (>= 3.0.0) railties (3.0.3) actionpack (= 3.0.3) activesupport (= 3.0.3) rake (>= 0.8.7) thor (~> 0.14.4) rake (0.8.7) rest-client (1.6.1) mime-types (>= 1.16) rspec (2.3.0) rspec-core (~> 2.3.0) rspec-expectations (~> 2.3.0) rspec-mocks (~> 2.3.0) rspec-core (2.3.1) rspec-expectations (2.3.0) diff-lcs (~> 1.1.2) rspec-mocks (2.3.0) rspec-rails (2.3.1) actionpack (~> 3.0) activesupport (~> 3.0) railties (~> 3.0) rspec (~> 2.3.0) rubyzip (0.9.4) sass (3.1.0.alpha.206) selenium-webdriver (0.1.2) childprocess (~> 0.1.5) ffi (~> 0.6.3) json_pure rubyzip shoulda (2.11.3) sqlite3-ruby (1.3.2) term-ansicolor (1.0.5) thor (0.14.6) treetop (1.4.9) polyglot (>= 0.3.1) tzinfo (0.3.23) unicorn (3.1.0) kgio (~> 2.0.0) rack webrat (0.7.2) nokogiri (>= 1.2.0) rack (>= 1.0) rack-test (>= 0.5.3) xpath (0.1.2) nokogiri (~> 1.3) PLATFORMS ruby DEPENDENCIES authlogic autotest capybara couchrest cucumber facebooker factory_girl haml rails (= 3.0.3) rails3-generators rspec rspec-rails sass shoulda sqlite3-ruby unicorn webrat

    Read the article

  • VBS Runtime error code 800A01B6

    - by salvationishere
    I am a newbie to VBS scripting. I am getting above error on line 54, character 5 in script below. This error says "Object doesn't support this property or method: 'MimeMapArray'". And line it is referring to is: MimeMapArray(i) = CreateObject("MimeMap") Can u tell me what I am doing wrong? Here is the script in its entirety. Note, I am trying to run this on an XP OS by double-clicking this VBS file. ' This script adds the necessary Windows Presentation Foundation MIME types ' to an IIS Server. ' To use this script, just double-click or execute it from a command line. ' Running this script multiple times results in multiple entries in the IIS MimeMap. ' Set the MIME types to be added Dim MimeMapObj Dim MimeMapArray Dim WshShell Dim oExec Const ADS_PROPERTY_UPDATE = 2 Dim MimeTypesToAddArray MimeTypesToAddArray = Array(".manifest", "application/manifest", ".xaml", _ "application/xaml+xml", ".application", "application/x-ms-application", _ ".deploy", "application/octet-stream", ".xbap", "application/x-ms-xbap", _ ".xps", "application/vnd.ms-xpsdocument") ' Get the mimemap object Set MimeMapObj = GetObject("IIS://LocalHost/MimeMap") ' Call AddMimeType for every pair of extension/MIME type For counter = 0 to UBound(MimeTypesToAddArray) Step 2 AddMimeType MimeTypesToAddArray(counter), MimeTypesToAddArray(counter+1) Next ' Create a Shell object Set WshShell = CreateObject("WScript.Shell") ' Stop and Start the IIS Service Set oExec = WshShell.Exec("net stop w3svc") Do While oExec.Status = 0 WScript.Sleep 100 Loop Set oExec = WshShell.Exec("net start w3svc") Do While oExec.Status = 0 WScript.Sleep 100 Loop Set oExec = Nothing ' Report status to user WScript.Echo "Windows Presentation Foundation MIME types have been registered." ' AddMimeType Sub Sub AddMimeType(ByVal Ext, ByVal MType) ' Get the mappings from the MimeMap property. MimeMapArray = MimeMapObj.GetEx("MimeMap") ' Add a new mapping. i = UBound(MimeMapArray) + 1 ReDim Preserve MimeMapArray(i) MimeMapArray(i) = CreateObject("MimeMap") MimeMapArray(i).Extension = Ext MimeMapArray(i).MimeType = MType MimeMapObj.PutEx ADS_PROPERTY_UPDATE, "MimeMap", MimeMapArray MimeMapObj.SetInfo() End Sub

    Read the article

  • dll's loaded through reflection - Phantom Bug

    - by Seattle Leonard
    Ok, I got a strange one here and I want to know if any of you have ever run accross anything like it. So I've got this web app that loads up a bunch of dll's through reflection. Basically it looks for Types that are derived from certain abstract types and adds them to the list of things it can make. Here's the weird part. While developing there is never a problem. When installing it, there is never a problem to start with. Then, at a seemingly random time, the application breaks while trying to find all the types. I've had 2 sites sitting side by side and one worked while the other did not, and they were configured exactly(and I mean exactly) the same. IISRESET's never helped, but this did: I simply moved all the dll's out of the bin directory then moved them back. That's right I just moved them out of the bin directory then put them right back where they came from and everything worked fine. Any ideas? Got some more info When the site is working I notice this behavior: After IISRESET it still works, but recycling the app pool will cause it to break. When the site is broken: Neiter IISRESET nor recycling the app pool fixes it, but moving a single dll out then back in fixes it.

    Read the article

  • How do I use a .NET class in VBA? Syntax help!

    - by Jordan S
    ok I have couple of .NET classes that I want to use in VBA. So I must register them through COM and all that. I think I have the COM registration figured out (finally) but now I need help with the syntax of how to create the objects. Here is some pseudo code showing what I am trying to do. EDIT: Changed Attached Objects to return an ArrayList instead of a List The .NET classes look like this... public class ResourceManagment { public ResourceManagment() { // Default Constructor } public static List<RandomObject> AttachedObjects() { ArrayList list = new ArrayList(); return list; } } public class RandomObject { // public RandomObject(int someParam) { } } OK, so this is what I would like to do in VBA (demonstrated in C#) but I don't know how... public class VBAClass { public void main() { ArrayList myList = ResourceManagment.AttachedObjects(); foreach(RandomObject x in myList) { // Do something with RandomObject x like list them in a Combobox } } } One thing to note is that RandomObject does not have a public default constructor. So I can not create an instance of it like Dim x As New RandomObject. MSDN says that you can not instantiate an object that doesn't have a default constructor through COM but you can still use the object type if it is returned by another method... Types must have a public default constructor to be instantiated through COM. Managed, public types are visible to COM. However, without a public default constructor (a constructor without arguments), COM clients cannot create an instance of the type. COM clients can still use the type if the type is instantiated in another way and the instance is returned to the COM client. You may include overloaded constructors that accept varying arguments for these types. However, constructors that accept arguments may only be called from managed (.NET) code. Added: Here is my attempt in VB: Dim count As Integer count = 0 Dim myObj As New ResourceManagment For Each RandomObject In myObj.AttachedObjects count = count + 1 Next RandomObject

    Read the article

  • Why can't I use WCF DataContract and ISerializable on the same class?

    - by Dave
    Hi all, I have a class that I need to be able to serialize to a SQLServer session variable and be available over a WCF Service. I have declared it as follows namespace MyNM { [Serializable] [DataContract(Name = "Foo", Namespace = "http://www.mydomain.co.uk")] public class Foo : IEntity, ISafeCopy<Foo> { [DataMember(Order = 0)] public virtual Guid Id { get; set; } [DataMember(Order = 1)] public virtual string a { get; set; } DataMember(Order = 2)] public virtual Bar c { get; set; } /* ISafeCopy implementation */ } [Serializable] [DataContract(Name = "Bar ", Namespace = "http://www.mydomain.co.uk")] public class Bar : IEntity, ISafeCopy<Bar> { #region Implementation of IEntity DataMember(Order = 0)] public virtual Guid Id { get; set; } [DataMember(Order = 1)] public virtual Baz y { get; set; } #endregion /* ISafeCopy implementation*/ } [Serializable] [DataContract] public enum Baz { [EnumMember(Value = "one")] one, [EnumMember(Value = "two")] two, [EnumMember(Value = "three")] three } But when I try and call this service, I get the following error in the trace log. "System.Runtime.Serialization.InvalidDataContractException: Type 'BarProxybcb100e8617f40ceaa832fe4bb94533c' cannot be ISerializable and have DataContractAttribute attribute." If I take out the Serializable attribute, the WCF service works, but when the object can't be serialized to session. If I remove the DataContract attribute from class Bar, the WCF service fails saying Type 'BarProxy3bb05a31167f4ba492909ec941a54533' with data contract name 'BarProxy3bb05a31167f4ba492909ec941a54533:http://schemas.datacontract.org/2004/07/' is not expected. Add any types not known statically to the list of known types - for example, by using the KnownTypeAttribute attribute or by adding them to the list of known types passed to DataContractSerializer I've tried adding a KnownType attribute to the foo class [KnownType(typeof(Bar))] But I still get the same error. Can anyone help me out with this? Many thanks Dave

    Read the article

  • Implementing BindingList<T>

    - by EtherealMonkey
    I am trying to learn more about BindingList because I believe that it will help me with a project that I am working on. Currently, I have an object class (ScannedImage) that is a subtype of a class (HashedImage) that subtypes a native .Net object (Image). There is no reason why I couldn't move the two subtypes together. I am simply subtyping an object that I had previously constructed, but I will now be storing my ScannedImage object in an RDB (well, not technically - only the details and probably the thumbnail). Also, the object class has member types that are my own custom types (Keywords). I am using a custom datagridview to present these objects, but am handling all changes to the ScannedImage object with my own code. As you can probably imagine, I have quite a few events to handle that occur in these base types. So, if I changed my object to implement INotifyPropertyChanged, would the object collection (implementing BindingList) receive notifications of changes to the ScannedImage object? Also, if Keywords were to implement INotifyPropertyChanged, would changes be accessible to the BindingList through the ScannedImage object? Sorry if this seems rather newbish. I only recently discovered the BindingList and not having formal training in C# programming - am having a difficult time moving forward with this. Also, if anyone has any good reference material, I would be thankful for links. Obviously, I have perused the MSDN Library. I have found a few good links on the web, but it seems that a lot of people are now using WPF and ObservableCollection. My project is based on Winforms and .Net3.5 framework. TIA

    Read the article

  • Exporting non-public type through public API

    - by feelgood
    What if I have few factory methods returning non-public type and pairing set of methods which gives variables of this non-public type? This results with titled warning message in NetBeans. In result public API will contain only two pairing sets of methods. The reason is to make my type hierarchy sealed (like seald classes in Scala) and allow users only instantiate these types through factory methods. So we get DSL in some sense. For example, Schedule class represented by calendar fields' contraints. There are some types of contraints - Range, Singleton, List, FullSet - with NumberSet interface as a root. We don't want to expose these types and how Schedule interact with them. We just want the specification from the user. So we make NumberSet package-private. In class Schedule we create few factory-methods for constraints: NumberSet singleton(int value); NumberSet range(int form, int to); NumberSet list(NumberSet ... components); and some methods for creating Schedule object: Schedule everyHour(NumberSet minutes); Schedule everyDay(NumberSet minutes, NumberSet hours); User can only use them in the manner: Schedule s = Schedule.everyDay( singleton(0), list(range(10-15), singleton(8)) ); Is it bad idea?

    Read the article

  • Toolbar hiding on rotated UISplitView DetailView

    - by Gerry
    I've based my app on Apple's SplitView project type. I have a TableView as the Master, and am using different types of views as the Detail view. To select types of detail view, I'm using the fancy concept of buttons on my DetailView toolbar. When the DetailView is derived from UIViewController, everything is good. When the DetailView derives from UIViewController, but contains a UITableView then I have problems. In portrait view the toolbar is visible. In landscape mode the toolbar is hidden, even though the Tableview is moved down to allow space for it. The UIToolbar and UITableView are both defined in my NIB file which is loaded to create the detail view. Why is my toolbar invisible in landscape? BTW, is this the best way to choose Detail view types with UISplitView? Bonus question, what if selecting a row in my DetailView tableview should bring up another View, I can't push it like I would with a NaviagtionController, so how do I go back to the detail tableview? Thanks, Gerry

    Read the article

  • Svn import with auto-props & pre-commit hook

    - by James Tisato
    My company's svn repo has a lot of MS Word docs in it. We've implemented a policy that all .doc files must have the svn:needs-lock property set to prevent parallel access on files that are hard to merge (we've also done this for xls, ppt, pdf etc.). We've implemented the policy by distributing a svn config with auto-props set appropriately for all relevant document types. We've also set up a pre-commit hook that checks that all added files of these types have the needs-lock property set (i.e. if they forget/are too lazy to update their svn config file, they won't be able to add any docs to the repo). The problem I'm having, however, is that the pre-commit hook fails when users try to import files into the repo, e.g. some users like to add files directly thru TortoiseSVN's Repo Browser, which effectively is an svn import. Through testing on other file types, I have seen that doing an import does in fact apply the auto-props listed in my config, but they don't seem to be applied at the point that the pre-commit hook runs. When importing .doc files, the hook fails, saying that the needs-lock property is missing. Is there really much difference between adding a single file to a working copy and committing it vs importing a file directly? Do we need to tailor our precommit hook in some way to cater for this scenario?

    Read the article

  • What is the correct date format for a Date column in YUI DataTable ?

    - by giulio
    I have produced a data table. All the columns are sortable. It has a date in one column which I formatted dd/mm/yyyy hh:mm:ss . This is different from the default format as defined in the doco, but I should be able to define my own format for non-american formats. (See below) The DataTable class provides a set of built-in static functions to format certain well-known types of data. In your Column definition, if you set a Column's formatter to YAHOO.widget.DataTable.formatDate, that function will render data of type Date with the default syntax of "MM/DD/YYYY". If you would like to bypass a built-in formatter in favor of your own, you can point a Column's formatter to a custom function that you define. The table is generated from HTML Markup, so the data is held within "" tags. This gives me some more clues about compatible string dates for javascript: In general, the RecordSet expects to hold data in native JavaScript types. For instance, a date is expected to be a JavaScript Date instance, not a string like "4/26/2005" in order to sort properly. Converting data types as data comes into your RecordSet is enabled through the parser property in the fields array of your DataSource's responseSchema I suspect that the I'm missing something in the date format. So what is an acceptable string date for javascript, that Yui dataTable will recognise, given that I want format it as "dd/mm/yyyy hh:mm:ss" ?

    Read the article

  • Strange ng-model behavior inside ng-repeat

    - by Mike Fisher
    I'm trying to build up a complex post request to run a report in my Angular app. I have a list of inputs all dynamically generated via an ng-repeat a simple version of my html looks like this. <div ng-repeat="filter in lists.filters"> <input type="checkbox" ng-model="report.options.filters[filter.value]['type']/> <input type="text" ng-model="report.options.filters[filter.value]['values']/> </div> ng-repeat is looping over this array [ {name: 'Advertisers', value: 'advertisers'}, {name: 'Sizes', value: 'sizes'}, {name: 'Campaign IDs', value: 'campaigns'}, {name: 'Creative IDs', value: 'creatives'}, {name: 'Publishers', value: 'publishers'}, {name: 'Placement IDs', value: 'placements'}, {name: 'Seller Types', value: 'seller_types'}, {name: 'Impression Types', value: 'impression_types'}, {name: 'Bid Types', value: 'bid_types'}, {name: 'Seller Members', value: 'seller_members'}, {name: 'Buyer Members', value: 'buyer_members'}, {name: 'Insertion Order Ids', value: 'insertion_orders'}, {name: 'Countries', value: 'countries'}, {name: 'Site Ids', value: 'sites'}, {name: 'Sources', value: 'sources'} ]; The JSON I'm sending back needs to be structured like this: "filters": { "state": "all", "campaigns": {type:"include", values":[1,2]}, "creatives": {type:"exclude","values":[1,2]}, "publishers": {"values":[1,2]}, "placements": {type:"exclude",values":[1,2]}, "advertisers": {"values":[1,2]}, "sizes": {"values":[1,2]}, "countries": {"values":[1,2]}, "insertion_orders": {"values":[1,2]}, "sites": {"values":[1,2]}, "bid_types": {"values":[1,2]}, "seller_types": {"values":[1,2]}, "impression_types": {"values":[1,2]}, "seller_members": {"values":[1,2]}, "buyer_members": {"values":[1,2]}, "sources": {"values":[1,2]} } When I do this Angular throws an error: 'Cannot set property 'values' of undefined' and 'Cannot set property 'type' of undefined' Yet if I do this (inside ng-repeat) <input type="text" ng-model="report.options.filters[filter.value]/> Or this outside of ng-repeat <input type="text" ng-model="report.options.filters[filter.value]['values']/> No errors are thrown and everything works fine. I'm positive that filter.value is defined and available on the scope even though Angular thinks it's not for some reason. I'm not quite sure what I'm doing wrong. Any help is much appreciated.

    Read the article

  • Why are all objects in this extension of usercontrol null at runtime?

    - by csciguy
    All, I have a simple class. public class Container : UserControl { public bool IsClickable { get; set; } } I have a class that extends this class. public class ScrollingContainer : Container { public void Draw() { } public void Update() { } } I have a custom class, that then extends ScrollingContainer. public partial class MaskContainer : ScrollingContainer { public MaskContainer() { InitializeComponent(); } } XAML <local:ScrollingContainer x:Class="Test.Types.MaskContainer" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:d="http://schemas.microsoft.com/expression/blend/2008" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" xmlns:local="clr-namespace:GameObjects;assembly=GameObjects" mc:Ignorable="d" > </local:ScrollingContainer> In my mainpage.xaml, I have the following. <types:MaskContainer x:Name="maskContainer" Canvas.ZIndex="1" Width="Auto" Height="Auto"> <Canvas x:Name="maskCanvas"> <Button x:Name="button1" Content="test button"/> </Canvas> </types:MaskContainer> Why, at runtime, are both maskCanvas and button1 null? maskContainer is not null. The inheritance should be straightforward here. Container inherits usercontrol. Scrollable container inherits container. Mask Container inherits scrollable container. Why am I losing the fucntionality of the original base class at this level? Is it incorrect to add the element (button1) to the maskcontainer inside of the main.xaml? My end goal is to create a container that is reusable, but inherits all properties/methods that I've specified throughout the chain. Any help is appreciated.

    Read the article

  • .NET XML serialization

    - by ShdNx
    I would like to serialize and deserialize mixed data into XML. After some searches I found out that there were two ways of doing it: System.Runtime.Serialization.Formatters.Soap.SoapFormatter and System.Xml.Serialization.XmlSerializer. However, neither turned out to match my requirements, since: SoapFormatter doesn't support serialization of generic types XmlSerializer refuses to serialize types that implement IDictionary, not to mention that it's far less easy-to-use than "normal" serialization (e.g. see this SO question) I'm wondering if there exists an implementation that doesn't have these limitations? I have found attempts (for example CustomXmlSerializer and YAXLib as suggested in a related SO question), but they don't seem to work either. I was thinking about writing such serializer myself (though it certainly doesn't seem to be a very easy task), but then I find myself limited by the CLR, as I cannot create object instances of types that won't have a paramaterless constructor, even if I'm using reflection. I recall having read somewhere that implementations in System.Runtime.Serialization somehow bypass the normal object creation mechanism when deserializing objects, though I'm not sure. Any hints of how could this be done? Could somebody please point me to the right direction with this?

    Read the article

  • Partial template specialization on a class

    - by Jonathan Swinney
    I'm looking for a better way to this. I have a chunk of code that needs to handle several different objects that contain different types. The structure that I have looks like this: class Base { // some generic methods } template <typename T> class TypedBase : public Base { // common code with template specialization private: std::map<int,T> mapContainingSomeDataOfTypeT; } template <> class TypedBase<std::string> : public Base { // common code with template specialization public: void set( std::string ); // functions not needed for other types std::string get(); private: std::map<int,std::string> mapContainingSomeDataOfTypeT; // some data not needed for other types } Now I need to add some additional functionality that only applies to one of the derivative classes. Specifically the std::string derivation, but the type doesn't actually matter. The class is big enough that I would prefer not copy the whole thing simply to specialize a small part of it. I need to add a couple of functions (and accessor and modifier) and modify the body of several of the other functions. Is there a better way to accomplish this?

    Read the article

< Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >