Search Results

Search found 10201 results on 409 pages for 'virtual desktops'.

Page 356/409 | < Previous Page | 352 353 354 355 356 357 358 359 360 361 362 363  | Next Page >

  • Too many connections to Sql Server 2008

    - by Luis Forero
    I have an application in C# Framework 4.0. Like many app this one connects to a data base to get information. In my case this database is SqlServer 2008 Express. The database is in my machine In my data layer I’m using Enterprise Library 5.0 When I publish my app in my local machine (App Pool Classic) Windows Professional IIS 7.5 The application works fine. I’m using this query to check the number of connections my application is creating when I’m testing it. SELECT db_name(dbid) as DatabaseName, count(dbid) as NoOfConnections, loginame as LoginName FROM sys.sysprocesses WHERE dbid > 0 AND db_name(dbid) = 'MyDataBase' GROUP BY dbid, loginame When I start testing the number of connection start growing but at some point the max number of connection is 26. I think that’s ok because the app works When I publish the app to TestMachine1 • XP Mode Virtual Machine (Windows XP Professional) • IIS 5.1 It works fine, the behavior is the same, the number of connections to the database increment to 24 or 26, after that they stay at that point no matter what I do in the application. The problem: When I publish to TestMachine2 (App Pool Classic) • Windows Server 2008 R2 • IIS 7.5 I start to test the application the number of connection to the database start to grow but this time they grow very rapidly and don’t stop growing at 24 or 26, the number of connections grow till the get to be 100 and the application stop working at that point. I have check for any difference on the publications, especially in Windows Professional and Windows Server and they seem with the same parameters and configurations. Any clues why this could be happening? , any suggestions?

    Read the article

  • How can I undo what I did when I accidentally booted linux host inside itself with VMware?

    - by ThomasGHenry
    Hello, I'm dual booting XP and Kubuntu. I wanted to boot to my existing raw scsi XP partition inside Kubuntu, not a virtual XP instance. I accidentally booted Kubuntu inside itself. I know this is a big mistake, so I interrupted the VM, which saved the state and closed. I rebooted the host and now I can't load the Kubuntu partition at boot time. I get a maintenance shell and the Kubuntu partition is read-only. I am able to boot XP as usual. I removed the HDD and tried to mount it on another computer as an external drive and neither partition (XP or Kubuntu) will be recognized, it just appears to be one device that still mounts and appears empty. From the maintenance shell I can see all the files are still on the Kubuntu partition. How can I undo what I did when I accidentally booted Kubuntu inside itself? Is it a matter of unlocking some files somewhere? how can I do that on a RO filesystem? Thanks!

    Read the article

  • Increasing link speed on OpenVPN (bandwidth)

    - by Mike
    I have bought a tunnel service by using OpenVPN. For a year I've had 10 Mbps max upload/download speed but now I've bought an additional 20 Mbps making the available total bandwidth 30 Mbps for me. On their homepage there are some controls available for me, for example to restart the tunnel. I've done that. It also says that the speed has indeed been upgraded to 30 Mbps on their page. I also got an email that said they have upgraded the speed. However after I reboot my machine, and OpenVPN has started up and is running as usual, when I look at the Windows Task Manager (opens when pressing CTRL+SHIFT+ESC) in the "Networking" tab I still have a link speed of only 10 Mbps. Two adapters are listed: Local Area Connection 4 (10 Mbps) and Local Area Connection 5 (100 Mbps). LAC5 is my "real" adapter, I have a 100 Mbps Internet connection if I don't use a tunnel. LAC3 is the virtual adapter used by OpenVPN. The problem is that it is still showing 10 Mbps even though I have upgraded to 30 Mbps. How can I fix this?

    Read the article

  • SMTP Server setting on Windows 2008 R2

    - by user223298
    I am very very new to this and just trying to configure SMTP virtual server. I have followed a few threads to get it all running, but the mails are not being delivered. What I have done so far - 1) Install SMTP server. 2) SMTP server Properties General Tab - IP address is set to 'All Unassigned'. Access Tab - Authentication is anonymous access. Everything else is left to Default settings. Delivery Tab - Outbound security is anonymous access. In Advance section, entered the domain name in the FQDN field, and localhost in Smart host field. 3) Created an Inbound Rule for SMTP service to allow connections to Port 25. When I try to telnet, everything works up until the point the mail has to be send. Now, the sender's domain is different to the receiver's domain. Not sure if settings have to be changed to allow that? I had set the Relay restrictions on SMTP server, but because I couldn't send the mails, I thought I might as well make it work without the relay first. The error I see while sending the mail is 451 Timeout waiting for client input. I used to get some other error before when I had Relay restrictions on. Can anyone please point me in the right direction? Please let me know if you need more information. Thanks.

    Read the article

  • Windows 7 VM log on lockout

    - by AKa
    My Windows 7 XP Mode virtual machine has just locked me out of password log on. Two years of use and it never required I use a password on wake up, I never asked for that. Suddenly yesterday, password required! I located password and used it successfully a couple of times, but now even that is not good enough! Perhaps (unfairly) because I tried to get to the bottom of the new phenomenon and removed the password from the user accounts? Permissions are still set to ok for all users. I have been all the way through the settings I can access with the VM file hibernated, and have deleted the previously saved log on info, which always previously worked automatically as charged. Now when I attempt to log on it asks me for credentials, seems like progress, but when I offer them, and check the "remember my credentials" box, I still get the splash screen "The system could not log you on. [reminder about caps...]" !! Round and round. Back up and restore point versions of the VM toss me back into the same log on loop. There are no other machines on any network, I am the administrator and sole user. It must be specifically about the log on, a speck of dust corruption ... is there a way around this? I tried creating a new VM but the black inner box gets stuck at one point requesting I insert a boot disc. Thanks for input, AKa

    Read the article

  • Accessing the partitions on an VLM volume

    - by projix
    Suppose you have an LVM volume /dev/vg0/mylv. You have presented this as a virtual disk to a virtualised or emulated guest system. During installation the guest system sees it as /dev/sda and partitions it into /dev/sda{1,2,5,6} and completes the installation. Now at some point you need to access those filesystems from within the host system, without running the guest system. fdisk sees these partitions just fine: # fdisk -l /dev/vg0/mylv Device Boot Start End Blocks Id System /dev/vg0/mylv1 2048 684031 340992 83 Linux /dev/vg0/mylv2 686078 20969471 10141697 5 Extended /dev/vg0/mylv5 686080 8290303 3802112 83 Linux /dev/vg0/mylv6 8292352 11980799 1844224 83 Linux However, the devices such as /dev/vg0/mylv1 do not actually exist. I guess that because they're within an LV, the OS does not recognise this type of nesting by default. Is there any way I can prod Linux so that /dev/vg0/mylv1 or equivalent appears and thus becomes mountable within the host system? I understand that it's possible with qemu-nbd, and will use this if necessary. However, I was hoping for something more direct if possible, rather than simulating a network block device and attaching that.

    Read the article

  • How to provide users with isolated drive letters in Windows 2008 R2 (Terminal Server) [migrated]

    - by Pierre
    I need to be able to host several RDP sessions on a Terminal Server, where users of group A see a drive X: mapped to a given folder of the server and another group B see the same drive letter X: mapped to another folder. For instance : User 1, Group A X: --> C:\data\A User 2, Group A X: --> C:\data\A User 3, Group B X: --> C:\data\B User 4, Group C X: --> C:\data\C Is this possible. If so, how do I configure the virtual drive mapping so that the user has nothing special to do; i.e. I want the letter X: to be available to Remote Apps launched by the user, or if the user logs in to the remote desktop. Can I somehow use subst to get this to work? I would like to avoid, if possible, mounting drive letters on local shares (i.e. I don't like the idea of having to go through \\localhost\data-A to reach the user's data).

    Read the article

  • Print over the internet from a remote linux session locally (on a Windows 7 machine) to the shared printers?

    - by obeliksz
    I'm trying to use a linux virtual machine as a file server for windows clients. I have successfully implemented remote file sharing (samba+ssh) with which I am able to print locally with a little program that I made for this purpose (jetforms style)... but I would like to hear about a somewhat more direct approach. How can I attach the printers to the server, so that I can for example open a file on the remote session and in the print dialogbox I would see my local printers (on the machine from which I have established a remote session)? I guess there should be some kind of putty tunneling, but dont know how. I have a windows 7 machine locally; there is a CentOS 6 VM over the internet. It has ssh, cups, and samba. I have found a question which asks the opposite: there is a windows based server to connect form linux but that windows has a domain, mine is just a simple windows workstation that is behind NAT and has a dynamic IP. That question is: Print from Linux to Windows networked printer.

    Read the article

  • Determining how all memory is used in Windows Server 2008

    - by Mojah
    I have a Windows Server 2008 system, which has 12GB of RAM. If I list all processes in the Task Manager, and SUM() the memory of each process (Working Set, Memory (Private Working Set), Commit Size, ...), I never reach more than 4-5GB that should be "in use". However, task manager reports this server has 11GB in use via the "Performance" tab. I'm failing in determining where all that used RAM is going. It doesn't seem to be system cache, but I can not be sure. It might be a memory leak in one of the appliances, but I'm struggling to find out which one. The server's memory keeps filing up, and eventually forces us to reboot the device to clear it. I've been reading up on how RAM assignments work on Windows Server: RAM, Virtual Memory, Pagefile and all that stuff: http://support.microsoft.com/kb/2267427 What's the best way to measure? http://www.zdnet.com/blog/bott/windows-7-memory-usage-whats-the-best-way-to-measure/1786 Configure the file system cache in Windows: http://smallvoid.com/article/winnt-system-cache.html But I fear I'm stuck without ideas at the moment.

    Read the article

  • CentOS 6 and Sun/Oracle Java Issue

    - by user1710563
    I have a OpenVZ VPS running CentOS 6.3 64 bit and when I try to install JRE 7 64bit using the command: rpm -Uvh java.rpm It gives me this error: Preparing... ########################################### [100%] 1:jre ########################################### [100%] Unpacking JAR files... rt.jar... Error: Could not open input file: /usr/java/jre1.7.0_09/lib/rt.pack jsse.jar... Error: Could not open input file: /usr/java/jre1.7.0_09/lib/jsse.pack charsets.jar... Error: Could not open input file: /usr/java/jre1.7.0_09/lib/charsets.pack localedata.jar... Error: Could not open input file: /usr/java/jre1.7.0_09/lib/ext/localedata.pack I then tried the command: java -version And it gives me this error: Error occurred during initialization of VM Could not reserve enough space for object heap Error: Could not create the Java Virtual Machine Error: A fatal exception has occurred. Program will exit. Why does this happen if I have more than enough RAM on the VPS to run this (1GB)? Could it be an issue with the host node of the VPS? Thanks EDIT 1: Link to beancounter screenshot http://puu.sh/1xwxB EDIT 2: Link to htop screenshot http://puu.sh/1xwDl

    Read the article

  • Apache mod_disk_cache on a seperate drive

    - by pavs
    how can I set Apache mod_disk_cache on a separate drive from where the OS/Apache is installed? I have set up this on my apache2.conf: <IfModule mod_cache_disk.c> # cache cleaning is done by htcacheclean, which can be configured in # /etc/default/apache2 # # For further information, see the comments in that file, # /usr/share/doc/apache2/README.Debian, and the htcacheclean(8) # man page. # This path must be the same as the one in /etc/default/apache2 CacheRoot /media/cacheHD # This will also cache local documents. It usually makes more sense to # put this into the configuration for just one virtual host. CacheEnable disk / # The result of CacheDirLevels * CacheDirLength must not be higher than # 20. Moreover, pay attention on file system limits. Some file systems # do not support more than a certain number of inodes and # subdirectories (e.g. 32000 for ext3) CacheDirLevels 2 CacheDirLength 1 </IfModule> and it doesn't seem to be caching anything at all. The drive itself is a freshly installed ssd drive formatted with ext4.

    Read the article

  • Computers on network crashing

    - by Phil Cross
    We have recently upgraded our network to Windows 7 clients with Windows server 2008 servers. The upgrade was completed by the end of September and until now has been fine (apart from the minor bugs). Recently (within the last 2 weeks) we've notice all computers on the network (around 1000) start to slow down to the point their unusable. It starts at about 08:45 and finishes at 09:15. Because of this, we think something may be broadcasting across the network. This happens every day, between these times. I cant use my computer at all at the slowdown peak, and looking at task managers performance graphs, Physical memory is hovering around 35% and CPU usage is at 0-10% (idle) yet still crashing. I've looked on DHCPs server log and cant see anything which stands out. The only change we made prior to the slowdown was installing adobe CS6 on some computers, however the slowdown affects computers without CS6. We have 2 physical machines, each with around 5-7 virtual machines running on them with ample memory. Does anyone have any suggestions as to what we can do to narrow down whats causing the crashes? Any help, suggestions or advice would be appreciated.

    Read the article

  • Renaming VLAN Interfaces in Linux

    - by rhololkeolke
    I need to know how to rename VLAN interfaces. I'm currently running Ubuntu 11.04. I'm running a networking application that takes frames on one interface applies things like delays and errors and then forwards the frames out another interface. The default naming convention which names things <interface>.<vlan> e.g. eth0.2 will not work for my purposes because the program which parses the configuration script for the networking application doesn't like the decimal in the interface name. I ran vconfig set_name_type VLAN_PLUS_VID which solves the decimal in the interface name problem, however, I can then no longer assign the same vlan id to multiple interfaces because they have the same name. I know how to change physical interface names using udev rules, but because the vlan's will have the same MAC address and they aren't physical interfaces I can't use those rules to rename the interfaces. Is there a way to rename any interface in linux, including the virtual ones? Is there a way to specify your own naming convention for config set_name_type option without having to recompile the source of vconfig?

    Read the article

  • Copy past speed very slow for a large number of files on Windows [closed]

    - by Arno2501
    I've run the following test I've created a folder containing 15'000 files of 400 bytes using this batch : @ECHO off SET times=15000 FOR /L %%i IN (1,1,%times%) DO ( fsutil file createnew filename%%i.txt 400 ) then I copy past it on my Windows Computer using this command : robocopy LargeNumberOfFiles\ LargeNumberOfFiles2\ After it has completed I can see that the transfer rate was 915810 Bytes/sec this is less than 1 MB/s. It took me several seconds to copy 7 MBytes Please note that this is very slow. I've tried the same with a folder with a single file of 50 Mbytes and the transfer rate is 1219512195 Bytes/sec. (yeah GB/s) instantaneous. Why copying large number of files take so much time - ressources on a windows filesystem ? Please note that I've tried to do the same on a linux system which runs on the same computer in a virtual machine (vmware player) with ext3 filesystem. I use the cp command and the copy is instantaneous ! Please also note the following : no antivirus I've tested that behaviour on multiple windows computers (always ntfs) i always get comparable results (transfer rate under 1MB/s avg 7-8 seconds to copy 7 MBytes) I've tested on multiple linux ext3 system the copy is always instantaneous for that amount (15000 files of 400 bytes) The question is about understanding what makes windows filesystem so slow to copy large number of files compared to a linux one for instance.

    Read the article

  • Glassfish 3 Cant update JDK no way

    - by Parhs
    Hello.. I was using 1.6.0_19 jdk and installed 1.6.0_20 jdk.. Glassfish doesnt like that... Here are my windows environment variables.. ALLUSERSPROFILE=C:\ProgramData ANT_HOME=C:\apache-ant-1.8.1\ APPDATA=C:\Users\Parhs\AppData\Roaming CommonProgramFiles=C:\Program Files\Common Files COMPUTERNAME=PARHS-PC ComSpec=C:\Windows\system32\cmd.exe FP_NO_HOST_CHECK=NO HOMEDRIVE=C: HOMEPATH=\Users\Parhs JAVA_HOME=C:\Program Files\Java\jdk1.6.0_20\ LOCALAPPDATA=C:\Users\Parhs\AppData\Local LOGONSERVER=\\PARHS-PC NUMBER_OF_PROCESSORS=2 OS=Windows_NT Path=C:\Program Files\PHP\;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wb em;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program Files\Toshiba\Bluetoot h Toshiba Stack\sys\;C:\Program Files\Microsoft SQL Server\90\Tools\binn\;C:\apa che-ant-1.8.1\bin PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC PHPRC=C:\Program Files\PHP\php.ini PROCESSOR_ARCHITECTURE=x86 PROCESSOR_IDENTIFIER=x86 Family 6 Model 14 Stepping 8, GenuineIntel PROCESSOR_LEVEL=6 PROCESSOR_REVISION=0e08 ProgramData=C:\ProgramData ProgramFiles=C:\Program Files PROMPT=$P$G PSModulePath=C:\Windows\system32\WindowsPowerShell\v1.0\Modules\ PUBLIC=C:\Users\Public SESSIONNAME=Console SystemDrive=C: SystemRoot=C:\Windows TEMP=C:\Users\Parhs\AppData\Local\Temp TMP=C:\Users\Parhs\AppData\Local\Temp USERDOMAIN=Parhs-PC USERNAME=Parhs USERPROFILE=C:\Users\Parhs VS90COMNTOOLS=C:\Program Files\Microsoft Visual Studio 9.0\Common7\Tools\ windir=C:\Windows Also here is my asenv.bat REM DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS HEADER. REM REM Copyright 2004-2009 Sun Microsystems, Inc. All rights reserved. REM REM Use is subject to License Terms REM set AS_IMQ_LIB=....\mq\lib set AS_IMQ_BIN=....\mq\bin set AS_CONFIG=..\config set AS_INSTALL=.. set AS_DEF_DOMAINS_PATH=..\domains set AS_DERBY_INSTALL=....\javadb set AS_JAVA="C:\Program Files\Java\jdk1.6.0_20" And although restarting system and server i am getting this report Operating System Information: Name of the Operating System: Windows 7 Binary Architecture name of the Operating System: x86, Version: 6.1 Number of processors available on the Operating System: 2 System load on the available processors for the last minute: -1.0. (Sum of running and queued runnable entities per minute) General Java Runtime Environment Information for the VM: 6152@Parhs-PC JRE BootClassPath: C:\glassfishv3\glassfish/modules/endorsed\javax.annotation.jar;C:\glassfishv3\glassfish/modules/endorsed\jaxb-api-osgi.jar;C:\glassfishv3\glassfish/modules/endorsed\webservices-api-osgi.jar;C:\Program Files\Java\jdk1.6.0_19\jre\lib\resources.jar;C:\Program Files\Java\jdk1.6.0_19\jre\lib\rt.jar;C:\Program Files\Java\jdk1.6.0_19\jre\lib\sunrsasign.jar;C:\Program Files\Java\jdk1.6.0_19\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.6.0_19\jre\lib\jce.jar;C:\Program Files\Java\jdk1.6.0_19\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.6.0_19\jre\classes;C:\glassfishv3\glassfish\lib\monitor\btrace-boot.jar JRE ClassPath: C:\glassfishv3\glassfish\modules\glassfish.jar;C:\glassfishv3\glassfish\lib\monitor\btrace-agent.jar JRE Native Library Path: C:\Program Files\Java\jdk1.6.0_19\bin;.;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;C:\Program Files\PHP\;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program Files\Toshiba\Bluetooth Toshiba Stack\sys\;C:\Program Files\Microsoft SQL Server\90\Tools\binn\;C:\apache-ant-1.8.1\bin JRE name: Java HotSpot(TM) Client VM Vendor: Sun Microsystems Inc. Version: 16.2-b04 List of System Properties for the Java Virtual Machine: ANTLR_USE_DIRECT_CLASS_LOADING = true AS_CONFIG = C:\glassfishv3\glassfish\config\..\config AS_DEF_DOMAINS_PATH = C:\glassfishv3\glassfish\config\..\domains AS_DERBY_INSTALL = C:\glassfishv3\glassfish\config\..\..\javadb AS_IMQ_BIN = C:\glassfishv3\glassfish\config\..\..\mq\bin AS_IMQ_LIB = C:\glassfishv3\glassfish\config\..\..\mq\lib AS_INSTALL = C:\glassfishv3\glassfish\config\.. AS_JAVA = C:\Program Files\Java\jdk1.6.0_20\jre GlassFish_Platform = Felix awt.toolkit = sun.awt.windows.WToolkit catalina.base = C:\glassfishv3\glassfish\domains\domain1 catalina.home = C:\glassfishv3\glassfish\domains\domain1 catalina.useNaming = false com.sun.aas.configRoot = C:\glassfishv3\glassfish\config com.sun.aas.derbyRoot = C:\glassfishv3\javadb com.sun.aas.domainsRoot = C:\glassfishv3\glassfish\domains com.sun.aas.hostName = Parhs-PC com.sun.aas.imqBin = C:\glassfishv3\mq\bin com.sun.aas.imqLib = C:\glassfishv3\mq\lib com.sun.aas.installRoot = C:\glassfishv3\glassfish com.sun.aas.installRootURI = file:/C:/glassfishv3/glassfish/ com.sun.aas.instanceName = server com.sun.aas.instanceRoot = C:\glassfishv3\glassfish\domains\domain1 com.sun.aas.instanceRootURI = file:/C:/glassfishv3/glassfish/domains/domain1/ com.sun.aas.javaRoot = C:\Program Files\Java\jdk1.6.0_19\jre com.sun.enterprise.config.config_environment_factory_class = com.sun.enterprise.config.serverbeans.AppserverConfigEnvironmentFactory com.sun.enterprise.hk2.cacheDir = C:\glassfishv3\glassfish\domains\domain1\osgi-cache\felix com.sun.enterprise.jaccprovider.property.repository = C:\glassfishv3\glassfish\domains\domain1/generated/policy com.sun.enterprise.security.httpsOutboundKeyAlias = s1as common.loader = ${catalina.home}/common/classes,${catalina.home}/common/endorsed/*.jar,${catalina.home}/common/lib/*.jar eclipselink.security.usedoprivileged = true ejb.home = C:\glassfishv3\glassfish\modules\ejb felix.config.properties = file:/C:/glassfishv3/glassfish/osgi/felix/conf/config.properties felix.fileinstall.bundles.new.start = true felix.fileinstall.debug = 1 felix.fileinstall.dir = C:\glassfishv3\glassfish/modules/autostart/ felix.fileinstall.poll = 5000 felix.system.properties = file:/C:/glassfishv3/glassfish/osgi/felix/conf/system.properties file.encoding = Cp1253 file.encoding.pkg = sun.io file.separator = \ glassfish.version = GlassFish v3 (build 74.2) hk2.startup.context.args = #Mon Jun 07 20:27:37 EEST 2010 -startup-classpath=C\:\\glassfishv3\\glassfish\\modules\\glassfish.jar;C\:\\glassfishv3\\glassfish\\lib\\monitor\\btrace-agent.jar __time_zero=1275931657334 hk2.startup.context.mainModule=org.glassfish.core.kernel -startup-args=--domain,,,domain1,,,--domaindir,,,C\:\\glassfishv3\\glassfish\\domains\\domain1 --domain=domain1 -startup-classname=com.sun.enterprise.glassfish.bootstrap.ASMain --domaindir=C\:\\glassfishv3\\glassfish\\domains\\domain1 hk2.startup.context.root = C:\glassfishv3\glassfish\modules http.nonProxyHosts = localhost|127.0.0.1|Parhs-PC java.awt.graphicsenv = sun.awt.Win32GraphicsEnvironment java.awt.printerjob = sun.awt.windows.WPrinterJob java.class.path = C:\glassfishv3\glassfish\modules\glassfish.jar;C:\glassfishv3\glassfish\lib\monitor\btrace-agent.jar java.class.version = 50.0 java.endorsed.dirs = C:\glassfishv3\glassfish/modules/endorsed;C:\glassfishv3\glassfish/lib/endorsed java.ext.dirs = C:\Program Files\Java\jdk1.6.0_19\jre/lib/ext;C:\Program Files\Java\jdk1.6.0_19\jre/jre/lib/ext;C:\glassfishv3\glassfish\domains\domain1/lib/ext java.home = C:\Program Files\Java\jdk1.6.0_19\jre java.io.tmpdir = C:\Users\Parhs\AppData\Local\Temp\ java.library.path = C:\Program Files\Java\jdk1.6.0_19\bin;.;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;C:\Program Files\PHP\;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program Files\Toshiba\Bluetooth Toshiba Stack\sys\;C:\Program Files\Microsoft SQL Server\90\Tools\binn\;C:\apache-ant-1.8.1\bin java.net.useSystemProxies = true java.rmi.server.randomIDs = true java.runtime.name = Java(TM) SE Runtime Environment java.runtime.version = 1.6.0_19-b04 java.security.auth.login.config = C:\glassfishv3\glassfish\domains\domain1/config/login.conf java.security.policy = C:\glassfishv3\glassfish\domains\domain1/config/server.policy java.specification.name = Java Platform API Specification java.specification.vendor = Sun Microsystems Inc. java.specification.version = 1.6 java.util.logging.config.file = C:\glassfishv3\glassfish\domains\domain1\config\logging.properties java.vendor = Sun Microsystems Inc. java.vendor.url = http://java.sun.com/ java.vendor.url.bug = http://java.sun.com/cgi-bin/bugreport.cgi java.version = 1.6.0_19 java.vm.info = mixed mode java.vm.name = Java HotSpot(TM) Client VM java.vm.specification.name = Java Virtual Machine Specification java.vm.specification.vendor = Sun Microsystems Inc. java.vm.specification.version = 1.0 java.vm.vendor = Sun Microsystems Inc. java.vm.version = 16.2-b04 javax.net.ssl.keyStore = C:\glassfishv3\glassfish\domains\domain1/config/keystore.jks javax.net.ssl.keyStorePassword = changeit javax.net.ssl.trustStore = C:\glassfishv3\glassfish\domains\domain1/config/cacerts.jks javax.net.ssl.trustStorePassword = changeit javax.rmi.CORBA.PortableRemoteObjectClass = com.sun.corba.ee.impl.javax.rmi.PortableRemoteObject javax.rmi.CORBA.StubClass = com.sun.corba.ee.impl.javax.rmi.CORBA.StubDelegateImpl javax.rmi.CORBA.UtilClass = com.sun.corba.ee.impl.javax.rmi.CORBA.Util javax.security.jacc.PolicyConfigurationFactory.provider = com.sun.enterprise.security.provider.PolicyConfigurationFactoryImpl jdbc.drivers = org.apache.derby.jdbc.ClientDriver jpa.home = C:\glassfishv3\glassfish\modules\jpa line.separator = org.glassfish.web.rfc2109_cookie_names_enforced = false org.jvnet.hk2.osgimain.autostartBundles = osgi-adapter.jar, org.apache.felix.shell.jar, org.apache.felix.shell.remote.jar, org.apache.felix.configadmin.jar, org.apache.felix.fileinstall.jar org.jvnet.hk2.osgimain.bundlesDir = C:\glassfishv3\glassfish\modules org.jvnet.hk2.osgimain.excludedSubDirs = autostart/ org.omg.CORBA.ORBClass = com.sun.corba.ee.impl.orb.ORBImpl org.omg.CORBA.ORBSingletonClass = com.sun.corba.ee.impl.orb.ORBSingleton org.osgi.framework.storage = C:\glassfishv3\glassfish\domains\domain1\osgi-cache\felix os.arch = x86 os.name = Windows 7 os.version = 6.1 osgi.shell.telnet.ip = 127.0.0.1 osgi.shell.telnet.maxconn = 1 osgi.shell.telnet.port = 6666 package.access = package.definition = path.separator = ; security.home = C:\glassfishv3\glassfish\modules\security server.loader = ${catalina.home}/server/classes,${catalina.home}/server/lib/*.jar shared.loader = ${catalina.home}/shared/classes,${catalina.home}/shared/lib/*.jar sun.arch.data.model = 32 sun.boot.class.path = C:\glassfishv3\glassfish/modules/endorsed\javax.annotation.jar;C:\glassfishv3\glassfish/modules/endorsed\jaxb-api-osgi.jar;C:\glassfishv3\glassfish/modules/endorsed\webservices-api-osgi.jar;C:\Program Files\Java\jdk1.6.0_19\jre\lib\resources.jar;C:\Program Files\Java\jdk1.6.0_19\jre\lib\rt.jar;C:\Program Files\Java\jdk1.6.0_19\jre\lib\sunrsasign.jar;C:\Program Files\Java\jdk1.6.0_19\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.6.0_19\jre\lib\jce.jar;C:\Program Files\Java\jdk1.6.0_19\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.6.0_19\jre\classes;C:\glassfishv3\glassfish\lib\monitor\btrace-boot.jar sun.boot.library.path = C:\Program Files\Java\jdk1.6.0_19\jre\bin sun.cpu.endian = little sun.cpu.isalist = pentium_pro+mmx pentium_pro pentium+mmx pentium i486 i386 i86 sun.desktop = windows sun.io.unicode.encoding = UnicodeLittle sun.java.launcher = SUN_STANDARD sun.jnu.encoding = Cp1253 sun.management.compiler = HotSpot Client Compiler sun.os.patch.level = user.country = GR user.dir = C:\glassfishv3\glassfish\domains\domain1 user.home = C:\Users\Parhs user.language = el user.name = Parhs user.timezone = Europe/Athens user.variant = web.home = C:\glassfishv3\glassfish\modules\web weld.home = C:\glassfishv3\glassfish\modules\weld Why it is so damn hard??? What am i missing?

    Read the article

  • Reflect.Emit Dynamic Type Memory Blowup

    - by Firestrand
    Using C# 3.5 I am trying to generate dynamic types at runtime using reflection emit. I used the Dynamic Query Library sample from Microsoft to create a class generator. Everything works, my problem is that 100 generated types inflate the memory usage by approximately 25MB. This is a completely unacceptable memory profile as eventually I want to support having several hundred thousand types generated in memory. Memory profiling shows that the memory is apparently being held by various System.Reflection.Emit types and methods though I can't figure out why. I haven't found others talking about this problem so I am hoping someone in this community either knows what I am doing wrong or if this is expected behavior. Contrived Example below: using System; using System.Collections.Generic; using System.Text; using System.Reflection; using System.Reflection.Emit; namespace SmallRelfectExample { class Program { static void Main(string[] args) { int typeCount = 100; int propCount = 100; Random rand = new Random(); Type dynType = null; for (int i = 0; i < typeCount; i++) { List<DynamicProperty> dpl = new List<DynamicProperty>(propCount); for (int j = 0; j < propCount; j++) { dpl.Add(new DynamicProperty("Key" + rand.Next().ToString(), typeof(String))); } SlimClassFactory scf = new SlimClassFactory(); dynType = scf.CreateDynamicClass(dpl.ToArray(), i); //Optionally do something with the type here } Console.WriteLine("SmallRelfectExample: {0} Types generated.", typeCount); Console.ReadLine(); } } public class SlimClassFactory { private readonly ModuleBuilder module; public SlimClassFactory() { AssemblyName name = new AssemblyName("DynamicClasses"); AssemblyBuilder assembly = AppDomain.CurrentDomain.DefineDynamicAssembly(name, AssemblyBuilderAccess.Run); module = assembly.DefineDynamicModule("Module"); } public Type CreateDynamicClass(DynamicProperty[] properties, int Id) { string typeName = "DynamicClass" + Id.ToString(); TypeBuilder tb = module.DefineType(typeName, TypeAttributes.Class | TypeAttributes.Public, typeof(DynamicClass)); FieldInfo[] fields = GenerateProperties(tb, properties); GenerateEquals(tb, fields); GenerateGetHashCode(tb, fields); Type result = tb.CreateType(); return result; } static FieldInfo[] GenerateProperties(TypeBuilder tb, DynamicProperty[] properties) { FieldInfo[] fields = new FieldBuilder[properties.Length]; for (int i = 0; i < properties.Length; i++) { DynamicProperty dp = properties[i]; FieldBuilder fb = tb.DefineField("_" + dp.Name, dp.Type, FieldAttributes.Private); PropertyBuilder pb = tb.DefineProperty(dp.Name, PropertyAttributes.HasDefault, dp.Type, null); MethodBuilder mbGet = tb.DefineMethod("get_" + dp.Name, MethodAttributes.Public | MethodAttributes.SpecialName | MethodAttributes.HideBySig, dp.Type, Type.EmptyTypes); ILGenerator genGet = mbGet.GetILGenerator(); genGet.Emit(OpCodes.Ldarg_0); genGet.Emit(OpCodes.Ldfld, fb); genGet.Emit(OpCodes.Ret); MethodBuilder mbSet = tb.DefineMethod("set_" + dp.Name, MethodAttributes.Public | MethodAttributes.SpecialName | MethodAttributes.HideBySig, null, new Type[] { dp.Type }); ILGenerator genSet = mbSet.GetILGenerator(); genSet.Emit(OpCodes.Ldarg_0); genSet.Emit(OpCodes.Ldarg_1); genSet.Emit(OpCodes.Stfld, fb); genSet.Emit(OpCodes.Ret); pb.SetGetMethod(mbGet); pb.SetSetMethod(mbSet); fields[i] = fb; } return fields; } static void GenerateEquals(TypeBuilder tb, FieldInfo[] fields) { MethodBuilder mb = tb.DefineMethod("Equals", MethodAttributes.Public | MethodAttributes.ReuseSlot | MethodAttributes.Virtual | MethodAttributes.HideBySig, typeof(bool), new Type[] { typeof(object) }); ILGenerator gen = mb.GetILGenerator(); LocalBuilder other = gen.DeclareLocal(tb); Label next = gen.DefineLabel(); gen.Emit(OpCodes.Ldarg_1); gen.Emit(OpCodes.Isinst, tb); gen.Emit(OpCodes.Stloc, other); gen.Emit(OpCodes.Ldloc, other); gen.Emit(OpCodes.Brtrue_S, next); gen.Emit(OpCodes.Ldc_I4_0); gen.Emit(OpCodes.Ret); gen.MarkLabel(next); foreach (FieldInfo field in fields) { Type ft = field.FieldType; Type ct = typeof(EqualityComparer<>).MakeGenericType(ft); next = gen.DefineLabel(); gen.EmitCall(OpCodes.Call, ct.GetMethod("get_Default"), null); gen.Emit(OpCodes.Ldarg_0); gen.Emit(OpCodes.Ldfld, field); gen.Emit(OpCodes.Ldloc, other); gen.Emit(OpCodes.Ldfld, field); gen.EmitCall(OpCodes.Callvirt, ct.GetMethod("Equals", new Type[] { ft, ft }), null); gen.Emit(OpCodes.Brtrue_S, next); gen.Emit(OpCodes.Ldc_I4_0); gen.Emit(OpCodes.Ret); gen.MarkLabel(next); } gen.Emit(OpCodes.Ldc_I4_1); gen.Emit(OpCodes.Ret); } static void GenerateGetHashCode(TypeBuilder tb, FieldInfo[] fields) { MethodBuilder mb = tb.DefineMethod("GetHashCode", MethodAttributes.Public | MethodAttributes.ReuseSlot | MethodAttributes.Virtual | MethodAttributes.HideBySig, typeof(int), Type.EmptyTypes); ILGenerator gen = mb.GetILGenerator(); gen.Emit(OpCodes.Ldc_I4_0); foreach (FieldInfo field in fields) { Type ft = field.FieldType; Type ct = typeof(EqualityComparer<>).MakeGenericType(ft); gen.EmitCall(OpCodes.Call, ct.GetMethod("get_Default"), null); gen.Emit(OpCodes.Ldarg_0); gen.Emit(OpCodes.Ldfld, field); gen.EmitCall(OpCodes.Callvirt, ct.GetMethod("GetHashCode", new Type[] { ft }), null); gen.Emit(OpCodes.Xor); } gen.Emit(OpCodes.Ret); } } public abstract class DynamicClass { public override string ToString() { PropertyInfo[] props = GetType().GetProperties(BindingFlags.Instance | BindingFlags.Public); StringBuilder sb = new StringBuilder(); sb.Append("{"); for (int i = 0; i < props.Length; i++) { if (i > 0) sb.Append(", "); sb.Append(props[i].Name); sb.Append("="); sb.Append(props[i].GetValue(this, null)); } sb.Append("}"); return sb.ToString(); } } public class DynamicProperty { private readonly string name; private readonly Type type; public DynamicProperty(string name, Type type) { if (name == null) throw new ArgumentNullException("name"); if (type == null) throw new ArgumentNullException("type"); this.name = name; this.type = type; } public string Name { get { return name; } } public Type Type { get { return type; } } } }

    Read the article

  • Managing the layout of a Java MainFrame of Canvas3d

    - by John N
    Hi, Im trying to organise the layout of four canvas3d objects in a single MainFrame. Iv tried using some layout managers but none are working (or im doing it wrong). Can anyone give me advice or point me to a way to get this to display the four canvas's as a grid of four? Thanks, John public class Main { public static void Main(){ Window win = new Window(); } } import javax.media.j3d.BranchGroup; import javax.media.j3d.Canvas3D; import javax.media.j3d.Locale; import javax.media.j3d.PhysicalBody; import javax.media.j3d.PhysicalEnvironment; import javax.media.j3d.Transform3D; import javax.media.j3d.TransformGroup; import javax.media.j3d.View; import javax.media.j3d.ViewPlatform; import javax.media.j3d.VirtualUniverse; import javax.vecmath.Vector3f; import com.sun.j3d.utils.picking.PickCanvas; public class Universe { boolean camera = true; Canvas3D canvas1, canvas2, canvas3, canvas4; VirtualUniverse universe; Locale locale; TransformGroup vpTrans1, vpTransRight, vpTransFront, vpTransPers; TransformGroup mouseTransform = null; View view1, view2, view3, view4; BranchGroup scene; PickCanvas pickCanvas1 = null; PickCanvas pickCanvas2 = null; PickCanvas pickCanvas3 = null; PickCanvas pickCanvas4 = null; BranchGroup obj = new BranchGroup(); // Create a BranchGroup node for the view platform BranchGroup vpRoot = new BranchGroup(); //Temp vars for cam movement public Universe(Canvas3D c1, Canvas3D c2, Canvas3D c3, Canvas3D c4, BranchGroup scene) { this.canvas1 = c1; this.canvas2 = c2; this.canvas3 = c3; this.canvas4 = c4; this.scene = scene; // Establish a virtual universe that has a single // hi-res Locale universe = new VirtualUniverse(); locale = new Locale(universe); // Create a PhysicalBody and PhysicalEnvironment object PhysicalBody body = new PhysicalBody(); PhysicalEnvironment environment = new PhysicalEnvironment(); // Create a View and attach the Canvas3D and the physical // body and environment to the view. view1 = new View(); view1.addCanvas3D(c1); view1.addCanvas3D(c2); view1.addCanvas3D(c3); view1.addCanvas3D(c4); view1.setPhysicalBody(body); view1.setPhysicalEnvironment(environment); // Create a BranchGroup node for the view platform BranchGroup vpRoot = new BranchGroup(); // Create a ViewPlatform object, and its associated // TransformGroup object, and attach it to the root of the // subgraph. Attach the view to the view platform. Transform3D t = new Transform3D(); t.set(new Vector3f(0.0f, 0.0f, 2.0f)); ViewPlatform vp = new ViewPlatform(); vpTrans1 = new TransformGroup(t); vpTrans1.addChild(vp); vpRoot.addChild(vpTrans1); vpRoot.addChild(scene); view1.attachViewPlatform(vp); // Attach the branch graph to the universe, via the // Locale. The scene graph is now live! locale.addBranchGraph(vpRoot); } } import javax.media.j3d.BranchGroup; import javax.media.j3d.Canvas3D; import javax.media.j3d.Locale; import javax.media.j3d.PhysicalBody; import javax.media.j3d.PhysicalEnvironment; import javax.media.j3d.Transform3D; import javax.media.j3d.TransformGroup; import javax.media.j3d.View; import javax.media.j3d.ViewPlatform; import javax.media.j3d.VirtualUniverse; import javax.vecmath.Vector3f; import com.sun.j3d.utils.picking.PickCanvas; public class Universe { boolean camera = true; Canvas3D canvas1, canvas2, canvas3, canvas4; VirtualUniverse universe; Locale locale; TransformGroup vpTrans1, vpTransRight, vpTransFront, vpTransPers; TransformGroup mouseTransform = null; View view1, view2, view3, view4; BranchGroup scene; PickCanvas pickCanvas1 = null; PickCanvas pickCanvas2 = null; PickCanvas pickCanvas3 = null; PickCanvas pickCanvas4 = null; BranchGroup obj = new BranchGroup(); // Create a BranchGroup node for the view platform BranchGroup vpRoot = new BranchGroup(); //Temp vars for cam movement public Universe(Canvas3D c1, Canvas3D c2, Canvas3D c3, Canvas3D c4, BranchGroup scene) { this.canvas1 = c1; this.canvas2 = c2; this.canvas3 = c3; this.canvas4 = c4; this.scene = scene; // Establish a virtual universe that has a single // hi-res Locale universe = new VirtualUniverse(); locale = new Locale(universe); // Create a PhysicalBody and PhysicalEnvironment object PhysicalBody body = new PhysicalBody(); PhysicalEnvironment environment = new PhysicalEnvironment(); // Create a View and attach the Canvas3D and the physical // body and environment to the view. view1 = new View(); view1.addCanvas3D(c1); view1.addCanvas3D(c2); view1.addCanvas3D(c3); view1.addCanvas3D(c4); view1.setPhysicalBody(body); view1.setPhysicalEnvironment(environment); // Create a BranchGroup node for the view platform BranchGroup vpRoot = new BranchGroup(); // Create a ViewPlatform object, and its associated // TransformGroup object, and attach it to the root of the // subgraph. Attach the view to the view platform. Transform3D t = new Transform3D(); t.set(new Vector3f(0.0f, 0.0f, 2.0f)); ViewPlatform vp = new ViewPlatform(); vpTrans1 = new TransformGroup(t); vpTrans1.addChild(vp); vpRoot.addChild(vpTrans1); vpRoot.addChild(scene); view1.attachViewPlatform(vp); // Attach the branch graph to the universe, via the // Locale. The scene graph is now live! locale.addBranchGraph(vpRoot); } }

    Read the article

  • C++ Multithreading with pthread is blocking (including sockets)

    - by Sebastian Büttner
    I am trying to implement a multi threaded application with pthread. I did implement a thread class which looks like the following and I call it later twice (or even more), but it seems to block instead of execute the threads parallel. Here is what I got until now: The Thread Class is an abstract class which has the abstract method "exec" which should contain the thread code in a derive class (I did a sample of this, named DerivedThread) Thread.hpp #ifndef THREAD_H_ #define THREAD_H_ #include <pthread.h> class Thread { public: Thread(); void start(); void join(); virtual int exec() = 0; int exit_code(); private: static void* thread_router(void* arg); void exec_thread(); pthread_t pth_; int code_; }; #endif /* THREAD_H_ */ And Thread.cpp #include <iostream> #include "Thread.hpp" /*****************************/ using namespace std; Thread::Thread(): code_(0) { cout << "[Thread] Init" << endl; } void Thread::start() { cout << "[Thread] Created Thread" << endl; pthread_create( &pth_, NULL, Thread::thread_router, reinterpret_cast<void*>(this)); } void Thread::join() { cout << "[Thread] Join Thread" << endl; pthread_join(pth_, NULL); } int Thread::exit_code() { return code_; } void Thread::exec_thread() { cout << "[Thread] Execute" << endl; code_ = exec(); } void* Thread::thread_router(void* arg) { cout << "[Thread] exec_thread function in thread" << endl; reinterpret_cast<Thread*>(arg)->exec_thread(); return NULL; } DerivedThread.hpp #include "Thread.hpp" class DerivedThread : public Thread { public: DerivedThread(); virtual ~DerivedThread(); int exec(); void Close() = 0; DerivedThread.cpp [...] #include "DerivedThread.cpp" [...] int DerivedThread::exec() { //code to be executed do { cout << "Thread executed" << endl; usleep(1000000); } while (true); //dummy, just to let it run for a while } [...] Basically, I am calling this like the here: DerivedThread *thread; cout << "Creating Thread" << endl; thread = new DerivedThread(); cout << "Created thread, starting..." << endl; thread->start(); cout << "Started thread" << endl; cout << "Creating 2nd Thread" << endl; thread = new DerivedThread(); cout << "Created 2nd thread, starting..." << endl; thread->start(); cout << "Started 2nd thread" << endl; What is working great if I am only starting one of these Threads , but if I start multiple which should run together (not synced, only parallel) . But I discovered, that the thread is created, then as it tries to execute it (via start) the problem seems to block until the thread has closed. After that the next Thread is processed. I thought that pthread would do it unblocked for me, so what did I wrong? A sample output might be: Creating Thread [Thread] Thread Init Created thread, starting... [Thread] Created thread [Thread] exec_thread function in thread [Thread] Execute Thread executed Thread executed Thread executed Thread executed Thread executed Thread executed Thread executed .... Until Thread 1 is not terminated, a Thread 2 won't be created not executed. The process above is executed in an other class. Just for the information: I am trying to create a multi threaded server. The concept is like this: MultiThreadedServer Class has a main loop, like this one: ::inet::ServerSock *sock; //just a simple self made wrapper class for sockets DerivedThread *thread; for (;;) { sock = new ::inet::ServerSock(); this->Socket->accept( *sock ); cout << "Creating Thread" << endl; //Threads (according to code sample above) thread = new DerivedThread(sock); //I did not mentoine the parameter before as it was not neccesary, in fact, I pass the socket handle with the connected socket to the thread cout << "Created thread, starting..." << endl; thread->start(); cout << "Started thread" << endl; } So I thought that this would loop over and over and wait for new connections to accept. and when a new client arrives, I am creating a new thread and give the thread the connected socket as a parameter. In the DerivedThread::exec I am doing the handling for the connected client. Like: [...] do { [...] if (this-sock_-read( Buffer, sizeof(PacketStruc) ) 0) { cout << "[Handler_Base] Recv Packet" << endl; //handle the packet } else { Connected = false; } delete Buffer; } while ( Connected ); So I loop in the created thread as long as the client keeps the connection. I think, that the socket may cause the blocking behaviour. Edit: I figured out, that it is not the read() loop in the DerivedThread Class as I simply replaced it with a loop over a simple cout-usleep part. It did also only execute the first one and after first thread finished, the 2nd one was executed. Many thanks and best regards, Sebastian

    Read the article

  • C++ simple logging class with UTF-8 output [code example]

    - by Andrew
    Hello everyone, I was working on one of my academic projects and for the first time I needed pure C++ without GUI. After googling for a while, I did not find any simple and easy to use implementation for logging and created my own. This is a simple implementation with iostreams that logs messages to screen and to the file simultaneously. I was thinking of using templates but then I realized that I do not expect any changes and removed that. It is modified std::wostream with two added modifiers: 1. TimeStamp - prints time-stamp 2. LogMode(LogModes) - switches output: file only, screen only, file+screen. *Boost::utf8_codecvt_facet* is used for UTF-8 output. // ############################################################################ // # Name: MyLog.h # // # Purpose: Logging Class Header # // # Author: Andrew Drach # // # Modified by: <somebody> # // # Created: 03/21/10 # // # SVN-ID: $Id$ # // # Copyright: (c) 2010 Andrew Drach # // # Licence: <license> # // ############################################################################ #ifndef INCLUDED_MYLOG_H #define INCLUDED_MYLOG_H // headers -------------------------------------------------------------------- #include <string> #include <iostream> #include <fstream> #include <exception> #include <boost/program_options/detail/utf8_codecvt_facet.hpp> using namespace std; // definitions ---------------------------------------------------------------- // ---------------------------------------------------------------------------- // DblBuf class // Splits up output stream into two // Inspired by http://wordaligned.org/articles/cpp-streambufs // ---------------------------------------------------------------------------- class DblBuf : public wstreambuf { private: // private member declarations DblBuf(); wstreambuf *bf1; wstreambuf *bf2; virtual int_type overflow(int_type ch) { int_type eof = traits_type::eof(); int_type not_eof = !eof; if ( traits_type::eq_int_type(ch,eof) ) return not_eof; else { char_type ch1 = traits_type::to_char_type(ch); int_type r1( bf1on ? bf1->sputc(ch1) : not_eof ); int_type r2( bf2on ? bf2->sputc(ch1) : not_eof ); return (traits_type::eq_int_type(r1,eof) || traits_type::eq_int_type(r2,eof) ) ? eof : ch; } } virtual int sync() { int r1( bf1on ? bf1->pubsync() : NULL ); int r2( bf2on ? bf2->pubsync() : NULL ); return (r1 == 0 && r2 == 0) ? 0 : -1; } public: // public member declarations explicit DblBuf(wstreambuf *bf1, wstreambuf *bf2) : bf1(bf1), bf2(bf2) { if (bf1) bf1on = true; else bf1on = false; if (bf2) bf2on = true; else bf2on = false; } bool bf1on; bool bf2on; }; // ---------------------------------------------------------------------------- // logstream class // Wrapper for a standard wostream with access to modified buffer // ---------------------------------------------------------------------------- class logstream : public wostream { private: // private member declarations logstream(); public: // public member declarations DblBuf *buf; explicit logstream(wstreambuf *StrBuf, bool isStd = false) : wostream(StrBuf, isStd), buf((DblBuf*)StrBuf) {} }; // ---------------------------------------------------------------------------- // Logging mode Class // ---------------------------------------------------------------------------- enum LogModes{LogToFile=1, LogToScreen, LogToBoth}; class LogMode { private: // private member declarations LogMode(); short mode; public: // public member declarations LogMode(short mode1) : mode(mode1) {} logstream& operator()(logstream &stream1) { switch(mode) { case LogToFile: stream1.buf->bf1on = true; stream1.buf->bf2on = false; break; case LogToScreen: stream1.buf->bf1on = false; stream1.buf->bf2on = true; break; case LogToBoth: stream1.buf->bf1on = true; stream1.buf->bf2on = true; } return stream1; } }; logstream& operator<<(logstream &out, LogMode mode) { return mode(out); } wostream& TimeStamp1(wostream &out1) { time_t time1; struct tm timeinfo; wchar_t timestr[512]; // Get current time and convert it to a string time(&time1); localtime_s (&timeinfo, &time1); wcsftime(timestr, 512,L"[%Y-%b-%d %H:%M:%S %p] ",&timeinfo); return out1 << timestr; } // ---------------------------------------------------------------------------- // MyLog class // Logs events to both file and screen // ---------------------------------------------------------------------------- class MyLog { private: // private member declarations MyLog(); auto_ptr<DblBuf> buf; string mErrorMsg1; string mErrorMsg2; string mErrorMsg3; string mErrorMsg4; public: // public member declarations explicit MyLog(string FileName1, wostream *ScrLog1, locale utf8locale1); ~MyLog(); void NewEvent(wstring str1, bool TimeStamp = true); string FileName; wostream *ScrLog; wofstream File; auto_ptr<logstream> Log; locale utf8locale; }; // ---------------------------------------------------------------------------- // MyLog constructor // ---------------------------------------------------------------------------- MyLog::MyLog(string FileName1, wostream *ScrLog1, locale utf8locale1) : // ctors mErrorMsg1("Failed to open file for application logging! []"), mErrorMsg2("Failed to write BOM! []"), mErrorMsg3("Failed to write to file! []"), mErrorMsg4("Failed to close file! []"), FileName(FileName1), ScrLog(ScrLog1), utf8locale(utf8locale1), File(FileName1.c_str()) { // Adjust error strings mErrorMsg1.insert(mErrorMsg1.length()-1,FileName1); mErrorMsg2.insert(mErrorMsg2.length()-1,FileName1); mErrorMsg3.insert(mErrorMsg3.length()-1,FileName1); mErrorMsg4.insert(mErrorMsg4.length()-1,FileName1); // check for file open errors if ( !File ) throw ofstream::failure(mErrorMsg1); // write UTF-8 BOM File << wchar_t(0xEF) << wchar_t(0xBB) << wchar_t(0xBF); // switch locale to UTF-8 File.imbue(utf8locale); // check for write errors if ( File.bad() ) throw ofstream::failure(mErrorMsg2); buf.reset( new DblBuf(File.rdbuf(),ScrLog->rdbuf()) ); Log.reset( new logstream(&*buf) ); } // ---------------------------------------------------------------------------- // MyLog destructor // ---------------------------------------------------------------------------- MyLog::~MyLog() { *Log << TimeStamp1 << "Log finished." << endl; // clean up objects Log.reset(); buf.reset(); File.close(); // check for file close errors if ( File.bad() ) throw ofstream::failure(mErrorMsg4); } //--------------------------------------------------------------------------- #endif // INCLUDED_MYLOG_H Tested on MSVC 2008, boost 1.42. I do not know if this is the right place to share it. Hope it helps anybody. Feel free to make it better.

    Read the article

  • MVC4 Model in View has nested data - cannot get data in model

    - by Taersious
    I have a Model defined that gets me a View with a list of RadioButtons, per IEnumerable. Within that Model, I want to display a list of checkboxes that will vary based on the item selected. Finally, there will be a Textarea in the same view once the user has selected from the available checkboxes, with some dynamic text there based on the CheckBoxes that are selected. What we should end up with is a Table-per-hierarchy. The layout is such that the RadioButtonList is in the first table cell, the CheckBoxList is in the middle table cell, and the Textarea is ini the right table cell. If anyone can guide me to what my model-view should be to achieve this result, I'll be most pleased... Here are my codes: // // View Model for implementing radio button list public class RadioButtonViewModel { // objects public List<RadioButtonItem> RadioButtonList { get; set; } public string SelectedRadioButton { get; set; } } // // Object for handling each radio button public class RadioButtonItem { // this object public string Name { get; set; } public bool Selected { get; set; } public int ObjectId { get; set; } // columns public virtual IEnumerable<CheckBoxItem> CheckBoxItems { get; set; } } // // Object for handling each checkbox public class CheckBoxViewModel { public List<CheckBoxItem> CheckBoxList { get; set; } } // // Object for handling each check box public class CheckBoxItem { public string Name { get; set; } public bool Selected { get; set; } public int ObjectId { get; set; } public virtual RadioButtonItem RadioButtonItem { get; set; } } and the view @model IEnumerable<EF_Utility.Models.RadioButtonItem> @{ ViewBag.Title = "Connect"; ViewBag.Selected = Request["name"] != null ? Request["name"].ToString() : ""; } @using (Html.BeginForm("Objects" , "Home", FormMethod.Post) ){ @Html.ValidationSummary(true) <table> <tbody> <tr> <td style="border: 1px solid grey; vertical-align:top;"> <table> <tbody> <tr> <th style="text-align:left; width: 50px;">Select</th> <th style="text-align:left;">View or Table Name</th> </tr> @{ foreach (EF_Utility.Models.RadioButtonItem item in @Model ) { <tr> <td> @Html.RadioButton("RadioButtonViewModel.SelectedRadioButton", item.Name, ViewBag.Selected == item.Name ? true : item.Selected, new { @onclick = "this.form.action='/Home/Connect?name=" + item.Name + "'; this.form.submit(); " }) </td> <td> @Html.DisplayFor(i => item.Name) </td> </tr> } } </tbody> </table> </td> <td style="border: 1px solid grey; width: 220px; vertical-align:top; @(ViewBag.Selected == "" ? "display:none;" : "")"> <table> <tbody> <tr> <th>Column </th> </tr> <tr> <td><!-- checkboxes will go here --> </td> </tr> </tbody> </table> </td> <td style="border: 1px solid grey; vertical-align:top; @(ViewBag.Selected == "" ? "display:none;" : "")"> <textarea name="output" id="output" rows="24" cols="48"></textarea> </td> </tr> </tbody> </table> } and the relevant controller public ActionResult Connect() { /* TEST SESSION FIRST*/ if( Session["connstr"] == null) return RedirectToAction("Index"); else { ViewBag.Message = ""; ViewBag.ConnectionString = Server.UrlDecode( Session["connstr"].ToString() ); ViewBag.Server = ParseConnectionString( ViewBag.ConnectionString, "Data Source" ); ViewBag.Database = ParseConnectionString( ViewBag.ConnectionString, "Initial Catalog" ); using( var db = new SysDbContext(ViewBag.ConnectionString)) { var objects = db.Set<SqlObject>().ToArray(); var model = objects .Select( o => new RadioButtonItem { Name = o.Name, Selected = false, ObjectId = o.Object_Id, CheckBoxItems = Enumerable.Empty<EF_Utility.Models.CheckBoxItem>() } ) .OrderBy( rb => rb.Name ); return View( model ); } } } What I am missing it seems, is the code in my Connect() method that will bring the data context forward; at that point, it should be fairly straight-forward to set up the Html for the View. EDIT ** So I am going to need to bind the RadioButtonItem to the view with something like the following, except my CheckBoxList will NOT be an empty set. // // POST: /Home/Connect/ [HttpPost] public ActionResult Connect( RadioButtonItem rbl ) { /* TEST SESSION FIRST*/ if ( Session["connstr"] == null ) return RedirectToAction( "Index" ); else { ViewBag.Message = ""; ViewBag.ConnectionString = Server.UrlDecode( Session["connstr"].ToString() ); ViewBag.Server = ParseConnectionString( ViewBag.ConnectionString, "Data Source" ); ViewBag.Database = ParseConnectionString( ViewBag.ConnectionString, "Initial Catalog" ); using ( var db = new SysDbContext( ViewBag.ConnectionString ) ) { var objects = db.Set<SqlObject>().ToArray(); var model = objects .Select( o => new RadioButtonItem { Name = o.Name, Selected = false, ObjectId = o.Object_Id, CheckBoxItems = Enumerable.Empty<EF_Utility.Models.CheckBoxItem>() } ) .OrderBy( rb => rb.Name ); return View( model ); } } }

    Read the article

  • Announcing the Release of Visual Studio 2013 and Great Improvements to ASP.NET and Entity Framework

    - by ScottGu
    Today we released VS 2013 and .NET 4.5.1. These releases include a ton of great improvements, and include some fantastic enhancements to ASP.NET and the Entity Framework.  You can download and start using them now. Below are details on a few of the great ASP.NET, Web Development, and Entity Framework improvements you can take advantage of with this release.  Please visit http://www.asp.net/vnext for additional release notes, documentation, and tutorials. One ASP.NET With the release of Visual Studio 2013, we have taken a step towards unifying the experience of using the different ASP.NET sub-frameworks (Web Forms, MVC, Web API, SignalR, etc), and you can now easily mix and match the different ASP.NET technologies you want to use within a single application. When you do a File-New Project with VS 2013 you’ll now see a single ASP.NET Project option: Selecting this project will bring up an additional dialog that allows you to start with a base project template, and then optionally add/remove the technologies you want to use in it.  For example, you could start with a Web Forms template and add Web API or Web Forms support for it, or create a MVC project and also enable Web Forms pages within it: This makes it easy for you to use any ASP.NET technology you want within your apps, and take advantage of any feature across the entire ASP.NET technology span. Richer Authentication Support The new “One ASP.NET” project dialog also includes a new Change Authentication button that, when pushed, enables you to easily change the authentication approach used by your applications – and makes it much easier to build secure applications that enable SSO from a variety of identity providers.  For example, when you start with the ASP.NET Web Forms or MVC templates you can easily add any of the following authentication options to the application: No Authentication Individual User Accounts (Single Sign-On support with FaceBook, Twitter, Google, and Microsoft ID – or Forms Auth with ASP.NET Membership) Organizational Accounts (Single Sign-On support with Windows Azure Active Directory ) Windows Authentication (Active Directory in an intranet application) The Windows Azure Active Directory support is particularly cool.  Last month we updated Windows Azure Active Directory so that developers can now easily create any number of Directories using it (for free and deployed within seconds).  It now takes only a few moments to enable single-sign-on support within your ASP.NET applications against these Windows Azure Active Directories.  Simply choose the “Organizational Accounts” radio button within the Change Authentication dialog and enter the name of your Windows Azure Active Directory to do this: This will automatically configure your ASP.NET application to use Windows Azure Active Directory and register the application with it.  Now when you run the app your users can easily and securely sign-in using their Active Directory credentials within it – regardless of where the application is hosted on the Internet. For more information about the new process for creating web projects, see Creating ASP.NET Web Projects in Visual Studio 2013. Responsive Project Templates with Bootstrap The new default project templates for ASP.NET Web Forms, MVC, Web API and SPA are built using Bootstrap. Bootstrap is an open source CSS framework that helps you build responsive websites which look great on different form factors such as mobile phones, tables and desktops. For example in a browser window the home page created by the MVC template looks like the following: When you resize the browser to a narrow window to see how it would like on a phone, you can notice how the contents gracefully wrap around and the horizontal top menu turns into an icon: When you click the menu-icon above it expands into a vertical menu – which enables a good navigation experience for small screen real-estate devices: We think Bootstrap will enable developers to build web applications that work even better on phones, tablets and other mobile devices – and enable you to easily build applications that can leverage the rich ecosystem of Bootstrap CSS templates already out there.  You can learn more about Bootstrap here. Visual Studio Web Tooling Improvements Visual Studio 2013 includes a new, much richer, HTML editor for Razor files and HTML files in web applications. The new HTML editor provides a single unified schema based on HTML5. It has automatic brace completion, jQuery UI and AngularJS attribute IntelliSense, attribute IntelliSense Grouping, and other great improvements. For example, typing “ng-“ on an HTML element will show the intellisense for AngularJS: This support for AngularJS, Knockout.js, Handlebars and other SPA technologies in this release of ASP.NET and VS 2013 makes it even easier to build rich client web applications: The screen shot below demonstrates how the HTML editor can also now inspect your page at design-time to determine all of the CSS classes that are available. In this case, the auto-completion list contains classes from Bootstrap’s CSS file. No more guessing at which Bootstrap element names you need to use: Visual Studio 2013 also comes with built-in support for both CoffeeScript and LESS editing support. The LESS editor comes with all the cool features from the CSS editor and has specific Intellisense for variables and mixins across all the LESS documents in the @import chain. Browser Link – SignalR channel between browser and Visual Studio The new Browser Link feature in VS 2013 lets you run your app within multiple browsers on your dev machine, connect them to Visual Studio, and simultaneously refresh all of them just by clicking a button in the toolbar. You can connect multiple browsers (including IE, FireFox, Chrome) to your development site, including mobile emulators, and click refresh to refresh all the browsers all at the same time.  This makes it much easier to easily develop/test against multiple browsers in parallel. Browser Link also exposes an API to enable developers to write Browser Link extensions.  By enabling developers to take advantage of the Browser Link API, it becomes possible to create very advanced scenarios that crosses boundaries between Visual Studio and any browser that’s connected to it. Web Essentials takes advantage of the API to create an integrated experience between Visual Studio and the browser’s developer tools, remote controlling mobile emulators and a lot more. You will see us take advantage of this support even more to enable really cool scenarios going forward. ASP.NET Scaffolding ASP.NET Scaffolding is a new code generation framework for ASP.NET Web applications. It makes it easy to add boilerplate code to your project that interacts with a data model. In previous versions of Visual Studio, scaffolding was limited to ASP.NET MVC projects. With Visual Studio 2013, you can now use scaffolding for any ASP.NET project, including Web Forms. When using scaffolding, we ensure that all required dependencies are automatically installed for you in the project. For example, if you start with an ASP.NET Web Forms project and then use scaffolding to add a Web API Controller, the required NuGet packages and references to enable Web API are added to your project automatically.  To do this, just choose the Add->New Scaffold Item context menu: Support for scaffolding async controllers uses the new async features from Entity Framework 6. ASP.NET Identity ASP.NET Identity is a new membership system for ASP.NET applications that we are introducing with this release. ASP.NET Identity makes it easy to integrate user-specific profile data with application data. ASP.NET Identity also allows you to choose the persistence model for user profiles in your application. You can store the data in a SQL Server database or another data store, including NoSQL data stores such as Windows Azure Storage Tables. ASP.NET Identity also supports Claims-based authentication, where the user’s identity is represented as a set of claims from a trusted issuer. Users can login by creating an account on the website using username and password, or they can login using social identity providers (such as Microsoft Account, Twitter, Facebook, Google) or using organizational accounts through Windows Azure Active Directory or Active Directory Federation Services (ADFS). To learn more about how to use ASP.NET Identity visit http://www.asp.net/identity.  ASP.NET Web API 2 ASP.NET Web API 2 has a bunch of great improvements including: Attribute routing ASP.NET Web API now supports attribute routing, thanks to a contribution by Tim McCall, the author of http://attributerouting.net. With attribute routing you can specify your Web API routes by annotating your actions and controllers like this: OAuth 2.0 support The Web API and Single Page Application project templates now support authorization using OAuth 2.0. OAuth 2.0 is a framework for authorizing client access to protected resources. It works for a variety of clients including browsers and mobile devices. OData Improvements ASP.NET Web API also now provides support for OData endpoints and enables support for both ATOM and JSON-light formats. With OData you get support for rich query semantics, paging, $metadata, CRUD operations, and custom actions over any data source. Below are some of the specific enhancements in ASP.NET Web API 2 OData. Support for $select, $expand, $batch, and $value Improved extensibility Type-less support Reuse an existing model OWIN Integration ASP.NET Web API now fully supports OWIN and can be run on any OWIN capable host. With OWIN integration, you can self-host Web API in your own process alongside other OWIN middleware, such as SignalR. For more information, see Use OWIN to Self-Host ASP.NET Web API. More Web API Improvements In addition to the features above there have been a host of other features in ASP.NET Web API, including CORS support Authentication Filters Filter Overrides Improved Unit Testability Portable ASP.NET Web API Client To learn more go to http://www.asp.net/web-api/ ASP.NET SignalR 2 ASP.NET SignalR is library for ASP.NET developers that dramatically simplifies the process of adding real-time web functionality to your applications. Real-time web functionality is the ability to have server-side code push content to connected clients instantly as it becomes available. SignalR 2.0 introduces a ton of great improvements. We’ve added support for Cross-Origin Resource Sharing (CORS) to SignalR 2.0. iOS and Android support for SignalR have also been added using the MonoTouch and MonoDroid components from the Xamarin library (for more information on how to use these additions, see the article Using Xamarin Components from the SignalR wiki). We’ve also added support for the Portable .NET Client in SignalR 2.0 and created a new self-hosting package. This change makes the setup process for SignalR much more consistent between web-hosted and self-hosted SignalR applications. To learn more go to http://www.asp.net/signalr. ASP.NET MVC 5 The ASP.NET MVC project templates integrate seamlessly with the new One ASP.NET experience and enable you to integrate all of the above ASP.NET Web API, SignalR and Identity improvements. You can also customize your MVC project and configure authentication using the One ASP.NET project creation wizard. The MVC templates have also been updated to use ASP.NET Identity and Bootstrap as well. An introductory tutorial to ASP.NET MVC 5 can be found at Getting Started with ASP.NET MVC 5. This release of ASP.NET MVC also supports several nice new MVC-specific features including: Authentication filters: These filters allow you to specify authentication logic per-action, per-controller or globally for all controllers. Attribute Routing: Attribute Routing allows you to define your routes on actions or controllers. To learn more go to http://www.asp.net/mvc Entity Framework 6 Improvements Visual Studio 2013 ships with Entity Framework 6, which bring a lot of great new features to the data access space: Async and Task<T> Support EF6’s new Async Query and Save support enables you to perform asynchronous data access and take advantage of the Task<T> support introduced in .NET 4.5 within data access scenarios.  This allows you to free up threads that might otherwise by blocked on data access requests, and enable them to be used to process other requests whilst you wait for the database engine to process operations. When the database server responds the thread will be re-queued within your ASP.NET application and execution will continue.  This enables you to easily write significantly more scalable server code. Here is an example ASP.NET WebAPI action that makes use of the new EF6 async query methods: Interception and Logging Interception and SQL logging allows you to view – or even change – every command that is sent to the database by Entity Framework. This includes a simple, human readable log – which is great for debugging – as well as some lower level building blocks that give you access to the command and results. Here is an example of wiring up the simple log to Debug in the constructor of an MVC controller: Custom Code-First Conventions The new Custom Code-First Conventions enable bulk configuration of a Code First model – reducing the amount of code you need to write and maintain. Conventions are great when your domain classes don’t match the Code First conventions. For example, the following convention configures all properties that are called ‘Key’ to be the primary key of the entity they belong to. This is different than the default Code First convention that expects Id or <type name>Id. Connection Resiliency The new Connection Resiliency feature in EF6 enables you to register an execution strategy to handle – and potentially retry – failed database operations. This is especially useful when deploying to cloud environments where dropped connections become more common as you traverse load balancers and distributed networks. EF6 includes a built-in execution strategy for SQL Azure that knows about retryable exception types and has some sensible – but overridable – defaults for the number of retries and time between retries when errors occur. Registering it is simple using the new Code-Based Configuration support: These are just some of the new features in EF6. You can visit the release notes section of the Entity Framework site for a complete list of new features. Microsoft OWIN Components Open Web Interface for .NET (OWIN) defines an open abstraction between .NET web servers and web applications, and the ASP.NET “Katana” project brings this abstraction to ASP.NET. OWIN decouples the web application from the server, making web applications host-agnostic. For example, you can host an OWIN-based web application in IIS or self-host it in a custom process. For more information about OWIN and Katana, see What's new in OWIN and Katana. Summary Today’s Visual Studio 2013, ASP.NET and Entity Framework release delivers some fantastic new features that streamline your web development lifecycle. These feature span from server framework to data access to tooling to client-side HTML development.  They also integrate some great open-source technology and contributions from our developer community. Download and start using them today! Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • CodePlex Daily Summary for Tuesday, February 08, 2011

    CodePlex Daily Summary for Tuesday, February 08, 2011Popular ReleasesyoutubeFisher: youtubeFisher 3.0 [beta]: What's new: Supports YouTube's new layout Complete internal refactoringNearforums - ASP.NET MVC forum engine: Nearforums v5.0: Version 5.0 of the ASP.NET MVC Forum Engine, containing the following improvements: .NET 4.0 as target framework using ASP.NET MVC 3. All views migrated to Razor for cleaner markup. Alternate template (Layout file) for mobile devices 4 Bug Fixes since Version 4.1 Visit the project Roadmap for more details.fuv: 1.0 release, codename Chopper Joe: features: search/replace :o to open file :s to save file :q to quitASP.NET MVC Project Awesome, jQuery Ajax helpers (controls): 1.7: A rich set of helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager html generation optimized new features for the lookup (add additional search data ) live demo went aeroEnhSim: EnhSim 2.3.6 BETA: 2.3.6 BETAThis release supports WoW patch 4.06 at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 Changes since 2.3.0 ...TestApi - a library of Test APIs: TestApi v0.6: TestApi v0.6 comes with the following changes: TestApi code development has been moved to Codeplex: Moved TestApi soluton to VS 2010; Moved all source code to Codeplex. All development work is done there now. Fault Injection API: Integrated the unmanaged FaultInjectionEngine.dll COM component in the build; Cleaned up FaultInjectionEngine.dll to build at warning level 4; Implemented “FaultScope” which allows for in-process fault injection; Added automation scripts & sample program; ...AutoLoL: AutoLoL v1.5.5: AutoChat now allows up to 6 items. Items with nr. 7-0 will be removed! News page url's are now opened in the default browser Added a context menu to the system tray icon (thanks to Alex Banagos) AutoChat now allows configuring the Chat Keys and the Modifier Key The recent files list now supports compact and full mode Fix: Swapped mouse buttons are now properly detected Fix: Sometimes the Play button was pressed while still greyed out Champion: Karma Note: You can also run the u...mojoPortal: 2.3.6.2: see release notes on mojoportal.com http://www.mojoportal.com/mojoportal-2362-released.aspx Note that we have separate deployment packages for .NET 3.5 and .NET 4.0 The deployment package downloads on this page are pre-compiled and ready for production deployment, they contain no C# source code. To download the source code see the Source Code Tab I recommend getting the latest source code using TortoiseHG, you can get the source code corresponding to this release here.Rawr: Rawr 4.0.19 Beta: Rawr is now web-based. The link to use Rawr4 is: http://elitistjerks.com/rawr.phpThis is the Cataclysm Beta Release. More details can be found at the following link http://rawr.codeplex.com/Thread/View.aspx?ThreadId=237262 As of the 4.0.16 release, you can now also begin using the new Downloadable WPF version of Rawr!This is a pre-alpha release of the WPF version, there are likely to be a lot of issues. If you have a problem, please follow the Posting Guidelines and put it into the Issue Trac...IronRuby: 1.1.2: IronRuby 1.1.2 is a servicing release that keeps on improving compatibility with Ruby 1.9.2 and includes IronRuby integration to Visual Studio 2010. We decided to drop 1.8.6 compatibility mode in all post-1.0 releases. We recommend using IronRuby 1.0 if you need 1.8.6 compatibility. In this release we fixed several major issues: - problems that blocked Gem installation in certain cases - regex syntax: the parser was replaced with a new one that is much more compatible with Ruby 1.9.2 - cras...Pyxis 2: Production Release: Pyxis 2.0.0.13 - Full Production Release This release of Pyxis 2 offers you a wide range of features: Launch Applications in their own threads & domains Render alpha-blended icons on the desktop Support for SD & USB drives Online App Store Dynamic & Static IP support Menus & Modals Over a dozen GUI controls File selection dialogs Folder selection dialog Application, Bootloader, and Firmware Updating Update Release Notes Much More!Microsoft All-In-One Code Framework: Sample Browser v2 (CTP Release): Sample Browser v2 (CTP Release) http://i3.codeplex.com/Project/Download/FileDownload.aspx?ProjectName=1code&DownloadId=205917MVVM Light Toolkit: MVVM Light Toolkit V3 SP1 (4): There was a small issue with the previous release that caused errors when installing the templates in VS10 Express. This release corrects the error. Only use this if you encountered issues when installing the previous release. No changes in the binaries.Finestra Virtual Desktops: 1.0: Finally the version 1.0 release! Sorry for the long delay since the last release, but I think that you'll find this release to be really smooth, really stable, and a really great enhancement to Windows. New features include: Windows 7 taskbar integration Major performance and usability improvements Redesigned look and feel New name: Finestra Better automatic updating Much faster full-screen switcher Fixes Windows 7 hotkey collisions by default Updated installerFacebook C# SDK: 5.0.2 (BETA): PLEASE TAKE A FEW MINUTES TO GIVE US SOME FEEDBACK: Facebook C# SDK Survey This is third BETA release of the version 5 branch of the Facebook C# SDK. Remember this is a BETA build. Some things may change or not work exactly as planned. We are absolutely looking for feedback on this release to help us improve the final 5.X.X release. This release contains some breaking changes. Particularly with authentication. After spending time reviewing the trouble areas that people are having using th...ASP.NET MVC SiteMap provider: MvcSiteMapProvider 3.0.0 for MVC3: Using NuGet?MvcSiteMapProvider is also listed in the NuGet feed. Learn more... Like the project? Consider a donation!Donate via PayPal via PayPal. ChangelogTargeting ASP.NET MVC 3 and .NET 4.0 Additional UpdatePriority options for generating XML sitemaps Allow to specify target on SiteMapTitleAttribute One action with multiple routes and breadcrumbs Medium Trust optimizations Create SiteMapTitleAttribute for setting parent title IntelliSense for your sitemap with MvcSiteMapSchem...patterns & practices SharePoint Guidance: SharePoint Guidance 2010 Hands On Lab: SharePoint Guidance 2010 Hands On Lab consists of six labs: one for logging, one for service location, and four for application setting manager. Each lab takes about 20 minutes to walk through. Each lab consists of a PDF document. You can go through the steps in the doc to create solution and then build/deploy the solution and run the lab. For those of you who wants to save the time, we included the final solution so you can just build/deploy the solution and run the lab.Value Injecter - object(s) to -> object mapper: 2.3: it lets you define your own convention-based matching algorithms (ValueInjections) in order to match up (inject) source values to destination values. inject from multiple sources in one InjectFrom added ConventionInjectionMobile Device Detection and Redirection: 0.1.11.11: Improvements to Beta Release The following changes have been made in version 0.1.11.11: BlackBerry Version 6 devices (such as the 9800 Torch) are now correctly identified with a dedicated handler. Android powered devices are now correctly identified. Minor change to Provider.cs to improve performance and optimise data sent to 51Degrees.mobi if the option is enabled. GC.collect is no longer called at any point. All garbage collection now happens automatically IMPORTANT CHANGES This rele...TweetSharp: TweetSharp v2.0.0.0 - Preview 10: Documentation for this release may be found at http://tweetsharp.codeplex.com/wikipage?title=UserGuide&referringTitle=Documentation. Note: This code is currently preview quality. Preview 9 ChangesAdded support for trends Added support for Silverlight 4 Elevated WP7 fixes Third Party Library VersionsHammock v1.1.7: http://hammock.codeplex.com Json.NET 4.0 Release 1: http://json.codeplex.comNew Projectsbisolu_sendus: bisolu smsBlack & Scholes: Black & Scholes OptionsCosLabs: CosLabs makes it easy to see the full capabilities of the Cosmos operating system toolkit. CosLabs has a number of experiments to find new and unique uses for Cosmos. It's developed in C#.DeployToAzure: DeployToAzure allows automating deployment of Windows Azure project and making it a part of TFS 2010 build process without using PowerShell and Azure Management CmdLets. Digital: Educational purposesDiscogsNet: DiscogsNet is a .NET library to query the Discogs.com API and parse the Discogs XML data dump files. It's developed in C# and usable from any .NET language. The API of the library is focused on ease of use and intuitiveness.DJME - The jQuery extensions for ASP.NET MVC: DJME - The jQuery extensions for ASP.NET MVC is a lightweight framework which helps you build rich user interfaces for ASP.NET MVC while enjoying great developer productivity. DokuDB: Visio AddIn to document different versions of a databaseDtsConfig Explorer: This project is a windows froms application that helps explore a SSIS dtsconfig file with an easy wayEveryDNS Service: Windows service to automatically send your current IP address to everydns.net for dynamic domains. Automatically monitors and sends IP address changes without having to be logged in. The project is built in C# targeting the .NET 4.0 framework.EvoSim - Evolution Simulation: EvoSim is a pet project to learn aspects of MVVM, WPF/Silverlight, and Parallel Processing features of .NET 4. The goal is to simulate a world filled with creatures that move, eat, reproduce, and die according to Darwinian evolutionary principles. It is tile and turn based.EZStorage: A storage library for XNA based on EasyStorage but abstracting in an XML based file list for each folder, allowing file listings on all folders and details like file creation dates which are no longer possible in XNA 4.0ForeverBell's Snake: A snake game written in VB.NET.Glauca Browser: This is a browser with a webkit core inside.Grauers SharePoint 2010 Custom MasterPage Feature: This is a SharePoint 2010 Custom MastePage feature. Custom Master and custom CSS Blog post about the project: http://www.grauers.net/archive/2011/02/07/build-a-sharepoint-2010-custom-mastepage-with-visual-studio-2010.aspxHamming Code Sample: HamCode demonstrate how the hamming code work, result of coded and decoded data. Shows the benefits of hamming method/codeHydroDesktop Mercurial Test: This is a trial version of the HydroDesktop project (hydrodesktop.codeplex.com) running on Mercurial source code repository. We first want to test whether the data update/download is happening fine before switching the official HydroDesktop website to the new system.LateBindingApi: Creates .Net Proxy Components from COM Type Libraries.Middlesex County College Library Wireless Auto Login: The Middlesex County College Library Wireless Auto Login automatically logs a computer in to the Middlesex County College (Edison, NJ) Library's WifiMobility: Mobility is a small program that reads from a text file. The text file includes everything about the program, right down to that cute little bunny on your desktop! Try Mobility today!OpenMFC: OpenMFC is opensource version of MFC for using with C/C++ compiler without MFC.Orchard Content Sharing: This Orchard module adds content sharing functionality via integration with AddThis.com sharing service.Orchard Wunder Weather Widget: This project is used to maintain the source code for the Wunder Weather widget in the Orchard gallery.Professional Audio Recorder: Professional Audio Recorder is a Audio Recorder for Windows Phone 7Python Node Info for Umbraco: Designed to assist Umbraco macro authors who are writing their macros in Python. Provides a helper macro that, when inserted into a page, will enable the display of an info panel showing the page node properties as represented in Python.Softina.Graphs: Softina.Graphs is a graph managment and algorithm utility. It includes a set of libraries to work with graph processing. It is written using .net framework with c# language and MS Visual Studio 2008. Client applications is created using devexpress components.spangesharp: Application to help people learn c#Visual Studio Private Extension Gallery: Add a new tab in the Visual Studio extension manager to manage private extensions.WCF Home Framework: Home Framework is a service-oriented framework able to facilitate deploy and management of wcf services in a mid-size scenario project.Windows Phone 7 Video Player: This project contains all the source of an application for Windows Phone 7 to consume an RSS Media exposed by our Smooth Streaming Video Player plugin for WordPress (http://smooth.codeplex.com).WOL Shopping List: This is a shoppingList version of WOL's NearMeWP7RSSReader: Updated RSSReader project for Windows Phone 7 using the RTM tools with the October 2010 update.Xen (XNA Extended) Framework: The Xen Framework is a set of libraries that provides and extends XNA 4.0 functionality to make game development easier and faster while producing clean, maintainable code. Xen frees you up to spend more time building your game.

    Read the article

  • Laissez les bon temps rouler! (Microsoft BI Conference 2010)

    - by smisner
    "Laissez les bons temps rouler" is a Cajun phrase that I heard frequently when I lived in New Orleans in the mid-1990s. It means "Let the good times roll!" and encapsulates a feeling of happy expectation. As I met with many of my peers and new acquaintances at the Microsoft BI Conference last week, this phrase kept running through my mind as people spoke about their plans in their respective businesses, the benefits and opportunities that the recent releases in the BI stack are providing, and their expectations about the future of the BI stack. Notwithstanding some jabs here and there to point out the platform is neither perfect now nor will be anytime soon (along with admissions that the competitors are also not perfect), and notwithstanding several missteps by the event organizers (which I don't care to enumerate), the overarching mood at the conference was positive. It was a refreshing change from the doom and gloom hovering over several conferences that I attended in 2009. Although many people expect economic hardships to continue over the coming year or so, everyone I know in the BI field is busier than ever and expects to stay busy for quite a while. Self-Service BI Self-service was definitely a theme of the BI conference. In the keynote, Ted Kummert opened with a look back to a fairy tale vision of self-service BI that he told in 2008. At that time, the fairy tale future was a time when "every end user was able to use BI technologies within their job in order to move forward more effectively" and transitioned to the present time in which SQL Server 2008 R2, Office 2010, and SharePoint 2010 are available to deliver managed self-service BI. This set of technologies is presumably poised to address the needs of the 80% of users that Kummert said do not use BI today. He proceeded to outline a series of activities that users ought to be able to do themselves--from simple changes to a report like formatting or an addtional data visualization to integration of an additional data source. The keynote then continued with a series of demonstrations of both current and future technology in support of self-service BI. Some highlights that interested me: PowerPivot, of course, is the flagship product for self-service BI in the Microsoft BI stack. In the TechEd keynote, which was open to the BI conference attendees, Amir Netz (twitter) impressed the audience by demonstrating interactivity with a workbook containing 100 million rows. He upped the ante at the BI keynote with his demonstration of a future-state PowerPivot workbook containing over 2 billion records. It's important to note that this volume of data is being processed by a server engine, and not in the PowerPivot client engine. (Yes, I think it's impressive, but none of my clients are typically wrangling with 2 billion records at a time. Maybe they're thinking too small. This ability to work quickly with large data sets has greater implications for BI solutions than for self-service BI, in my opinion.) Amir also demonstrated KPIs for the future PowerPivot, which appeared to be easier to implement than in any other Microsoft product that supports KPIs, apart from simple KPIs in SharePoint. (My initial reaction is that we have one more place to build KPIs. Great. It's confusing enough. I haven't seen how well those KPIs integrate with other BI tools, which will be important for adoption.) One more PowerPivot feature that Amir showed was a graphical display of the lineage for calculations. (This is hugely practical, especially if you build up calculations incrementally. You can more easily follow the logic from calculation to calculation. Furthermore, if you need to make a change to one calculation, you can assess the impact on other calculations.) Another product demonstration will be available within the next 30 days--Pivot for Reporting Services. If you haven't seen this technology yet, check it out at www.getpivot.com. (It definitely has a wow factor, but I'm skeptical about its practicality. However, I'm looking forward to trying it out with data that I understand.) Michael Tejedor (twitter) demonstrated a feature that I think is really interesting and not emphasized nearly enough--overshadowed by PowerPivot, no doubt. That feature is the Microsoft Business Intelligence Indexing Connector, which enables search of the content of Excel workbooks and Reporting Services reports. (This capability existed in MOSS 2007, but was more cumbersome to implement. The search results in SharePoint 2010 are not only cooler, but more useful by describing whether the content is found in a table or a chart, for example.) This may yet be the dawning of the age of self-service BI - a phrase I've heard repeated from time to time over the last decade - but I think BI professionals are likely to stay busy for a long while, and need not start looking for a new line of work. Kummert repeatedly referenced strategic BI solutions in contrast to self-service BI to emphasize that self-service BI is not a replacement for the services that BI professionals provide. After all, self-service BI does not appear magically on user desktops (or whatever device they want to use). A supporting infrastructure is necessary, and grows in complexity in proportion to the need to simplify BI for users. It's one thing to hear the party line touted by Microsoft employees at the BI keynote, but it's another to hear from the people who are responsible for implementing and supporting it within an organization. Rob Collie (blog | twitter), Kasper de Jonge (blog | twitter), Vidas Matelis (site | twitter), and I were invited to join Andrew Brust (blog | twitter) as he led a Birds of a Feather session at TechEd entitled "PowerPivot: Is It the BI Deal-Changer for Developers and IT Pros?" I would single out the prevailing concern in this session as the issue of control. On one side of this issue were those who were concerned that they would lose control once PowerPivot is implemented. On the other side were those who believed that data should be freely accessible to users in PowerPivot, and even acknowledgment that users would get the data they want even if it meant they would have to manually enter into a workbook to have it ready for analysis. For another viewpoint on how PowerPivot played out at the conference, see Rob Collie's observations. Collaborative BI I have been intrigued by the notion of collaborative BI for a very long time. Before I discovered BI, I was a Lotus Notes developer and later a manager of developers, working in a software company that enabled collaboration in the legal industry. Not only did I help create collaborative systems for our clients, I created a complete project management from the ground up to collaboratively manage our custom development work. In that case, collaboration involved my team, my client contacts, and me. I was also able to produce my own BI from that system as well, but didn't know that's what I was doing at the time. Only in recent years has SharePoint begun to catch up with the capabilities that I had with Lotus Notes more than a decade ago. Eventually, I had the opportunity at that job to formally investigate BI as another product offering for our software, and the rest - as they say - is history. I built my first data warehouse with Scott Cameron (who has also ventured into the authoring world by writing Analysis Services 2008 Step by Step and was at the BI Conference last week where I got to reminisce with him for a bit) and that began a career that I never imagined at the time. Fast forward to 2010, and I'm still lauding the virtues of collaborative BI, if only the tools will catch up to my vision! Thus, I was anxious to see what Donald Farmer (blog | twitter) and Rita Sallam of Gartner had to say on the subject in their session "Collaborative Decision Making." As I suspected, the tools aren't quite there yet, but the vendors are moving in the right direction. One thing I liked about this session was a non-Microsoft perspective of the state of the industry with regard to collaborative BI. In addition, this session included a better demonstration of SharePoint collaborative BI capabilities than appeared in the BI keynote. Check out the video in the link to the session to see the demonstration. One of the use cases that was demonstrated was linking from information to a person, because, as Donald put it, "People don't trust data, they trust people." The Microsoft BI Stack in General A question I hear all the time from students when I'm teaching is how to know what tools to use when there is overlap between products in the BI stack. I've never taken the time to codify my thoughts on the subject, but saw that my friend Dan Bulos provided good insight on this topic from a variety of perspectives in his session, "So Many BI Tools, So Little Time." I thought one of his best points was that ideally you should be able to design in your tool of choice, and then deploy to your tool of choice. Unfortunately, the ideal is yet to become real across the platform. The closest we come is with the RDL in Reporting Services which can be produced from two different tools (Report Builder or Business Intelligence Development Studio's Report Designer), manually, or by a third-party or custom application. I have touted the idea for years (and publicly said so about 5 years ago) that eventually more products would be RDL producers or consumers, but we aren't there yet. Maybe in another 5 years. Another interesting session that covered the BI stack against a backdrop of competitive products was delivered by Andrew Brust. Andrew did a marvelous job of consolidating a lot of information in a way that clearly communicated how various vendors' offerings compared to the Microsoft BI stack. He also made a particularly compelling argument about how the existence of an ecosystem around the Microsoft BI stack provided innovation and opportunities lacking for other vendors. Check out his presentation, "How Does the Microsoft BI Stack...Stack Up?" Expo Hall I had planned to spend more time in the Expo Hall to see who was doing new things with the BI stack, but didn't manage to get very far. Each time I set out on an exploratory mission, I got caught up in some fascinating conversations with one or more of my peers. I find interacting with people that I meet at conferences just as important as attending sessions to learn something new. There were a couple of items that really caught me eye, however, that I'll share here. Pragmatic Works. Whether you develop SSIS packages, build SSAS cubes, or author SSRS reports (or all of the above), you really must take a look at BI Documenter. Brian Knight (twitter) walked me through the key features, and I must say I was impressed. Once you've seen what this product can do, you won't want to document your BI projects any other way. You can download a free single-user database edition, or choose from more feature-rich standard or professional editions. Microsoft Press ebooks. I also stopped by the O'Reilly Media booth to meet some folks that one of my acquisitions editors at Microsoft Press recommended. In case you haven't heard, Microsoft Press has partnered with O'Reilly Media for distribution and publishing. Apart from my interest in learning more about O'Reilly Media as an author, an advertisement in their booth caught me eye which I think is a really great move. When you buy Microsoft Press ebooks through the O'Reilly web site, you can receive it in any (or all) of the following formats where possible: PDF, epub, .mobi for Kindle and .apk for Android. You also have lifetime DRM-free access to the ebooks. As someone who is an avid collector of books, I fnd myself running out of room for storage. In addition, I travel a lot, and it's hard to lug my reference library with me. Today's e-reader options make the move to digital books a more viable way to grow my library. Having a variety of formats means I am not limited to a single device, and lifetime access means I don't have to worry about keeping track of where I've stored my files. Because the e-books are DRM-free, I can copy and paste when I'm compiling notes, and I can print pages when necessary. That's a winning combination in my mind! Overall, I was pleased with the BI conference. There were many more sessions that I couldn't attend, either because the room was full when I got there or there were multiple sessions running concurrently that I wanted to see. Fortunately, many of the sessions are accessible for viewing online at http://www.msteched.com/2010/NorthAmerica along with the TechEd sessions. You can spot the BI sessions by the yellow skyline on the title slide of the presentation as shown below. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • CodePlex Daily Summary for Thursday, March 15, 2012

    CodePlex Daily Summary for Thursday, March 15, 2012Popular ReleasesPulse: Pulse Beta 4: This version is still in development but should include: Logging and error handling have been greatly improved. If you run into an error or Pulse crashes make sure to check the Log folder for a recently modified log file so you can report the details of the issue A bunch of new features for the Wallbase.cc provider. Cleaner separation between inputs, downloading and output. Input and downloading are fairly clean now but outputs are still mixed up in the mix which I'm trying to resolve ...Google Books Downloader for Windows: Google Books Downloader-2.0.0.0.: Google Books DownloaderFinestra Virtual Desktops: 2.5.4501: This is a very minor update release. Please see the information about the 2.5 and 2.5.4500 releases for more information on recent changes. This update did not even have an automatic update triggered for it. Adds error checking and reporting to all threads, not only those with message loopsAcDown????? - Anime&Comic Downloader: AcDown????? v3.9.2: ?? ●AcDown??????????、??、??????,????1M,????,????,?????????????????????????。???????????Acfun、????(Bilibili)、??、??、YouTube、??、???、??????、SF????、????????????。??????AcPlay?????,??????、????????????????。 ● AcDown???????????????????????????,???,???????????????????。 ● AcDown???????C#??,????.NET Framework 2.0??。?????"Acfun?????"。 ????32??64? Windows XP/Vista/7/8 ????????????? ??:????????Windows XP???,?????????.NET Framework 2.0???(x86),?????"?????????"??? ??????????????,??????????: ??"AcDo...C.B.R. : Comic Book Reader: CBR 0.6: 20 Issue trackers are closed and a lot of bugs too Localize view is now MVVM and delete is working. Added the unused flag (take care that it goes to true only when displaying screen elements) Backstage - new input/output format choice control for the conversion Backstage - Add display, behaviour and register file type options in the extended options dialog Explorer list view has been transformed to a custom control. New group header, colunms order and size are saved Single insta...Windows Azure Toolkit for Windows 8: Windows Azure Toolkit for Windows 8 Consumer Prv: Windows Azure Toolkit for Windows 8 Consumer Preview - Preview Release v 1.2.0 Please download this for Windows Azure Toolkit for Windows 8 functionality on Windows 8 Consumer Preview. The core features of the toolkit include: Automated Install – Scripted install of all dependencies including Visual Studio 2010 Express and the Windows Azure SDK on Windows 8 Consumer Preview. Project Templates – Windows 8 Metro Style app project templates in Dev 11 in both XAML/C# and HTML5/JS with a suppor...CODE Framework: 4.0.20312.0: This version includes significant improvements in the WPF system (and the WPF MVVM/MVC system). This includes new styles for Metro controls and layouts. Improved color handling. It also includes an improved theme/style swapping engine down to active (open) views. There also are various other enhancements and small fixes throughout the entire framework.ScintillaNET: ScintillaNET 2.4: 3/12/2012 Jacob Slusser Added support for annotations. Issues Fixed with this Release Issue # Title 25012 25012 25018 25018 25023 25023 25014 25014 Visual Studio ALM Quick Reference Guidance: v3 - For Visual Studio 11: RELEASE README Welcome to the BETA release of the Quick Reference Guide preview As this is a BETA release and the quality bar for the final Release has not been achieved, we value your candid feedback and recommend that you do not use or deploy these BETA artifacts in a production environment. Quality-Bar Details Documentation has been reviewed by Visual Studio ALM Rangers Documentation has not been through an independent technical review Documentation ...AvalonDock: AvalonDock 2.0.0345: Welcome to early alpha release of AvalonDock 2.0 I've completely rewritten AvalonDock in order to take full advantage of the MVVM pattern. New version also boost a lot of new features: 1) Deep separation between model and layout. 2) Full WPF binding support thanks to unified logical tree between main docking manager, auto-hide windows and floating windows. 3) Support for Aero semi-maximized windows feature. 4) Support for multiple panes in the same floating windows. For a short list of new f...Windows Azure PowerShell Cmdlets: Windows Azure PowerShell Cmdlets 2.2.2: Changes Added Start Menu Item for Easy Startup Added Link to Getting Started Document Added Ability to Persist Subscription Data to Disk Fixed Get-Deployment to not throw on empty slot Simplified numerous default values for cmdlets Breaking Changes: -SubscriptionName is now mandatory in Set-Subscription. -DefaultStorageAccountName and -DefaultStorageAccountKey parameters were removed from Set-Subscription. Instead, when adding multiple accounts to a subscription, each one needs to be added ...IronPython: 2.7.2.1: On behalf of the IronPython team, I'm happy to announce the final release IronPython 2.7.2. This release includes everything from IronPython 54498 and 62475 as well. Like all IronPython 2.7-series releases, .NET 4 is required to install it. Installing this release will replace any existing IronPython 2.7-series installation. Unlike previous releases, the assemblies for all supported platforms are included in the installer as well as the zip package, in the "Platforms" directory. IronPython 2...Kooboo CMS: Kooboo CMS 3.2.0.0: Breaking changes: When upgrade from previous versions, MUST reset the all the content type templates, otherwise the content manager might get a compile error. New features Integrate with Windows azure. See: http://wiki.kooboo.com/?wiki=Kooboo CMS on Azure Complete solution to deploy on load balance servers. See: http://wiki.kooboo.com/?wiki=Kooboo CMS load balance Update Jquery and Jquery ui to the lastest version(Jquery 1.71, Jquery UI 1.8.16). Tree style text content editing. See:h...SubExtractor: Release 1026: Fix: multi-colored bluray subs will no longer result in black blob for OCR Fix: dvds with no language specified will not cause exception in name creation of subtitle files Fix: Root directory Dvds will use volume label as their directory nameExtensions for Reactive Extensions (Rxx): Rxx 1.3: Please read the latest release notes for details about what's new. Related Work Items Content SummaryRxx provides the following features. See the Documentation for details. Many IObservable<T> extension methods and IEnumerable<T> extension methods. Many wrappers that convert asynchronous Framework Class Library APIs into observables. Many useful types such as ListSubject<T>, DictionarySubject<T>, CommandSubject, ViewModel, ObservableDynamicObject, Either<TLeft, TRight>, Maybe<T>, Scala...Skype Auto Recorder: SkypeAutoRecorder 1.2: Fixed the issue when application doesn't record MP3-file for some reason. Implemented support of Skype disconnects and connection problems during conversation. Implemented {duration} placeholder. Improved settings loading. Several code improvements and optimizations. Read more about changes on my blogPlayer Framework by Microsoft: Player Framework for Windows 8 Metro (Preview): Player Framework for HTML/JavaScript and XAML/C# Metro Style Applications. Additional DownloadsIIS Smooth Streaming Client SDK for Windows 8WPF Application Framework (WAF): WAF for .NET 4.5 (Experimental): Version: 2.5.0.440 (Experimental): This is an experimental release! It can be used to investigate the new .NET Framework 4.5 features. The ideas shown in this release might come in a future release (after 2.5) of the WPF Application Framework (WAF). More information can be found in this dicussion post. Requirements .NET Framework 4.5 (The package contains a solution file for Visual Studio 11) The unit test projects require Visual Studio 11 Professional Changelog All: Upgrade all proje...SSH.NET Library: 2012.3.9: There are still few outstanding issues I wanted to include in this release but since its been a while and there are few new features already I decided to create a new release now. New Features Add SOCKS4, SOCKS5 and HTTP Proxy support when connecting to remote server. For silverlight only IP address can be used for server address when using proxy. Add dynamic port forwarding support using ForwardedPortDynamic class. Add new ShellStream class to work with SSH Shell. Add supports for mu...Test Case Import Utilities for Visual Studio 2010 and Visual Studio 11 Beta: V1.2 RTM: This release (V1.2 RTM) includes: Support for connecting to Hosted Team Foundation Server Preview. Support for connecting to Team Foundation Server 11 Beta. Fix to issue with read-only attribute being set for LinksMapping-ReportFile which may have led to problems when saving the report file. Fix to issue with “related links” not being set properly in certain conditions. Fix to ensure that tool works fine when the Excel file contained rich text data. Note: Data is still imported in pl...New ProjectsAjayLabs: ajaylabsAltairStudios.Core: AltairStudios.Core is a MVC framework extension with utils and administrationBdRise: BdRiseBenchmark.NET: Benchmark.NET makes it easier to measure the execution time of a piece of code. Internally it uses System.Diagnostics.Stopwatch so it is of higher precision that querying System.DateTime.Now. Instead of var sw = Stopwatch.StartNew(); int i; for(i = 0; i < 1000; i++) { TestSomePieceOfCode(); } sw.Stop(); var ticksPerIteration = sw.Elapsed.Ticks / i; you can just var result = Benchmark.Sequentially(()=>TestSomePieceOfCode()); Also, you can do parallel benchmarks, use...CodedUITest OrderedTest BatchRunner: CUITBatchRunner makes it easier to create a suite of CodedUITest orderedtests and execute for desired iterations(re-iterates only for failed tests). Execute CUIT orderedtests unattended with or without VSTS 2010 (at least Test Agent) installed. It's developed using C#/.NET 4.0CoseaCRM: Cosea CRM (CRM especializado en empresas de Recursos Humanos)CoseaRecluta2: Sistema de reclutamiento de personal CoseaRH: Sistema de Recursos Humanos InternoDevme.Diagnostics: Diagnostics tools collection for .NETDiplomová práce: Diplomová práceDynamics AX (Axapta) Sync AutoFix: A Dynamics AX 2012 class that will change the IDs of the tables from the SQLDictionary to match the AOT IDs.EHS Parents' Guild Cafeteria Volunteer Reminder System: EHS has a large group of volunteers who assist during lunch time at the school. Volunteers are assigned a specific day of the month, such as the 2nd Monday or the 3rd Friday. There is a need for a reminder email to be sent to volunteers 3-4 days in advance.EladPlus Source Code: EladPlus Source Code Offical EladPlus makes it easier for Everyone to Open Things On your PC. You'll no longer have to Search Things On Your Computer. It's developed in C#. EladPlus 1.0.1 Beta 2 ChangeLog: 1.Spotlight - Search Things Fastly On The Web And On EladPlus 2.EladPlus Utilties System Requirements: Windows 8Dev/Consumer Windows 7 Windows Vista - May Work Slowly Windows XP Adobe Flash Player 10.0 IE 7eSheet - H? th?ng qu?n lý n?i dung: eSheet - H? th?ng qu?n lý n?i dungEwk.Math: Math libraryIndonesia News: Menyajikan berita nasional dan informasi terkini tentang berbagai peristiwa yang terjadi di Indonesia.Liuyi.Phone.StartTileGroup: Liuyi.Phone.StartTileGroup 2012 LiuyiMDX Query Reader: Mdx Query Reader can read mdx queries in ssrs report rdl files with parameters replaced by default values. That way you can transfer your query to ssms and run it without rewriting parameter values manually. Works without installing and is extremely simple to use.Microsoft Script Explorer for Windows PowerShell: Microsoft Script Explorer for Windows PowerShell (Script Explorer) allows users to search for scripts in local and online script repositories such as the TechNet Script Center and PoshCode. Available scripts returned by searching are organized by category, and you can also search for scripts from local and trusted community repositories by applying filters based on focus areas. Search results return code samples, information about script usage, and articles about the scripts. When you find th...Orchard Simple Media Picker: A simple way to fill an input field with the url for a media file in Orchard CMS. POSSchemas: Aplicación Windows Form para generar codigo SQL y codigo C# a partir de tablas en SQL Serverpvmapper: PVMapper is an open source project focused on developing web tools for mapping locations for photovoltaic energy development.Skeleton.NET: Skeleton.NET is targeted to be a RAD Framework for Desktop and Web Applications. It will contain several blocks that are used in application development, like logging ,repository, crud, messaging, ....Softcenter Ado Library: This library can be used to implement runtime polymorphism to support any registered ADO.NET DataProvider. SPBSU IFMO schedule parser ^_^: ?????? ?????????? ? ????? ifmo.ruSudoku Library: Sudoku Library will provide a .Net library capable of creating and solving sudoku puzzles. The goal of this project is to be; light weight, efficient, and fast. This will not include an implementation of the game that is playable.System.Windows.Explorer.ContextMenu: This project aims to make developing Windows Explorer Context Menu shell extensions as simple as possible. The resulting code is event driven and hides all of the Win32 API and Shell Interfaces away from the developer.test2: test2 projektTime Sheet Management: Project personalTvUnit: TvUnit?TvRock????????、??????·???????々?????????????Windows???????????。???????.NET Framework 4.0????????。Vincent: Test projectWPF Table View: WPF DataGrid replacement. Simple WPF control to display a table of data with improved performance. Developed in C#.

    Read the article

  • Laissez les bon temps rouler! (Microsoft BI Conference 2010)

    - by smisner
    Laissez les bons temps rouler" is a Cajun phrase that I heard frequently when I lived in New Orleans in the mid-1990s. It means "Let the good times roll!" and encapsulates a feeling of happy expectation. As I met with many of my peers and new acquaintances at the Microsoft BI Conference last week, this phrase kept running through my mind as people spoke about their plans in their respective businesses, the benefits and opportunities that the recent releases in the BI stack are providing, and their expectations about the future of the BI stack.Notwithstanding some jabs here and there to point out the platform is neither perfect now nor will be anytime soon (along with admissions that the competitors are also not perfect), and notwithstanding several missteps by the event organizers (which I don't care to enumerate), the overarching mood at the conference was positive. It was a refreshing change from the doom and gloom hovering over several conferences that I attended in 2009. Although many people expect economic hardships to continue over the coming year or so, everyone I know in the BI field is busier than ever and expects to stay busy for quite a while.Self-Service BISelf-service was definitely a theme of the BI conference. In the keynote, Ted Kummert opened with a look back to a fairy tale vision of self-service BI that he told in 2008. At that time, the fairy tale future was a time when "every end user was able to use BI technologies within their job in order to move forward more effectively" and transitioned to the present time in which SQL Server 2008 R2, Office 2010, and SharePoint 2010 are available to deliver managed self-service BI.This set of technologies is presumably poised to address the needs of the 80% of users that Kummert said do not use BI today. He proceeded to outline a series of activities that users ought to be able to do themselves--from simple changes to a report like formatting or an addtional data visualization to integration of an additional data source. The keynote then continued with a series of demonstrations of both current and future technology in support of self-service BI. Some highlights that interested me:PowerPivot, of course, is the flagship product for self-service BI in the Microsoft BI stack. In the TechEd keynote, which was open to the BI conference attendees, Amir Netz (twitter) impressed the audience by demonstrating interactivity with a workbook containing 100 million rows. He upped the ante at the BI keynote with his demonstration of a future-state PowerPivot workbook containing over 2 billion records. It's important to note that this volume of data is being processed by a server engine, and not in the PowerPivot client engine. (Yes, I think it's impressive, but none of my clients are typically wrangling with 2 billion records at a time. Maybe they're thinking too small. This ability to work quickly with large data sets has greater implications for BI solutions than for self-service BI, in my opinion.)Amir also demonstrated KPIs for the future PowerPivot, which appeared to be easier to implement than in any other Microsoft product that supports KPIs, apart from simple KPIs in SharePoint. (My initial reaction is that we have one more place to build KPIs. Great. It's confusing enough. I haven't seen how well those KPIs integrate with other BI tools, which will be important for adoption.)One more PowerPivot feature that Amir showed was a graphical display of the lineage for calculations. (This is hugely practical, especially if you build up calculations incrementally. You can more easily follow the logic from calculation to calculation. Furthermore, if you need to make a change to one calculation, you can assess the impact on other calculations.)Another product demonstration will be available within the next 30 days--Pivot for Reporting Services. If you haven't seen this technology yet, check it out at www.getpivot.com. (It definitely has a wow factor, but I'm skeptical about its practicality. However, I'm looking forward to trying it out with data that I understand.)Michael Tejedor (twitter) demonstrated a feature that I think is really interesting and not emphasized nearly enough--overshadowed by PowerPivot, no doubt. That feature is the Microsoft Business Intelligence Indexing Connector, which enables search of the content of Excel workbooks and Reporting Services reports. (This capability existed in MOSS 2007, but was more cumbersome to implement. The search results in SharePoint 2010 are not only cooler, but more useful by describing whether the content is found in a table or a chart, for example.)This may yet be the dawning of the age of self-service BI - a phrase I've heard repeated from time to time over the last decade - but I think BI professionals are likely to stay busy for a long while, and need not start looking for a new line of work. Kummert repeatedly referenced strategic BI solutions in contrast to self-service BI to emphasize that self-service BI is not a replacement for the services that BI professionals provide. After all, self-service BI does not appear magically on user desktops (or whatever device they want to use). A supporting infrastructure is necessary, and grows in complexity in proportion to the need to simplify BI for users.It's one thing to hear the party line touted by Microsoft employees at the BI keynote, but it's another to hear from the people who are responsible for implementing and supporting it within an organization. Rob Collie (blog | twitter), Kasper de Jonge (blog | twitter), Vidas Matelis (site | twitter), and I were invited to join Andrew Brust (blog | twitter) as he led a Birds of a Feather session at TechEd entitled "PowerPivot: Is It the BI Deal-Changer for Developers and IT Pros?" I would single out the prevailing concern in this session as the issue of control. On one side of this issue were those who were concerned that they would lose control once PowerPivot is implemented. On the other side were those who believed that data should be freely accessible to users in PowerPivot, and even acknowledgment that users would get the data they want even if it meant they would have to manually enter into a workbook to have it ready for analysis. For another viewpoint on how PowerPivot played out at the conference, see Rob Collie's observations.Collaborative BII have been intrigued by the notion of collaborative BI for a very long time. Before I discovered BI, I was a Lotus Notes developer and later a manager of developers, working in a software company that enabled collaboration in the legal industry. Not only did I help create collaborative systems for our clients, I created a complete project management from the ground up to collaboratively manage our custom development work. In that case, collaboration involved my team, my client contacts, and me. I was also able to produce my own BI from that system as well, but didn't know that's what I was doing at the time. Only in recent years has SharePoint begun to catch up with the capabilities that I had with Lotus Notes more than a decade ago. Eventually, I had the opportunity at that job to formally investigate BI as another product offering for our software, and the rest - as they say - is history. I built my first data warehouse with Scott Cameron (who has also ventured into the authoring world by writing Analysis Services 2008 Step by Step and was at the BI Conference last week where I got to reminisce with him for a bit) and that began a career that I never imagined at the time.Fast forward to 2010, and I'm still lauding the virtues of collaborative BI, if only the tools will catch up to my vision! Thus, I was anxious to see what Donald Farmer (blog | twitter) and Rita Sallam of Gartner had to say on the subject in their session "Collaborative Decision Making." As I suspected, the tools aren't quite there yet, but the vendors are moving in the right direction. One thing I liked about this session was a non-Microsoft perspective of the state of the industry with regard to collaborative BI. In addition, this session included a better demonstration of SharePoint collaborative BI capabilities than appeared in the BI keynote. Check out the video in the link to the session to see the demonstration. One of the use cases that was demonstrated was linking from information to a person, because, as Donald put it, "People don't trust data, they trust people."The Microsoft BI Stack in GeneralA question I hear all the time from students when I'm teaching is how to know what tools to use when there is overlap between products in the BI stack. I've never taken the time to codify my thoughts on the subject, but saw that my friend Dan Bulos provided good insight on this topic from a variety of perspectives in his session, "So Many BI Tools, So Little Time." I thought one of his best points was that ideally you should be able to design in your tool of choice, and then deploy to your tool of choice. Unfortunately, the ideal is yet to become real across the platform. The closest we come is with the RDL in Reporting Services which can be produced from two different tools (Report Builder or Business Intelligence Development Studio's Report Designer), manually, or by a third-party or custom application. I have touted the idea for years (and publicly said so about 5 years ago) that eventually more products would be RDL producers or consumers, but we aren't there yet. Maybe in another 5 years.Another interesting session that covered the BI stack against a backdrop of competitive products was delivered by Andrew Brust. Andrew did a marvelous job of consolidating a lot of information in a way that clearly communicated how various vendors' offerings compared to the Microsoft BI stack. He also made a particularly compelling argument about how the existence of an ecosystem around the Microsoft BI stack provided innovation and opportunities lacking for other vendors. Check out his presentation, "How Does the Microsoft BI Stack...Stack Up?"Expo HallI had planned to spend more time in the Expo Hall to see who was doing new things with the BI stack, but didn't manage to get very far. Each time I set out on an exploratory mission, I got caught up in some fascinating conversations with one or more of my peers. I find interacting with people that I meet at conferences just as important as attending sessions to learn something new. There were a couple of items that really caught me eye, however, that I'll share here.Pragmatic Works. Whether you develop SSIS packages, build SSAS cubes, or author SSRS reports (or all of the above), you really must take a look at BI Documenter. Brian Knight (twitter) walked me through the key features, and I must say I was impressed. Once you've seen what this product can do, you won't want to document your BI projects any other way. You can download a free single-user database edition, or choose from more feature-rich standard or professional editions.Microsoft Press ebooks. I also stopped by the O'Reilly Media booth to meet some folks that one of my acquisitions editors at Microsoft Press recommended. In case you haven't heard, Microsoft Press has partnered with O'Reilly Media for distribution and publishing. Apart from my interest in learning more about O'Reilly Media as an author, an advertisement in their booth caught me eye which I think is a really great move. When you buy Microsoft Press ebooks through the O'Reilly web site, you can receive it in any (or all) of the following formats where possible: PDF, epub, .mobi for Kindle and .apk for Android. You also have lifetime DRM-free access to the ebooks. As someone who is an avid collector of books, I fnd myself running out of room for storage. In addition, I travel a lot, and it's hard to lug my reference library with me. Today's e-reader options make the move to digital books a more viable way to grow my library. Having a variety of formats means I am not limited to a single device, and lifetime access means I don't have to worry about keeping track of where I've stored my files. Because the e-books are DRM-free, I can copy and paste when I'm compiling notes, and I can print pages when necessary. That's a winning combination in my mind!Overall, I was pleased with the BI conference. There were many more sessions that I couldn't attend, either because the room was full when I got there or there were multiple sessions running concurrently that I wanted to see. Fortunately, many of the sessions are accessible for viewing online at http://www.msteched.com/2010/NorthAmerica along with the TechEd sessions. You can spot the BI sessions by the yellow skyline on the title slide of the presentation as shown below. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

< Previous Page | 352 353 354 355 356 357 358 359 360 361 362 363  | Next Page >