Search Results

Search found 82729 results on 3310 pages for 'system web routing'.

Page 106/3310 | < Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >

  • What technologies/tools do people use to implement live websites ?

    - by MadSeb
    Hi, I have the following situation: -I have a server A hooked up to a piece of hardware that sends values and information out of every second. Programs on the server machine can read these values. This server A is in a very remote location so Internet connection is very slow and not reliable but the connection does exist. Let's say it's a weather station in the Arctic. -Users from the home location want to monitorize the weather values somehow. Well, the users can use a remote desktop connection the server A but that would be too too slow. My idea is somehow to have a website on a web server (let's call the webserver - B and B is in a home location ) and make the server A connect to the server B and somehow send values and the web application reads the values and displays them....... but how to do such a system ?? I know I can use MySQL and have the server A connect to a SQL server on server B and send INSERT queries and have the web application running on server B constantly read from the SQL server but I think that would be way way too slow and I think there has to be a better solution. Any ideas ? BTW. The users should be able to send information to the weather station from the website as well ( so an ADMIN user should be allowed to shut down the weather station from the website or whatever) Best regards, MadSeb

    Read the article

  • Statistical analysis on large data set to be published on the web

    - by dassouki
    I have a non-computer related data logger, that collects data from the field. This data is stored as text files, and I manually lump the files together and organize them. The current format is through a csv file per year per logger. Each file is around 4,000,000 lines x 7 loggers x 5 years = a lot of data. some of the data is organized as bins item_type, item_class, item_dimension_class, and other data is more unique, such as item_weight, item_color, date_collected, and so on ... Currently, I do statistical analysis on the data using a python/numpy/matplotlib program I wrote. It works fine, but the problem is, I'm the only one who can use it, since it and the data live on my computer. I'd like to publish the data on the web using a postgres db; however, I need to find or implement a statistical tool that'll take a large postgres table, and return statistical results within an adequate time frame. I'm not familiar with python for the web; however, I'm proficient with PHP on the web side, and python on the offline side. users should be allowed to create their own histograms, data analysis. For example, a user can search for all items that are blue shipped between week x and week y, while another user can search for sort the weight distribution of all items by hour for all year long. I was thinking of creating and indexing my own statistical tools, or automate the process somehow to emulate most queries. This seemed inefficient. I'm looking forward to hearing your ideas Thanks

    Read the article

  • java web templates across multiple WAR files

    - by Casey
    I have a multi WAR web application that was designed badly. There is a single WAR that is responsible for handling some authorization against a database and defines a standard web page using a jsp taglib. The main WAR basically checks the privileges of the user and than based on that, displays links to the context path of the other deployed WARS. Each of the other deployed WARs includes this custom tag lib. I am working on redesigning this application, and one of the nice things that I want to retain is that we have other project teams that have developed these WAR modules that "plug into" our current system to take advantage of other things we have to offer. I am not entirely sure how to handle the page templates though. I need a templating system that would be easy enough to use across multiple wars (I was thinking of jsp fragments??). I really only need to define a consistent header and main navigation section. Whatever else is displayed on the page is up to the individual web project. Any suggestions? I hope that this is clear, if not I can elaborate more.

    Read the article

  • Sending information between JavaScript and Web Services using AJAX

    - by COB-CSU-AM
    Alright so I'm using Microsoft's Web Services and AJAX to get information from a SQL database for use with java script on the client side. And I'm wondering what the best method is. Before I started working on the project, the web services were setup to return a C# List filled with some objects. Those objects variables (ints, strings, etc.) contain the data I want to use. Of course, java script can't do much with this, to the best of my knowledge. I then modified the web service to return a 2D Array, but java script got confused, and to the best of my knowledge can't handle 2D array's returned from C#. I then tried to use a regular array, but then a found the length property of an array in JS doesn't carry over, so I couldn't preform a for loop through all the items, because there wasn't anyway of knowing how many elements there were. The only other thing I can thing of is returning a string with special char's to separate the data, but this seems way too convoluted. Any suggestions? Thanks in advance!

    Read the article

  • General web service ideas

    - by user2014175
    I have a question regarding different types of web services. I'll preface this by saying that I have built a number of apps (for both ios and android) for personal use that interact with the web via php and sql. I have taught myself these languages, and as such don't have the broader background knowledge that many of you do. My question is, in what other ways can you perform an interaction between a web service and a mobile device other than mobile - php - sql - etc. For example, If I built a very simple tracking app for my car, my current method would be to push GPS coordinates from my iphone to my database at a set interval, then I would write a simple bit of javascript that pulled those coordinates out of the database and superimposed them on a google map. Is there a different way to do this? Such as the server acting as a live middle man who simple pushed the coordinates directly to a target browser? Without the database in the middle? If so, are there advantages and disadvantages to these different methods to achieve different goals? I know its a broad question but I'm really intrigued and I'm finding it difficult to word a google search for it. Any info / reading material suggesting would be excellent. Thanks

    Read the article

  • Migrating Ajax web application to web socket

    - by Bastan
    Hi, I think i'm just missing a little detail that is preventing me from seeing the whole picture. I have a web application which use ajax request every x time to update client with new information or tasks. I also have a long running process on the server which is a java computation engine. I would like this engine to send update to the client. I am wondering how to migrate my web app to using websocket. Probably phpwebsocket or similar. Can my server 'decide' to send information to a specific client? It seems possible looking at the php-websocket. Can my java backend long process use the websocket server to send notification to a specific client. How? well I can say that my java app could use a class that could send over websocket instead of http. But how the websocket server knows to which client to send the 'info'. I am puzzle by all this. Any document that explain this in more details? It seems that the websocket could create an instance of my web application. Thanks

    Read the article

  • License For (Mostly) Open Source Website / Service

    - by Ryan Sullivan
    I have an interesting setup and am not sure how to license a website. I know this is not legal advice, and I am not asking for any. There are so many different Open Source Licenses and I do not have the time to read every last one to see which best fits my situation. Really, I am looking for suggestions and a nudge in the right direction. My setup is: I give away for free version of my web service with a clean website interface. The implementation I use in the actual web site is (almost) identical to what I give away. The main service works the exactly the same way, but the website interface to manage features in the service is fairly different. Really the web interfaces have the same exact backend, and the front ends accomplish the same tasks, but the service I offer on my site is very rich and uses a good deal of javascript, where I kept the interface in the version I give away as simple and javascript-less as possible. Mostly so it is easy to understand and integrate into other sites. I am not entirely sure how I should license this. It is more like I develop an open source service but have a separate site built upon it. I like the GPLv3 but I am not sure if I can use it in this case especially since I am making some money off of google ad's on the site and plan on using amazon affiliates as well. Any help would be greatly appreciated. I do want to open it up as much as possible. But I still want to be able to continue with my own implementation. Thanks in advance for any information or help anyone can provide.

    Read the article

  • Why the system information message when accessing an Ubuntu server doesn't match free -m?

    - by Andres
    Each time I SSH into my AWS Ubuntu servers I see a system information message, showing load, memory usage and packages available to install, like this: Welcome to Ubuntu 12.04.3 LTS (GNU/Linux 3.2.0-51-virtual x86_64) * Documentation: https://help.ubuntu.com/ System information as of Sun Nov 10 18:06:43 EST 2013 System load: 0.08 Processes: 127 Usage of /: 4.9% of 98.43GB Users logged in: 1 Memory usage: 69% IP address for eth0: 10.236.136.233 Swap usage: 100% Graph this data and manage this system at https://landscape.canonical.com/ 13 packages can be updated. 0 updates are security updates. Get cloud support with Ubuntu Advantage Cloud Guest http://www.ubuntu.com/business/services/cloud Use Juju to deploy your cloud instances and workloads. https://juju.ubuntu.com/#cloud-precise *** /dev/xvda1 will be checked for errors at next reboot *** *** System restart required *** My question is about the memory percentage shown. In this case, it's showing a 69% of memory usage, but since the swap usage was 100% I checked it by myself. So when I run free -m I get this: total used free shared buffers cached Mem: 1652 1635 17 0 4 29 -/+ buffers/cache: 1601 51 Swap: 895 895 0 And that's of course closer to 100% than to 69%

    Read the article

  • Is it recommend to use Windows XP System Restore?

    - by Stan
    I usually only enable system restore on OS drive. But even so, I rarely use it. Usually when got infected, system restore can't help resolving the issue. Besides got infected, I can't think of any case that requires system restore. So, is it recommend to enable it? Thanks.

    Read the article

  • Windows XP restore point file from disk.

    - by Dragos Toader
    Suppose I copied a Windows XP restore point file to a USB memory stick. I copied C:\System Volume Information\MountPointManagerRemoteDatabase C:\System Volume Information\tracking.log C:\System Volume Information\_restore{45B5E8B9-949A-471E-999D-F381DA56A2D3} C:\System Volume Information\catalog.wci to F:\System Volume Information\ How can I restore this restore point? Can I fool the system into using that file (if I copied it back into the restore point folder)? From F:\System Volume Information\MountPointManagerRemoteDatabase F:\System Volume Information\tracking.log F:\System Volume Information\_restore{45B5E8B9-949A-471E-999D-F381DA56A2D3} F:\System Volume Information\catalog.wci to C:\System Volume Information\

    Read the article

  • Two network interfaces and two IP addresses on the same subnet in Linux

    - by Scott Duckworth
    I recently ran into a situation where I needed two IP addresses on the same subnet assigned to one Linux host so that we could run two SSL/TLS sites. My first approach was to use IP aliasing, e.g. using eth0:0, eth0:1, etc, but our network admins have some fairly strict settings in place for security that squashed this idea: They use DHCP snooping and normally don't allow static IP addresses. Static addressing is accomplished by using static DHCP entries, so the same MAC address always gets the same IP assignment. This feature can be disabled per switchport if you ask and you have a reason for it (thankfully I have a good relationship with the network guys and this isn't hard to do). With the DHCP snooping disabled on the switchport, they had to put in a rule on the switch that said MAC address X is allowed to have IP address Y. Unfortunately this had the side effect of also saying that MAC address X is ONLY allowed to have IP address Y. IP aliasing required that MAC address X was assigned two IP addresses, so this didn't work. There may have been a way around these issues on the switch configuration, but in an attempt to preserve good relations with the network admins I tried to find another way. Having two network interfaces seemed like the next logical step. Thankfully this Linux system is a virtual machine, so I was able to easily add a second network interface (without rebooting, I might add - pretty cool). A few keystrokes later I had two network interfaces up and running and both pulled IP addresses from DHCP. But then the problem came in: the network admins could see (on the switch) the ARP entry for both interfaces, but only the first network interface that I brought up would respond to pings or any sort of TCP or UDP traffic. After lots of digging and poking, here's what I came up with. It seems to work, but it also seems to be a lot of work for something that seems like it should be simple. Any alternate ideas out there? Step 1: Enable ARP filtering on all interfaces: # sysctl -w net.ipv4.conf.all.arp_filter=1 # echo "net.ipv4.conf.all.arp_filter = 1" >> /etc/sysctl.conf From the file networking/ip-sysctl.txt in the Linux kernel docs: arp_filter - BOOLEAN 1 - Allows you to have multiple network interfaces on the same subnet, and have the ARPs for each interface be answered based on whether or not the kernel would route a packet from the ARP'd IP out that interface (therefore you must use source based routing for this to work). In other words it allows control of which cards (usually 1) will respond to an arp request. 0 - (default) The kernel can respond to arp requests with addresses from other interfaces. This may seem wrong but it usually makes sense, because it increases the chance of successful communication. IP addresses are owned by the complete host on Linux, not by particular interfaces. Only for more complex setups like load- balancing, does this behaviour cause problems. arp_filter for the interface will be enabled if at least one of conf/{all,interface}/arp_filter is set to TRUE, it will be disabled otherwise Step 2: Implement source-based routing I basically just followed directions from http://lartc.org/howto/lartc.rpdb.multiple-links.html, although that page was written with a different goal in mind (dealing with two ISPs). Assume that the subnet is 10.0.0.0/24, the gateway is 10.0.0.1, the IP address for eth0 is 10.0.0.100, and the IP address for eth1 is 10.0.0.101. Define two new routing tables named eth0 and eth1 in /etc/iproute2/rt_tables: ... top of file omitted ... 1 eth0 2 eth1 Define the routes for these two tables: # ip route add default via 10.0.0.1 table eth0 # ip route add default via 10.0.0.1 table eth1 # ip route add 10.0.0.0/24 dev eth0 src 10.0.0.100 table eth0 # ip route add 10.0.0.0/24 dev eth1 src 10.0.0.101 table eth1 Define the rules for when to use the new routing tables: # ip rule add from 10.0.0.100 table eth0 # ip rule add from 10.0.0.101 table eth1 The main routing table was already taken care of by DHCP (and it's not even clear that its strictly necessary in this case), but it basically equates to this: # ip route add default via 10.0.0.1 dev eth0 # ip route add 130.127.48.0/23 dev eth0 src 10.0.0.100 # ip route add 130.127.48.0/23 dev eth1 src 10.0.0.101 And voila! Everything seems to work just fine. Sending pings to both IP addresses works fine. Sending pings from this system to other systems and forcing the ping to use a specific interface works fine (ping -I eth0 10.0.0.1, ping -I eth1 10.0.0.1). And most importantly, all TCP and UDP traffic to/from either IP address works as expected. So again, my question is: is there a better way to do this? This seems like a lot of work for a seemingly simple problem.

    Read the article

  • Is it possible to clone system drive in Windows 7?

    - by Ladislav Mrnka
    My current problem is that my Window 7 system drive is unstable. I would like to try to clone this drive to the same type of disk (OCZ Vertex 2 120GB to OCZ Vertex 2 120GB) and replace the system drive with created clone. My installation doesn't have ProgramData and User profiles on the system drive. Later on (after warranty replacement of problematic drive), I would like to copy ProgramData and User profiles to different disk (Samsung SpinPoint 750GB to OCZ Vertex 2 120GB) and use the new disk instead. Note: data have only few GBs so there should not be any problem with the disk size. Is it possible? What is the best way to do that? Is it better to simply reinstall the system from scratch (I would like to avoid it)?

    Read the article

  • Web Matrix released

    - by TATWORTH
    Microsoft have now released Web Matrix (and ASP.NET MVC3 if you so inclined!) One signifcant utility is IIS Express which will replace Cassini It is worth noting that SP1 for VS2010 should be out in Q1. Links: http://www.hanselman.com/blog/ASPNETMVC3WebMatrixNuGetIISExpressAndOrchardReleasedTheMicrosoftJanuaryWebReleaseInContext.aspx http://www.hanselman.com/blog/LinkRollupNewDocumentationAndTutorialsFromWebPlatformAndTools.aspx http://arstechnica.com/microsoft/news/2011/01/microsoft-releases-free-webmatrix-web-development-tool.ars I am impressed by the copious tutorials on MVC, which I include below: Intro to ASP.NET MVC 3 onboarding series. Scott Hanselman and Rick Anderson collaboration and Mike Pope (Editor) Both C# and VB versions: Intro to ASP.NET MVC 3 Adding a Controller Adding a View Entity Framework Code-First Development Accessing your Model's Data from a Controller Adding a Create Method and Create View Adding Validation to the Model Adding a New Field to the Movie Model and Table Implementing Edit, Details and Delete Source code for this series MVC 3 Updated and new tutorials/ API Reference on MSDN Rick Anderson (Lead Programming Writer), Keith Newman and Mike Pope (Editor) ASP.NET MVC 3 Content Map ASP.NET MVC Overview MVC Framework and Application Structure Understanding MVC Application Execution Compatibility of ASP.NET Web Forms and MVC Walkthrough: Creating a Basic ASP.NET MVC Project Walkthrough: Using Forms Authentication in ASP.NET MVC Controllers and Action Methods in ASP.NET MVC Applications Using an Asynchronous Controller in ASP.NET MVC Views and UI Rendering in ASP.NET MVC Applications Rendering a Form Using HTML Helpers Passing Data in an ASP.NET MVC Application Walkthrough: Using Templated Helpers to Display Data in ASP.NET MVC Creating an ASP.NET MVC View by Calling Multiple Actions Models and Validation in ASP.NET MVC How to: Validate Model Data Using DataAnnotations Attributes Walkthrough: Using MVC View Templates How to: Implement Remote Validation in ASP.NET MVC Walkthrough: Adding AJAX Scripting Walkthrough: Organizing an Application using Areas Filtering in ASP.NET MVC Creating Custom Action Filters How to: Create a Custom Action Filter Unit Testing in ASP.NET MVC Applications Walkthrough: Using TDD with ASP.NET MVC How to: Add a Custom ASP.NET MVC Test Framework in Visual Studio ASP.NET MVC 3 Reference System.Web.Mvc System.Web.Mvc.Ajax System.Web.Mvc.Async System.Web.Mvc.Html System.Web.Mvc.Razor

    Read the article

  • Anunciando: Grandes Melhorias para Web Sites da Windows Azure

    - by Leniel Macaferi
    Estou animado para anunciar algumas grandes melhorias para os Web Sites da Windows Azure que introduzimos no início deste verão.  As melhorias de hoje incluem: uma nova opção de hospedagem adaptável compartilhada de baixo custo, suporte a domínios personalizados para websites hospedados em modo compartilhado ou em modo reservado usando registros CNAME e A-Records (o último permitindo naked domains), suporte para deployment contínuo usando tanto CodePlex e GitHub, e a extensibilidade FastCGI. Todas essas melhorias estão agora online em produção e disponíveis para serem usadas imediatamente. Nova Camada Escalonável "Compartilhada" A Windows Azure permite que você implante e hospede até 10 websites em um ambiente gratuito e compartilhado com múltiplas aplicações. Você pode começar a desenvolver e testar websites sem nenhum custo usando este modo compartilhado (gratuito). O modo compartilhado suporta a capacidade de executar sites que servem até 165MB/dia de conteúdo (5GB/mês). Todas as capacidades que introduzimos em Junho com esta camada gratuita permanecem inalteradas com a atualização de hoje. Começando com o lançamento de hoje, você pode agora aumentar elasticamente seu website para além desta capacidade usando uma nova opção "shared" (compartilhada) de baixo custo (a qual estamos apresentando hoje), bem como pode usar a opção "reserved instance" (instância reservada) - a qual suportamos desde Junho. Aumentar a capacidade de qualquer um desses modos é fácil. Basta clicar na aba "scale" (aumentar a capacidade) do seu website dentro do Portal da Windows Azure, escolher a opção de modo de hospedagem que você deseja usar com ele, e clicar no botão "Salvar". Mudanças levam apenas alguns segundos para serem aplicadas e não requerem nenhum código para serem alteradas e também não requerem que a aplicação seja reimplantada/reinstalada: A seguir estão mais alguns detalhes sobre a nova opção "shared" (compartilhada), bem como a opção existente "reserved" (reservada): Modo Compartilhado Com o lançamento de hoje, estamos introduzindo um novo modo de hospedagem de baixo custo "compartilhado" para Web Sites da Windows Azure. Um website em execução no modo compartilhado é implantado/instalado em um ambiente de hospedagem compartilhado com várias outras aplicações. Ao contrário da opção de modo free (gratuito), um web-site no modo compartilhado não tem quotas/limite máximo para a quantidade de largura de banda que o mesmo pode servir. Os primeiros 5 GB/mês de banda que você servir com uma website compartilhado é grátis, e então você passará a pagar a taxa padrão "pay as you go" (pague pelo que utilizar) da largura de banda de saída da Windows Azure quando a banda de saída ultrapassar os 5 GB. Um website em execução no modo compartilhado agora também suporta a capacidade de mapear múltiplos nomes de domínio DNS personalizados, usando ambos CNAMEs e A-records para tanto. O novo suporte A-record que estamos introduzindo com o lançamento de hoje oferece a possibilidade para você suportar "naked domains" (domínios nús - sem o www) com seus web-sites (por exemplo, http://microsoft.com além de http://www.microsoft.com). Nós também, no futuro, permitiremos SSL baseada em SNI como um recurso nativo nos websites que rodam em modo compartilhado (esta funcionalidade não é suportada com o lançamento de hoje - mas chagará mais tarde ainda este ano, para ambos as opções de hospedagem - compartilhada e reservada). Você paga por um website no modo compartilhado utilizando o modelo padrão "pay as you go" que suportamos com outros recursos da Windows Azure (ou seja, sem custos iniciais, e você só paga pelas horas nas quais o recurso estiver ativo). Um web-site em execução no modo compartilhado custa apenas 1,3 centavos/hora durante este período de preview (isso dá uma média de $ 9.36/mês ou R$ 19,00/mês - dólar a R$ 2,03 em 17-Setembro-2012) Modo Reservado Além de executar sites em modo compartilhado, também suportamos a execução dos mesmos dentro de uma instância reservada. Quando rodando em modo de instância reservada, seus sites terão a garantia de serem executados de maneira isolada dentro de sua própria VM (virtual machine - máquina virtual) Pequena, Média ou Grande (o que significa que, nenhum outro cliente da Windows azure terá suas aplicações sendo executadas dentro de sua VM. Somente as suas aplicações). Você pode executar qualquer número de websites dentro de uma máquina virtual, e não existem quotas para limites de CPU ou memória. Você pode executar seus sites usando uma única VM de instância reservada, ou pode aumentar a capacidade tendo várias instâncias (por exemplo, 2 VMs de médio porte, etc.). Dimensionar para cima ou para baixo é fácil - basta selecionar a VM da instância "reservada" dentro da aba "scale" no Portal da Windows Azure, escolher o tamanho da VM que você quer, o número de instâncias que você deseja executar e clicar em salvar. As alterações têm efeito em segundos: Ao contrário do modo compartilhado, não há custo por site quando se roda no modo reservado. Em vez disso, você só paga pelas instâncias de VMs reservadas que você usar - e você pode executar qualquer número de websites que você quiser dentro delas, sem custo adicional (por exemplo, você pode executar um único site dentro de uma instância de VM reservada ou 100 websites dentro dela com o mesmo custo). VMs de instâncias reservadas têm um custo inicial de $ 8 cents/hora ou R$ 16 centavos/hora para uma pequena VM reservada. Dimensionamento Elástico para Cima/para Baixo Os Web Sites da Windows Azure permitem que você dimensione para cima ou para baixo a sua capacidade dentro de segundos. Isso permite que você implante um site usando a opção de modo compartilhado, para começar, e em seguida, dinamicamente aumente a capacidade usando a opção de modo reservado somente quando você precisar - sem que você tenha que alterar qualquer código ou reimplantar sua aplicação. Se o tráfego do seu site diminuir, você pode diminuir o número de instâncias reservadas que você estiver usando, ou voltar para a camada de modo compartilhado - tudo em segundos e sem ter que mudar o código, reimplantar a aplicação ou ajustar os mapeamentos de DNS. Você também pode usar o "Dashboard" (Painel de Controle) dentro do Portal da Windows Azure para facilmente monitorar a carga do seu site em tempo real (ele mostra não apenas as solicitações/segundo e a largura de banda consumida, mas também estatísticas como a utilização de CPU e memória). Devido ao modelo de preços "pay as you go" da Windows Azure, você só paga a capacidade de computação que você usar em uma determinada hora. Assim, se o seu site está funcionando a maior parte do mês em modo compartilhado (a $ 1.3 cents/hora ou R$ 2,64 centavos/hora), mas há um final de semana em que ele fica muito popular e você decide aumentar sua capacidade colocando-o em modo reservado para que seja executado em sua própria VM dedicada (a $ 8 cents/hora ou R$ 16 centavos/hora), você só terá que pagar os centavos/hora adicionais para as horas em que o site estiver sendo executado no modo reservado. Você não precisa pagar nenhum custo inicial para habilitar isso, e uma vez que você retornar seu site para o modo compartilhado, você voltará a pagar $ 1.3 cents/hora ou R$ 2,64 centavos/hora). Isto faz com que essa opção seja super flexível e de baixo custo. Suporte Melhorado para Domínio Personalizado Web sites em execução no modo "compartilhado" ou no modo "reservado" suportam a habilidade de terem nomes personalizados (host names) associados a eles (por exemplo www.mysitename.com). Você pode associar múltiplos domínios personalizados para cada Web Site da Windows Azure. Com o lançamento de hoje estamos introduzindo suporte para registros A-Records (um recurso muito pedido pelos usuários). Com o suporte a A-Record, agora você pode associar domínios 'naked' ao seu Web Site da Windows Azure - ou seja, em vez de ter que usar www.mysitename.com você pode simplesmente usar mysitename.com (sem o prefixo www). Tendo em vista que você pode mapear vários domínios para um único site, você pode, opcionalmente, permitir ambos domínios (com www e a versão 'naked') para um site (e então usar uma regra de reescrita de URL/redirecionamento (em Inglês) para evitar problemas de SEO). Nós também melhoramos a interface do usuário para o gerenciamento de domínios personalizados dentro do Portal da Windows Azure como parte do lançamento de hoje. Clicando no botão "Manage Domains" (Gerenciar Domínios) na bandeja na parte inferior do portal agora traz uma interface de usuário personalizada que torna fácil gerenciar/configurar os domínios: Como parte dessa atualização nós também tornamos significativamente mais suave/mais fácil validar a posse de domínios personalizados, e também tornamos mais fácil alternar entre sites/domínios existentes para Web Sites da Windows Azure, sem que o website fique fora do ar. Suporte a Deployment (Implantação) contínua com Git e CodePlex ou GitHub Um dos recursos mais populares que lançamos no início deste verão foi o suporte para a publicação de sites diretamente para a Windows Azure usando sistemas de controle de código como TFS e Git. Esse recurso fornece uma maneira muito poderosa para gerenciar as implantações/instalações da aplicação usando controle de código. É realmente fácil ativar este recurso através da página do dashboard de um web site: A opção TFS que lançamos no início deste verão oferece uma solução de implantação contínua muito rica que permite automatizar os builds e a execução de testes unitários a cada vez que você atualizar o repositório do seu website, e em seguida, se os testes forem bem sucedidos, a aplicação é automaticamente publicada/implantada na Windows Azure. Com o lançamento de hoje, estamos expandindo nosso suporte Git para também permitir cenários de implantação contínua integrando esse suporte com projetos hospedados no CodePlex e no GitHub. Este suporte está habilitado para todos os web-sites (incluindo os que usam o modo "free" (gratuito)). A partir de hoje, quando você escolher o link "Set up Git publishing" (Configurar publicação Git) na página do dashboard de um website, você verá duas opções adicionais quando a publicação baseada em Git estiver habilitada para o web-site: Você pode clicar em qualquer um dos links "Deploy from my CodePlex project" (Implantar a partir do meu projeto no CodePlex) ou "Deploy from my GitHub project"  (Implantar a partir do meu projeto no GitHub) para seguir um simples passo a passo para configurar uma conexão entre o seu website e um repositório de código que você hospeda no CodePlex ou no GitHub. Uma vez que essa conexão é estabelecida, o CodePlex ou o GitHub automaticamente notificará a Windows Azure a cada vez que um checkin ocorrer. Isso fará com que a Windows Azure faça o download do código e compile/implante a nova versão da sua aplicação automaticamente.  Os dois vídeos a seguir (em Inglês) mostram quão fácil é permitir esse fluxo de trabalho ao implantar uma app inicial e logo em seguida fazer uma alteração na mesma: Habilitando Implantação Contínua com os Websites da Windows Azure e CodePlex (2 minutos) Habilitando Implantação Contínua com os Websites da Windows Azure e GitHub (2 minutos) Esta abordagem permite um fluxo de trabalho de implantação contínua realmente limpo, e torna muito mais fácil suportar um ambiente de desenvolvimento em equipe usando Git: Nota: o lançamento de hoje suporta estabelecer conexões com repositórios públicos do GitHub/CodePlex. Suporte para repositórios privados será habitado em poucas semanas. Suporte para Múltiplos Branches (Ramos de Desenvolvimento) Anteriormente, nós somente suportávamos implantar o código que estava localizado no branch 'master' do repositório Git. Muitas vezes, porém, os desenvolvedores querem implantar a partir de branches alternativos (por exemplo, um branch de teste ou um branch com uma versão futura da aplicação). Este é agora um cenário suportado - tanto com projetos locais baseados no git, bem como com projetos ligados ao CodePlex ou GitHub. Isto permite uma variedade de cenários úteis. Por exemplo, agora você pode ter dois web-sites - um em "produção" e um outro para "testes" - ambos ligados ao mesmo repositório no CodePlex ou no GitHub. Você pode configurar um dos websites de forma que ele sempre baixe o que estiver presente no branch master, e que o outro website sempre baixe o que estiver no branch de testes. Isto permite uma maneira muito limpa para habilitar o teste final de seu site antes que ele entre em produção. Este vídeo de 1 minuto (em Inglês) demonstra como configurar qual branch usar com um web-site. Resumo Os recursos mostrados acima estão agora ao vivo em produção e disponíveis para uso imediato. Se você ainda não tem uma conta da Windows Azure, você pode inscrever-se em um teste gratuito para começar a usar estes recursos hoje mesmo. Visite o O Centro de Desenvolvedores da Windows Azure (em Inglês) para saber mais sobre como criar aplicações para serem usadas na nuvem. Nós teremos ainda mais novos recursos e melhorias chegando nas próximas semanas - incluindo suporte para os recentes lançamentos do Windows Server 2012 e .NET 4.5 (habilitaremos novas imagens de web e work roles com o Windows Server 2012 e NET 4.5 no próximo mês). Fique de olho no meu blog para detalhes assim que esses novos recursos ficarem disponíveis. Espero que ajude, - Scott P.S. Além do blog, eu também estou utilizando o Twitter para atualizações rápidas e para compartilhar links. Siga-me em: twitter.com/ScottGu Texto traduzido do post original por Leniel Macaferi.

    Read the article

  • Migrate ASP.Net web site from IIS6 to IIS7

    - by David.Chu.ca
    I have to migrate an ASP.Net web site from IIS6 to IIS7. I tried to copy the all files for a web site from IIS6 (c:\inetpub\wwwroot\MySite) to another box with Windows Server 2008 R2 where IIS7 is the default web server. However, the simply copy seems not working. Should I rebuild the web site for IIS7 or should I make changes on the new box with IIS7 such as web.config? Thanks for the comments. Further investigation I found that http handers seems caused exception: <!--httpHandlers> <add path="Reserved.ReportViewerWebControl.axd" verb="*" type="Microsoft.Reporting.WebForms.HttpHandler, Microsoft.ReportViewer.WebForms, Version=8.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" validate="false"/> </httpHandlers--> After I comment out the above handler in web.config, the web page works fine. This is just my initial test. I am not sure if I should rebuild the web site from source codes or not. If so, do I need to specify for IIS7?

    Read the article

  • What is my HttpContext.GetLocalResourceObject Method Virtual Path?

    - by ARUNRAJ
    I have read http://msdn.microsoft.com/en-us/library/ms149953.aspx and need to verify what is my GetLocalResourceObject virtual path. My local resource files are located on my pc at: C:\inetpub\wwwroot\GlobalX\Input\App_LocalResources Within this folder are my resource files for all the languages that site handles (InputContactDetails.aspx.ro.resx, InputContactDetails.aspx.hi.resx, etc.), as well as the default resource file (InputContactDetails.aspx.resx). I am receiving an error when I attempt to implement the virtual path string. Below is my line of offending code: return '<%= HttpContext.GetLocalResourceObject("~/GlobalX/Input/App_LocalResources/InputContactDetails.aspx.resx", "ContactDetails.Text", new System.Globalization.CultureInfo("ro")) %>'; I have tried ~/GlobalX/Input/App_LocalResources as the virtual path, and several other permutations, but I get the same error. If someone could show what I am doing wrong, I would appreciate it greatly. Here is the error message I am getting: The resource class for this page was not found. Please check if the resource file exists and try again. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.InvalidOperationException: The resource class for this page was not found. Please check if the resource file exists and try again. Source Error: Line 410: function languageContactPromptPhone(var_lcs) { Line 411: if (var_lcs == "af") { Line 412: return '<%= HttpContext.GetLocalResourceObject("~/GlobalX/Input/App_LocalResources/InputContactDetails.aspx.resx", "ContactDetails.Text", new System.Globalization.CultureInfo("ro")) %'; Line 413: } Line 414: else if (var_lcs == "sq") { Source File: c:\inetpub\wwwroot\GlobalX\Input\InputContactDetails.aspx Line: 412 Stack Trace: [InvalidOperationException: The resource class for this page was not found. Please check if the resource file exists and try again.] System.Web.Compilation.LocalResXResourceProvider.CreateResourceManager() +2785818 System.Web.Compilation.BaseResXResourceProvider.EnsureResourceManager() +24 System.Web.Compilation.BaseResXResourceProvider.GetObject(String resourceKey, CultureInfo culture) +15 System.Web.Compilation.ResourceExpressionBuilder.GetResourceObject(IResourceProvider resourceProvider, String resourceKey, CultureInfo culture, Type objType, String propName) +23 System.Web.HttpContext.GetLocalResourceObject(String virtualPath, String resourceKey, CultureInfo culture) +38 ASP.input_inputcontactdetails_aspx.__RenderContentInputContactDetails(HtmlTextWriter __w, Control parameterContainer) in c:\inetpub\wwwroot\GlobalX\Input\InputContactDetails.aspx:412 System.Web.UI.Control.RenderChildrenInternal(HtmlTextWriter writer, ICollection children) +109 System.Web.UI.Control.RenderChildren(HtmlTextWriter writer) +8 System.Web.UI.Control.Render(HtmlTextWriter writer) +10 System.Web.UI.Control.RenderControlInternal(HtmlTextWriter writer, ControlAdapter adapter) +27 System.Web.UI.Control.RenderControl(HtmlTextWriter writer, ControlAdapter adapter) +8991378 System.Web.UI.Control.RenderControl(HtmlTextWriter writer) +25 System.Web.UI.Control.RenderChildrenInternal(HtmlTextWriter writer, ICollection children) +208 System.Web.UI.Control.RenderChildren(HtmlTextWriter writer) +8 System.Web.UI.Control.Render(HtmlTextWriter writer) +10 System.Web.UI.Control.RenderControlInternal(HtmlTextWriter writer, ControlAdapter adapter) +27 System.Web.UI.Control.RenderControl(HtmlTextWriter writer, ControlAdapter adapter) +8991378 System.Web.UI.Control.RenderControl(HtmlTextWriter writer) +25 System.Web.UI.Control.RenderChildrenInternal(HtmlTextWriter writer, ICollection children) +208 System.Web.UI.UpdatePanel.RenderChildren(HtmlTextWriter writer) +256 System.Web.UI.UpdatePanel.Render(HtmlTextWriter writer) +37 System.Web.UI.Control.RenderControlInternal(HtmlTextWriter writer, ControlAdapter adapter) +27 System.Web.UI.Control.RenderControl(HtmlTextWriter writer, ControlAdapter adapter) +8991378 System.Web.UI.Control.RenderControl(HtmlTextWriter writer) +25 ASP.masterpages_masterinput_master.__RenderformMasterInput(HtmlTextWriter __w, Control parameterContainer) in c:\inetpub\wwwroot\GlobalX\MasterPages\MasterInput.master:140 System.Web.UI.Control.RenderChildrenInternal(HtmlTextWriter writer, ICollection children) +109 System.Web.UI.HtmlControls.HtmlForm.RenderChildren(HtmlTextWriter writer) +173 System.Web.UI.HtmlControls.HtmlContainerControl.Render(HtmlTextWriter writer) +31 System.Web.UI.HtmlControls.HtmlForm.Render(HtmlTextWriter output) +53 System.Web.UI.Control.RenderControlInternal(HtmlTextWriter writer, ControlAdapter adapter) +27 System.Web.UI.Control.RenderControl(HtmlTextWriter writer, ControlAdapter adapter) +8991378 System.Web.UI.HtmlControls.HtmlForm.RenderControl(HtmlTextWriter writer) +40 System.Web.UI.Control.RenderChildrenInternal(HtmlTextWriter writer, ICollection children) +208 System.Web.UI.Control.RenderChildren(HtmlTextWriter writer) +8 System.Web.UI.Control.Render(HtmlTextWriter writer) +10 System.Web.UI.Control.RenderControlInternal(HtmlTextWriter writer, ControlAdapter adapter) +27 System.Web.UI.Control.RenderControl(HtmlTextWriter writer, ControlAdapter adapter) +8991378 System.Web.UI.Control.RenderControl(HtmlTextWriter writer) +25 System.Web.UI.Control.RenderChildrenInternal(HtmlTextWriter writer, ICollection children) +208 System.Web.UI.Control.RenderChildren(HtmlTextWriter writer) +8 System.Web.UI.Page.Render(HtmlTextWriter writer) +29 System.Web.UI.Control.RenderControlInternal(HtmlTextWriter writer, ControlAdapter adapter) +27 System.Web.UI.Control.RenderControl(HtmlTextWriter writer, ControlAdapter adapter) +8991378 System.Web.UI.Control.RenderControl(HtmlTextWriter writer) +25 System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +3060

    Read the article

  • Partner Webcast – Implementing Web Services & SOA Security with Oracle Fusion Middleware - 20 September 2012

    - by Thanos
    Security was always one of the main pain points for the IT industry, and new security challenges has been introduced with the proliferation  of the service-oriented approach to building modern software. Oracle Fusion Middleware provides a wide variety of features that ease the building service-oriented solutions, but how these services can be secured?Should we implement the security features in each and every service or there’s a better way? During the webinar we are going to show how to implement non-intrusive declarative security for your SOA components by introducing the Oracle product portfolio in this area, such as Oracle Web Services Manager and Oracle IDM. Agenda: SOA & Web Services basics: quick refresher Building your SOA with Oracle Fusion Middleware: product review Common security risks in the Web Services world SOA & Web Services security standards Implementing Web Services Security with the Oracle products Web Services Security with Oracle – the big picture Declarative end point security with Oracle Web Services Manager Perimeter Security with Oracle Enterprise Gateway Utilizing the other Oracle IDM products for the advanced scenarios Q&A session Delivery Format This FREE online LIVE eSeminar will be delivered over the Web. Registrations received less than 24hours prior to start time may not receive confirmation to attend. Thursday, September 20, 2012 - 10:00 AM to 11:00 AM CET (GMT/UTC+1)Duration: 1 hour Register Now Send your questions and migration/upgrade requests [email protected] Visit regularly our ISV Migration Center blog or Follow us @oracleimc to learn more on Oracle Technologies, upcoming partner webcasts and events. All content is made available through our YouTube - SlideShare - Oracle Mix.

    Read the article

  • Writing files in a sub folder of the web folder (apache security)

    - by Homunculus Reticulli
    I need to save session data for a dynamic web page script by writing to file. I have two questions: Are there any security preferences as to whether to save the data UNDER the web folder, or OUTSIDE the web folder? I attempted to write to the folder an (unsuprisingly), I had a 'file permission refused' type error. Should I set the folder ownership to the apache user (600, 640 or 644?) [[Edit]] core <- 'OUTSIDE' web folder (php script live here) data <- 'OUTSIDE' web folder (session data and other misc data resides here) web <- web root folder js <- any folder below is 'INSIDE' the web folder css html For example, in a php script (i.e. a dynamic PHP page), I can attempt to write to a file using something like fput('../data',data) yet (as I understand it) ../data should not be accessible - for security reasons. Could someone please provide a simple example that shows how to provide access to ../data/ in the example given above?. What are the actual SPECIFIC steps required? BTW, I am running on a LAMP stack.

    Read the article

  • difference between Casini [IIS express] and VS Development server or Expression web

    - by anirudha
    MVC3 project can be run within Expression web and Visual studio as opened like a website not a project. they work same even if you open blogengine.net project in VS they take a time when you have more theme but expression web debug them in a second. well because theme design not matter for code. Expression web is a good because they save time for compile the code. even changes we make a little in design nothing in backend code.   i found a little difference between Casini and VS development server that if image putted in wrong way like <img src=”//img.png”/> instead of <img src=”/img.png”/> the error we make // instead of / that’s not worked in Expression web or Visual studio debugging but in Cassini it’s work fine.   Well i found that debug Blogengine.net in Expression web is a great thing because in VS they take a time like a minute to debug when you trying to debug first time. Expression Web save a time when we design themes within them and that’s much good option because web is also maked for design.   Well if you want to debug application faster then use casini but Expression web debugging is a good option when they take a long time to debug in Visual studio and EW debug them in a seconds.

    Read the article

  • How to write code that communicates with an accelerator in the real address space (real mode)?

    - by ysap
    This is a preliminary question for the issue, where I was asked to program a host-accelerator program on an embedded system we are building. The system is comprised of (among the standard peripherals) an ARM core and an accelerator processor. Both processors access the system bus via their bus interfaces, and share the same 32-bit global physical memory space. Both share access to the system's DRAM through the system bus. (The computer concept is similar to Beagleboard/raspberry Pie, but with a specialized accelerator added) The accelerator has its own internal memory (SRAM) which is exposed to the system and occupies a portion of the global address space (as opposed to how a graphics card would talk to teh CPU via a "small" aperture in the system memory space). On the ARM core (the host) we plan on running Ubuntu 12.04. The mode of operation of communicating between the processors should be that the host issues memory transactions on the system bus that are targeted at the accelerator internal memory. As far as my understanding goes, if I write a program for the host that simply writes to the physical address of the accelerator, most chances are that the program will crash due to a segmentation violation. So, I assume that I need some way of communicating with the device in real mode. What is the easiest way to achieve this mode of operation?

    Read the article

  • Announcing release of ASP.NET MVC 3, IIS Express, SQL CE 4, Web Farm Framework, Orchard, WebMatrix

    - by ScottGu
    I’m excited to announce the release today of several products: ASP.NET MVC 3 NuGet IIS Express 7.5 SQL Server Compact Edition 4 Web Deploy and Web Farm Framework 2.0 Orchard 1.0 WebMatrix 1.0 The above products are all free. They build upon the .NET 4 and VS 2010 release, and add a ton of additional value to ASP.NET (both Web Forms and MVC) and the Microsoft Web Server stack. ASP.NET MVC 3 Today we are shipping the final release of ASP.NET MVC 3.  You can download and install ASP.NET MVC 3 here.  The ASP.NET MVC 3 source code (released under an OSI-compliant open source license) can also optionally be downloaded here. ASP.NET MVC 3 is a significant update that brings with it a bunch of great features.  Some of the improvements include: Razor ASP.NET MVC 3 ships with a new view-engine option called “Razor” (in addition to continuing to support/enhance the existing .aspx view engine).  Razor minimizes the number of characters and keystrokes required when writing a view template, and enables a fast, fluid coding workflow. Unlike most template syntaxes, with Razor you do not need to interrupt your coding to explicitly denote the start and end of server blocks within your HTML. The Razor parser is smart enough to infer this from your code. This enables a compact and expressive syntax which is clean, fast and fun to type.  You can learn more about Razor from some of the blog posts I’ve done about it over the last 6 months Introducing Razor New @model keyword in Razor Layouts with Razor Server-Side Comments with Razor Razor’s @: and <text> syntax Implicit and Explicit code nuggets with Razor Layouts and Sections with Razor Today’s release supports full code intellisense support for Razor (both VB and C#) with Visual Studio 2010 and the free Visual Web Developer 2010 Express. JavaScript Improvements ASP.NET MVC 3 enables richer JavaScript scenarios and takes advantage of emerging HTML5 capabilities. The AJAX and Validation helpers in ASP.NET MVC 3 now use an Unobtrusive JavaScript based approach.  Unobtrusive JavaScript avoids injecting inline JavaScript into HTML, and enables cleaner separation of behavior using the new HTML 5 “data-“ attribute convention (which conveniently works on older browsers as well – including IE6). This keeps your HTML tight and clean, and makes it easier to optionally swap out or customize JS libraries.  ASP.NET MVC 3 now includes built-in support for posting JSON-based parameters from client-side JavaScript to action methods on the server.  This makes it easier to exchange data across the client and server, and build rich JavaScript front-ends.  We think this capability will be particularly useful going forward with scenarios involving client templates and data binding (including the jQuery plugins the ASP.NET team recently contributed to the jQuery project).  Previous releases of ASP.NET MVC included the core jQuery library.  ASP.NET MVC 3 also now ships the jQuery Validate plugin (which our validation helpers use for client-side validation scenarios).  We are also now shipping and including jQuery UI by default as well (which provides a rich set of client-side JavaScript UI widgets for you to use within projects). Improved Validation ASP.NET MVC 3 includes a bunch of validation enhancements that make it even easier to work with data. Client-side validation is now enabled by default with ASP.NET MVC 3 (using an onbtrusive javascript implementation).  Today’s release also includes built-in support for Remote Validation - which enables you to annotate a model class with a validation attribute that causes ASP.NET MVC to perform a remote validation call to a server method when validating input on the client. The validation features introduced within .NET 4’s System.ComponentModel.DataAnnotations namespace are now supported by ASP.NET MVC 3.  This includes support for the new IValidatableObject interface – which enables you to perform model-level validation, and allows you to provide validation error messages specific to the state of the overall model, or between two properties within the model.  ASP.NET MVC 3 also supports the improvements made to the ValidationAttribute class in .NET 4.  ValidationAttribute now supports a new IsValid overload that provides more information about the current validation context, such as what object is being validated.  This enables richer scenarios where you can validate the current value based on another property of the model.  We’ve shipped a built-in [Compare] validation attribute  with ASP.NET MVC 3 that uses this support and makes it easy out of the box to compare and validate two property values. You can use any data access API or technology with ASP.NET MVC.  This past year, though, we’ve worked closely with the .NET data team to ensure that the new EF Code First library works really well for ASP.NET MVC applications.  These two posts of mine cover the latest EF Code First preview and demonstrates how to use it with ASP.NET MVC 3 to enable easy editing of data (with end to end client+server validation support).  The final release of EF Code First will ship in the next few weeks. Today we are also publishing the first preview of a new MvcScaffolding project.  It enables you to easily scaffold ASP.NET MVC 3 Controllers and Views, and works great with EF Code-First (and is pluggable to support other data providers).  You can learn more about it – and install it via NuGet today - from Steve Sanderson’s MvcScaffolding blog post. Output Caching Previous releases of ASP.NET MVC supported output caching content at a URL or action-method level. With ASP.NET MVC V3 we are also enabling support for partial page output caching – which allows you to easily output cache regions or fragments of a response as opposed to the entire thing.  This ends up being super useful in a lot of scenarios, and enables you to dramatically reduce the work your application does on the server.  The new partial page output caching support in ASP.NET MVC 3 enables you to easily re-use cached sub-regions/fragments of a page across multiple URLs on a site.  It supports the ability to cache the content either on the web-server, or optionally cache it within a distributed cache server like Windows Server AppFabric or memcached. I’ll post some tutorials on my blog that show how to take advantage of ASP.NET MVC 3’s new output caching support for partial page scenarios in the future. Better Dependency Injection ASP.NET MVC 3 provides better support for applying Dependency Injection (DI) and integrating with Dependency Injection/IOC containers. With ASP.NET MVC 3 you no longer need to author custom ControllerFactory classes in order to enable DI with Controllers.  You can instead just register a Dependency Injection framework with ASP.NET MVC 3 and it will resolve dependencies not only for Controllers, but also for Views, Action Filters, Model Binders, Value Providers, Validation Providers, and Model Metadata Providers that you use within your application. This makes it much easier to cleanly integrate dependency injection within your projects. Other Goodies ASP.NET MVC 3 includes dozens of other nice improvements that help to both reduce the amount of code you write, and make the code you do write cleaner.  Here are just a few examples: Improved New Project dialog that makes it easy to start new ASP.NET MVC 3 projects from templates. Improved Add->View Scaffolding support that enables the generation of even cleaner view templates. New ViewBag property that uses .NET 4’s dynamic support to make it easy to pass late-bound data from Controllers to Views. Global Filters support that allows specifying cross-cutting filter attributes (like [HandleError]) across all Controllers within an app. New [AllowHtml] attribute that allows for more granular request validation when binding form posted data to models. Sessionless controller support that allows fine grained control over whether SessionState is enabled on a Controller. New ActionResult types like HttpNotFoundResult and RedirectPermanent for common HTTP scenarios. New Html.Raw() helper to indicate that output should not be HTML encoded. New Crypto helpers for salting and hashing passwords. And much, much more… Learn More about ASP.NET MVC 3 We will be posting lots of tutorials and samples on the http://asp.net/mvc site in the weeks ahead.  Below are two good ASP.NET MVC 3 tutorials available on the site today: Build your First ASP.NET MVC 3 Application: VB and C# Building the ASP.NET MVC 3 Music Store We’ll post additional ASP.NET MVC 3 tutorials and videos on the http://asp.net/mvc site in the future. Visit it regularly to find new tutorials as they are published. How to Upgrade Existing Projects ASP.NET MVC 3 is compatible with ASP.NET MVC 2 – which means it should be easy to update existing MVC projects to ASP.NET MVC 3.  The new features in ASP.NET MVC 3 build on top of the foundational work we’ve already done with the MVC 1 and MVC 2 releases – which means that the skills, knowledge, libraries, and books you’ve acquired are all directly applicable with the MVC 3 release.  MVC 3 adds new features and capabilities – it doesn’t obsolete existing ones. You can upgrade existing ASP.NET MVC 2 projects by following the manual upgrade steps in the release notes.  Alternatively, you can use this automated ASP.NET MVC 3 upgrade tool to easily update your  existing projects. Localized Builds Today’s ASP.NET MVC 3 release is available in English.  We will be releasing localized versions of ASP.NET MVC 3 (in 9 languages) in a few days.  I’ll blog pointers to the localized downloads once they are available. NuGet Today we are also shipping NuGet – a free, open source, package manager that makes it easy for you to find, install, and use open source libraries in your projects. It works with all .NET project types (including ASP.NET Web Forms, ASP.NET MVC, WPF, WinForms, Silverlight, and Class Libraries).  You can download and install it here. NuGet enables developers who maintain open source projects (for example, .NET projects like Moq, NHibernate, Ninject, StructureMap, NUnit, Windsor, Raven, Elmah, etc) to package up their libraries and register them with an online gallery/catalog that is searchable.  The client-side NuGet tools – which include full Visual Studio integration – make it trivial for any .NET developer who wants to use one of these libraries to easily find and install it within the project they are working on. NuGet handles dependency management between libraries (for example: library1 depends on library2). It also makes it easy to update (and optionally remove) libraries from your projects later. It supports updating web.config files (if a package needs configuration settings). It also allows packages to add PowerShell scripts to a project (for example: scaffold commands). Importantly, NuGet is transparent and clean – and does not install anything at the system level. Instead it is focused on making it easy to manage libraries you use with your projects. Our goal with NuGet is to make it as simple as possible to integrate open source libraries within .NET projects.  NuGet Gallery This week we also launched a beta version of the http://nuget.org web-site – which allows anyone to easily search and browse an online gallery of open source packages available via NuGet.  The site also now allows developers to optionally submit new packages that they wish to share with others.  You can learn more about how to create and share a package here. There are hundreds of open-source .NET projects already within the NuGet Gallery today.  We hope to have thousands there in the future. IIS Express 7.5 Today we are also shipping IIS Express 7.5.  IIS Express is a free version of IIS 7.5 that is optimized for developer scenarios.  It works for both ASP.NET Web Forms and ASP.NET MVC project types. We think IIS Express combines the ease of use of the ASP.NET Web Server (aka Cassini) currently built-into Visual Studio today with the full power of IIS.  Specifically: It’s lightweight and easy to install (less than 5Mb download and a quick install) It does not require an administrator account to run/debug applications from Visual Studio It enables a full web-server feature set – including SSL, URL Rewrite, and other IIS 7.x modules It supports and enables the same extensibility model and web.config file settings that IIS 7.x support It can be installed side-by-side with the full IIS web server as well as the ASP.NET Development Server (they do not conflict at all) It works on Windows XP and higher operating systems – giving you a full IIS 7.x developer feature-set on all Windows OS platforms IIS Express (like the ASP.NET Development Server) can be quickly launched to run a site from a directory on disk.  It does not require any registration/configuration steps. This makes it really easy to launch and run for development scenarios.  You can also optionally redistribute IIS Express with your own applications if you want a lightweight web-server.  The standard IIS Express EULA now includes redistributable rights. Visual Studio 2010 SP1 adds support for IIS Express.  Read my VS 2010 SP1 and IIS Express blog post to learn more about what it enables.  SQL Server Compact Edition 4 Today we are also shipping SQL Server Compact Edition 4 (aka SQL CE 4).  SQL CE is a free, embedded, database engine that enables easy database storage. No Database Installation Required SQL CE does not require you to run a setup or install a database server in order to use it.  You can simply copy the SQL CE binaries into the \bin directory of your ASP.NET application, and then your web application can use it as a database engine.  No setup or extra security permissions are required for it to run. You do not need to have an administrator account on the machine. Just copy your web application onto any server and it will work. This is true even of medium-trust applications running in a web hosting environment. SQL CE runs in-memory within your ASP.NET application and will start-up when you first access a SQL CE database, and will automatically shutdown when your application is unloaded.  SQL CE databases are stored as files that live within the \App_Data folder of your ASP.NET Applications. Works with Existing Data APIs SQL CE 4 works with existing .NET-based data APIs, and supports a SQL Server compatible query syntax.  This means you can use existing data APIs like ADO.NET, as well as use higher-level ORMs like Entity Framework and NHibernate with SQL CE.  This enables you to use the same data programming skills and data APIs you know today. Supports Development, Testing and Production Scenarios SQL CE can be used for development scenarios, testing scenarios, and light production usage scenarios.  With the SQL CE 4 release we’ve done the engineering work to ensure that SQL CE won’t crash or deadlock when used in a multi-threaded server scenario (like ASP.NET).  This is a big change from previous releases of SQL CE – which were designed for client-only scenarios and which explicitly blocked running in web-server environments.  Starting with SQL CE 4 you can use it in a web-server as well. There are no license restrictions with SQL CE.  It is also totally free. Tooling Support with VS 2010 SP1 Visual Studio 2010 SP1 adds support for SQL CE 4 and ASP.NET Projects.  Read my VS 2010 SP1 and SQL CE 4 blog post to learn more about what it enables.  Web Deploy and Web Farm Framework 2.0 Today we are also releasing Microsoft Web Deploy V2 and Microsoft Web Farm Framework V2.  These services provide a flexible and powerful way to deploy ASP.NET applications onto either a single server, or across a web farm of machines. You can learn more about these capabilities from my previous blog posts on them: Introducing the Microsoft Web Farm Framework Automating Deployment with Microsoft Web Deploy Visit the http://iis.net website to learn more and install them. Both are free. Orchard 1.0 Today we are also releasing Orchard v1.0.  Orchard is a free, open source, community based project.  It provides Content Management System (CMS) and Blogging System support out of the box, and makes it possible to easily create and manage web-sites without having to write code (site owners can customize a site through the browser-based editing tools built-into Orchard).  Read these tutorials to learn more about how you can setup and manage your own Orchard site. Orchard itself is built as an ASP.NET MVC 3 application using Razor view templates (and by default uses SQL CE 4 for data storage).  Developers wishing to extend an Orchard site with custom functionality can open and edit it as a Visual Studio project – and add new ASP.NET MVC Controllers/Views to it.  WebMatrix 1.0 WebMatrix is a new, free, web development tool from Microsoft that provides a suite of technologies that make it easier to enable website development.  It enables a developer to start a new site by browsing and downloading an app template from an online gallery of web applications (which includes popular apps like Umbraco, DotNetNuke, Orchard, WordPress, Drupal and Joomla).  Alternatively it also enables developers to create and code web sites from scratch. WebMatrix is task focused and helps guide developers as they work on sites.  WebMatrix includes IIS Express, SQL CE 4, and ASP.NET - providing an integrated web-server, database and programming framework combination.  It also includes built-in web publishing support which makes it easy to find and deploy sites to web hosting providers. You can learn more about WebMatrix from my Introducing WebMatrix blog post this summer.  Visit http://microsoft.com/web to download and install it today. Summary I’m really excited about today’s releases – they provide a bunch of additional value that makes web development with ASP.NET, Visual Studio and the Microsoft Web Server a lot better.  A lot of folks worked hard to share this with you today. On behalf of my whole team – we hope you enjoy them! Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Windows Azure Recipe: Social Web / Big Media

    - by Clint Edmonson
    With the rise of social media there’s been an explosion of special interest media web sites on the web. From athletics to board games to funny animal behaviors, you can bet there’s a group of people somewhere on the web talking about it. Social media sites allow us to interact, share experiences, and bond with like minded enthusiasts around the globe. And through the power of software, we can follow trends in these unique domains in real time. Drivers Reach Scalability Media hosting Global distribution Solution Here’s a sketch of how a social media application might be built out on Windows Azure: Ingredients Traffic Manager (optional) – can be used to provide hosting and load balancing across different instances and/or data centers. Perfect if the solution needs to be delivered to different cultures or regions around the world. Access Control – this service is essential to managing user identity. It’s backed by a full blown implementation of Active Directory and allows the definition and management of users, groups, and roles. A pre-built ASP.NET membership provider is included in the training kit to leverage this capability but it’s also flexible enough to be combined with external Identity providers including Windows LiveID, Google, Yahoo!, and Facebook. The provider model has extensibility points to hook into other identity providers as well. Web Role – hosts the core of the web application and presents a central social hub users. Database – used to store core operational, functional, and workflow data for the solution’s web services. Caching (optional) – as a web site traffic grows caching can be leveraged to keep frequently used read-only, user specific, and application resource data in a high-speed distributed in-memory for faster response times and ultimately higher scalability without spinning up more web and worker roles. It includes a token based security model that works alongside the Access Control service. Tables (optional) – for semi-structured data streams that don’t need relational integrity such as conversations, comments, or activity streams, tables provide a faster and more flexible way to store this kind of historical data. Blobs (optional) – users may be creating or uploading large volumes of heterogeneous data such as documents or rich media. Blob storage provides a scalable, resilient way to store terabytes of user data. The storage facilities can also integrate with the Access Control service to ensure users’ data is delivered securely. Content Delivery Network (CDN) (optional) – for sites that service users around the globe, the CDN is an extension to blob storage that, when enabled, will automatically cache frequently accessed blobs and static site content at edge data centers around the world. The data can be delivered statically or streamed in the case of rich media content. Training These links point to online Windows Azure training labs and resources where you can learn more about the individual ingredients described above. (Note: The entire Windows Azure Training Kit can also be downloaded for offline use.) Windows Azure (16 labs) Windows Azure is an internet-scale cloud computing and services platform hosted in Microsoft data centers, which provides an operating system and a set of developer services which can be used individually or together. It gives developers the choice to build web applications; applications running on connected devices, PCs, or servers; or hybrid solutions offering the best of both worlds. New or enhanced applications can be built using existing skills with the Visual Studio development environment and the .NET Framework. With its standards-based and interoperable approach, the services platform supports multiple internet protocols, including HTTP, REST, SOAP, and plain XML SQL Azure (7 labs) Microsoft SQL Azure delivers on the Microsoft Data Platform vision of extending the SQL Server capabilities to the cloud as web-based services, enabling you to store structured, semi-structured, and unstructured data. Windows Azure Services (9 labs) As applications collaborate across organizational boundaries, ensuring secure transactions across disparate security domains is crucial but difficult to implement. Windows Azure Services provides hosted authentication and access control using powerful, secure, standards-based infrastructure. See my Windows Azure Resource Guide for more guidance on how to get started, including links web portals, training kits, samples, and blogs related to Windows Azure.

    Read the article

  • TFS Build Automation - Web Deployment Project error

    - by gracejz
    I'm trying to build a web deployment project using TFS automated build process. When I build the project directly in Visual Studio 2008, it works fine. But from TFS, I get the following error: "C:\Users\tfsservice\AppData\Local\Temp\TestProduct\TestSolution\BuildType\TFSBuild.proj" (EndToEndIteration target) (1) - "C:\Users\tfsservice\AppData\Local\Temp\TestProduct\TestSolution\BuildType\TFSBuild.proj" (CoreCompile target) (1:2) - "C:\Users\tfsservice\AppData\Local\Temp\TestProduct\TestSolution\BuildType\TFSBuild.proj" (CompileConfiguration target) (1:3) - "C:\Users\tfsservice\AppData\Local\Temp\TestProduct\TestSolution\BuildType\TFSBuild.proj" (CompileSolution target) (1:4) - "C:\Users\tfsservice\AppData\Local\Temp\TestProduct\TestSolution\Sources\TestSolution.sln" (default target) (6) - "C:\Users\tfsservice\AppData\Local\Temp\TestProduct\TestSolution\Sources\WebDeployment\WebDeployment.wdproj" (default target) (48) - (CreateVirtualDirectory target) - C:\Program Files\MSBuild\Microsoft\WebDeployment\v9.0\Microsoft.WebDeployment.targets(676,5): error : Some or all identity references could not be translated. I made sure that NETWORK SERVICE account has permission to access all the web folders. Any ideas?

    Read the article

< Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >