Skip to main content
.NET Services - Cloud Interoperability
Azure Dev Challenge Questions Tackled
Last month Microsoft announced its first-ever Azure Developer Challenge. Since then, we've received strong interest and several questions about the competition.
The question I most frequently receive is, "Why is the Azure Dev Challenge open to US residents only?"
I am pleased to announce that starting today the Azure Developer Challenge is open to developers outside of the US. Without starting to sound like a lawyer, there were legal and regulatory considerations that took us a little bit longer to resolve for international. We now have those worked out and can't wait to see your applications. See my previous post for more info on getting registered.
Seperately, we received questions about interopoerability and the use of existing code. Specifically... "Can we use jQuery?" the answer is yes.
Experiment, have fun, and send us feedback about your experiences during the process of submitting an entry for the competition.
Remember, the deadline to submit contest app entries is June 18 for US developers. Developers outside the US have until July 9th to submit their application. So hurry up and go to www.newcloudapp.com to get started today! To follow the contest chatter on Twitter, search for "#newCloudApp".
Azure Developer Challenge – Judged by Om Malik and Michael Cote
As you know, I'm always up for a little friendly competition… I'm excited to announce that today, we are kicking off the first Azure Services Platform Developer Challenge.
Over the next two months, developers will have the opportunity to show off what they can do on the Azure Services Platform. We're looking for innovate apps developed with the user experience in mind that are applicable to the real-world and highlight new opportunities cloud computing brings to developers.
There are a couple of surprise twists to this challenge. The first twist is this contest will have three winners – best .NET application, a top PHP application, and one more we'll talk about in just a bit. For the .NET application category, we want to see a great .NET application running on Windows Azure using ASP.NET or Silverlight to incorporate additional Azure services such as .NET Services and Live Services. Incorporate other Microsoft, 3rd party services, or other cloud services and author a unique web, mobile, or desktop application.
For the PHP contest category running on Windows Azure, we want to see a PHP application taking interoperability to the next level by integrating with other Azure services, 3rd party web services and APIs, and services provided by other cloud providers.
The second twist to this contest is that we're really excited that the winner of the .NET and PHP application categories will be judged by industry leaders Om Malik, founder and senior writer for the GigaOM Network, and Michael Cote, IT Management Lead analyst at RedMonk. The winners chosen by the judges will be announced at Structure 09 on June 25.
And now the third and final twist… We are inviting the community of web and software developers to decide the third overall community winner via online voting. This winner will be announced on June 30.
But before you head off to create your masterpiece, make sure you check out www.Newcloudapp.com for official rules, registration info, important deadlines and to learn what money, fame, and glory are in store for three creative developers J.
Expect to see some chatter about this on Twitter. Search for "#newCloudApp" as developers start sharing their creations online.
Good luck!
BizTalk Adapter Pack 2.0 announced at WPC
This week I returned to my home state of Texas to attend the Worldwide Partner Conference. Returning to my home state didn't get me back to speaking in my southern-drawl, but it did get me speaking with partners about something we are both excited about. The news may not be as big as the state of Texas (but what is?); however, the BizTalk Adapter Pack 2.0 is far from a Rhode Island sized announcement.
So what is it? The BizTalk Adapter Pack is a separate SKU from the rest of the BizTalk Server family of products. It continues to simplify the ways that customers and partners can connect to line-of-business systems (LOBs). Generally speaking, when developers want to build an application that draws information from an LOB, they use a message broker technology with an application adapter or they write directly to the LOB APIs. Neither is particularly productive especially in simple scenarios. The BizTalk Adapter pack changes this by giving developers simple technology to connect directly to the LOB system without using a heavy mid-tier server.
What's new in V2? Specifically in the V2, we are delivering new adapters for the Oracle E-Business Suite and SQL Server. This builds on existing functionality from first version, which RTM'd a few months ago, and includes adapters for SAP R/3, Siebel and Oracle DBMS. A key value for partners is the WCF LOB Adapter SDK (available as a free download from MSDN) on which Microsoft has built the Adapter Pack. The SDK enables a platform that makes Adapter development much easier by providing support for key capabilities (like Metadata browse & search and connection pooling) out of the box.
Overall, the BizTalk Adapter Pack is another demonstration of Microsoft's long term commitment to interoperability. Customers choose how they want to connect application platforms and people, and we provide them with the tools.
BizTalk Server 2009
Today, we announced updated plans for the next major version of BizTalk Server. We launched the first version of BizTalk Server back in 2000. Eight years later, we've seen our installed base grow to 8,200 customers making it the most widely deployed solution for enterprise connectivity in heterogeneous environments. We're hearing from our customers that BizTalk has become a core part of their infrastructure, running mission critical applications. Our partners (over 1500 of them) tell us that the applications and adapters they build for BizTalk have become a significant part of their business. This positive feedback is our greatest reward.
We're excited to offer more details on the next version of BizTalk Server—now dubbed BizTalk Server 2009 to reflect the full release that it is. Initially, We disclosed this as BizTalk Server 2006 R3, but it has so many exciting new features that it deserves a to be referenced as a full release. BizTalk Server 2009 will focus on a few key areas; as always, these areas are determined based on what customers have told us are their priorities. They are platform support, SOA and Web Services, B2B integration and developer productivity. In particular, the platform updates enable greater scalability and reliability, new Hyper-V virtualization support, and many advances in the latest developer tools.
I should also note that we're still on track for the final release of BizTalk Server 2009 in 1H of CY2009. For all the features and details, go here or to PressPass.
We've actually already delivered a first Community Technology Preview (CTP) to select customers and we're getting great feedback! The next CTP update is coming sometime in Q4 of CY08. We'll use this broad feedback from customers and partners to help us validate the features and readiness of the product.
Looking into the future, the goal is to continue to provide a BizTalk Server release approximately every two years, plus additional interim releases of service packs as appropriate. At each milestone, we will take advantage of as much platform technology as is reasonable and consumable by our customers and will take advantage of updates to .NET, Visual Studio, Windows Server and SQL.
We're also hearing from many of our BizTalk customers that they're beginning to accelerate the development of more complex composite applications. As you know, one of our missions with "Oslo" is to simplify the development/deployment/management of composite applications through a model-driven approach to the application lifecycle. We see our BizTalk customers benefitting from Oslo's core technologies, and are committed to providing choice, flexibility and a clear integration path for those who are interested in taking advantage.
BizTalk Services "R12" Release – Workflow
Drum roll please….for those of you are as interested in the advent of cloud services, this is big. Today, we released BizTalk Services "R12" Community Technology Preview (CTP). Just as a refresher: "BizTalk Services" is the code-name for an incubation for our SOA platform-in-the-cloud offering from Microsoft. BizTalk Services provides Messaging, Identity and Workflow (our latest addition) enabling developers to extend existing premises applications and build new composite applications. See my previous post for additional information.
While the BizTalk Services "R12" CTP includes a variety of updates, the piece that stand out is the release of the anticipated Workflow capabilities. The new cloud-based Workflow capabilities enable 'service orchestration' from the cloud. This functionality is based on the Windows Workflow Foundation (.NET Framework component) and can orchestrate services that connect to systems in your enterprise, or to systems running anywhere on the Internet via Web services messages. Using this service, you can define the interaction of any web-addressable services.
In addition to the Workflow functionality, the BizTalk Services Identity Service has been expanded and enhanced to enable more flexibility for scenarios demanded by our customers. R12 introduces a new approach for creating, viewing, and managing access control rules.
The new BizTalk Services "R12" CTP is online and available now for your use and the SDK is available at https://labs.biztalk.net. Whether or not you currently have an account, now's the time to try it!
Bring back the red shirt
Sign the petition here! See you at Mix.
Convergence - SOA, SaaS, Modeling, Virtualization
Last year at TechEd we talked about deploying technologies in a blended way – both on premises and in the cloud. We also talked about a framework for thinking about which applications might benefit the most from the blended world. Specifically, applications can be grouped in two buckets – core and commodity. Core – things that make your business really unique. Commodity – things that are important, but not secret sauce – think Expense Reporting. Those who started the journey a year ago are starting to see real benefits.
Here we are at TechEd a year later with some new trends on the horizon – this time, it's the convergence of technologies that has us really excited. There are four distinct major technology trends which are beginning to converge. Looking into the future, we see a perfect storm of productivity and application richness brewing. Specifically, SOA, SaaS, Application Virtualization and Modeling will collide and spark a wave of application creation that we haven't seen since Al Gore invented the Internet. Let me paint you a picture - developers will compose business critical applications from services they didn't author, run them in datacenters they don't own, manage them at a policy level, and pay for them by the drink.
Let's take a look at these four trends. Today, developers build services and expose them across the firewall, which looks a lot like SaaS. Modeling technologies are being used to aggregate services into composite apps. Cloud based, Application virtualization will allow developers to think about their infrastructure in logical terms rather than physical – imagine datacenter as you go. No more worrying about building something and having too much or too little. You eliminate the Three Bears problem, not too hot, not too cold – but just right.
Microsoft architecting for convergence. When it comes to building apps that will live in the cloud, we know developers will want to harness their existing skills, leverage their existing apps and connect to existing third party apps. Today we have sevarla active incubations running – BizTalk Services, SSDS and Live Mesh to name a few. We'll say a lot more about our plans for the cloud at the PDC in October.
Countdown to Oslo: Introducing "M" and "Quadrant"
Robert Wahbe and I were out on the road this week meeting with press and members of the dev community to talk more about Oslo. For those of you who have been following Oslo, the timing won't come as a surprise: almost exactly one year ago, we announced it at the SOA BP Conference. Our goal? To make it easier to build apps using models and to break down silos of application development. That means that whether you write the user specification, write the code, or have to manage the application, the model for what the app should do is shared. Any of us that have written or managed apps know that getting everyone to think about requirements in the same way can be a huge challenge. Ultimately, for developers and others in the application lifecycle, this will mean more productivity, transparency and flexibility. For partners, it will mean more opportunities to design domain-specific apps (e.g., through the creation of DSLs, etc.).
One of the things we promised when we made that first announcement was to deliver a CTP of Oslo at the PDC – a commitment I'm very proud to say we'll keep. In less than three weeks, the those attending the PDC will get a first-hand look at the three technologies that make up Oslo:
- A language – codenamed "M" – that helps people create and use textual domain-specific languages (DSLs) and data models
- A relational repository – that makes models available to both tools and platform components
- A tool – codenamed "Quadrant" – that helps people define and interact with models in a rich and visual manner
Now, in case those two codenames sound new; they are. While we've talked about both the tool and the language before, today is the first time we've publicly referred to them by their codenames: "Quadrant" and "M" which you'll see reflected in the CTP packaging.
The upcoming CTP was a pretty hot topic during a conversation Robert and I had with a group of developers during dinner a few nights ago. Readers probably know some of these folks already: Shashi Raina, Don Demsak, Marc Adler, Nick Landry, Ambrose Little and Bill Zack. Across the board, these guys seem eager to get their hands on the bits to see for themselves what Oslo is all about (obviously pretty encouraging for me and the rest of the team). The language("M") in particular seems to be of interest, which is consistent with what we've been hearing over the past year. Here's why… when we talk to developers about "Modeling" many have the same immediate reaction. They say something like "modeling isn't for me, I'll leave the pretty pictures to the business folks and architects. I write code and live in a text editor."
Which is where "M" comes in. We respond by saying that modeling and writing code are not mutually exclusive, in fact, they are intertwined. "M" gives developers the ability to model applications in text rather than graphics. Nothing against graphics, we have a great tool for that too – Quadrant. "M," will add the text component that has been missing in the industry so far.
The dinner conversation did get very engaged in a few spots, especially when we started discussing the idea of bringing business analysts and other users closer to the application via the more visual part of the equation, a.k.a. Quadrant. For instance, most people only think about having business analysts interacting in application design. Oslo will allow folks to go well beyond that – if a developer delivers a set of DSLs, an end user could potentially compose the application they need in real time. Most at the table saw the value (and business opportunities) in this, but a couple expressed some distaste at the idea of analysts and others directly interacting with an app. Makes me wonder if everyone else is as passionate about this…?
XAML was another big topic of discussion, with some lively debate over its full potential and role in the Microsoft platform as well as the ability to target other non-MSFT runtimes.
I'll be talking about all of these areas in more detail over the next couple of weeks. In the meantime, only about 17 days to go until that Oslo CTP is available…
Datacenter Expansion and Capacity Planning
Update on recent developments regarding Windows Azure Datacenter Expansion and Capacity planning - here.
Higher Standards for Web Standards
Since the emergence of web services in the 90s, we've seen an explosion of standards and standards bodies. Sometimes, they emerge based on new innovations, other times they're created to unblock a stalemate on a similar standard or organization. Occasionally they are created simply to change the technology landscape in a way that is more favorable for certain vendors.
A question that I am asked over and over is – "Does Microsoft support standard X?" or "Is Microsoft going to join standards body Y?" The question we should spend more time debating is "What are the key technology or interoperability gaps and how do we fill them?" As new initiatives emerge, we research the business and technical need before taking action. We do this by putting ourselves in the shoes of actual developers and IT Pros and asking "what are the barriers I'm facing today, and what do I need to solve them?" Pragmatism over theory, always.
In many cases, the right answer isn't necessarily to define something new, but to instead carefully consider whether technology or initiatives already exist to solve the problem. In the end, we should judge the strength of standards on industry and customer adoption alone. As an example, IBM recently announced a consortium called "WSTF": Web Services Test Forum which leaves us a tad puzzled.
As of today, the WS-* standards are largely complete within W3C, OASIS, WS-I, DMTF, etc. and are widely implemented in infrastructure products and used by organizations all over the world. We were thrilled to participate in the Oasis announcement just last week on WS-RX, WS-TX and WS-SX. With regard to testing, we think it is critical that customers be able to propose scenarios that match their real-world interoperability needs. Equally important - both successes and failures must be made public. This is why we're still evaluating our participation in WSTF.
Microsoft and other vendors have been participating in a variety of forums for quite some time to help crack the interoperability code. A few examples of forums that have yielded real world results for developers over the years are:
- WS-* specification development at W3C and OASIS. This formal process defined the protocol specifications for enabling service composition through addressable, secure, transacted, reliable, policy-based, end-to-end messaging.
- WS-I is the base layer process for integration and interoperability, upon which other, more domain-specific or scenario-specific tests, profiles, and guidance are built and the primary WS-* interoperability testing focus for Microsoft.
- Interoperability Plug-fests are more informal events at which multiple vendors get together to test interoperability against all other interested attendees, using agreed-upon scenarios for current and forthcoming products. The test tools that Microsoft developed remain available at https://131.107.72.15/endpoints/Default.aspx. These endpoints (and similar endpoints from other vendors) implement dozens of scenarios that customers and vendors can use to validate interoperability.
- Greg Leake runs one of the largest interoperability labs in the world and publishes results and guidance on WS* / WebSpehere / .NET interop. Stay tuned for more here – Greg is just completing his work on WebSphere 7.
Separately, but potentially equally interesting… An interoperability project that Microsoft recently joined is the Apache Stonehenge incubator effort. We look forward to expanding our efforts in partnership with other vendors on this front. Here's the latest.
Do we need additional standards? The answer is almost certainly yes. But before touting a new standard or standards organization, vendors need to be clear about what specific issue is being solved and hold all parties accountable for doing so. Public access is a key criterion we have in mind as we think about WSTF. So what can you do? Continue to contribute at all levels; standards are only as good as the community formed around them. As always, let us know what you think.
IDS Scheer announces the release of ARIS for BizTalk
June has been an active month for the Business Process Alliance. In a recent post, I shared some thoughts on how several cutting edge technologies—SOA, modeling, SaaS and virtualization—are beginning to converge. Recent product announcements show that it's already beginning to happen. That said, you're probably asking, 'why do I care again?' The short answer… we're watching business and IT slowly move toward the right levels of alignment for the last decade, and many have cited SOA & Business Process technologies as vehicles.
At the Process World event in Berlin, IDS Scheer announced the release of ARIS for BizTalk, which generates BizTalk applications from process models. They will release a second product in the near future which allows data from BizTalk to be exposed in their process monitoring software. In other words, you can now model a process in ARIS SOA Architect, take the process represented in either BPEL or BPMN code from the SOA Architect product and convert it directly into a BizTalk application using ARIS for Microsoft BizTalk. This is a great demonstration of design and runtime technologies working together thru standards. The second, as yet to be named product also announced at Process World will enable a full cycle process modeling and execution experience by extracting data from the BizTalk BAM database and presenting it in the ARIS PPM dashboard.
The combination of these two products will allow users to model a process, generate an executable application, run the application and monitor the resulting application performance so that adjustments can be made to the model with the intent of optimizing performance. This release is the result of over a year of work by a joint MS and IDS Scheer team to realize the goal of end-to-end support of process model execution. We think this represents a huge step forward for developers, and look forward to continuing work with the BPA on Oslo—this is just the beginning!
A couple other key announcements: on the 3rd of June, K2 announced their new BlackPoint product, a derivative of BlackPearl. Aimed at providing a low cost way of building workflow applications in MOSS with an Office-style designer, the innovations in BlackPearl make it friendly to non-technical users. Later on June 10th, SOA Software announced that they had certified their SOA governance solution to work with WCF allowing .Net applications to act as fully governed peer in heterogeneous SOA environments the might include other vendor's products.
The BPA members just continue to create exciting new products and making it easier to build great SOA & BPM solutions with Microsoft technology!
Know SOA or No SOA?
You've probably seen the chatter as a result of Anne's recent post. So, the question on all SOA minds now turns to the fate of SOA: have we seen SOA's last act or will there be an encore? With the economic downturn and companies increasing their focus on near term results, we're a bit surprised over the fervor. A perfectly logical response might have been "Yeah, and?"
This debate will continue, that's for certain. There are folks on either side of the fence who are passionate about this topic, and more than a few vendors have tied their revenue to it in a material way. Read some of the responses thru that lens – opinions on SOA being dead, or alive and well, seem to reflect economic passion more than anything else.
The 'SO' of SOA has been around for over a decade. It started in the form of Web Services back in the late 90s; companies were looking to find efficiencies between systems to do things like employee on-boarding. Soon after, the light bulb went on, and oragnizations extended this federation beyond internal systems to foster better integration with partners and customer. Federation will continue as customers turn smart decisions into technical reality through service orientation in a blended world of computing (premises and in the cloud–Something we've been discussing and investing in for quite some time now).
If SOA fostered all the fantastic innovation above, why are companies struggling with it and abandoning their SOA projects? Let's be clear: many SOA projects have been successful and will continue, specifically the ones born from a middle out, project approach and based on business value. Customers like M.D. Anderson Cancer Center and Vail Resorts are great examples. However, oftentimes there has been too much focus on the "A" in SOA instead of the "SO." The bottom line: projects that have been monolithic undertakings are the ones that never made it out of the lab, and are probably being discontinued today.
We've been out on the road talking to customers as part of the SOA Roadshows discussing this topic, and we have a lot more to say on this topic, so look for more soon from the Microsoft SOA & Business Process Conference happening later this month. In the mean time, enjoy the show…
Microsoft Participating in the Object Management Group (OMG)
This week I had the privilege of sitting with Bob Muglia while he taped a short video on his thoughts of how model-driven development could transform the way we develop and manage applications. Bob also announced Microsoft's participation in the Object Management Group (OMG) standards organization, which owns key modeling standards like UML and BPMN. Given bob's history, it was great to hear his perspective on how modeling will impact the next generation of computing.
Bob spoke about the opportunity that Microsoft sees to take this kind of approach mainstream – not just the Fortune 50 or Fortune 500, but the Fortune 5 million… In the video, Bob talks about modeling as a core focus of Microsoft's Dynamic IT strategy, and highlighted three areas of focus for Microsoft: platform, personas and partners.
What is the significance of these 3 "Ps" aside from being a super-cool alliteration? Well, they are the three-part strategy to bringing modeling into the mainstream. We think that by driving modeling capabilities into the core .NET platform, expanding the various types of personas within an organization that can interact with models, and finally expanding the partner ecosystem to create a broad array of solutions and standards, this gives customers – of all sizes – the ability to benefit from modeling as a part of their application development. In many cases, customers will benefit from modeling through natural evolution of existing technologies. They won't necessarily know they are using modeling when they deploy an app – their life will just get easier.
I've talked about " Oslo" many times here. "Oslo" is the codename for a set of technical investments that will significantly enhance modeling capabilities for Microsoft, our partner ecosystem and our customers. It consists of:
· A tool that helps people define and interact with models in a rich and visual manner
· A language that helps people create and use textual domain-specific languages and data models
· A repository that makes models available to both tools and platform components
It's also important to note that modeling is a company-wide investment, which means it doesn't start or stop with "Oslo" – it includes a lot of exciting work we're doing as part of Visual Studio Team System "Rosario", System Center, BizTalk Server, SQL Server and more.
Want to know more about the OMG news or what Bob had to say? Check out the video.
Microsoft Positioned in the Leaders Quadrant of Latest Magic Quadrants for Application Infrastructure
The latest round of research is out from Gartner, and they have positioned Microsoft in the Leaders Quadrant of all three Application Infrastructure Magic Quadrants. The Magic Quadrant for Application Infrastructure for SOA Composite Application Projects , Magic Quadrant for Application Infrastructure for New Systematic SOA Application Projects, and the Magic Quadrant for Application Infrastructure for Back-End Application Integration Projects.
In my humble opinion, placement in the Leaders Quadrant validates Microsoft as a leading provider of platform technology for service orientation and integration. We believe that these reports highlight the depth of our offering in these spaces and recognize our potential for future innovation.
For more information on today's news, please click here for details on the press release.
On the legalese:
The Gartner Magic Quadrant is copyrighted 2008 by Gartner, Inc., and is reused with permission. The Magic Quadrant is a graphical representation of a marketplace at and for a specific time period. It depicts Gartner's analysis of how certain vendors measure against criteria for that marketplace, as defined by Gartner. Gartner does not endorse any vendor, product or service depicted in the Magic Quadrant, and does not advise technology users to select only those vendors placed in the "Leaders" quadrant. The Magic Quadrant is intended solely as a research tool, and is not meant to be a specific guide to action. Gartner disclaims all warranties, express or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
The Magic Quadrant graphic was published by Gartner, Inc., as part of a larger research note and should be evaluated in the context of the entire report. The Gartner report is available upon request from Microsoft Corporation.
Microsoft's Annual SOA and BP conference - SOA meets Cloud
Today we wrapped up the 6th annual SOA and BP conference here in Redmond. This was the 14th stop on the tour which continues around the world thru April. Yesterday, I had the privilege of delivering the keynote to kick off the conference and took that opportunity to talk about both the last few weeks and the road ahead.
As we talk to customers around the world, it's clear that the message on SOA being a "how" rather than a "what" is really sinking in. Now we spend more of our time talking about extending Service Orientation than we do defining it. I took a very non-Microsoft approach – instead of blathering on about technology, I instead spent time debunking the myths (which still pervade the space) and connecting customers with their peers to share what they have learned and best practices.
One thing that was clear from talking to customers this week is that the SOA landscape continues to evolve. As I wrote months ago – the convergence of key technologies like Virtualization, Cloud, Modeling, SOA and SaaS continues at a rapid pace. It's clear that cloud computing will have a significant impact on SOA moving forward. So much so that one might argue that sooner rather than latter, the app pattern which is Service Orientation will be a given - much the same way that no one refers to Object Orientation as a "feature" of an app anymore. Service Orientation will be the flour of the composite app cake.
While many have taken time to question Microsoft's strategy with Service Orientation, what we have always said still rings true – it's about web services, starting small and being pragmatic. We are taking Service Orientation (and as much Architecture as is needed to solve the problem) to the masses by making key components available directly in the .NET Framework. Our investments in cloud computing will only help as developers will have additional options for instantiating services in a geo-scalable way.
Thanks to all who attended and watched it live all over the world! As always, tell us what you think…
MIX 09 - Return on Experience
Posting from The Venetian Hotel in Las Vegas where I'm taking part in Microsoft's 4th annual MIX event, where the crowd is a sea of red… red shirts that is, in honor of Scott Guthrie's style.
Looking around I see a diverse group of web designers and developers eager to learn how businesses can achieve a 'return on experience' by using a new generation of development tools, web platform technologies and software plus services to better engage their customers. Web 2.0 technologies have become a business norm and an essential customer interaction point for most businesses.
So what does this have to do with MIX? Well, MIX provides a forum for developers to learn about Microsoft's newest technologies and developer tools -- whether they be on the web or in the cloud, so that they can engage their customers and forge deeper connections through rich online experiences that deliver a better experience.
From the Web to the Cloud
Developers using the web as a key access point to build rich content, social media applications and innovative business application isn't new. In fact, Microsoft has, and continues to provide the tools and technologies to enable web developers to build these rich applications, including Web App Gallery, Silverlight 3 Beta, Expression, etc.
With changing economics and technical innovations, we are taking this opportunity to extend the web workload into the cloud helping developers harness the power that they want and need, at scale without having to manage the complexities of hardware. Moving beyond hardware, Microsoft's Azure Services Platform helps developers focus on their applications and user experiences.
When Microsoft began thinking about its software plus services vision over three years ago, we knew that in order to help our customers solve their business problems, they would need a balance between on-premise and cloud solutions, and the choice to determine which workloads they moved in and out of the cloud. We've created a dynamic internet-scale cloud services platform hosted in Microsoft data centers, which provides an operating system and a broad set of developer services that can be used individually or together. Today, I'm happy to announce the roadmap updates with the SDS relational database capabilities, Windows Azure updates (including FactCGI, .NET FullTrust & Geolocation), and the .NET Services interop enhancements.
For those who weren't able to attend MIX in person – here are few things to consider.
Get Involved
· Visit the MIX 09 website and check out the latest innovations on the web and cloud, or stream the keynotes from your desk.
· Begin leveraging Microsoft's cloud, visit the Azure website and register for the Community Tech Preview.
· Download the Silverlight 3 beta and take advantage of the newest tools outlined above.
· Check out the WebPI and the App Gallery.
For those of you who made the trip to MIX, I'll see you at tonight's party at TAO !
MIX Day 3: Oslo - "M" and the MSC
Last Friday we spent time talking about Oslo, which included a drill into "M" and demos. Doug Purdy and Chris Sells presented their session 'Developing RESTful Services and Clients with "M"'. During the session, Doug demonstrated two DSLs: a sample DSL for RESTful client development called "MUrl", and a second DSL called "MService" which targets RESTful service development. These DSLs are part of our broader efforts to highlight how Oslo can help increase productivity in web development.
We also announced a community-driven process for getting input on the "M" specification called the "M" Specification Community, or MSC. The MSC is an online discussion group that we will use to gather feedback on the M specification, before we finalize it under the terms of the Open Specification Promise (OSP), a commitment we disclosed at PDC. The MSC will allow a diverse set of industry folks to participate in M's development, which will in turn make M an even better and more approachable tool for developers. We are interested in working with the industry to generalize an approach to model-driven programming and develop consistent programming concepts to make this a mainstream application development activity.
So, join the MSC, download the Oslo CTP, and check out the DSLs here, and please keep the feedback coming. Web developers - we think this content is especially relevant to you, and are excited to hear what you think.
Moving beyond the "Manifesto"
As you might expect, several of us spent most of Thursday and Friday last week in conversations with developers, standards body members and other vendors regarding open standards for cloud computing and how we get there collaboratively. Being in this industry for so many years, I remember a time when new technologies and platforms did not produce much interest in standards and interoperability. It was great this time around to see broad support for openness in the cloud and transparency on the approach to interoperability. I was also happy to see a number of community-driven efforts spin up last week, which will provide enormously valuable feedback in defining the desired end-state. It's important for everyone to take a step back and remember this isn't about vendors; it's about developers and end-users.
As I indicated on Wednesday night, Microsoft welcomes the opportunity for open dialogue on cloud standards. To that end, we have accepted an invitation to meet on Monday at 4pm in New York at the Cloud Computing Expo with other vendors and members of standards bodies. From our perspective, this represents a fresh start on the conversation – a collaborative "do-over" if you will.
Moving Toward an Open Process on Cloud Computing Interoperability
Moving Toward an Open Process on Cloud Computing Interoperability
From the moment we kicked off our cloud computing effort, openness and interop stood at the forefront. As those who are using it will tell you, the Azure Services Platform is an open and flexible platform that is defined by web addressability, SOAP, XML, and REST. Our vision in taking this approach was to ensure that the programming model was extensible and that the individual services could be used in conjunction with applications and infrastructure that ran on both Microsoft and non-Microsoft stacks. This is something that I've written about previously and is an area where we receive some of the most positive feedback from our users. At MIX, we highlighted the use of our Identity Service and Service Bus with an application written in Python and deployed into Google App Engine which may have been the first public cloud to cloud interop demo.
But what about web and cloud-specific standards? Microsoft has enjoyed a long and productive history working with many companies regarding standardization projects; a great example being the WS* work which we continue to help evolve. We expect interoperability and standards efforts to evolve organically as the industry gradually shifts focus to the huge opportunity provided by cloud computing.
Recently, we've heard about a "Cloud Manifesto," purportedly describing principles and guidelines for interoperability in cloud computing. We love the concept. We strongly support an open, collaborative discussion with customers, analysts and other vendors regarding the direction and principles of cloud computing. When the center of gravity is standards and interoperability, we are even more enthusiastic because we believe these are the key to the long term success for the industry, as we are demonstrating through a variety of technologies such as Silverlight, Internet Explorer 8, and the Azure Services Platform. We have learned a lot from the tens-of-thousands of developers who are using our cloud platform and their feedback is driving our efforts. We are happy to participate in a dialogue with other providers and collaborate with them on how cloud computing could evolve to provide additional choices and greater value for customers.
We were admittedly disappointed by the lack of openness in the development of the Cloud Manifesto. What we heard was that there was no desire to discuss, much less implement, enhancements to the document despite the fact that we have learned through direct experience. Very recently we were privately shown a copy of the document, warned that it was a secret, and told that it must be signed "as is," without modifications or additional input. It appears to us that one company, or just a few companies, would prefer to control the evolution of cloud computing, as opposed to reaching a consensus across key stakeholders (including cloud users) through an "open" process. An open Manifesto emerging from a closed process is at least mildly ironic.
To ensure that the work on such a project is open, transparent and complete, we feel strongly that any "manifesto" should be created, from its inception, through an open mechanism like a Wiki, for public debate and comment, all available through a Creative Commons license. After all, what we are really seeking are ideas that have been broadly developed, meet a test of open, logical review and reflect principles on which the broad community agrees. This would help avoid biases toward one technology over another, and expand the opportunities for innovation.
In our view, large parts of the draft Manifesto are sensible. Other parts arguably reflect the authors' biases. Still other parts are too ambiguous to know exactly what the authors intended.
Cloud computing is an exciting, important, but still nascent marketplace. It will, we expect, be driven in beneficial ways by a lot of innovation that we're dreaming up today. Innovation lowers costs and increases utility, but it needs freedom to develop. Freezing the state of cloud computing at any time and (especially now) before it has significant industry and customer experience across a wide range of technologies would severely hamper that innovation. At the same time, we strongly believe that interoperability (achieved in many different ways) and consensus-based standards will be valuable in allowing the market to develop in an open, dynamic way in response to different customer needs.
To net this out… In the coming days or weeks you may hear about an "Open Cloud Manifesto." We love the idea of openness in cloud computing and are eager for industry dialogue on how best to think about cloud computing and interoperability. Cloud computing provides fertile ground that will drive innovation, and an open cloud ecosystem is rich with potential for customers and the industry as a whole. So, we welcome an open dialogue to define interoperability principles that reflect the diversity of cloud approaches. If there is a truly open, transparent, inclusive dialogue on cloud interoperability and standards principles, we are enthusiastically "in".
Here are some principles on the approach we think better serve customers and the industry overall:
· Interoperability principles and any needed standards for cloud computing need to be defined through a process that is open to public collaboration and scrutiny.
· Creation of interoperability principles and any standards effort that may result should not be a vendor-dominated process. To be fair as well as relevant, they should have support from multiple providers as well as strong support from customers and other stakeholders.
· Due recognition should be given to the fact that the cloud market is immature, with a great deal of innovation yet to come. Therefore, while principles can be agreed upon relatively soon, the relevant standards may take some time to develop and coalesce as the cloud computing industry matures.
What do you think? Where do you think this best lives? An open Wiki? A conference? A summit where a lively give-and-take can get all the issues recognized in an open way? What elements of an open cloud are most important to you? Let us (all) know…
.NET Services - Cloud Interoperability
Speaking of standards - I'm thrilled to report that we will release the "M5" (Milestone 5) CTP (Community Technology Preview – think Beta) for .NET Services (part of the Azure Services Platform) tomorrow! For those who aren't familiar with this effort, here's the primer… Almost two years ago, we introduced these services – Service Bus (secure messaging across networks and firewalls), Access Control (user access to web apps and services across multiple standards-based identity providers), and Workflow (for orchestrating and routing Service Bus messages). From the beginning, .NET Services was designed for multi-cloud, multi-platform use. Developers can use the .NET Services in conjunction with ANY programming language (using support for industry-standard protocols, or via available SDKs for .NET, Java and Ruby) on ANY platform to create or extend federated applications. A good overview of .NET Services is available here.
This milestone contains enhancements to all of the services including expanded support for standards like REST, ATOM, SOAP and HTTP. As I mentioned previously, we demonstrated at MIX cloud to cloud interop in action. Specifically, we showed how the Access Control Service and Service Bus could be integrated with a Python Application deployed into Google App Engine using just two lines of code. As always, feedback from developers is critical to us. So, please take time to sign up for the CTP, and tell us what you think. We're on our way to commercial availability later this year and we need your help to get there.
If you haven't already, you can follow our cloud efforts by adding @Azure on Twitter.
Popular posts from this blog
AD RMS to AD RMS to Azure Information Protection Part 1 The Scenario: So, you have read my previous blog posts about AD RMS side-by-side migration and Enterprise Migration from AD RMS to AIP using SCCM but unfortunately both of those articles assume best case scenario for the original AD RMS cluster. Sadly, that is not always the way things work. In the real world, the AD RMS instance may have been initially installed on Windows Server 2003 using RMS 1.0 and was subsequently upgraded to 2008 R2 keeping all of the settings pretty much the same. This usually means using http only and having no CNAMEs for AD RMS or SQL. This makes my happy articles on upgrading to newer versions of AD RMS or to AIP a lot less straightforward. Let's fix that. The Setup: Luckily, most of the concepts for migration are the same as what I documented in the previous two articles, so I am going to happily plagerize reuse the content in those articles to make something new. This a...
MEDC in Las Vegas
Windows Azure:新计划程序服务,读取访问同步冗余存储以及监测更新 [原文发表地址] Windows Azure: New Scheduler Service, Read-Access Geo Redundant Storage, and Monitoring Updates [原文发表时间] December 12, 2013 12:41 PM 今天早上我们推出了windows Azure的另一组增强功能。今天的新功能包括: 程序调度:新的windows Azure计划程序服务 存储:新的同步读写冗余存储方案 监测:windows Azure服务的监测及诊断的增强功能 所有的这些改进现在都可以使用(注意有些功能仍然是在预览)。下面是有关他们的更多详细信息: 程序调度:新的windows Azure计划程序服务 我很高兴宣布我们可以预览新的Windows Azure调度服务。Windows Azure调度服务允许你安排启用HTTP/S端点的任务或者按你制定的任何计划向存储队列上发送信息。使用调度程序,你可以创建可靠的调用Windows Azure内部或外部服务的任务并且按照常规计划立刻运行或者设置他们在未来某刻运行。 想要开始使用调度程序,首先你需要在 Windows Azure Preview 页面上为预览进行注册。一旦在预览页中注册成功后,你可以登陆到管理门户并且开始使用它。 创建一个调度任务 一旦你在你的订阅中启用调度预览,你可以用以下几个简短步骤很容易的创建一个新的任务。 在Windows Azure门户管理网站内单击 新建-> 服务程序 -> 调度 –> 自定义创建: 选择一个你想要运行任务的Windows Azure 区域,之后选择一个已有的任务收集器或者创建一个新的并把任务加进去: 之后你就能定义你的任务操作。在本例中,我们会创建一个向web站点发送GET 请求的HTTP 操作(你也可以使用其他的HTTP协议,像HTTPS)。 对于处理长时间的请求或者在脱机状态启用某项服务,你也许更期望给存储队列添加一些信息而不是坚持启用一个Web 服务。要给存储队列添加信息你只需要选择存储队列作为你的操作,之后创建或选择一个存储帐号及队列用来发送请求: 一旦你定义了你要...
Exclude a path from WSS 3.0 on Windows Server 2008
Recursive CTEs continued ... In this post, I will finish the discussion of recursive CTEs that I began in my last post. I will continue to use the CTE examples from Books Online . To run these examples, you'll need to install the Adventure Works Cycles OLTP sample database . In my last post, I explained that all recursive queries follow the same pattern of one or more anchor sub-selects and one or more recursive sub-selects combined by a UNION ALL. Similarly, all recursive query plans also follow the same pattern which looks like so: |--Index Spool(WITH STACK) |--Concatenation |--Compute Scalar(DEFINE:([Expr10XX]=(0))) | |-- ... anchor sub-select plan(s) ... |--Assert(WHERE:(CASE WHEN [Expr10ZZ]>(100) THEN (0) ELSE NULL END)) |--Nested Loops(Inner Join, OUTER REFERENCES:([Expr10YY], [Recr10XX], ...)) |--Compute Scalar(DEFINE:([Expr10ZZ]=[Expr10YY]+(1))) ...
Comments
Post a Comment